Introduction

This project is a realization of the ideas I came up with during my time at Event Galaxy, where I spent a month as a website manager.

The original intent was to create an application that would allow a user to provide an image of an advertising inflatable, either obtained online or captured live. The application would then classify this inflatable and redirect the customer to its matching product page on our website.

In our line of work, customers are often confused between various advertising inflatables avaliable and thus have difficulty communicating their ideas or vision to us.

A huge thanks to my director for allowing me to use the images on the site.

Scraping the Data

The main products & services page contain links for all the products.

title

We will start by scraping the urls for each product page. Begin by importing the necessary modules.

In [1]:
import requests
from bs4 import BeautifulSoup

url = "http://www.eventgalaxy.com.sg/products/lighted-balloon-stands/"
request = requests.get(url)
soup = BeautifulSoup(request.text,features="html.parser")

products = soup.find_all("li",{"class":"page_item"})
for product in products:
    print(product.find("a")['href'])
http://www.eventgalaxy.com.sg/products/features/
http://www.eventgalaxy.com.sg/products/lighted-balloon-stands/
http://www.eventgalaxy.com.sg/products/giant-advertising-helium-balloon/
http://www.eventgalaxy.com.sg/products/inflatable-arches/
http://www.eventgalaxy.com.sg/products/advertising-balloons/
http://www.eventgalaxy.com.sg/products/balloon-decorations/
http://www.eventgalaxy.com.sg/products/dancing-tube/
http://www.eventgalaxy.com.sg/products/remote-control-blimp/
http://www.eventgalaxy.com.sg/products/illuminated-advertising-inflatables/
http://www.eventgalaxy.com.sg/products/carnival-games/
http://www.eventgalaxy.com.sg/products/arcade-games/
http://www.eventgalaxy.com.sg/products/inflatable-games/
http://www.eventgalaxy.com.sg/products/inflatable-movie-screen/
http://www.eventgalaxy.com.sg/products/snow-globe/
http://www.eventgalaxy.com.sg/products/kiddie-rides/
http://www.eventgalaxy.com.sg/products/launch-mechanism/
http://www.eventgalaxy.com.sg/products/events-support/
http://www.eventgalaxy.com.sg/products/event-management/
http://www.eventgalaxy.com.sg/products/festive-decorations/
http://www.eventgalaxy.com.sg/products/human-claw-machine/
http://www.eventgalaxy.com.sg/products/templates/

By inspecting the various website elements on Chrome, we are able to find the necessary tags to obtain the required urls.

Opening up a specific product page and scrolling down reveals several categories for this particular product.

title

To obtain the url for each category, it is slightly trickier.

In [2]:
url = "http://www.eventgalaxy.com.sg/products/lighted-balloon-stands/"
request = requests.get(url)
soup = BeautifulSoup(request.text,features="html.parser")
start_at = soup.find("li",{"class":"page_item page-item-1742"})
shortened_soup = start_at.find_all_previous("li")

for category in shortened_soup:
    try: 
        categoryUrl = category.find("a")['href']
        print(categoryUrl)
    except:
        continue
http://www.eventgalaxy.com.sg/products/lighted-balloon-stands/features/battery-operated-lbs/
http://www.eventgalaxy.com.sg/products/lighted-balloon-stands/features/color-jacket-lbs/
http://www.eventgalaxy.com.sg/products/lighted-balloon-stands/features/custom-made-lbs/
http://www.eventgalaxy.com.sg/products/lighted-balloon-stands/features/exhibitions/
http://www.eventgalaxy.com.sg/products/lighted-balloon-stands/features/corporate-exhibitions/
http://www.eventgalaxy.com.sg/products/lighted-balloon-stands/features/property-launch-events/
http://www.eventgalaxy.com.sg
http://www.eventgalaxy.com.sg/contact-us/
http://www.eventgalaxy.com.sg/catalogue/
http://www.eventgalaxy.com.sg/portfolio/
http://www.eventgalaxy.com.sg/clients/
http://www.eventgalaxy.com.sg/about-us/our-quality-policy/
http://www.eventgalaxy.com.sg/about-us/our-management-team/
http://www.eventgalaxy.com.sg/about-us/our-philosophy/
http://www.eventgalaxy.com.sg/about-us/company-profile/
http://eventgalaxy.com.sg/about-us/our-philosophy/
http://www.eventgalaxy.com.sg/products/launch-mechanism/
http://www.eventgalaxy.com.sg/products/carnival-games/
http://www.eventgalaxy.com.sg/products/inflatable-games/
http://www.eventgalaxy.com.sg/products/illuminated-advertising-inflatables/
http://www.eventgalaxy.com.sg/products/dancing-tube/
http://www.eventgalaxy.com.sg/products/advertising-balloons/
http://www.eventgalaxy.com.sg/products/inflatable-arches/
http://www.eventgalaxy.com.sg/products/giant-advertising-helium-balloon/
http://www.eventgalaxy.com.sg/products/lighted-balloon-stands/
http://eventgalaxy.com.sg/products/lighted-balloon-stands/
http://eventgalaxy.com.sg/
https://www.youtube.com/user/eventgalaxy?feature=watch
https://www.facebook.com/eventgalaxysg/
mailto:eric@eventgalaxy.com.sg
tel:+6597623936

We make the following amendments to the code to skip the irrelevant links.

In [3]:
for category in shortened_soup:
    try: 
        categoryUrl = category.find("a")['href']
        if categoryUrl == "http://www.eventgalaxy.com.sg":
            break
        print(categoryUrl)
    except:
        continue
http://www.eventgalaxy.com.sg/products/lighted-balloon-stands/features/battery-operated-lbs/
http://www.eventgalaxy.com.sg/products/lighted-balloon-stands/features/color-jacket-lbs/
http://www.eventgalaxy.com.sg/products/lighted-balloon-stands/features/custom-made-lbs/
http://www.eventgalaxy.com.sg/products/lighted-balloon-stands/features/exhibitions/
http://www.eventgalaxy.com.sg/products/lighted-balloon-stands/features/corporate-exhibitions/
http://www.eventgalaxy.com.sg/products/lighted-balloon-stands/features/property-launch-events/

Upon clicking a specific category, we are treated to all the images under this category. This is what we want to scrape for our product classification.

title

The code snippet below will scrape the first image, for illustration. In the actual code used (download_images.py), we download the images to our hard disk instead.

In [4]:
import matplotlib.pyplot as plt
import requests
from PIL import Image
from io import BytesIO
%matplotlib inline 

url = "http://www.eventgalaxy.com.sg/products/lighted-balloon-stands/features/property-launch-events/"
request = requests.get(url)
soup = BeautifulSoup(request.text,features="html.parser")

imgs = soup.find_all("div",{"class":"productCatPhoto"})
for img in imgs:
    img = img.find("img")
    im = Image.open(BytesIO(requests.get(img["src"]).content))
    plt.imshow(im)
    plt.show()
    break

By merging the three scripts (find_urls.py, find_categories.py and download_images.py) together (download_all_images.py), we can download all the images on the site in an efficient and orderly manner.

title

That's the end for the scraping component of the project. Next up: Product Classification Using Convolutional Neural Networks!

Product Classification

In [1]:
import cv2
import numpy as np
import os
import tensorflow as tf
import matplotlib.pyplot as plt
%matplotlib inline

class_dict = {}
with open("classes5.txt","r") as txt:
    classes = [line.strip() for line in txt.readlines()]
    class_dict = dict(zip(range(len(classes)),classes))

def loadImages(root,txt):
    with open(txt,"r") as f:
        all_imgs = []
        all_classes = []
        for line in f.readlines(): 
            imgname = line.split(" ")[0]
            img_arr = cv2.imread(root + imgname)
            img_arr = img_arr[:,:,::-1]
            img_arr = cv2.resize(img_arr,(128,128))
            all_imgs.append(img_arr)
            label = int(line.split(" ")[1].strip("\n"))
            all_classes.append(label)
        return np.array(all_imgs),np.array(all_classes)

train_root = "D:\\Projects\\train\\"
test_root = "D:\\Projects\\test\\"
X_train,y_train = loadImages(train_root,"D:\\Projects\Advertising\\train.txt")
X_test,y_test = loadImages(test_root,"D:\\Projects\\Advertising\\test.txt")
# Normalise pixel values to be between 0 and 1
X_train, X_test = X_train/255.0, X_test/255.0

print(X_train.shape)

plt.figure(figsize=(10,10))
for i in range(25):
    plt.subplot(5,5,i+1)
    plt.xticks([])
    plt.yticks([])
    plt.grid(False)
    plt.imshow(X_train[i], cmap=plt.cm.binary)
    plt.xlabel(class_dict[y_train[i]])
plt.show()
(511, 128, 128, 3)
In [2]:
import tensorflow as tf
from tensorflow.keras import layers, models

model = models.Sequential()
model.add(layers.Conv2D(32,(3,3), activation = "relu", input_shape =(128,128,3))) 
model.add(layers.MaxPooling2D((2, 2)))
model.add(layers.Flatten())
model.add(layers.Dense(64, activation='relu'))
model.add(layers.Dense(5, activation='softmax'))

model.summary()
Model: "sequential"
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
conv2d (Conv2D)              (None, 126, 126, 32)      896       
_________________________________________________________________
max_pooling2d (MaxPooling2D) (None, 63, 63, 32)        0         
_________________________________________________________________
flatten (Flatten)            (None, 127008)            0         
_________________________________________________________________
dense (Dense)                (None, 64)                8128576   
_________________________________________________________________
dense_1 (Dense)              (None, 5)                 325       
=================================================================
Total params: 8,129,797
Trainable params: 8,129,797
Non-trainable params: 0
_________________________________________________________________
In [3]:
model.compile(optimizer='adam', 
              loss='sparse_categorical_crossentropy',
              metrics=['accuracy'])
history = model.fit(X_train, y_train, epochs=100, batch_size = 32)
Epoch 1/100
511/511 [==============================] - 3s 5ms/sample - loss: 7.8350 - accuracy: 0.2446
Epoch 2/100
511/511 [==============================] - 1s 1ms/sample - loss: 1.6060 - accuracy: 0.2740
Epoch 3/100
511/511 [==============================] - 1s 1ms/sample - loss: 1.4184 - accuracy: 0.3933
Epoch 4/100
511/511 [==============================] - 1s 1ms/sample - loss: 1.2526 - accuracy: 0.5010
Epoch 5/100
511/511 [==============================] - 1s 1ms/sample - loss: 1.0886 - accuracy: 0.5949
Epoch 6/100
511/511 [==============================] - 1s 1ms/sample - loss: 0.8424 - accuracy: 0.6869
Epoch 7/100
511/511 [==============================] - 1s 1ms/sample - loss: 0.6667 - accuracy: 0.7906
Epoch 8/100
511/511 [==============================] - 1s 1ms/sample - loss: 0.4878 - accuracy: 0.8826
Epoch 9/100
511/511 [==============================] - 1s 1ms/sample - loss: 0.3512 - accuracy: 0.9569
Epoch 10/100
511/511 [==============================] - 1s 1ms/sample - loss: 0.2540 - accuracy: 0.9785
Epoch 11/100
511/511 [==============================] - 1s 1ms/sample - loss: 0.1771 - accuracy: 0.9863
Epoch 12/100
511/511 [==============================] - ETA: 0s - loss: 0.1283 - accuracy: 0.99 - 1s 1ms/sample - loss: 0.1274 - accuracy: 0.9980
Epoch 13/100
511/511 [==============================] - 1s 1ms/sample - loss: 0.0916 - accuracy: 0.9980
Epoch 14/100
511/511 [==============================] - 1s 1ms/sample - loss: 0.0644 - accuracy: 0.9980
Epoch 15/100
511/511 [==============================] - 1s 1ms/sample - loss: 0.0608 - accuracy: 1.0000
Epoch 16/100
511/511 [==============================] - 1s 1ms/sample - loss: 0.0383 - accuracy: 1.0000
Epoch 17/100
511/511 [==============================] - 1s 1ms/sample - loss: 0.0283 - accuracy: 1.0000
Epoch 18/100
511/511 [==============================] - 1s 1ms/sample - loss: 0.0246 - accuracy: 1.0000
Epoch 19/100
511/511 [==============================] - 1s 1ms/sample - loss: 0.0195 - accuracy: 1.0000
Epoch 20/100
511/511 [==============================] - 1s 1ms/sample - loss: 0.0161 - accuracy: 1.0000
Epoch 21/100
511/511 [==============================] - 1s 1ms/sample - loss: 0.0132 - accuracy: 1.0000
Epoch 22/100
511/511 [==============================] - 1s 1ms/sample - loss: 0.0110 - accuracy: 1.0000
Epoch 23/100
511/511 [==============================] - 1s 1ms/sample - loss: 0.0095 - accuracy: 1.0000
Epoch 24/100
511/511 [==============================] - 1s 1ms/sample - loss: 0.0081 - accuracy: 1.0000
Epoch 25/100
511/511 [==============================] - 1s 1ms/sample - loss: 0.0074 - accuracy: 1.0000
Epoch 26/100
511/511 [==============================] - 1s 1ms/sample - loss: 0.0064 - accuracy: 1.0000
Epoch 27/100
511/511 [==============================] - 1s 1ms/sample - loss: 0.0055 - accuracy: 1.0000
Epoch 28/100
511/511 [==============================] - 1s 1ms/sample - loss: 0.0051 - accuracy: 1.0000
Epoch 29/100
511/511 [==============================] - 1s 1ms/sample - loss: 0.0046 - accuracy: 1.0000
Epoch 30/100
511/511 [==============================] - 1s 1ms/sample - loss: 0.0041 - accuracy: 1.0000
Epoch 31/100
511/511 [==============================] - 1s 1ms/sample - loss: 0.0037 - accuracy: 1.0000
Epoch 32/100
511/511 [==============================] - 1s 1ms/sample - loss: 0.0034 - accuracy: 1.0000
Epoch 33/100
511/511 [==============================] - 1s 1ms/sample - loss: 0.0032 - accuracy: 1.0000
Epoch 34/100
511/511 [==============================] - 1s 1ms/sample - loss: 0.0029 - accuracy: 1.0000
Epoch 35/100
511/511 [==============================] - 1s 1ms/sample - loss: 0.0027 - accuracy: 1.0000
Epoch 36/100
511/511 [==============================] - 1s 1ms/sample - loss: 0.0026 - accuracy: 1.0000
Epoch 37/100
511/511 [==============================] - 1s 1ms/sample - loss: 0.0024 - accuracy: 1.0000
Epoch 38/100
511/511 [==============================] - 1s 1ms/sample - loss: 0.0022 - accuracy: 1.0000
Epoch 39/100
511/511 [==============================] - 1s 1ms/sample - loss: 0.0021 - accuracy: 1.0000
Epoch 40/100
511/511 [==============================] - 1s 1ms/sample - loss: 0.0020 - accuracy: 1.0000
Epoch 41/100
511/511 [==============================] - 1s 1ms/sample - loss: 0.0019 - accuracy: 1.0000
Epoch 42/100
511/511 [==============================] - 1s 1ms/sample - loss: 0.0018 - accuracy: 1.0000
Epoch 43/100
511/511 [==============================] - 1s 1ms/sample - loss: 0.0017 - accuracy: 1.0000
Epoch 44/100
511/511 [==============================] - 1s 1ms/sample - loss: 0.0016 - accuracy: 1.0000
Epoch 45/100
511/511 [==============================] - 1s 1ms/sample - loss: 0.0015 - accuracy: 1.0000
Epoch 46/100
511/511 [==============================] - 1s 1ms/sample - loss: 0.0014 - accuracy: 1.0000
Epoch 47/100
511/511 [==============================] - 1s 1ms/sample - loss: 0.0014 - accuracy: 1.0000
Epoch 48/100
511/511 [==============================] - 1s 1ms/sample - loss: 0.0013 - accuracy: 1.0000
Epoch 49/100
511/511 [==============================] - 1s 1ms/sample - loss: 0.0012 - accuracy: 1.0000
Epoch 50/100
511/511 [==============================] - 1s 1ms/sample - loss: 0.0012 - accuracy: 1.0000
Epoch 51/100
511/511 [==============================] - 1s 1ms/sample - loss: 0.0011 - accuracy: 1.0000
Epoch 52/100
511/511 [==============================] - 1s 1ms/sample - loss: 0.0011 - accuracy: 1.0000
Epoch 53/100
511/511 [==============================] - 1s 1ms/sample - loss: 0.0010 - accuracy: 1.0000
Epoch 54/100
511/511 [==============================] - 1s 1ms/sample - loss: 9.7434e-04 - accuracy: 1.0000
Epoch 55/100
511/511 [==============================] - 1s 1ms/sample - loss: 9.3298e-04 - accuracy: 1.0000
Epoch 56/100
511/511 [==============================] - 1s 1ms/sample - loss: 8.9829e-04 - accuracy: 1.0000
Epoch 57/100
511/511 [==============================] - 1s 1ms/sample - loss: 8.6587e-04 - accuracy: 1.0000
Epoch 58/100
511/511 [==============================] - 1s 1ms/sample - loss: 8.2371e-04 - accuracy: 1.0000
Epoch 59/100
511/511 [==============================] - 1s 1ms/sample - loss: 7.9426e-04 - accuracy: 1.0000
Epoch 60/100
511/511 [==============================] - 1s 1ms/sample - loss: 7.6171e-04 - accuracy: 1.0000
Epoch 61/100
511/511 [==============================] - 1s 1ms/sample - loss: 7.3598e-04 - accuracy: 1.0000
Epoch 62/100
511/511 [==============================] - 1s 1ms/sample - loss: 7.1125e-04 - accuracy: 1.0000
Epoch 63/100
511/511 [==============================] - 1s 1ms/sample - loss: 6.8337e-04 - accuracy: 1.0000
Epoch 64/100
511/511 [==============================] - 1s 1ms/sample - loss: 6.6053e-04 - accuracy: 1.0000
Epoch 65/100
511/511 [==============================] - 1s 1ms/sample - loss: 6.3632e-04 - accuracy: 1.0000
Epoch 66/100
511/511 [==============================] - 1s 1ms/sample - loss: 6.1759e-04 - accuracy: 1.0000
Epoch 67/100
511/511 [==============================] - 1s 1ms/sample - loss: 5.9586e-04 - accuracy: 1.00000s - loss: 5.7747e-04 - accu
Epoch 68/100
511/511 [==============================] - 1s 1ms/sample - loss: 5.7930e-04 - accuracy: 1.0000
Epoch 69/100
511/511 [==============================] - 1s 1ms/sample - loss: 5.5658e-04 - accuracy: 1.0000
Epoch 70/100
511/511 [==============================] - 1s 1ms/sample - loss: 5.4141e-04 - accuracy: 1.0000
Epoch 71/100
511/511 [==============================] - 1s 1ms/sample - loss: 5.2381e-04 - accuracy: 1.0000
Epoch 72/100
511/511 [==============================] - 1s 1ms/sample - loss: 5.0496e-04 - accuracy: 1.0000
Epoch 73/100
511/511 [==============================] - 1s 1ms/sample - loss: 4.8806e-04 - accuracy: 1.0000
Epoch 74/100
511/511 [==============================] - 1s 1ms/sample - loss: 4.7341e-04 - accuracy: 1.0000
Epoch 75/100
511/511 [==============================] - 1s 1ms/sample - loss: 4.5937e-04 - accuracy: 1.0000
Epoch 76/100
511/511 [==============================] - 1s 1ms/sample - loss: 4.4613e-04 - accuracy: 1.0000
Epoch 77/100
511/511 [==============================] - 1s 1ms/sample - loss: 4.3173e-04 - accuracy: 1.0000
Epoch 78/100
511/511 [==============================] - 1s 1ms/sample - loss: 4.2343e-04 - accuracy: 1.0000
Epoch 79/100
511/511 [==============================] - 1s 1ms/sample - loss: 4.0777e-04 - accuracy: 1.0000
Epoch 80/100
511/511 [==============================] - 1s 1ms/sample - loss: 3.9655e-04 - accuracy: 1.0000
Epoch 81/100
511/511 [==============================] - 1s 1ms/sample - loss: 3.8701e-04 - accuracy: 1.0000
Epoch 82/100
511/511 [==============================] - 1s 1ms/sample - loss: 3.7550e-04 - accuracy: 1.0000
Epoch 83/100
511/511 [==============================] - 1s 1ms/sample - loss: 3.6410e-04 - accuracy: 1.0000
Epoch 84/100
511/511 [==============================] - 1s 1ms/sample - loss: 3.5395e-04 - accuracy: 1.0000
Epoch 85/100
511/511 [==============================] - 1s 1ms/sample - loss: 3.4747e-04 - accuracy: 1.0000
Epoch 86/100
511/511 [==============================] - 1s 1ms/sample - loss: 3.3820e-04 - accuracy: 1.0000
Epoch 87/100
511/511 [==============================] - 1s 1ms/sample - loss: 3.2670e-04 - accuracy: 1.0000
Epoch 88/100
511/511 [==============================] - 1s 1ms/sample - loss: 3.1820e-04 - accuracy: 1.0000
Epoch 89/100
511/511 [==============================] - 1s 1ms/sample - loss: 3.1080e-04 - accuracy: 1.0000
Epoch 90/100
511/511 [==============================] - 1s 1ms/sample - loss: 3.0358e-04 - accuracy: 1.0000
Epoch 91/100
511/511 [==============================] - 1s 1ms/sample - loss: 2.9722e-04 - accuracy: 1.0000
Epoch 92/100
511/511 [==============================] - 1s 1ms/sample - loss: 2.8816e-04 - accuracy: 1.0000
Epoch 93/100
511/511 [==============================] - 1s 1ms/sample - loss: 2.8039e-04 - accuracy: 1.0000
Epoch 94/100
511/511 [==============================] - 1s 1ms/sample - loss: 2.7418e-04 - accuracy: 1.0000
Epoch 95/100
511/511 [==============================] - 1s 1ms/sample - loss: 2.6992e-04 - accuracy: 1.0000
Epoch 96/100
511/511 [==============================] - 1s 1ms/sample - loss: 2.6108e-04 - accuracy: 1.0000
Epoch 97/100
511/511 [==============================] - 1s 1ms/sample - loss: 2.5386e-04 - accuracy: 1.0000
Epoch 98/100
511/511 [==============================] - 1s 1ms/sample - loss: 2.4766e-04 - accuracy: 1.0000
Epoch 99/100
511/511 [==============================] - 1s 1ms/sample - loss: 2.4246e-04 - accuracy: 1.0000
Epoch 100/100
511/511 [==============================] - 1s 1ms/sample - loss: 2.3742e-04 - accuracy: 1.0000
In [4]:
# summarize history for accuracy
plt.plot(history.history['accuracy'])
# plt.plot(history.history['val_accuracy'])
plt.title('model accuracy')
plt.ylabel('accuracy')
plt.xlabel('epoch')
plt.legend(['train', 'test'], loc='upper left')
plt.show()
In [5]:
# summarize history for loss
plt.plot(history.history['loss'])
# plt.plot(history.history['val_loss'])
plt.title('model loss')
plt.ylabel('loss')
plt.xlabel('epoch')
plt.legend(['train', 'test'], loc='upper left')
plt.show()
In [6]:
test_loss, test_acc = model.evaluate(X_test, y_test)
print('Test accuracy:', test_acc)
222/222 [==============================] - 0s 1ms/sample - loss: 2.2503 - accuracy: 0.5270
Test accuracy: 0.527027
In [7]:
model = models.Sequential()
model.add(layers.Conv2D(32,(3,3), activation = "relu", input_shape =(128,128,3))) 
model.add(layers.MaxPooling2D((2, 2)))
model.add(layers.Conv2D(64, (3, 3), activation='relu'))
model.add(layers.MaxPooling2D((2, 2)))
model.add(layers.Conv2D(64, (3, 3), activation='relu'))
model.add(layers.Flatten())
model.add(layers.Dense(64, activation='relu'))
model.add(layers.Dense(5, activation='softmax'))

model.compile(optimizer='adam', 
              loss='sparse_categorical_crossentropy',
              metrics=['accuracy'])
history = model.fit(X_train, y_train, epochs=100, batch_size = 32)

test_loss, test_acc = model.evaluate(X_test, y_test)
print('Test accuracy:', test_acc)
Epoch 1/100
511/511 [==============================] - 1s 2ms/sample - loss: 1.6787 - accuracy: 0.2720
Epoch 2/100
511/511 [==============================] - 1s 1ms/sample - loss: 1.4808 - accuracy: 0.3620
Epoch 3/100
511/511 [==============================] - 1s 1ms/sample - loss: 1.2831 - accuracy: 0.4775
Epoch 4/100
511/511 [==============================] - 1s 1ms/sample - loss: 1.0304 - accuracy: 0.6106
Epoch 5/100
511/511 [==============================] - 1s 1ms/sample - loss: 0.8082 - accuracy: 0.6967
Epoch 6/100
511/511 [==============================] - 1s 1ms/sample - loss: 0.5823 - accuracy: 0.8180
Epoch 7/100
511/511 [==============================] - 1s 1ms/sample - loss: 0.3956 - accuracy: 0.8708
Epoch 8/100
511/511 [==============================] - 1s 1ms/sample - loss: 0.2517 - accuracy: 0.9159
Epoch 9/100
511/511 [==============================] - 1s 1ms/sample - loss: 0.1668 - accuracy: 0.9452
Epoch 10/100
511/511 [==============================] - 1s 1ms/sample - loss: 0.1265 - accuracy: 0.9687
Epoch 11/100
511/511 [==============================] - 1s 1ms/sample - loss: 0.0755 - accuracy: 0.9843
Epoch 12/100
511/511 [==============================] - ETA: 0s - loss: 0.0756 - accuracy: 0.98 - 1s 1ms/sample - loss: 0.0716 - accuracy: 0.9843
Epoch 13/100
511/511 [==============================] - 1s 1ms/sample - loss: 0.0285 - accuracy: 0.9961
Epoch 14/100
511/511 [==============================] - 1s 1ms/sample - loss: 0.0057 - accuracy: 1.0000
Epoch 15/100
511/511 [==============================] - 1s 1ms/sample - loss: 0.0029 - accuracy: 1.0000
Epoch 16/100
511/511 [==============================] - 1s 1ms/sample - loss: 0.0018 - accuracy: 1.0000
Epoch 17/100
511/511 [==============================] - 1s 1ms/sample - loss: 0.0011 - accuracy: 1.0000
Epoch 18/100
511/511 [==============================] - 1s 1ms/sample - loss: 7.7649e-04 - accuracy: 1.00000s - loss: 7.9576e-04 - accuracy: 1.
Epoch 19/100
511/511 [==============================] - 1s 1ms/sample - loss: 6.6500e-04 - accuracy: 1.0000
Epoch 20/100
511/511 [==============================] - 1s 1ms/sample - loss: 5.7770e-04 - accuracy: 1.0000
Epoch 21/100
511/511 [==============================] - 1s 1ms/sample - loss: 4.9833e-04 - accuracy: 1.0000
Epoch 22/100
511/511 [==============================] - 1s 1ms/sample - loss: 4.4392e-04 - accuracy: 1.0000
Epoch 23/100
511/511 [==============================] - 1s 1ms/sample - loss: 4.0206e-04 - accuracy: 1.0000
Epoch 24/100
511/511 [==============================] - 1s 1ms/sample - loss: 3.6578e-04 - accuracy: 1.0000
Epoch 25/100
511/511 [==============================] - 1s 2ms/sample - loss: 3.3282e-04 - accuracy: 1.0000
Epoch 26/100
511/511 [==============================] - 1s 1ms/sample - loss: 3.0678e-04 - accuracy: 1.0000
Epoch 27/100
511/511 [==============================] - 1s 1ms/sample - loss: 2.8507e-04 - accuracy: 1.0000
Epoch 28/100
511/511 [==============================] - 1s 1ms/sample - loss: 2.6466e-04 - accuracy: 1.0000
Epoch 29/100
511/511 [==============================] - 1s 1ms/sample - loss: 2.4722e-04 - accuracy: 1.0000
Epoch 30/100
511/511 [==============================] - 1s 1ms/sample - loss: 2.2953e-04 - accuracy: 1.0000
Epoch 31/100
511/511 [==============================] - 1s 1ms/sample - loss: 2.1487e-04 - accuracy: 1.0000
Epoch 32/100
511/511 [==============================] - 1s 1ms/sample - loss: 2.0213e-04 - accuracy: 1.0000
Epoch 33/100
511/511 [==============================] - 1s 2ms/sample - loss: 1.9071e-04 - accuracy: 1.0000
Epoch 34/100
511/511 [==============================] - 1s 1ms/sample - loss: 1.8010e-04 - accuracy: 1.0000
Epoch 35/100
511/511 [==============================] - 1s 2ms/sample - loss: 1.6906e-04 - accuracy: 1.0000
Epoch 36/100
511/511 [==============================] - 1s 2ms/sample - loss: 1.5980e-04 - accuracy: 1.0000
Epoch 37/100
511/511 [==============================] - 1s 1ms/sample - loss: 1.5194e-04 - accuracy: 1.0000
Epoch 38/100
511/511 [==============================] - 1s 1ms/sample - loss: 1.4375e-04 - accuracy: 1.0000
Epoch 39/100
511/511 [==============================] - 1s 1ms/sample - loss: 1.3702e-04 - accuracy: 1.0000
Epoch 40/100
511/511 [==============================] - 1s 1ms/sample - loss: 1.3065e-04 - accuracy: 1.0000
Epoch 41/100
511/511 [==============================] - 1s 1ms/sample - loss: 1.2448e-04 - accuracy: 1.0000
Epoch 42/100
511/511 [==============================] - 1s 1ms/sample - loss: 1.1868e-04 - accuracy: 1.0000
Epoch 43/100
511/511 [==============================] - 1s 1ms/sample - loss: 1.1272e-04 - accuracy: 1.0000
Epoch 44/100
511/511 [==============================] - 1s 1ms/sample - loss: 1.0826e-04 - accuracy: 1.0000
Epoch 45/100
511/511 [==============================] - 1s 1ms/sample - loss: 1.0309e-04 - accuracy: 1.0000
Epoch 46/100
511/511 [==============================] - 1s 1ms/sample - loss: 9.8872e-05 - accuracy: 1.0000
Epoch 47/100
511/511 [==============================] - 1s 1ms/sample - loss: 9.4619e-05 - accuracy: 1.0000
Epoch 48/100
511/511 [==============================] - 1s 1ms/sample - loss: 9.1029e-05 - accuracy: 1.0000
Epoch 49/100
511/511 [==============================] - 1s 1ms/sample - loss: 8.7385e-05 - accuracy: 1.0000
Epoch 50/100
511/511 [==============================] - 1s 1ms/sample - loss: 8.3786e-05 - accuracy: 1.0000
Epoch 51/100
511/511 [==============================] - 1s 1ms/sample - loss: 8.0740e-05 - accuracy: 1.0000
Epoch 52/100
511/511 [==============================] - 1s 1ms/sample - loss: 7.7742e-05 - accuracy: 1.0000
Epoch 53/100
511/511 [==============================] - 1s 1ms/sample - loss: 7.4898e-05 - accuracy: 1.0000
Epoch 54/100
511/511 [==============================] - 1s 1ms/sample - loss: 7.2111e-05 - accuracy: 1.0000
Epoch 55/100
511/511 [==============================] - 1s 1ms/sample - loss: 6.9641e-05 - accuracy: 1.0000
Epoch 56/100
511/511 [==============================] - 1s 1ms/sample - loss: 6.7347e-05 - accuracy: 1.00000s - loss: 5.7893e-05 - accura
Epoch 57/100
511/511 [==============================] - 1s 1ms/sample - loss: 6.4688e-05 - accuracy: 1.0000
Epoch 58/100
511/511 [==============================] - 1s 1ms/sample - loss: 6.2610e-05 - accuracy: 1.0000
Epoch 59/100
511/511 [==============================] - 1s 1ms/sample - loss: 6.0733e-05 - accuracy: 1.0000
Epoch 60/100
511/511 [==============================] - 1s 1ms/sample - loss: 5.8655e-05 - accuracy: 1.0000
Epoch 61/100
511/511 [==============================] - 1s 1ms/sample - loss: 5.6928e-05 - accuracy: 1.0000
Epoch 62/100
511/511 [==============================] - 1s 1ms/sample - loss: 5.4976e-05 - accuracy: 1.00000s - loss: 5.6936e-05 - accuracy
Epoch 63/100
511/511 [==============================] - 1s 1ms/sample - loss: 5.3187e-05 - accuracy: 1.00000s - loss: 5.2711e-05 - accuracy: 1.00
Epoch 64/100
511/511 [==============================] - 1s 1ms/sample - loss: 5.1510e-05 - accuracy: 1.0000
Epoch 65/100
511/511 [==============================] - 1s 1ms/sample - loss: 5.0063e-05 - accuracy: 1.0000
Epoch 66/100
511/511 [==============================] - 1s 1ms/sample - loss: 4.8345e-05 - accuracy: 1.00000s - loss: 4.9823e-05 - accuracy:  - ETA: 0s - loss: 4.9160e-05 - accuracy: 1.00
Epoch 67/100
511/511 [==============================] - 1s 1ms/sample - loss: 4.6990e-05 - accuracy: 1.0000
Epoch 68/100
511/511 [==============================] - 1s 1ms/sample - loss: 4.5662e-05 - accuracy: 1.0000
Epoch 69/100
511/511 [==============================] - 1s 1ms/sample - loss: 4.4393e-05 - accuracy: 1.0000
Epoch 70/100
511/511 [==============================] - 1s 1ms/sample - loss: 4.3109e-05 - accuracy: 1.0000
Epoch 71/100
511/511 [==============================] - 1s 1ms/sample - loss: 4.1942e-05 - accuracy: 1.0000
Epoch 72/100
511/511 [==============================] - 1s 1ms/sample - loss: 4.0743e-05 - accuracy: 1.0000
Epoch 73/100
511/511 [==============================] - 1s 1ms/sample - loss: 3.9623e-05 - accuracy: 1.0000
Epoch 74/100
511/511 [==============================] - 1s 1ms/sample - loss: 3.8705e-05 - accuracy: 1.0000
Epoch 75/100
511/511 [==============================] - 1s 1ms/sample - loss: 3.7553e-05 - accuracy: 1.0000
Epoch 76/100
511/511 [==============================] - 1s 1ms/sample - loss: 3.6548e-05 - accuracy: 1.0000
Epoch 77/100
511/511 [==============================] - 1s 1ms/sample - loss: 3.5644e-05 - accuracy: 1.0000
Epoch 78/100
511/511 [==============================] - 1s 1ms/sample - loss: 3.4836e-05 - accuracy: 1.0000
Epoch 79/100
511/511 [==============================] - 1s 1ms/sample - loss: 3.3805e-05 - accuracy: 1.0000
Epoch 80/100
511/511 [==============================] - 1s 1ms/sample - loss: 3.3030e-05 - accuracy: 1.0000
Epoch 81/100
511/511 [==============================] - 1s 1ms/sample - loss: 3.2161e-05 - accuracy: 1.0000
Epoch 82/100
511/511 [==============================] - 1s 1ms/sample - loss: 3.1343e-05 - accuracy: 1.0000
Epoch 83/100
511/511 [==============================] - 1s 1ms/sample - loss: 3.0658e-05 - accuracy: 1.00000s - loss: 3.5553e-05 - accura
Epoch 84/100
511/511 [==============================] - 1s 1ms/sample - loss: 2.9883e-05 - accuracy: 1.0000
Epoch 85/100
511/511 [==============================] - 1s 1ms/sample - loss: 2.9187e-05 - accuracy: 1.0000
Epoch 86/100
511/511 [==============================] - 1s 1ms/sample - loss: 2.8461e-05 - accuracy: 1.0000
Epoch 87/100
511/511 [==============================] - 1s 1ms/sample - loss: 2.7793e-05 - accuracy: 1.0000
Epoch 88/100
511/511 [==============================] - 1s 1ms/sample - loss: 2.7181e-05 - accuracy: 1.0000
Epoch 89/100
511/511 [==============================] - 1s 1ms/sample - loss: 2.6526e-05 - accuracy: 1.0000
Epoch 90/100
511/511 [==============================] - 1s 1ms/sample - loss: 2.5986e-05 - accuracy: 1.00000s - loss: 2.4303e-05 - accura
Epoch 91/100
511/511 [==============================] - 1s 1ms/sample - loss: 2.5413e-05 - accuracy: 1.00000s - loss: 2.4050e-05 - accura
Epoch 92/100
511/511 [==============================] - 1s 1ms/sample - loss: 2.4785e-05 - accuracy: 1.0000
Epoch 93/100
511/511 [==============================] - 1s 1ms/sample - loss: 2.4260e-05 - accuracy: 1.0000
Epoch 94/100
511/511 [==============================] - 1s 1ms/sample - loss: 2.3801e-05 - accuracy: 1.0000
Epoch 95/100
511/511 [==============================] - 1s 1ms/sample - loss: 2.3267e-05 - accuracy: 1.0000
Epoch 96/100
511/511 [==============================] - 1s 1ms/sample - loss: 2.2706e-05 - accuracy: 1.0000
Epoch 97/100
511/511 [==============================] - 1s 1ms/sample - loss: 2.2321e-05 - accuracy: 1.0000
Epoch 98/100
511/511 [==============================] - 1s 1ms/sample - loss: 2.1751e-05 - accuracy: 1.0000
Epoch 99/100
511/511 [==============================] - 1s 1ms/sample - loss: 2.1333e-05 - accuracy: 1.0000
Epoch 100/100
511/511 [==============================] - 1s 1ms/sample - loss: 2.0882e-05 - accuracy: 1.0000
222/222 [==============================] - 0s 1ms/sample - loss: 4.0695 - accuracy: 0.5270
Test accuracy: 0.527027
In [8]:
model = models.Sequential()
model.add(layers.Flatten(input_shape =(128,128,3)))
model.add(layers.Dense(64, activation='relu'))
model.add(layers.Dense(32, activation='relu'))
model.add(layers.Dense(5, activation='softmax'))

model.compile(optimizer='adam', 
              loss='sparse_categorical_crossentropy',
              metrics=['accuracy'])
history = model.fit(X_train, y_train, epochs=100, batch_size = 32)

test_loss, test_acc = model.evaluate(X_test, y_test)
print('Test accuracy:', test_acc)
Epoch 1/100
511/511 [==============================] - 1s 1ms/sample - loss: 9.2519 - accuracy: 0.2427
Epoch 2/100
511/511 [==============================] - 0s 718us/sample - loss: 2.8534 - accuracy: 0.3072
Epoch 3/100
511/511 [==============================] - 0s 720us/sample - loss: 1.3991 - accuracy: 0.4736
Epoch 4/100
511/511 [==============================] - 0s 755us/sample - loss: 1.4476 - accuracy: 0.4481
Epoch 5/100
511/511 [==============================] - 0s 800us/sample - loss: 1.3486 - accuracy: 0.4403
Epoch 6/100
511/511 [==============================] - 0s 772us/sample - loss: 1.3028 - accuracy: 0.4775
Epoch 7/100
511/511 [==============================] - 0s 732us/sample - loss: 1.1407 - accuracy: 0.5460
Epoch 8/100
511/511 [==============================] - 0s 730us/sample - loss: 1.0374 - accuracy: 0.6204
Epoch 9/100
511/511 [==============================] - 0s 724us/sample - loss: 1.0436 - accuracy: 0.5949
Epoch 10/100
511/511 [==============================] - 0s 732us/sample - loss: 1.0708 - accuracy: 0.5656
Epoch 11/100
511/511 [==============================] - 0s 742us/sample - loss: 0.8617 - accuracy: 0.6928
Epoch 12/100
511/511 [==============================] - 0s 744us/sample - loss: 0.8300 - accuracy: 0.6888
Epoch 13/100
511/511 [==============================] - 0s 755us/sample - loss: 0.8394 - accuracy: 0.7084
Epoch 14/100
511/511 [==============================] - 0s 765us/sample - loss: 0.8162 - accuracy: 0.7065
Epoch 15/100
511/511 [==============================] - 0s 773us/sample - loss: 0.7669 - accuracy: 0.7280 - loss: 0.8108 - accuracy: 0.
Epoch 16/100
511/511 [==============================] - 0s 780us/sample - loss: 0.7407 - accuracy: 0.7221
Epoch 17/100
511/511 [==============================] - 0s 747us/sample - loss: 0.6644 - accuracy: 0.7828
Epoch 18/100
511/511 [==============================] - 0s 736us/sample - loss: 0.6014 - accuracy: 0.8004
Epoch 19/100
511/511 [==============================] - 0s 730us/sample - loss: 0.5567 - accuracy: 0.8337
Epoch 20/100
511/511 [==============================] - 0s 728us/sample - loss: 0.5915 - accuracy: 0.8200
Epoch 21/100
511/511 [==============================] - 0s 771us/sample - loss: 0.6913 - accuracy: 0.7495
Epoch 22/100
511/511 [==============================] - 0s 755us/sample - loss: 0.4915 - accuracy: 0.8454
Epoch 23/100
511/511 [==============================] - 0s 800us/sample - loss: 0.4224 - accuracy: 0.8963
Epoch 24/100
511/511 [==============================] - 0s 817us/sample - loss: 0.4419 - accuracy: 0.8845
Epoch 25/100
511/511 [==============================] - 0s 788us/sample - loss: 0.5061 - accuracy: 0.8297
Epoch 26/100
511/511 [==============================] - 0s 753us/sample - loss: 0.6042 - accuracy: 0.8004
Epoch 27/100
511/511 [==============================] - 0s 780us/sample - loss: 0.3642 - accuracy: 0.8924
Epoch 28/100
511/511 [==============================] - 0s 769us/sample - loss: 0.3804 - accuracy: 0.9041
Epoch 29/100
511/511 [==============================] - 0s 765us/sample - loss: 0.2919 - accuracy: 0.9393
Epoch 30/100
511/511 [==============================] - 0s 782us/sample - loss: 0.4891 - accuracy: 0.8219
Epoch 31/100
511/511 [==============================] - 0s 789us/sample - loss: 0.4607 - accuracy: 0.8591
Epoch 32/100
511/511 [==============================] - 0s 728us/sample - loss: 0.3796 - accuracy: 0.8924
Epoch 33/100
511/511 [==============================] - 0s 757us/sample - loss: 0.5125 - accuracy: 0.8141
Epoch 34/100
511/511 [==============================] - 0s 794us/sample - loss: 0.2592 - accuracy: 0.9472
Epoch 35/100
511/511 [==============================] - 0s 762us/sample - loss: 0.2533 - accuracy: 0.9452
Epoch 36/100
511/511 [==============================] - 0s 743us/sample - loss: 0.2254 - accuracy: 0.9589
Epoch 37/100
511/511 [==============================] - 0s 740us/sample - loss: 0.1834 - accuracy: 0.9765
Epoch 38/100
511/511 [==============================] - 0s 740us/sample - loss: 0.1465 - accuracy: 0.9922
Epoch 39/100
511/511 [==============================] - 0s 733us/sample - loss: 0.1502 - accuracy: 0.9863
Epoch 40/100
511/511 [==============================] - 0s 738us/sample - loss: 0.1639 - accuracy: 0.9726
Epoch 41/100
511/511 [==============================] - 0s 737us/sample - loss: 0.4430 - accuracy: 0.8571
Epoch 42/100
511/511 [==============================] - 0s 780us/sample - loss: 0.6082 - accuracy: 0.7965
Epoch 43/100
511/511 [==============================] - 0s 737us/sample - loss: 0.3585 - accuracy: 0.8728
Epoch 44/100
511/511 [==============================] - 0s 741us/sample - loss: 0.2880 - accuracy: 0.9022
Epoch 45/100
511/511 [==============================] - 0s 751us/sample - loss: 0.1294 - accuracy: 0.9726
Epoch 46/100
511/511 [==============================] - 0s 793us/sample - loss: 0.0848 - accuracy: 0.9863
Epoch 47/100
511/511 [==============================] - 0s 765us/sample - loss: 0.0863 - accuracy: 0.9922
Epoch 48/100
511/511 [==============================] - 0s 772us/sample - loss: 0.0744 - accuracy: 0.9961
Epoch 49/100
511/511 [==============================] - 0s 729us/sample - loss: 0.0773 - accuracy: 0.9863
Epoch 50/100
511/511 [==============================] - 0s 748us/sample - loss: 0.0501 - accuracy: 1.0000
Epoch 51/100
511/511 [==============================] - 0s 908us/sample - loss: 0.0461 - accuracy: 1.0000
Epoch 52/100
511/511 [==============================] - 0s 826us/sample - loss: 0.0391 - accuracy: 0.9980
Epoch 53/100
511/511 [==============================] - 0s 738us/sample - loss: 0.0428 - accuracy: 0.9980
Epoch 54/100
511/511 [==============================] - 0s 747us/sample - loss: 0.0341 - accuracy: 0.9980
Epoch 55/100
511/511 [==============================] - 0s 748us/sample - loss: 0.0380 - accuracy: 0.9980
Epoch 56/100
511/511 [==============================] - 0s 768us/sample - loss: 0.0291 - accuracy: 1.0000
Epoch 57/100
511/511 [==============================] - 0s 807us/sample - loss: 0.0278 - accuracy: 1.0000
Epoch 58/100
511/511 [==============================] - 0s 758us/sample - loss: 0.0313 - accuracy: 0.9980
Epoch 59/100
511/511 [==============================] - 0s 785us/sample - loss: 0.0406 - accuracy: 0.9980
Epoch 60/100
511/511 [==============================] - 0s 768us/sample - loss: 0.0328 - accuracy: 1.0000
Epoch 61/100
511/511 [==============================] - 0s 775us/sample - loss: 0.0206 - accuracy: 1.0000
Epoch 62/100
511/511 [==============================] - 0s 741us/sample - loss: 0.0220 - accuracy: 1.0000
Epoch 63/100
511/511 [==============================] - 0s 741us/sample - loss: 0.0202 - accuracy: 1.0000
Epoch 64/100
511/511 [==============================] - 0s 809us/sample - loss: 0.0176 - accuracy: 1.0000
Epoch 65/100
511/511 [==============================] - 0s 772us/sample - loss: 0.0161 - accuracy: 1.0000
Epoch 66/100
511/511 [==============================] - 0s 791us/sample - loss: 0.0156 - accuracy: 1.0000
Epoch 67/100
511/511 [==============================] - 0s 838us/sample - loss: 0.0153 - accuracy: 1.0000
Epoch 68/100
511/511 [==============================] - 0s 838us/sample - loss: 0.0145 - accuracy: 1.0000
Epoch 69/100
511/511 [==============================] - 0s 849us/sample - loss: 0.0161 - accuracy: 1.0000
Epoch 70/100
511/511 [==============================] - 0s 803us/sample - loss: 0.0151 - accuracy: 1.0000
Epoch 71/100
511/511 [==============================] - 0s 774us/sample - loss: 0.0130 - accuracy: 1.0000
Epoch 72/100
511/511 [==============================] - 0s 779us/sample - loss: 0.0116 - accuracy: 1.0000
Epoch 73/100
511/511 [==============================] - 0s 773us/sample - loss: 0.0126 - accuracy: 1.0000
Epoch 74/100
511/511 [==============================] - 0s 803us/sample - loss: 0.0111 - accuracy: 1.0000 - loss: 0.0112 - accuracy: 1.00
Epoch 75/100
511/511 [==============================] - 0s 827us/sample - loss: 0.0114 - accuracy: 1.0000
Epoch 76/100
511/511 [==============================] - 0s 790us/sample - loss: 0.0123 - accuracy: 1.0000
Epoch 77/100
511/511 [==============================] - 0s 752us/sample - loss: 0.0107 - accuracy: 1.0000
Epoch 78/100
511/511 [==============================] - 0s 738us/sample - loss: 0.0104 - accuracy: 1.0000
Epoch 79/100
511/511 [==============================] - 0s 747us/sample - loss: 0.0090 - accuracy: 1.0000
Epoch 80/100
511/511 [==============================] - 0s 743us/sample - loss: 0.0085 - accuracy: 1.0000
Epoch 81/100
511/511 [==============================] - 0s 753us/sample - loss: 0.0078 - accuracy: 1.0000
Epoch 82/100
511/511 [==============================] - 0s 758us/sample - loss: 0.0076 - accuracy: 1.0000
Epoch 83/100
511/511 [==============================] - 0s 753us/sample - loss: 0.0075 - accuracy: 1.0000
Epoch 84/100
511/511 [==============================] - 0s 745us/sample - loss: 0.0087 - accuracy: 1.0000
Epoch 85/100
511/511 [==============================] - 0s 746us/sample - loss: 0.0077 - accuracy: 1.0000
Epoch 86/100
511/511 [==============================] - 0s 753us/sample - loss: 0.0070 - accuracy: 1.0000
Epoch 87/100
511/511 [==============================] - 0s 738us/sample - loss: 0.0067 - accuracy: 1.0000
Epoch 88/100
511/511 [==============================] - 0s 742us/sample - loss: 0.0070 - accuracy: 1.0000
Epoch 89/100
511/511 [==============================] - 0s 737us/sample - loss: 0.0071 - accuracy: 1.0000
Epoch 90/100
511/511 [==============================] - 0s 734us/sample - loss: 0.0066 - accuracy: 1.0000
Epoch 91/100
511/511 [==============================] - 0s 724us/sample - loss: 0.0059 - accuracy: 1.0000
Epoch 92/100
511/511 [==============================] - 0s 714us/sample - loss: 0.0055 - accuracy: 1.0000
Epoch 93/100
511/511 [==============================] - 0s 713us/sample - loss: 0.0054 - accuracy: 1.0000
Epoch 94/100
511/511 [==============================] - 0s 702us/sample - loss: 0.0053 - accuracy: 1.0000
Epoch 95/100
511/511 [==============================] - 0s 711us/sample - loss: 0.0052 - accuracy: 1.0000
Epoch 96/100
511/511 [==============================] - 0s 715us/sample - loss: 0.0049 - accuracy: 1.0000
Epoch 97/100
511/511 [==============================] - 0s 744us/sample - loss: 0.0049 - accuracy: 1.0000
Epoch 98/100
511/511 [==============================] - 0s 781us/sample - loss: 0.0049 - accuracy: 1.0000
Epoch 99/100
511/511 [==============================] - 0s 744us/sample - loss: 0.0044 - accuracy: 1.0000
Epoch 100/100
511/511 [==============================] - 0s 723us/sample - loss: 0.0044 - accuracy: 1.0000
222/222 [==============================] - 0s 954us/sample - loss: 2.6878 - accuracy: 0.4595
Test accuracy: 0.45945945
In [84]:
predictions = model.predict(X_test)
In [85]:
def convertPredToLabels(predictions):
    labels = []
    for i in range(0,len(predictions)):
         labels.append(np.argmax(predictions[i]))
    return labels
In [73]:
pred_labels = convertPredToLabels(predictions)
In [35]:
def tabulateResults(pred_labels, actual_labels):
    counter = dict(zip(range(len(classes)),np.zeros(10)))
    for i in range(0,len(pred_labels)):
        if pred_labels[i] == actual_labels[i]:
            counter[pred_labels[i]]+=1
    return counter
In [36]:
tabulateResults(pred_labels,y_test)
Out[36]:
{0: 51.0, 1: 13.0, 2: 20.0, 3: 7.0, 4: 27.0}
In [37]:
unique, counts = np.unique(y_test, return_counts=True)
print (np.asarray((unique, counts)).T)
[[ 0 61]
 [ 1 39]
 [ 2 40]
 [ 3 22]
 [ 4 60]]
In [ ]: