2021 Dev-Matching : ๋จธ์ ๋ฌ๋ ๊ณผ์ ํ
์คํธ
ํ๋กค๋ก๊ทธ
์ค๋์ 8์๊ฐ์ด๋ ์งํํ๋ ๋จธ์ ๋ฌ๋ ํ
์คํธ์ด๋ค. ์ฌ์ค ์ข ๋ ๋ง์ ์ค๋น๋ฅผ ํ๊ณ ์ถ์๋๋ฐ, ์ผ๋ง ์ค๋น๋ฅผ ๋ชปํด์ ์์ ๊ฐ์ด ์์๋ค. ๋ด๊ฐ ํ ์ ์์ ๊ฑฐ๋ผ๋ ์๊ฐ๋ ๋ชปํ๊ณ ๊ทธ๋ฅ ๊ฒฝํ์ผ์ ํด๋ณด๋ค๊ฐ ์ด๋ ค์ฐ๋ฉด ํฌ๊ธฐํ์๋ผ๋ ๋ง์ธ๋๋ฅผ ๊ฐ์ง๊ณ ์์ํ๋ค.
๊ทธ๋ฆฌ๊ณ , ์ํ ๊ฐ์ ์ผ์ ๋ฒ์ด์ง์ง ์๋๋ค. ์ญ์๋ ์ด๋ ค์ ๊ณ ๋ง์ ๊ฒ์์ผ๋ก ์กฐ๊ทธ๋งํ ๋ฌธ์ ๋ค์ ํด๊ฒฐํจ์ ์์ด ๋ต๋ตํ๋ค. ์๋ ์๊ฐ์ด ์ด๋งํผ์ด๋ ์ง๋ฌ๋๋ฐ ์์ง ์ด๊ฒ๋ฐ์ ๋ชปํ๋ค๊ณ ? ๊ฒฐ๊ตญ ๋ต๋ตํ ๋ง์์ ๋ชป์ด๊ธฐ๊ณ ํฌ๊ธฐํ๋ค. ๊ทธ๋ฆฌ๊ณ 1์๊ฐ ๋ฐ์ ์ค๋ค.
์๊ณ ์ผ์ด๋์ ๋ชปํด๋ ๋๋๊น ํ๋ฒ ํ ์ ์๋๋ฐ ๊น์ง ํด๋ณด์ ๋ผ๋ ๋ง์ธ๋๋ก ๋ค์ ์์ํ๋ค. 10์๋ถํฐ ์์์ธ ๊ณผ์ ๋ฅผ 2์๋ถํฐ ๋ค์ ์์ํ๋ค. 4์๊ฐ๋ง์ ํ ์ ์์๊น?
๋ฌธ์
์ฌ๋, ๋ง, ์ง, ๊ฐ, ๊ธฐํ, ๊ธฐ๋ฆฐ, ์ฝ๋ผ๋ฆฌ์ 7์๋ฆฌ ์นดํ
๊ณ ๋ฆฌ ๋ถ๋ฅ ๋ฌธ์ ์๋ค. 1698๊ฐ์ ๋ฐ์ดํฐ์
์ด ์ฃผ์ด์ก๊ณ ๊ฐ๊ฐ์ ๋ฐ์ดํฐ๋ ๊ฐ ๋จ์ด์ ์์ด๋จ์ด์ ์ด๋ฆ์ ๊ฐ์ง ํด๋์์ ์์๋ค.
์ ์ถ๋ฌผ
์ฝ๋ ์ค๋ช
์ด ์๋ ์ฝ๋
ํ
์คํธ ๋ฐ์ดํฐ์ ๋ํ csv ๊ฒฐ๊ณผ ํ์ผ
๊ตฌํ
๋จธ์ ๋ฌ๋ ๊ณผ์ ํ
์คํธ
Image ํ์ผ๋ค์ด ๋ด๊ธด ์์ถ ํ์ผ์ ํ๋๋ค.
ํ์ํ ๋ผ์ด๋ธ๋ฌ๋ฆฌ๋ฅผ ์ ์ธํฉ๋๋ค
Copy import os
import numpy as np
import matplotlib.pyplot as plt
import pandas as pd
import cv2
from sklearn.utils import shuffle
from sklearn.model_selection import train_test_split
from sklearn.metrics import accuracy_score
Copy import tensorflow as tf
from tensorflow.keras.models import Model
from tensorflow.keras.utils import to_categorical
from tensorflow.keras.models import Sequential
from tensorflow.keras.layers import Input, Conv2D, MaxPool2D, Activation, Flatten, Dense
from tensorflow.keras.losses import categorical_crossentropy
from tensorflow.keras.optimizers import Adam
from tensorflow.keras.callbacks import ModelCheckpoint
from tensorflow.keras.models import load_model
์ด๋ฏธ์ง์ ํฌ๊ธฐ๋ฅผ ์ ์ํฉ๋๋ค.
๊ทธ๋ฆฌ๊ณ ์ด๋ฏธ์ง์ ๊ฐ ๋ผ๋ฒจ๋ค์ ์ ์์ ๋งค์นญํฉ๋๋ค
Copy image_width = 227
image_height = 227
class_names = ['dog', 'elephant', 'giraffe', 'guitar', 'horse', 'house', 'person',]
class_names_label = {class_name:i for i, class_name in enumerate(class_names)}
num_classes = len(class_names)
train_path = './train'
trainํด๋์์ ์ด๋ฏธ์ง๋ฅผ ๋ถ๋ฌ์ ๋ฆฌ์คํธ์ ์ ์ฅํฉ๋๋ค.
์ด ๋ ๊ฐ ์ด๋ฏธ์ง์ ๋ผ๋ฒจ์ ์ด๋ฏธ์ง๊ฐ ์กด์ฌํ๋ ํด๋ ์ด๋ฆ์ผ๋ก ์ ํฉ๋๋ค.
Copy images = []
labels = []
for dirname, _, filenames in os.walk(train_path):
category = dirname.replace('./train/','')
if category == './train':
continue
label = class_names_label[category]
for filename in filenames:
path = os.path.join(dirname, filename)
image = cv2.imread(path)
image = cv2.cvtColor(image, cv2.COLOR_BGR2RGB)
images.append(image)
labels.append(label)
images = np.array(images, dtype = 'float32')
labels = np.array(labels, dtype = 'int32')
๋ฐ์ดํฐ์
์ ์
ํํ๊ณ ์ฑ๋ฅ ์ธก์ ์ ์ํด ํ
์คํธ ๋ฐ์ดํฐ์
์ 20% ๋ถ๋ฆฌํฉ๋๋ค
Copy images, labels = shuffle(images, labels, random_state=25)
images = images / 255.0
train_images, test_images, train_labels, test_labels = train_test_split(images, labels, test_size=0.2)
ํ์ฌ 1358๊ฐ์ ํ๋ จ ๋ฐ์ดํฐ์ 340๊ฐ์ ํ
์คํธ ๋ฐ์ดํฐ ์ด 1698๊ฐ์ ๋ฐ์ดํฐ์
์ ๊ฐ์ง๊ณ ์์ต๋๋ค.
Copy n_train = train_labels.shape[0]
n_test = test_labels.shape[0]
print ("Number of training examples: {}".format(n_train))
print ("Number of training examples: {}".format(n_test))
Copy Number of training examples: 1358
Number of training examples: 340
๊ฐ๋จํ CNN ๋ชจ๋ธ์ ์ง์ ๋ง๋ค์ด๋ณด๊ณ ์ฑ๋ฅ์ ํ์ธํด๋ด
๋๋ค.
์ด ๋ ๊ฒ์ฆ ๋ฐ์ดํฐ๋ฅผ 20% ๋น์จ๋ก ๋ง๋ จํด์ ์ถ๊ฐ๋ก ์ฑ๋ฅ์ ํ์ธํด๋ด
๋๋ค.
Copy model = Sequential([
Conv2D(32, (3, 3), activation = 'relu', input_shape = (image_height, image_width, 3)),
MaxPool2D(2,2),
Conv2D(64, (3, 3), activation = 'relu'),
MaxPool2D(2,2),
Conv2D(128, (3, 3), activation = 'relu'),
MaxPool2D(2,2),
Flatten(),
Dense(256, activation=tf.nn.relu),
Dense(num_classes, activation=tf.nn.softmax)
])
Copy model.compile(optimizer = 'adam', loss = 'sparse_categorical_crossentropy', metrics=['accuracy'])
Copy history = model.fit(train_images, train_labels, batch_size=64, epochs=20, validation_split = 0.2)
Copy Epoch 1/20
17/17 [==============================] - 38s 240ms/step - loss: 3.6401 - accuracy: 0.1889 - val_loss: 1.9124 - val_accuracy: 0.2243
Epoch 2/20
17/17 [==============================] - 2s 110ms/step - loss: 1.8678 - accuracy: 0.2314 - val_loss: 1.8660 - val_accuracy: 0.2279
Epoch 3/20
17/17 [==============================] - 2s 110ms/step - loss: 1.7636 - accuracy: 0.2779 - val_loss: 1.7660 - val_accuracy: 0.3125
Epoch 4/20
17/17 [==============================] - 2s 111ms/step - loss: 1.6425 - accuracy: 0.3904 - val_loss: 1.6835 - val_accuracy: 0.3566
Epoch 5/20
17/17 [==============================] - 2s 110ms/step - loss: 1.4164 - accuracy: 0.4867 - val_loss: 1.7457 - val_accuracy: 0.3382
Epoch 6/20
17/17 [==============================] - 2s 110ms/step - loss: 1.0570 - accuracy: 0.6507 - val_loss: 1.6682 - val_accuracy: 0.3676
Epoch 7/20
17/17 [==============================] - 2s 109ms/step - loss: 0.7076 - accuracy: 0.8003 - val_loss: 1.8565 - val_accuracy: 0.3787
Epoch 8/20
17/17 [==============================] - 2s 109ms/step - loss: 0.4653 - accuracy: 0.8563 - val_loss: 2.2181 - val_accuracy: 0.3824
Epoch 9/20
17/17 [==============================] - 2s 111ms/step - loss: 0.2103 - accuracy: 0.9435 - val_loss: 2.4412 - val_accuracy: 0.4007
Epoch 10/20
17/17 [==============================] - 2s 112ms/step - loss: 0.0872 - accuracy: 0.9852 - val_loss: 3.1015 - val_accuracy: 0.3787
Epoch 11/20
17/17 [==============================] - 2s 110ms/step - loss: 0.0549 - accuracy: 0.9883 - val_loss: 3.1400 - val_accuracy: 0.3676
Epoch 12/20
17/17 [==============================] - 2s 112ms/step - loss: 0.0546 - accuracy: 0.9836 - val_loss: 3.5330 - val_accuracy: 0.3860
Epoch 13/20
17/17 [==============================] - 2s 111ms/step - loss: 0.0174 - accuracy: 0.9989 - val_loss: 3.5670 - val_accuracy: 0.4118
Epoch 14/20
17/17 [==============================] - 2s 110ms/step - loss: 0.0107 - accuracy: 1.0000 - val_loss: 3.6782 - val_accuracy: 0.4301
Epoch 15/20
17/17 [==============================] - 2s 110ms/step - loss: 0.0028 - accuracy: 1.0000 - val_loss: 3.9017 - val_accuracy: 0.4375
Epoch 16/20
17/17 [==============================] - 2s 111ms/step - loss: 0.0016 - accuracy: 1.0000 - val_loss: 4.0851 - val_accuracy: 0.4375
Epoch 17/20
17/17 [==============================] - 2s 111ms/step - loss: 9.3226e-04 - accuracy: 1.0000 - val_loss: 4.1403 - val_accuracy: 0.4412
Epoch 18/20
17/17 [==============================] - 2s 111ms/step - loss: 7.6645e-04 - accuracy: 1.0000 - val_loss: 4.1752 - val_accuracy: 0.4412
Epoch 19/20
17/17 [==============================] - 2s 111ms/step - loss: 5.6912e-04 - accuracy: 1.0000 - val_loss: 4.2507 - val_accuracy: 0.4412
Epoch 20/20
17/17 [==============================] - 2s 111ms/step - loss: 5.1643e-04 - accuracy: 1.0000 - val_loss: 4.3206 - val_accuracy: 0.4449
ํ๋ จ ๋ฐ์ดํฐ์ ์ฑ๋ฅ์ ๋งค์ฐ ์ข์ง๋ง ๊ฒ์ฆ ๋ฐ์ดํฐ์ ์ฑ๋ฅ์ ๋ง์ด ๋์ง ์์ ๊ฒฐ๊ณผ๊ฐ ๋ณผ ์ ์์ต๋๋ค.
Copy def plot_accuracy_loss(history):
fig = plt.figure(figsize=(10,5))
plt.subplot(221)
plt.plot(history.history['accuracy'],'bo--', label = "accuracy")
plt.plot(history.history['val_accuracy'], 'ro--', label = "val_accuracy")
plt.title("train_acc vs val_acc")
plt.ylabel("accuracy")
plt.xlabel("epochs")
plt.legend()
plt.subplot(222)
plt.plot(history.history['loss'],'bo--', label = "loss")
plt.plot(history.history['val_loss'], 'ro--', label = "val_loss")
plt.title("train_loss vs val_loss")
plt.ylabel("loss")
plt.xlabel("epochs")
plt.legend()
plt.show()
Copy plot_accuracy_loss(history)
๊ทธ๋ํ๋ฅผ ํตํด ํ์ธํด๋ณด๋ฉด ๋๋ต 5epochs ์ดํ ๋ถํฐ๋ ๊ต์ฐจ ๋ฐ์ดํฐ์ ์ฑ๋ฅ์ด ์ ๋์ค์ง ์์ต๋๋ค.
Copy test_loss = model.evaluate(test_images, test_labels)
Copy 11/11 [==============================] - 1s 45ms/step - loss: 5.0795 - accuracy: 0.4118
ํ
์คํธ ๋ฐ์ดํฐ์ ์ฑ๋ฅ์ ๋๋ต 0.4 ๋ด์ธ์
๋๋ค.
(์ฌ๋ฌ ์ํ ๊ฒฐ๊ณผ 0.35 ~ 0.45 ์ฌ์ด)
Image Augmentation
๋ฐ์ดํฐ์
์ ๊ฐฏ์๊ฐ ๋ง์์ง์๋ก ๋ชจ๋ธ์ด ์ค๋ฒํผํ
๋ ๊ฐ๋ฅ์ฑ์ด ์ค์ด๋ค๋ฉฐ, ์ด๋ ์ฑ๋ฅ์ ๊ฐ์ ์์๊ฐ ๋ฉ๋๋ค.
๋ฐ์ดํฐ์
์ ๊ฐฏ์๋ฅผ ์ฆ๊ฐ์ํค๊ธฐ ์ํด ImageDataGenerator
๋ฅผ ์ด์ฉํฉ๋๋ค.
ํ์ฌ ๋ฐ์ดํฐ์
์ ์ด 1698๊ฐ์
๋๋ค.
์ด๋ฏธ์ง ์กฐ์์ ํตํด ๋ฐ์ดํฐ์
์ ๊ฐฏ์๋ฅผ ๋๋ ค์ค๋๋ค.
์ด๋ฏธ์ง 1๊ฐ๋น ์กฐ์์ ํตํด 3๊ฐ์ ์ด๋ฏธ์ง๊ฐ ์์ฑ๋ฉ๋๋ค.
๋ฐ๋ผ์ ์ด 1698 4 = 6792๊ฐ์ ๋ฐ์ดํฐ์
์ด ๋ง๋ จ๋ฉ๋๋ค.
1698 2 = 3396
1698 3 = 5094
1698 4 = 6792
1698 * 5 = 8490
๋ ๋ง์ ์ด๋ฏธ์ง๋ colab์์ ๋ฆฌ์์ค ๋ถ์กฑ์ผ๋ก ์๋ํ์ง ์์์ ์ถ๊ฐ๋ก 2๊ฐ๊น์ง๋ง ์์ฑํ์ต๋๋ค.(์ด 3๊ฐ)
Copy from tensorflow.keras.preprocessing.image import ImageDataGenerator, array_to_img, img_to_array, load_img
imageGenerator = ImageDataGenerator(
rotation_range=20,
width_shift_range=0.1,
height_shift_range=0.1,
brightness_range=[.2, .2],
horizontal_flip=True,
)
for dirname, _, filenames in os.walk(train_path):
category = dirname.replace('./train/','')
if category == './train':
continue
for filename in filenames:
img = load_img(os.path.join(dirname, filename))
x = img_to_array(img)
x = x.reshape((1, ) + x.shape)
i = 0
for batch in imageGenerator.flow(x,batch_size = 1,
save_to_dir = os.path.join(train_path, category),
save_format ='jpg'):
i += 1
if i == 2:
break
์ด๋ฏธ์ง๋ฅผ ์กฐ์ํ ๋ฐฉ๋ฒ์ ๋ค์๊ณผ ๊ฐ์ด ์ ํฉ๋๋ค.
rotation_range = 20
ํ์ ํ ๊ฐ๋๋ฅผ 20๋๋ก ์ค์ ํฉ๋๋ค
height_shift_range = 0.1
์ํ ๋๋ ์์ง์ผ๋ก ์ด๋ ๋น์จ์ 0.1๋ก ์ ํฉ๋๋ค
brigthness_range = [.2, .2]
๋ฐ๊ธฐ๋ฅผ -20% ๋ถํฐ 20%๊น์ง ๋ณํ์ํต๋๋ค.
horizontal_flip=True
์ํ์ผ๋ก ๋ค์ง์ต๋๋ค.
Copy train_path = './train'
images = []
labels = []
for dirname, _, filenames in os.walk(train_path):
category = dirname.split('/')[-1]
if category == 'train':
continue
label = class_names_label[category]
for filename in filenames:
path = os.path.join(dirname, filename)
image = cv2.imread(path)
image = cv2.cvtColor(image, cv2.COLOR_BGR2RGB)
images.append(image)
labels.append(label)
images = np.array(images, dtype = 'float32')
labels = np.array(labels, dtype = 'int32')
images, labels = shuffle(images, labels, random_state=25)
images = images / 255.0
train_images, test_images, train_labels, test_labels = train_test_split(images, labels, test_size=0.2)
Copy n_train = train_labels.shape[0]
n_test = test_labels.shape[0]
print ("Number of training examples: {}".format(n_train))
print ("Number of training examples: {}".format(n_test))
Copy Number of training examples: 3989
Number of training examples: 998
Copy model.compile(optimizer = 'adam', loss = 'sparse_categorical_crossentropy', metrics=['accuracy'])
history = model.fit(train_images, train_labels, batch_size=64, epochs=20, validation_split = 0.2)
Copy Epoch 1/20
50/50 [==============================] - 8s 153ms/step - loss: 2.2086 - accuracy: 0.4334 - val_loss: 1.4567 - val_accuracy: 0.4900
Epoch 2/20
50/50 [==============================] - 5s 109ms/step - loss: 1.1767 - accuracy: 0.5928 - val_loss: 1.3364 - val_accuracy: 0.5426
Epoch 3/20
50/50 [==============================] - 5s 110ms/step - loss: 0.6962 - accuracy: 0.7762 - val_loss: 1.4134 - val_accuracy: 0.5564
Epoch 4/20
50/50 [==============================] - 6s 111ms/step - loss: 0.3223 - accuracy: 0.9011 - val_loss: 1.9013 - val_accuracy: 0.5501
Epoch 5/20
50/50 [==============================] - 6s 111ms/step - loss: 0.1151 - accuracy: 0.9695 - val_loss: 2.3870 - val_accuracy: 0.5313
Epoch 6/20
50/50 [==============================] - 6s 110ms/step - loss: 0.0513 - accuracy: 0.9889 - val_loss: 2.5908 - val_accuracy: 0.5602
Epoch 7/20
50/50 [==============================] - 6s 112ms/step - loss: 0.0282 - accuracy: 0.9938 - val_loss: 2.8788 - val_accuracy: 0.5238
Epoch 8/20
50/50 [==============================] - 6s 111ms/step - loss: 0.0187 - accuracy: 0.9968 - val_loss: 3.5123 - val_accuracy: 0.5125
Epoch 9/20
50/50 [==============================] - 6s 111ms/step - loss: 0.0382 - accuracy: 0.9899 - val_loss: 3.8634 - val_accuracy: 0.4561
Epoch 10/20
50/50 [==============================] - 6s 112ms/step - loss: 0.1146 - accuracy: 0.9645 - val_loss: 2.8725 - val_accuracy: 0.5088
Epoch 11/20
50/50 [==============================] - 6s 111ms/step - loss: 0.0543 - accuracy: 0.9849 - val_loss: 2.9627 - val_accuracy: 0.5276
Epoch 12/20
50/50 [==============================] - 6s 111ms/step - loss: 0.0163 - accuracy: 0.9961 - val_loss: 3.2457 - val_accuracy: 0.4987
Epoch 13/20
50/50 [==============================] - 6s 112ms/step - loss: 0.0150 - accuracy: 0.9974 - val_loss: 3.4321 - val_accuracy: 0.5238
Epoch 14/20
50/50 [==============================] - 6s 112ms/step - loss: 0.0058 - accuracy: 0.9990 - val_loss: 3.7935 - val_accuracy: 0.5288
Epoch 15/20
50/50 [==============================] - 6s 111ms/step - loss: 6.9671e-04 - accuracy: 1.0000 - val_loss: 3.8625 - val_accuracy: 0.5401
Epoch 16/20
50/50 [==============================] - 6s 112ms/step - loss: 2.6550e-04 - accuracy: 1.0000 - val_loss: 3.9490 - val_accuracy: 0.5414
Epoch 17/20
50/50 [==============================] - 6s 112ms/step - loss: 1.7575e-04 - accuracy: 1.0000 - val_loss: 3.9936 - val_accuracy: 0.5439
Epoch 18/20
50/50 [==============================] - 6s 112ms/step - loss: 1.4344e-04 - accuracy: 1.0000 - val_loss: 4.0374 - val_accuracy: 0.5439
Epoch 19/20
50/50 [==============================] - 6s 112ms/step - loss: 1.1886e-04 - accuracy: 1.0000 - val_loss: 4.0765 - val_accuracy: 0.5439
Epoch 20/20
50/50 [==============================] - 6s 112ms/step - loss: 1.0780e-04 - accuracy: 1.0000 - val_loss: 4.1103 - val_accuracy: 0.5439
Copy plot_accuracy_loss(history)
Copy test_loss = model.evaluate(test_images, test_labels)
Copy 32/32 [==============================] - 1s 25ms/step - loss: 3.8832 - accuracy: 0.5341
ํ
์คํธ ๋ฐ์ดํฐ์ ์ฑ๋ฅ์ด ๋์์ง ๊ฒ์ ๋ณผ ์ ์์ต๋๋ค.
VGG16 ๋ชจ๋ธ
ํ์ฌ ๊ตฌํ๋ ์ด๋ฏธ์ง ๋ถ๋ฅ๊ธฐ๋ ๊ทธ ์ธต์ ๊น์ด๊ฐ ์์ต๋๋ค.
๋ ์ข์ ๋ถ๋ฅ๊ธฐ๋ฅผ ์ฌ์ฉํ๊ธฐ ์ํด tensorflow์์ ์ง์ํ๋ ๋ชจ๋ธ์ธ VGG๋ฅผ ์ฌ์ฉํฉ๋๋ค.
๋ํ VGG๋ชจ๋ธ์ ๊ฑฐ์ณ ์์ฑ๋ ํน์ง์ผ๋ก ์
๋ ฅ๊ณผ ์ถ๋ ฅ์ ๋ชจ๋ ์ฐ๊ฒฐํด์ฃผ๋ Dense Layer๋ฅผ ๊ฑฐ์น๊ฒ ๋ฉ๋๋ค.
Copy from keras.applications.vgg16 import VGG16
from keras.preprocessing import image
from keras.applications.vgg16 import preprocess_input
model = VGG16(weights='imagenet', include_top=False)
Copy train_features = model.predict(train_images)
test_features = model.predict(test_images)
Copy n_train, x, y, z = train_features.shape
n_test, x, y, z = test_features.shape
numFeatures = x * y * z
Copy model2 = tf.keras.Sequential([
tf.keras.layers.Flatten(input_shape = (x, y, z)),
tf.keras.layers.Dense(50, activation=tf.nn.relu),
tf.keras.layers.Dense(num_classes, activation=tf.nn.softmax)
])
model2.compile(optimizer = 'adam', loss = 'sparse_categorical_crossentropy', metrics=['accuracy'])
history2 = model2.fit(train_features, train_labels, batch_size=64, epochs=15, validation_split = 0.2)
Copy Epoch 1/15
50/50 [==============================] - 1s 16ms/step - loss: 1.8851 - accuracy: 0.4079 - val_loss: 0.6769 - val_accuracy: 0.7581
Epoch 2/15
1/50 [..............................] - ETA: 0s - loss: 0.5036 - accuracy: 0.8438
Copy ---------------------------------------------------------------------------
KeyboardInterrupt Traceback (most recent call last)
<ipython-input-9-c73ec4987367> in <module>()
7 model2.compile(optimizer = 'adam', loss = 'sparse_categorical_crossentropy', metrics=['accuracy'])
8
----> 9 history2 = model2.fit(train_features, train_labels, batch_size=64, epochs=15, validation_split = 0.2)
/usr/local/lib/python3.7/dist-packages/tensorflow/python/keras/engine/training.py in fit(self, x, y, batch_size, epochs, verbose, callbacks, validation_split, validation_data, shuffle, class_weight, sample_weight, initial_epoch, steps_per_epoch, validation_steps, validation_batch_size, validation_freq, max_queue_size, workers, use_multiprocessing)
1098 _r=1):
1099 callbacks.on_train_batch_begin(step)
-> 1100 tmp_logs = self.train_function(iterator)
1101 if data_handler.should_sync:
1102 context.async_wait()
/usr/local/lib/python3.7/dist-packages/tensorflow/python/eager/def_function.py in __call__(self, *args, **kwds)
826 tracing_count = self.experimental_get_tracing_count()
827 with trace.Trace(self._name) as tm:
--> 828 result = self._call(*args, **kwds)
829 compiler = "xla" if self._experimental_compile else "nonXla"
830 new_tracing_count = self.experimental_get_tracing_count()
/usr/local/lib/python3.7/dist-packages/tensorflow/python/eager/def_function.py in _call(self, *args, **kwds)
853 # In this case we have created variables on the first call, so we run the
854 # defunned version which is guaranteed to never create variables.
--> 855 return self._stateless_fn(*args, **kwds) # pylint: disable=not-callable
856 elif self._stateful_fn is not None:
857 # Release the lock early so that multiple threads can perform the call
/usr/local/lib/python3.7/dist-packages/tensorflow/python/eager/function.py in __call__(self, *args, **kwargs)
2941 filtered_flat_args) = self._maybe_define_function(args, kwargs)
2942 return graph_function._call_flat(
-> 2943 filtered_flat_args, captured_inputs=graph_function.captured_inputs) # pylint: disable=protected-access
2944
2945 @property
/usr/local/lib/python3.7/dist-packages/tensorflow/python/eager/function.py in _call_flat(self, args, captured_inputs, cancellation_manager)
1917 # No tape is watching; skip to running the function.
1918 return self._build_call_outputs(self._inference_function.call(
-> 1919 ctx, args, cancellation_manager=cancellation_manager))
1920 forward_backward = self._select_forward_and_backward_functions(
1921 args,
/usr/local/lib/python3.7/dist-packages/tensorflow/python/eager/function.py in call(self, ctx, args, cancellation_manager)
558 inputs=args,
559 attrs=attrs,
--> 560 ctx=ctx)
561 else:
562 outputs = execute.execute_with_cancellation(
/usr/local/lib/python3.7/dist-packages/tensorflow/python/eager/execute.py in quick_execute(op_name, num_outputs, inputs, attrs, ctx, name)
58 ctx.ensure_initialized()
59 tensors = pywrap_tfe.TFE_Py_Execute(ctx._handle, device_name, op_name,
---> 60 inputs, attrs, num_outputs)
61 except core._NotOkStatusException as e:
62 if name is not None:
KeyboardInterrupt:
Copy plot_accuracy_loss(history)
Copy test_loss = model2.evaluate(test_features, test_labels)
Copy 32/32 [==============================] - 0s 3ms/step - loss: 0.3025 - accuracy: 0.8938
์ฑ๋ฅ์ด ๋น์ฝ์ ์ผ๋ก ์์นํ ๊ฒ์ ๋ณผ ์ ์์ต๋๋ค.
VGG16 + Data Augmentation 0.89
Ensemble, ์์๋ธ
์์๋ธ ๋ฌ๋์ ์ฌ๋ฌ๊ฐ์ ๋ถ๋ฅ๊ธฐ๋ฅผ ํตํด ํ์ตํ๊ณ ํ์ตํ ๊ฒฐ๊ณผ๋ฅผ ์ข
ํฉ(ํ๊ท ๋๋ ์ต๋น๊ฐ)ํ์ฌ ์ต์ข
๊ฒฐ๊ณผ๊ฐ์ ๊ฒฐ์ ํ๋ ๊ฒ์ ๋งํฉ๋๋ค.
์ด๋, ์์๋ธ ๋ฌ๋์ ์ด์ฉํ๋ ๋ฐฉ๋ฒ์ ์ธ ๊ฐ์ง๊ฐ ์์ต๋๋ค.
์๋ก ๋ค๋ฅธ ๋ถ๋ฅ๊ธฐ๋ก ๋์ผํ ๋ฐ์ดํฐ์
์ ๋ํด ๋ค๋ฅธ ๊ฒฐ๊ณผ๋ฅผ ์ป๋๋ค
์๋ก ๋ค๋ฅธ ๋ฐ์ดํฐ๋ก ๋์ผํ ๋ถ๋ฅ๊ธฐ์ ๋ํด ๋ค๋ฅธ ๊ฒฐ๊ณผ๋ฅผ ์ป๋๋ค
๋๋, ์ด ๋์ ๋ ๋ค ์ด์ฉํ๋ค.
์ฌ๊ธฐ์๋ 10๊ฐ์ ์๋ก ๋ค๋ฅธ ๋ถ๋ฅ๊ธฐ๋ก ๋์ผํ ๋ฐ์ดํฐ์
์ ๋ํด ๋ค๋ฅธ ๊ฒฐ๊ณผ๋ฅผ ์ป์ต๋๋ค.
๋ถ๋ฅ๊ธฐ๊ฐ ์๋ก ๋ค๋ฅด๋ค๋ ๋ป์ Dense Layer์์ ์
๋ ฅ๊ฐ์ ๋ค๋ฅด๊ฒ ์
๋ ฅ๋ฐ๋๋ค๋ ๋ป์ด๋ฉฐ ์ด๋ VGG16์ ๊ฑฐ์น ํน์ง๋ค์ด ๋๋ค์ผ๋ก Dropout๋ 10๊ฐ์ง์ ์
๋ ฅ์ ๋ฐ๋๋ค๋ ๋ง๊ณผ ๊ฐ์ต๋๋ค.
Copy n_estimators = 10
max_samples = 0.8
max_samples *= n_train
max_samples = int(max_samples)
Copy models = list()
random = np.random.randint(50, 100, size = n_estimators)
for i in range(n_estimators):
model3 = tf.keras.Sequential([ tf.keras.layers.Flatten(input_shape = (x, y, z)),
tf.keras.layers.Dense(random[i], activation=tf.nn.relu),
tf.keras.layers.Dense(num_classes, activation=tf.nn.softmax)
])
model3.compile(optimizer = 'adam', loss = 'sparse_categorical_crossentropy', metrics=['accuracy'])
models.append(model3)
Copy histories = []
for i in range(n_estimators):
train_idx = np.random.choice(len(train_features), size = max_samples)
histories.append(models[i].fit(train_features[train_idx], train_labels[train_idx], batch_size=64, epochs=10, validation_split = 0.1))
Copy Epoch 1/10
45/45 [==============================] - 1s 9ms/step - loss: 1.9823 - accuracy: 0.3059 - val_loss: 1.0176 - val_accuracy: 0.7063
Epoch 2/10
45/45 [==============================] - 0s 6ms/step - loss: 0.9019 - accuracy: 0.7247 - val_loss: 0.5121 - val_accuracy: 0.8875
Epoch 3/10
45/45 [==============================] - 0s 7ms/step - loss: 0.3701 - accuracy: 0.9178 - val_loss: 0.3479 - val_accuracy: 0.9094
Epoch 4/10
45/45 [==============================] - 0s 7ms/step - loss: 0.2173 - accuracy: 0.9613 - val_loss: 0.2734 - val_accuracy: 0.9281
Epoch 5/10
45/45 [==============================] - 0s 6ms/step - loss: 0.1204 - accuracy: 0.9880 - val_loss: 0.2201 - val_accuracy: 0.9375
Epoch 6/10
45/45 [==============================] - 0s 6ms/step - loss: 0.0818 - accuracy: 0.9940 - val_loss: 0.1895 - val_accuracy: 0.9406
Epoch 7/10
45/45 [==============================] - 0s 6ms/step - loss: 0.0501 - accuracy: 0.9982 - val_loss: 0.1696 - val_accuracy: 0.9406
Epoch 8/10
45/45 [==============================] - 0s 7ms/step - loss: 0.0344 - accuracy: 0.9996 - val_loss: 0.1632 - val_accuracy: 0.9312
Epoch 9/10
45/45 [==============================] - 0s 6ms/step - loss: 0.0235 - accuracy: 0.9998 - val_loss: 0.1513 - val_accuracy: 0.9563
Epoch 10/10
45/45 [==============================] - 0s 6ms/step - loss: 0.0191 - accuracy: 0.9993 - val_loss: 0.1467 - val_accuracy: 0.9625
Epoch 1/10
45/45 [==============================] - 1s 9ms/step - loss: 2.1596 - accuracy: 0.3024 - val_loss: 1.0193 - val_accuracy: 0.6375
Epoch 2/10
45/45 [==============================] - 0s 6ms/step - loss: 0.8358 - accuracy: 0.7512 - val_loss: 0.6540 - val_accuracy: 0.8031
Epoch 3/10
45/45 [==============================] - 0s 6ms/step - loss: 0.5051 - accuracy: 0.8747 - val_loss: 0.4970 - val_accuracy: 0.8375
Epoch 4/10
45/45 [==============================] - 0s 7ms/step - loss: 0.3183 - accuracy: 0.9158 - val_loss: 0.3074 - val_accuracy: 0.9125
Epoch 5/10
45/45 [==============================] - 0s 6ms/step - loss: 0.1145 - accuracy: 0.9833 - val_loss: 0.2050 - val_accuracy: 0.9500
Epoch 6/10
45/45 [==============================] - 0s 6ms/step - loss: 0.0617 - accuracy: 0.9970 - val_loss: 0.1897 - val_accuracy: 0.9625
Epoch 7/10
45/45 [==============================] - 0s 7ms/step - loss: 0.0357 - accuracy: 1.0000 - val_loss: 0.1739 - val_accuracy: 0.9656
Epoch 8/10
45/45 [==============================] - 0s 6ms/step - loss: 0.0302 - accuracy: 1.0000 - val_loss: 0.1671 - val_accuracy: 0.9563
Epoch 9/10
45/45 [==============================] - 0s 6ms/step - loss: 0.0192 - accuracy: 1.0000 - val_loss: 0.1580 - val_accuracy: 0.9563
Epoch 10/10
45/45 [==============================] - 0s 6ms/step - loss: 0.0151 - accuracy: 1.0000 - val_loss: 0.1578 - val_accuracy: 0.9625
Epoch 1/10
45/45 [==============================] - 1s 9ms/step - loss: 2.2014 - accuracy: 0.3511 - val_loss: 0.8670 - val_accuracy: 0.6906
Epoch 2/10
45/45 [==============================] - 0s 7ms/step - loss: 0.6973 - accuracy: 0.7739 - val_loss: 0.6392 - val_accuracy: 0.8125
Epoch 3/10
45/45 [==============================] - 0s 6ms/step - loss: 0.4221 - accuracy: 0.9097 - val_loss: 0.4906 - val_accuracy: 0.8656
Epoch 4/10
45/45 [==============================] - 0s 7ms/step - loss: 0.2574 - accuracy: 0.9617 - val_loss: 0.4151 - val_accuracy: 0.8906
Epoch 5/10
45/45 [==============================] - 0s 6ms/step - loss: 0.1797 - accuracy: 0.9754 - val_loss: 0.3713 - val_accuracy: 0.8813
Epoch 6/10
45/45 [==============================] - 0s 6ms/step - loss: 0.1147 - accuracy: 0.9935 - val_loss: 0.3483 - val_accuracy: 0.9000
Epoch 7/10
45/45 [==============================] - 0s 6ms/step - loss: 0.0843 - accuracy: 0.9957 - val_loss: 0.3210 - val_accuracy: 0.9031
Epoch 8/10
45/45 [==============================] - 0s 6ms/step - loss: 0.0618 - accuracy: 0.9977 - val_loss: 0.3275 - val_accuracy: 0.9125
Epoch 9/10
45/45 [==============================] - 0s 7ms/step - loss: 0.0476 - accuracy: 0.9982 - val_loss: 0.3028 - val_accuracy: 0.9125
Epoch 10/10
45/45 [==============================] - 0s 7ms/step - loss: 0.0344 - accuracy: 0.9998 - val_loss: 0.2974 - val_accuracy: 0.9094
Epoch 1/10
45/45 [==============================] - 1s 9ms/step - loss: 2.9052 - accuracy: 0.3392 - val_loss: 0.7875 - val_accuracy: 0.7437
Epoch 2/10
45/45 [==============================] - 0s 7ms/step - loss: 0.5499 - accuracy: 0.8438 - val_loss: 0.5012 - val_accuracy: 0.8406
Epoch 3/10
45/45 [==============================] - 0s 7ms/step - loss: 0.2768 - accuracy: 0.9504 - val_loss: 0.3944 - val_accuracy: 0.8938
Epoch 4/10
45/45 [==============================] - 0s 7ms/step - loss: 0.1770 - accuracy: 0.9739 - val_loss: 0.3226 - val_accuracy: 0.9219
Epoch 5/10
45/45 [==============================] - 0s 7ms/step - loss: 0.1055 - accuracy: 0.9945 - val_loss: 0.2937 - val_accuracy: 0.9281
Epoch 6/10
45/45 [==============================] - 0s 7ms/step - loss: 0.0730 - accuracy: 0.9987 - val_loss: 0.2598 - val_accuracy: 0.9312
Epoch 7/10
45/45 [==============================] - 0s 7ms/step - loss: 0.0519 - accuracy: 0.9994 - val_loss: 0.2530 - val_accuracy: 0.9281
Epoch 8/10
45/45 [==============================] - 0s 6ms/step - loss: 0.0383 - accuracy: 1.0000 - val_loss: 0.2581 - val_accuracy: 0.9250
Epoch 9/10
45/45 [==============================] - 0s 7ms/step - loss: 0.0281 - accuracy: 1.0000 - val_loss: 0.2265 - val_accuracy: 0.9406
Epoch 10/10
45/45 [==============================] - 0s 7ms/step - loss: 0.0230 - accuracy: 1.0000 - val_loss: 0.2292 - val_accuracy: 0.9281
Epoch 1/10
45/45 [==============================] - 1s 9ms/step - loss: 2.1120 - accuracy: 0.4169 - val_loss: 0.5158 - val_accuracy: 0.8438
Epoch 2/10
45/45 [==============================] - 0s 6ms/step - loss: 0.3176 - accuracy: 0.9168 - val_loss: 0.2947 - val_accuracy: 0.9094
Epoch 3/10
45/45 [==============================] - 0s 7ms/step - loss: 0.1382 - accuracy: 0.9806 - val_loss: 0.2272 - val_accuracy: 0.9281
Epoch 4/10
45/45 [==============================] - 0s 6ms/step - loss: 0.0661 - accuracy: 0.9959 - val_loss: 0.2120 - val_accuracy: 0.9375
Epoch 5/10
45/45 [==============================] - 0s 7ms/step - loss: 0.0399 - accuracy: 0.9994 - val_loss: 0.1898 - val_accuracy: 0.9375
Epoch 6/10
45/45 [==============================] - 0s 7ms/step - loss: 0.0238 - accuracy: 1.0000 - val_loss: 0.1754 - val_accuracy: 0.9469
Epoch 7/10
45/45 [==============================] - 0s 7ms/step - loss: 0.0172 - accuracy: 1.0000 - val_loss: 0.1769 - val_accuracy: 0.9344
Epoch 8/10
45/45 [==============================] - 0s 7ms/step - loss: 0.0131 - accuracy: 1.0000 - val_loss: 0.1698 - val_accuracy: 0.9438
Epoch 9/10
45/45 [==============================] - 0s 7ms/step - loss: 0.0116 - accuracy: 1.0000 - val_loss: 0.1685 - val_accuracy: 0.9438
Epoch 10/10
45/45 [==============================] - 0s 7ms/step - loss: 0.0087 - accuracy: 1.0000 - val_loss: 0.1692 - val_accuracy: 0.9438
Epoch 1/10
45/45 [==============================] - 1s 9ms/step - loss: 2.4135 - accuracy: 0.3638 - val_loss: 0.8577 - val_accuracy: 0.6750
Epoch 2/10
45/45 [==============================] - 0s 6ms/step - loss: 0.5958 - accuracy: 0.8420 - val_loss: 0.5768 - val_accuracy: 0.7969
Epoch 3/10
45/45 [==============================] - 0s 6ms/step - loss: 0.3599 - accuracy: 0.9117 - val_loss: 0.4977 - val_accuracy: 0.8406
Epoch 4/10
45/45 [==============================] - 0s 6ms/step - loss: 0.2330 - accuracy: 0.9629 - val_loss: 0.3527 - val_accuracy: 0.8844
Epoch 5/10
45/45 [==============================] - 0s 7ms/step - loss: 0.1348 - accuracy: 0.9895 - val_loss: 0.3043 - val_accuracy: 0.8969
Epoch 6/10
45/45 [==============================] - 0s 6ms/step - loss: 0.1012 - accuracy: 0.9908 - val_loss: 0.2929 - val_accuracy: 0.9062
Epoch 7/10
45/45 [==============================] - 0s 7ms/step - loss: 0.0683 - accuracy: 0.9976 - val_loss: 0.2615 - val_accuracy: 0.9156
Epoch 8/10
45/45 [==============================] - 0s 6ms/step - loss: 0.0497 - accuracy: 0.9995 - val_loss: 0.2250 - val_accuracy: 0.9250
Epoch 9/10
45/45 [==============================] - 0s 7ms/step - loss: 0.0372 - accuracy: 1.0000 - val_loss: 0.2279 - val_accuracy: 0.9312
Epoch 10/10
45/45 [==============================] - 0s 6ms/step - loss: 0.0284 - accuracy: 1.0000 - val_loss: 0.2228 - val_accuracy: 0.9344
Epoch 1/10
45/45 [==============================] - 1s 9ms/step - loss: 1.8665 - accuracy: 0.4245 - val_loss: 0.6087 - val_accuracy: 0.7781
Epoch 2/10
45/45 [==============================] - 0s 7ms/step - loss: 0.3551 - accuracy: 0.8953 - val_loss: 0.3399 - val_accuracy: 0.9156
Epoch 3/10
45/45 [==============================] - 0s 7ms/step - loss: 0.1476 - accuracy: 0.9772 - val_loss: 0.2929 - val_accuracy: 0.9156
Epoch 4/10
45/45 [==============================] - 0s 6ms/step - loss: 0.0767 - accuracy: 0.9972 - val_loss: 0.2542 - val_accuracy: 0.9344
Epoch 5/10
45/45 [==============================] - 0s 7ms/step - loss: 0.0447 - accuracy: 0.9987 - val_loss: 0.2191 - val_accuracy: 0.9438
Epoch 6/10
45/45 [==============================] - 0s 6ms/step - loss: 0.0286 - accuracy: 1.0000 - val_loss: 0.2117 - val_accuracy: 0.9563
Epoch 7/10
45/45 [==============================] - 0s 6ms/step - loss: 0.0220 - accuracy: 1.0000 - val_loss: 0.2031 - val_accuracy: 0.9500
Epoch 8/10
45/45 [==============================] - 0s 7ms/step - loss: 0.0162 - accuracy: 1.0000 - val_loss: 0.2149 - val_accuracy: 0.9344
Epoch 9/10
45/45 [==============================] - 0s 6ms/step - loss: 0.0127 - accuracy: 1.0000 - val_loss: 0.2075 - val_accuracy: 0.9469
Epoch 10/10
45/45 [==============================] - 0s 6ms/step - loss: 0.0110 - accuracy: 1.0000 - val_loss: 0.2015 - val_accuracy: 0.9500
Epoch 1/10
45/45 [==============================] - 1s 9ms/step - loss: 2.4010 - accuracy: 0.3874 - val_loss: 0.9141 - val_accuracy: 0.7000
Epoch 2/10
45/45 [==============================] - 0s 7ms/step - loss: 0.7456 - accuracy: 0.7663 - val_loss: 0.5889 - val_accuracy: 0.8062
Epoch 3/10
45/45 [==============================] - 0s 7ms/step - loss: 0.3749 - accuracy: 0.9113 - val_loss: 0.3908 - val_accuracy: 0.8781
Epoch 4/10
45/45 [==============================] - 0s 7ms/step - loss: 0.1923 - accuracy: 0.9694 - val_loss: 0.2747 - val_accuracy: 0.8969
Epoch 5/10
45/45 [==============================] - 0s 7ms/step - loss: 0.1132 - accuracy: 0.9871 - val_loss: 0.2336 - val_accuracy: 0.9281
Epoch 6/10
45/45 [==============================] - 0s 7ms/step - loss: 0.0662 - accuracy: 0.9937 - val_loss: 0.2369 - val_accuracy: 0.9062
Epoch 7/10
45/45 [==============================] - 0s 6ms/step - loss: 0.0419 - accuracy: 0.9990 - val_loss: 0.2026 - val_accuracy: 0.9406
Epoch 8/10
45/45 [==============================] - 0s 6ms/step - loss: 0.0250 - accuracy: 0.9993 - val_loss: 0.2055 - val_accuracy: 0.9406
Epoch 9/10
45/45 [==============================] - 0s 7ms/step - loss: 0.0180 - accuracy: 1.0000 - val_loss: 0.1853 - val_accuracy: 0.9344
Epoch 10/10
45/45 [==============================] - 0s 6ms/step - loss: 0.0120 - accuracy: 1.0000 - val_loss: 0.1871 - val_accuracy: 0.9312
Epoch 1/10
45/45 [==============================] - 1s 9ms/step - loss: 2.0665 - accuracy: 0.4211 - val_loss: 0.5312 - val_accuracy: 0.8156
Epoch 2/10
45/45 [==============================] - 0s 7ms/step - loss: 0.3373 - accuracy: 0.9030 - val_loss: 0.3433 - val_accuracy: 0.9031
Epoch 3/10
45/45 [==============================] - 0s 6ms/step - loss: 0.1310 - accuracy: 0.9791 - val_loss: 0.2772 - val_accuracy: 0.9125
Epoch 4/10
45/45 [==============================] - 0s 7ms/step - loss: 0.0733 - accuracy: 0.9955 - val_loss: 0.2638 - val_accuracy: 0.9187
Epoch 5/10
45/45 [==============================] - 0s 6ms/step - loss: 0.0448 - accuracy: 1.0000 - val_loss: 0.2496 - val_accuracy: 0.9281
Epoch 6/10
45/45 [==============================] - 0s 6ms/step - loss: 0.0293 - accuracy: 0.9997 - val_loss: 0.2377 - val_accuracy: 0.9312
Epoch 7/10
45/45 [==============================] - 0s 7ms/step - loss: 0.0232 - accuracy: 1.0000 - val_loss: 0.2428 - val_accuracy: 0.9312
Epoch 8/10
45/45 [==============================] - 0s 6ms/step - loss: 0.0165 - accuracy: 1.0000 - val_loss: 0.2363 - val_accuracy: 0.9312
Epoch 9/10
45/45 [==============================] - 0s 7ms/step - loss: 0.0123 - accuracy: 1.0000 - val_loss: 0.2325 - val_accuracy: 0.9312
Epoch 10/10
45/45 [==============================] - 0s 7ms/step - loss: 0.0106 - accuracy: 1.0000 - val_loss: 0.2233 - val_accuracy: 0.9375
Epoch 1/10
45/45 [==============================] - 1s 9ms/step - loss: 2.2188 - accuracy: 0.3826 - val_loss: 0.7416 - val_accuracy: 0.7563
Epoch 2/10
45/45 [==============================] - 0s 6ms/step - loss: 0.5199 - accuracy: 0.8513 - val_loss: 0.4587 - val_accuracy: 0.8656
Epoch 3/10
45/45 [==============================] - 0s 6ms/step - loss: 0.2311 - accuracy: 0.9584 - val_loss: 0.3136 - val_accuracy: 0.9375
Epoch 4/10
45/45 [==============================] - 0s 7ms/step - loss: 0.1234 - accuracy: 0.9838 - val_loss: 0.2402 - val_accuracy: 0.9469
Epoch 5/10
45/45 [==============================] - 0s 7ms/step - loss: 0.0727 - accuracy: 0.9968 - val_loss: 0.2090 - val_accuracy: 0.9500
Epoch 6/10
45/45 [==============================] - 0s 6ms/step - loss: 0.0448 - accuracy: 1.0000 - val_loss: 0.1904 - val_accuracy: 0.9531
Epoch 7/10
45/45 [==============================] - 0s 6ms/step - loss: 0.0323 - accuracy: 1.0000 - val_loss: 0.1819 - val_accuracy: 0.9531
Epoch 8/10
45/45 [==============================] - 0s 7ms/step - loss: 0.0230 - accuracy: 1.0000 - val_loss: 0.1719 - val_accuracy: 0.9594
Epoch 9/10
45/45 [==============================] - 0s 6ms/step - loss: 0.0167 - accuracy: 1.0000 - val_loss: 0.1676 - val_accuracy: 0.9469
Epoch 10/10
45/45 [==============================] - 0s 6ms/step - loss: 0.0133 - accuracy: 1.0000 - val_loss: 0.1618 - val_accuracy: 0.9469
Copy predictions = []
for i in range(n_estimators):
predictions.append(models[i].predict(test_features))
predictions = np.array(predictions)
predictions = predictions.sum(axis = 0)
pred_labels = predictions.argmax(axis=1)
Copy print("Accuracy : {}".format(accuracy_score(test_labels, pred_labels)))
Copy Accuracy : 0.8937875751503006
์ฑ๋ฅ์ด ์กฐ๊ธ ๋ ์์นํ ๊ฒ์ ๋ณผ ์ ์์ต๋๋ค.
VGG16 + Data Augmentation 0.89
VGG16 + Data Augmentation + Ensemble 0.90 (0.89~0.91)
ํ
์คํธ ๋ฐ์ดํฐ ์ ์ฉํ๊ธฐ
Copy test_path = './test/0'
test_images = []
for dirname, _, filenames in os.walk(test_path):
category = dirname.split('/')[-1]
for filename in filenames:
path = os.path.join(dirname, filename)
image = cv2.imread(path)
image = cv2.cvtColor(image, cv2.COLOR_BGR2RGB)
test_images.append(image)
test_images = np.array(test_images, dtype = 'float32')
test_features = model.predict(test_images)
Copy predictions = []
for i in range(n_estimators):
predictions.append(models[i].predict(test_features))
predictions = np.array(predictions)
predictions = predictions.sum(axis = 0)
pred = predictions.argmax(axis=1)
๋ชจ๋ธ์ ๊ฒฐ๊ณผ๊ฐ์ ๋ฐ์ดํฐํ๋ ์์ผ๋ก ๋ง๋ ํ csv ํ์ผ๋ก ์ ์ฅํฉ๋๋ค.
Copy def write_preds(pred, fname):
pd.DataFrame({"answer value": pred}).to_csv(fname, index=False, header=True)
write_preds(pred, "result.csv")
๊ฒฐ๊ณผํ์ผ๋ก ์ป์ csv ํ์ผ์ ๋ค์ด๋ก๋ ํฉ๋๋ค.
Copy from google.colab import files
files.download('result.csv')
์ํ๋ก๊ทธ
์ค ์ ํ์๊ฑฐ ๊ฐ์๋ฐ?? ์ฑ๋ฅ๋ ์ข๊ณ !! ๊ฒฐ๊ณผ๋?? ์ด๋ผ๊ณ ํ๋ค๋ฉด ์๊ฐ์ด ๋ถ์กฑํ๋ค. ๋ด๊ฐ ์์ดํ ์ผ์ด๊ธฐ์ ์ด์ง ์์ฝ๊ธด ํ๋ค. ์ ์ ์์ค์ผ๋ฉด ์ข์ ์ฑ์ ์ ๋ด์ง ์์์๊น ์๊ฐ. ์์ฌ์ด ๋ง์์ .ipynb
ํ์ผ์ด๋ผ๋ ์ ์ถํ๋ค. ๋ฆฌ๋๋ณด๋ ์ฑ์ ์ 0์ ๋ ์๋๋ค. ๊ทธ๋ฅ ๋ฏธ์ ์ถ.
๊ฑฐ์ ๋ค ํ๋๋ฐ ๋ง์ง๋ง์ ๋ณ์๋ช
์ ํท๊ฐ๋ฆฌ๊ฒ ์จ์ ํ์ฐธ ๋์ ๋งํ ์๋ฌ๋ฅผ ํด๊ฒฐํ์ง ๋ชปํ๋ค. ๋๋๊ณ 30๋ถ์ด ์ง๋ ํ์์ผ ์๋ฌ๋ฅผ ํด๊ฒฐํ๋ค.
ํฐ ์์ฌ์์ ์์ง๋ง, ๊ทธ๋๋ ์ด๋ฐ ๊ณผ์ ๋ฅผ ์ฒ์ ํด๋ณด๋๋ฐ ๋๋ฆ ์ ํ๊ฒ ๊ฐ๋ค. ์ต๊ทผ์ ๋ฐฐ์ด ์์๋ธ๊ณผ Image Augmentation์ ํ์ฉํ ์ ์์ด์ ์ข์๋ค. ๊ทธ๋ฆฌ๊ณ VGG16 ๋ชจ๋ธ๋ ์ฒ์ ์ฌ์ฉํด๋ดค๋ค. ๊ทธ๋ฆฌ๊ณ ์ฑ๋ฅ๋ ๊ฝค ์ค์ํด์ ์ข์๋ค.
๋ค์์๋ ํ๋ฒ์ ๊ธด ์ฝ๋๋ฅผ ์ธ ์ ์์ ์ ๋๋ก(๊ทธ๋งํผ ๊ฒ์์ ์ ๊ฒ ํ ์ ๋๋ก) ์ต์ํด์ง๊ณ ๋ฐํํด์ง๊ณ ์ถ๋ค! ๋ค์ Dev-matching ์ ์ ์ทจ์
์ ํ ๊ฒ์ด๊ธฐ ๋๋ฌธ์ ๋ค์ ์ํ์ ๋ชป๋ณด๋ ๊ฒ์ด ์์ฝ์ง๋ง!
๋๋์ (?)์ด ๋ช ๊ฐ ์์ด์ ๋์ ์ด๊ณ ๊ฐ๋ค
์๊ฐ๋ณด๋ค ๋ฆฌ๋๋ณด๋ 98์ ์ด์์ด ์๋๋ฃฉ ํ๋ค. 100์ ๋ ๋งํ๊ฐ์๋ 10๋ช
์ด์์ ์๋ ๊ฒ ๊ฐ๋ค.
์ด๋ฏธ 4์๊ฐ์ด ์ง๋ ์์ ์ 100์ ์ด 3๋ช
์ด์๋ค.
๋๋ ๋ช์ ์ด์์๊น ใ
ใ
๊ทธ๋๋ ๊ณ ๋์ ์์์๋ ๋ชป๋ค์์ ๊ฒ ๊ฐ๋ค. ์ด๋ฏธ ํ
์คํธ๋ ๊ฒ์ฆ ๋ฐ์ดํฐ ์ฑ๋ฅ์ด 90์ ๋์ด๋๊น?
private test data์ ๋ํ 100์ ์ ํ๋ช
์ด์๋ค. ๊ทผ๋ฐ ์ด ํ๋ช
์ด ๋ฆฌ๋๋ณด๋์์ 100์ ์ ์๋์๋ ๊ฒ ๊ฐ๋ค.
colab์์ ์์๋ธ ๊ตฌํํ ๋ RAM์ด ๋๋ฌด ๋ถ์กฑํ๋ค. ์ด๊ฒ ๋๋ฌธ์ ์ก์๋จน์ ์๊ฐ์ด 1์๊ฐ์ ๋๋ ๊ฒ ๊ฐ๋ค. ์๊พธ ์ธ์
์ด ์ด๊ธฐํ๋ผ์ ๋๋ฌด ํ๋ค์๋ค.
๋ถ๋ฅ๊ธฐ ๊ฐฏ์๋ 10๊ฐ๋ก ๋ง์ด ์ค์ ํ๋๋ฐ ๊ทธ๋์ epoch๋ฅผ 10๊น์ง ๋ฐ์ ๋ชปํ๋ค.
epoch = 12๋ ์ธ์
์ด ์ด๊ธฐํ๋๋ค ใ
ใ
Image Augmentation์ ๋๋ฆด ๋๋ ๋ง์ ๋ฐ์ดํฐ๋ฅผ ์ฒ๋ฆฌํ๊ธฐ์๋ ๋๋ฌด ๋ถ์กฑํ๋ค.
๋ง ์ฅ ์ ๋๋ฅผ ๋๋ ธ๋๋ฐ ๋์ง ๋ชปํ๋ค.
ํฉ์ ๋ณธ๊ฒ์ 4,500์ฅ ์ ๋...
์๊ฐ๋ณด๋ค ๋ฌธ์ ๊ฐ ์ฌ์ ๋ค.
๋ ๋ชจ๋ธ ๊ตฌํํ๋ ์ฝ๋๋ฅผ ์ด๋ฒ์ ์ฒ์ ์์ฑํด๋ดค๋ค. ๋ฌผ๋ก ๊ตฌ๊ธ๋ง๊ณผ ๊ฐ์ข
์ฝ๋๋ฅผ ์ทจํฉํ ๊ฒ์ด๊ธด ํ๋ฐ.. ๋๊ตฌ๋ ์๊ทธ๋ด๊น?
๋ ๋ง์ด ๊ณต๋ถํด์ผ๊ฒ ๋ค. ํ์ดํ
!