Dogs Vs Cats With Transfer Learning
Table of Contents
Beginning
Imports
Python
from pathlib import Path
PyPi
from tensorflow.keras import layers
from tensorflow.keras.applications.inception_v3 import InceptionV3
from tensorflow.keras.optimizers import RMSprop
from tensorflow.keras.preprocessing.image import ImageDataGenerator
import tensorflow
My Stuff
from graeae import SubPathLoader, EmbedHoloviews, Timer
Set Up
The Timer
TIMER = Timer()
Tensorflow
tensorflow.enable_eager_execution()
Paths
MODELS = Path("~/models/dogs-vs-cats/").expanduser()
The Datasets
environment = SubPathLoader("DATASETS")
base_path = Path(environment["DOGS_VS_CATS"]).expanduser()
for item in base_path.iterdir():
print(item)
WARNING: Logging before flag parsing goes to stderr. I0803 17:35:32.889567 139918777980736 environment.py:35] Environment Path: /home/athena/.env I0803 17:35:32.890873 139918777980736 environment.py:90] Environment Path: /home/athena/.config/datasets/env /home/athena/data/datasets/images/dogs-vs-cats/train /home/athena/data/datasets/images/dogs-vs-cats/exercise /home/athena/data/datasets/images/dogs-vs-cats/test1
training_path = base_path/"train"
testing_path = base_path/"test"
for item in training_path.iterdir():
print(item)
/home/athena/data/datasets/images/dogs-vs-cats/train/dogs /home/athena/data/datasets/images/dogs-vs-cats/train/cats
Note: The download from kaggle has all the training files in one folder with the files labeled with either cat or dog, I made the subfolders and moved the files to make it work better with the Data Generator.
Middle
The Pre-built Model
The Broken Version
Note: This doesn't work, there's a bug (related to this one, I think). You need to pass a tags={train} argument to one of the methods called by the KerasLayer but the KerasLayer isn't defined in a way that it gets passed through. Ooops…
This is going to use the Keras version of tensorflow hub and the tensorflow version 1.x model (the URL determines which model we're using). There is a different one for tensorflow 2.0. First create an extraction layer from the prebuilt one and make it untrainable.
model_url = "https://tfhub.dev/google/imagenet/inception_v3/feature_vector/3"
input_shape = (150, 150, 3)
feature_extraction_layer = tensorflow_hub.KerasLayer(model_url,
input_shape=input_shape)
Take Two
This uses the InceptionV3 pre-trained model. By default it has weights trained on imagenet. The example on coursera uses weights downloaded from the web, but I'll try the defaults instead.
input_shape = (150, 150, 3)
base_model = InceptionV3(
input_shape = input_shape,
include_top=False
)
Actually, looking at the urls it looks like the example from the course is downloading the same weights just from a different place.
for layer in base_model.layers:
layer.trainable = False
Freeze the Model
base_model.trainable = False
print(base_model.summary())
Model: "inception_v3"
__________________________________________________________________________________________________
Layer (type) Output Shape Param # Connected to
==================================================================================================
input_2 (InputLayer) [(None, 150, 150, 3) 0
__________________________________________________________________________________________________
conv2d_94 (Conv2D) (None, 74, 74, 32) 864 input_2[0][0]
__________________________________________________________________________________________________
batch_normalization_94 (BatchNo (None, 74, 74, 32) 96 conv2d_94[0][0]
__________________________________________________________________________________________________
activation_94 (Activation) (None, 74, 74, 32) 0 batch_normalization_94[0][0]
__________________________________________________________________________________________________
conv2d_95 (Conv2D) (None, 72, 72, 32) 9216 activation_94[0][0]
__________________________________________________________________________________________________
batch_normalization_95 (BatchNo (None, 72, 72, 32) 96 conv2d_95[0][0]
__________________________________________________________________________________________________
activation_95 (Activation) (None, 72, 72, 32) 0 batch_normalization_95[0][0]
__________________________________________________________________________________________________
conv2d_96 (Conv2D) (None, 72, 72, 64) 18432 activation_95[0][0]
__________________________________________________________________________________________________
batch_normalization_96 (BatchNo (None, 72, 72, 64) 192 conv2d_96[0][0]
__________________________________________________________________________________________________
activation_96 (Activation) (None, 72, 72, 64) 0 batch_normalization_96[0][0]
__________________________________________________________________________________________________
max_pooling2d_4 (MaxPooling2D) (None, 35, 35, 64) 0 activation_96[0][0]
__________________________________________________________________________________________________
conv2d_97 (Conv2D) (None, 35, 35, 80) 5120 max_pooling2d_4[0][0]
__________________________________________________________________________________________________
batch_normalization_97 (BatchNo (None, 35, 35, 80) 240 conv2d_97[0][0]
__________________________________________________________________________________________________
activation_97 (Activation) (None, 35, 35, 80) 0 batch_normalization_97[0][0]
__________________________________________________________________________________________________
conv2d_98 (Conv2D) (None, 33, 33, 192) 138240 activation_97[0][0]
__________________________________________________________________________________________________
batch_normalization_98 (BatchNo (None, 33, 33, 192) 576 conv2d_98[0][0]
__________________________________________________________________________________________________
activation_98 (Activation) (None, 33, 33, 192) 0 batch_normalization_98[0][0]
__________________________________________________________________________________________________
max_pooling2d_5 (MaxPooling2D) (None, 16, 16, 192) 0 activation_98[0][0]
__________________________________________________________________________________________________
conv2d_102 (Conv2D) (None, 16, 16, 64) 12288 max_pooling2d_5[0][0]
__________________________________________________________________________________________________
batch_normalization_102 (BatchN (None, 16, 16, 64) 192 conv2d_102[0][0]
__________________________________________________________________________________________________
activation_102 (Activation) (None, 16, 16, 64) 0 batch_normalization_102[0][0]
__________________________________________________________________________________________________
conv2d_100 (Conv2D) (None, 16, 16, 48) 9216 max_pooling2d_5[0][0]
__________________________________________________________________________________________________
conv2d_103 (Conv2D) (None, 16, 16, 96) 55296 activation_102[0][0]
__________________________________________________________________________________________________
batch_normalization_100 (BatchN (None, 16, 16, 48) 144 conv2d_100[0][0]
__________________________________________________________________________________________________
batch_normalization_103 (BatchN (None, 16, 16, 96) 288 conv2d_103[0][0]
__________________________________________________________________________________________________
activation_100 (Activation) (None, 16, 16, 48) 0 batch_normalization_100[0][0]
__________________________________________________________________________________________________
activation_103 (Activation) (None, 16, 16, 96) 0 batch_normalization_103[0][0]
__________________________________________________________________________________________________
average_pooling2d_9 (AveragePoo (None, 16, 16, 192) 0 max_pooling2d_5[0][0]
__________________________________________________________________________________________________
conv2d_99 (Conv2D) (None, 16, 16, 64) 12288 max_pooling2d_5[0][0]
__________________________________________________________________________________________________
conv2d_101 (Conv2D) (None, 16, 16, 64) 76800 activation_100[0][0]
__________________________________________________________________________________________________
conv2d_104 (Conv2D) (None, 16, 16, 96) 82944 activation_103[0][0]
__________________________________________________________________________________________________
conv2d_105 (Conv2D) (None, 16, 16, 32) 6144 average_pooling2d_9[0][0]
__________________________________________________________________________________________________
batch_normalization_99 (BatchNo (None, 16, 16, 64) 192 conv2d_99[0][0]
__________________________________________________________________________________________________
batch_normalization_101 (BatchN (None, 16, 16, 64) 192 conv2d_101[0][0]
__________________________________________________________________________________________________
batch_normalization_104 (BatchN (None, 16, 16, 96) 288 conv2d_104[0][0]
__________________________________________________________________________________________________
batch_normalization_105 (BatchN (None, 16, 16, 32) 96 conv2d_105[0][0]
__________________________________________________________________________________________________
activation_99 (Activation) (None, 16, 16, 64) 0 batch_normalization_99[0][0]
__________________________________________________________________________________________________
activation_101 (Activation) (None, 16, 16, 64) 0 batch_normalization_101[0][0]
__________________________________________________________________________________________________
activation_104 (Activation) (None, 16, 16, 96) 0 batch_normalization_104[0][0]
__________________________________________________________________________________________________
activation_105 (Activation) (None, 16, 16, 32) 0 batch_normalization_105[0][0]
__________________________________________________________________________________________________
mixed0 (Concatenate) (None, 16, 16, 256) 0 activation_99[0][0]
activation_101[0][0]
activation_104[0][0]
activation_105[0][0]
__________________________________________________________________________________________________
conv2d_109 (Conv2D) (None, 16, 16, 64) 16384 mixed0[0][0]
__________________________________________________________________________________________________
batch_normalization_109 (BatchN (None, 16, 16, 64) 192 conv2d_109[0][0]
__________________________________________________________________________________________________
activation_109 (Activation) (None, 16, 16, 64) 0 batch_normalization_109[0][0]
__________________________________________________________________________________________________
conv2d_107 (Conv2D) (None, 16, 16, 48) 12288 mixed0[0][0]
__________________________________________________________________________________________________
conv2d_110 (Conv2D) (None, 16, 16, 96) 55296 activation_109[0][0]
__________________________________________________________________________________________________
batch_normalization_107 (BatchN (None, 16, 16, 48) 144 conv2d_107[0][0]
__________________________________________________________________________________________________
batch_normalization_110 (BatchN (None, 16, 16, 96) 288 conv2d_110[0][0]
__________________________________________________________________________________________________
activation_107 (Activation) (None, 16, 16, 48) 0 batch_normalization_107[0][0]
__________________________________________________________________________________________________
activation_110 (Activation) (None, 16, 16, 96) 0 batch_normalization_110[0][0]
__________________________________________________________________________________________________
average_pooling2d_10 (AveragePo (None, 16, 16, 256) 0 mixed0[0][0]
__________________________________________________________________________________________________
conv2d_106 (Conv2D) (None, 16, 16, 64) 16384 mixed0[0][0]
__________________________________________________________________________________________________
conv2d_108 (Conv2D) (None, 16, 16, 64) 76800 activation_107[0][0]
__________________________________________________________________________________________________
conv2d_111 (Conv2D) (None, 16, 16, 96) 82944 activation_110[0][0]
__________________________________________________________________________________________________
conv2d_112 (Conv2D) (None, 16, 16, 64) 16384 average_pooling2d_10[0][0]
__________________________________________________________________________________________________
batch_normalization_106 (BatchN (None, 16, 16, 64) 192 conv2d_106[0][0]
__________________________________________________________________________________________________
batch_normalization_108 (BatchN (None, 16, 16, 64) 192 conv2d_108[0][0]
__________________________________________________________________________________________________
batch_normalization_111 (BatchN (None, 16, 16, 96) 288 conv2d_111[0][0]
__________________________________________________________________________________________________
batch_normalization_112 (BatchN (None, 16, 16, 64) 192 conv2d_112[0][0]
__________________________________________________________________________________________________
activation_106 (Activation) (None, 16, 16, 64) 0 batch_normalization_106[0][0]
__________________________________________________________________________________________________
activation_108 (Activation) (None, 16, 16, 64) 0 batch_normalization_108[0][0]
__________________________________________________________________________________________________
activation_111 (Activation) (None, 16, 16, 96) 0 batch_normalization_111[0][0]
__________________________________________________________________________________________________
activation_112 (Activation) (None, 16, 16, 64) 0 batch_normalization_112[0][0]
__________________________________________________________________________________________________
mixed1 (Concatenate) (None, 16, 16, 288) 0 activation_106[0][0]
activation_108[0][0]
activation_111[0][0]
activation_112[0][0]
__________________________________________________________________________________________________
conv2d_116 (Conv2D) (None, 16, 16, 64) 18432 mixed1[0][0]
__________________________________________________________________________________________________
batch_normalization_116 (BatchN (None, 16, 16, 64) 192 conv2d_116[0][0]
__________________________________________________________________________________________________
activation_116 (Activation) (None, 16, 16, 64) 0 batch_normalization_116[0][0]
__________________________________________________________________________________________________
conv2d_114 (Conv2D) (None, 16, 16, 48) 13824 mixed1[0][0]
__________________________________________________________________________________________________
conv2d_117 (Conv2D) (None, 16, 16, 96) 55296 activation_116[0][0]
__________________________________________________________________________________________________
batch_normalization_114 (BatchN (None, 16, 16, 48) 144 conv2d_114[0][0]
__________________________________________________________________________________________________
batch_normalization_117 (BatchN (None, 16, 16, 96) 288 conv2d_117[0][0]
__________________________________________________________________________________________________
activation_114 (Activation) (None, 16, 16, 48) 0 batch_normalization_114[0][0]
__________________________________________________________________________________________________
activation_117 (Activation) (None, 16, 16, 96) 0 batch_normalization_117[0][0]
__________________________________________________________________________________________________
average_pooling2d_11 (AveragePo (None, 16, 16, 288) 0 mixed1[0][0]
__________________________________________________________________________________________________
conv2d_113 (Conv2D) (None, 16, 16, 64) 18432 mixed1[0][0]
__________________________________________________________________________________________________
conv2d_115 (Conv2D) (None, 16, 16, 64) 76800 activation_114[0][0]
__________________________________________________________________________________________________
conv2d_118 (Conv2D) (None, 16, 16, 96) 82944 activation_117[0][0]
__________________________________________________________________________________________________
conv2d_119 (Conv2D) (None, 16, 16, 64) 18432 average_pooling2d_11[0][0]
__________________________________________________________________________________________________
batch_normalization_113 (BatchN (None, 16, 16, 64) 192 conv2d_113[0][0]
__________________________________________________________________________________________________
batch_normalization_115 (BatchN (None, 16, 16, 64) 192 conv2d_115[0][0]
__________________________________________________________________________________________________
batch_normalization_118 (BatchN (None, 16, 16, 96) 288 conv2d_118[0][0]
__________________________________________________________________________________________________
batch_normalization_119 (BatchN (None, 16, 16, 64) 192 conv2d_119[0][0]
__________________________________________________________________________________________________
activation_113 (Activation) (None, 16, 16, 64) 0 batch_normalization_113[0][0]
__________________________________________________________________________________________________
activation_115 (Activation) (None, 16, 16, 64) 0 batch_normalization_115[0][0]
__________________________________________________________________________________________________
activation_118 (Activation) (None, 16, 16, 96) 0 batch_normalization_118[0][0]
__________________________________________________________________________________________________
activation_119 (Activation) (None, 16, 16, 64) 0 batch_normalization_119[0][0]
__________________________________________________________________________________________________
mixed2 (Concatenate) (None, 16, 16, 288) 0 activation_113[0][0]
activation_115[0][0]
activation_118[0][0]
activation_119[0][0]
__________________________________________________________________________________________________
conv2d_121 (Conv2D) (None, 16, 16, 64) 18432 mixed2[0][0]
__________________________________________________________________________________________________
batch_normalization_121 (BatchN (None, 16, 16, 64) 192 conv2d_121[0][0]
__________________________________________________________________________________________________
activation_121 (Activation) (None, 16, 16, 64) 0 batch_normalization_121[0][0]
__________________________________________________________________________________________________
conv2d_122 (Conv2D) (None, 16, 16, 96) 55296 activation_121[0][0]
__________________________________________________________________________________________________
batch_normalization_122 (BatchN (None, 16, 16, 96) 288 conv2d_122[0][0]
__________________________________________________________________________________________________
activation_122 (Activation) (None, 16, 16, 96) 0 batch_normalization_122[0][0]
__________________________________________________________________________________________________
conv2d_120 (Conv2D) (None, 7, 7, 384) 995328 mixed2[0][0]
__________________________________________________________________________________________________
conv2d_123 (Conv2D) (None, 7, 7, 96) 82944 activation_122[0][0]
__________________________________________________________________________________________________
batch_normalization_120 (BatchN (None, 7, 7, 384) 1152 conv2d_120[0][0]
__________________________________________________________________________________________________
batch_normalization_123 (BatchN (None, 7, 7, 96) 288 conv2d_123[0][0]
__________________________________________________________________________________________________
activation_120 (Activation) (None, 7, 7, 384) 0 batch_normalization_120[0][0]
__________________________________________________________________________________________________
activation_123 (Activation) (None, 7, 7, 96) 0 batch_normalization_123[0][0]
__________________________________________________________________________________________________
max_pooling2d_6 (MaxPooling2D) (None, 7, 7, 288) 0 mixed2[0][0]
__________________________________________________________________________________________________
mixed3 (Concatenate) (None, 7, 7, 768) 0 activation_120[0][0]
activation_123[0][0]
max_pooling2d_6[0][0]
__________________________________________________________________________________________________
conv2d_128 (Conv2D) (None, 7, 7, 128) 98304 mixed3[0][0]
__________________________________________________________________________________________________
batch_normalization_128 (BatchN (None, 7, 7, 128) 384 conv2d_128[0][0]
__________________________________________________________________________________________________
activation_128 (Activation) (None, 7, 7, 128) 0 batch_normalization_128[0][0]
__________________________________________________________________________________________________
conv2d_129 (Conv2D) (None, 7, 7, 128) 114688 activation_128[0][0]
__________________________________________________________________________________________________
batch_normalization_129 (BatchN (None, 7, 7, 128) 384 conv2d_129[0][0]
__________________________________________________________________________________________________
activation_129 (Activation) (None, 7, 7, 128) 0 batch_normalization_129[0][0]
__________________________________________________________________________________________________
conv2d_125 (Conv2D) (None, 7, 7, 128) 98304 mixed3[0][0]
__________________________________________________________________________________________________
conv2d_130 (Conv2D) (None, 7, 7, 128) 114688 activation_129[0][0]
__________________________________________________________________________________________________
batch_normalization_125 (BatchN (None, 7, 7, 128) 384 conv2d_125[0][0]
__________________________________________________________________________________________________
batch_normalization_130 (BatchN (None, 7, 7, 128) 384 conv2d_130[0][0]
__________________________________________________________________________________________________
activation_125 (Activation) (None, 7, 7, 128) 0 batch_normalization_125[0][0]
__________________________________________________________________________________________________
activation_130 (Activation) (None, 7, 7, 128) 0 batch_normalization_130[0][0]
__________________________________________________________________________________________________
conv2d_126 (Conv2D) (None, 7, 7, 128) 114688 activation_125[0][0]
__________________________________________________________________________________________________
conv2d_131 (Conv2D) (None, 7, 7, 128) 114688 activation_130[0][0]
__________________________________________________________________________________________________
batch_normalization_126 (BatchN (None, 7, 7, 128) 384 conv2d_126[0][0]
__________________________________________________________________________________________________
batch_normalization_131 (BatchN (None, 7, 7, 128) 384 conv2d_131[0][0]
__________________________________________________________________________________________________
activation_126 (Activation) (None, 7, 7, 128) 0 batch_normalization_126[0][0]
__________________________________________________________________________________________________
activation_131 (Activation) (None, 7, 7, 128) 0 batch_normalization_131[0][0]
__________________________________________________________________________________________________
average_pooling2d_12 (AveragePo (None, 7, 7, 768) 0 mixed3[0][0]
__________________________________________________________________________________________________
conv2d_124 (Conv2D) (None, 7, 7, 192) 147456 mixed3[0][0]
__________________________________________________________________________________________________
conv2d_127 (Conv2D) (None, 7, 7, 192) 172032 activation_126[0][0]
__________________________________________________________________________________________________
conv2d_132 (Conv2D) (None, 7, 7, 192) 172032 activation_131[0][0]
__________________________________________________________________________________________________
conv2d_133 (Conv2D) (None, 7, 7, 192) 147456 average_pooling2d_12[0][0]
__________________________________________________________________________________________________
batch_normalization_124 (BatchN (None, 7, 7, 192) 576 conv2d_124[0][0]
__________________________________________________________________________________________________
batch_normalization_127 (BatchN (None, 7, 7, 192) 576 conv2d_127[0][0]
__________________________________________________________________________________________________
batch_normalization_132 (BatchN (None, 7, 7, 192) 576 conv2d_132[0][0]
__________________________________________________________________________________________________
batch_normalization_133 (BatchN (None, 7, 7, 192) 576 conv2d_133[0][0]
__________________________________________________________________________________________________
activation_124 (Activation) (None, 7, 7, 192) 0 batch_normalization_124[0][0]
__________________________________________________________________________________________________
activation_127 (Activation) (None, 7, 7, 192) 0 batch_normalization_127[0][0]
__________________________________________________________________________________________________
activation_132 (Activation) (None, 7, 7, 192) 0 batch_normalization_132[0][0]
__________________________________________________________________________________________________
activation_133 (Activation) (None, 7, 7, 192) 0 batch_normalization_133[0][0]
__________________________________________________________________________________________________
mixed4 (Concatenate) (None, 7, 7, 768) 0 activation_124[0][0]
activation_127[0][0]
activation_132[0][0]
activation_133[0][0]
__________________________________________________________________________________________________
conv2d_138 (Conv2D) (None, 7, 7, 160) 122880 mixed4[0][0]
__________________________________________________________________________________________________
batch_normalization_138 (BatchN (None, 7, 7, 160) 480 conv2d_138[0][0]
__________________________________________________________________________________________________
activation_138 (Activation) (None, 7, 7, 160) 0 batch_normalization_138[0][0]
__________________________________________________________________________________________________
conv2d_139 (Conv2D) (None, 7, 7, 160) 179200 activation_138[0][0]
__________________________________________________________________________________________________
batch_normalization_139 (BatchN (None, 7, 7, 160) 480 conv2d_139[0][0]
__________________________________________________________________________________________________
activation_139 (Activation) (None, 7, 7, 160) 0 batch_normalization_139[0][0]
__________________________________________________________________________________________________
conv2d_135 (Conv2D) (None, 7, 7, 160) 122880 mixed4[0][0]
__________________________________________________________________________________________________
conv2d_140 (Conv2D) (None, 7, 7, 160) 179200 activation_139[0][0]
__________________________________________________________________________________________________
batch_normalization_135 (BatchN (None, 7, 7, 160) 480 conv2d_135[0][0]
__________________________________________________________________________________________________
batch_normalization_140 (BatchN (None, 7, 7, 160) 480 conv2d_140[0][0]
__________________________________________________________________________________________________
activation_135 (Activation) (None, 7, 7, 160) 0 batch_normalization_135[0][0]
__________________________________________________________________________________________________
activation_140 (Activation) (None, 7, 7, 160) 0 batch_normalization_140[0][0]
__________________________________________________________________________________________________
conv2d_136 (Conv2D) (None, 7, 7, 160) 179200 activation_135[0][0]
__________________________________________________________________________________________________
conv2d_141 (Conv2D) (None, 7, 7, 160) 179200 activation_140[0][0]
__________________________________________________________________________________________________
batch_normalization_136 (BatchN (None, 7, 7, 160) 480 conv2d_136[0][0]
__________________________________________________________________________________________________
batch_normalization_141 (BatchN (None, 7, 7, 160) 480 conv2d_141[0][0]
__________________________________________________________________________________________________
activation_136 (Activation) (None, 7, 7, 160) 0 batch_normalization_136[0][0]
__________________________________________________________________________________________________
activation_141 (Activation) (None, 7, 7, 160) 0 batch_normalization_141[0][0]
__________________________________________________________________________________________________
average_pooling2d_13 (AveragePo (None, 7, 7, 768) 0 mixed4[0][0]
__________________________________________________________________________________________________
conv2d_134 (Conv2D) (None, 7, 7, 192) 147456 mixed4[0][0]
__________________________________________________________________________________________________
conv2d_137 (Conv2D) (None, 7, 7, 192) 215040 activation_136[0][0]
__________________________________________________________________________________________________
conv2d_142 (Conv2D) (None, 7, 7, 192) 215040 activation_141[0][0]
__________________________________________________________________________________________________
conv2d_143 (Conv2D) (None, 7, 7, 192) 147456 average_pooling2d_13[0][0]
__________________________________________________________________________________________________
batch_normalization_134 (BatchN (None, 7, 7, 192) 576 conv2d_134[0][0]
__________________________________________________________________________________________________
batch_normalization_137 (BatchN (None, 7, 7, 192) 576 conv2d_137[0][0]
__________________________________________________________________________________________________
batch_normalization_142 (BatchN (None, 7, 7, 192) 576 conv2d_142[0][0]
__________________________________________________________________________________________________
batch_normalization_143 (BatchN (None, 7, 7, 192) 576 conv2d_143[0][0]
__________________________________________________________________________________________________
activation_134 (Activation) (None, 7, 7, 192) 0 batch_normalization_134[0][0]
__________________________________________________________________________________________________
activation_137 (Activation) (None, 7, 7, 192) 0 batch_normalization_137[0][0]
__________________________________________________________________________________________________
activation_142 (Activation) (None, 7, 7, 192) 0 batch_normalization_142[0][0]
__________________________________________________________________________________________________
activation_143 (Activation) (None, 7, 7, 192) 0 batch_normalization_143[0][0]
__________________________________________________________________________________________________
mixed5 (Concatenate) (None, 7, 7, 768) 0 activation_134[0][0]
activation_137[0][0]
activation_142[0][0]
activation_143[0][0]
__________________________________________________________________________________________________
conv2d_148 (Conv2D) (None, 7, 7, 160) 122880 mixed5[0][0]
__________________________________________________________________________________________________
batch_normalization_148 (BatchN (None, 7, 7, 160) 480 conv2d_148[0][0]
__________________________________________________________________________________________________
activation_148 (Activation) (None, 7, 7, 160) 0 batch_normalization_148[0][0]
__________________________________________________________________________________________________
conv2d_149 (Conv2D) (None, 7, 7, 160) 179200 activation_148[0][0]
__________________________________________________________________________________________________
batch_normalization_149 (BatchN (None, 7, 7, 160) 480 conv2d_149[0][0]
__________________________________________________________________________________________________
activation_149 (Activation) (None, 7, 7, 160) 0 batch_normalization_149[0][0]
__________________________________________________________________________________________________
conv2d_145 (Conv2D) (None, 7, 7, 160) 122880 mixed5[0][0]
__________________________________________________________________________________________________
conv2d_150 (Conv2D) (None, 7, 7, 160) 179200 activation_149[0][0]
__________________________________________________________________________________________________
batch_normalization_145 (BatchN (None, 7, 7, 160) 480 conv2d_145[0][0]
__________________________________________________________________________________________________
batch_normalization_150 (BatchN (None, 7, 7, 160) 480 conv2d_150[0][0]
__________________________________________________________________________________________________
activation_145 (Activation) (None, 7, 7, 160) 0 batch_normalization_145[0][0]
__________________________________________________________________________________________________
activation_150 (Activation) (None, 7, 7, 160) 0 batch_normalization_150[0][0]
__________________________________________________________________________________________________
conv2d_146 (Conv2D) (None, 7, 7, 160) 179200 activation_145[0][0]
__________________________________________________________________________________________________
conv2d_151 (Conv2D) (None, 7, 7, 160) 179200 activation_150[0][0]
__________________________________________________________________________________________________
batch_normalization_146 (BatchN (None, 7, 7, 160) 480 conv2d_146[0][0]
__________________________________________________________________________________________________
batch_normalization_151 (BatchN (None, 7, 7, 160) 480 conv2d_151[0][0]
__________________________________________________________________________________________________
activation_146 (Activation) (None, 7, 7, 160) 0 batch_normalization_146[0][0]
__________________________________________________________________________________________________
activation_151 (Activation) (None, 7, 7, 160) 0 batch_normalization_151[0][0]
__________________________________________________________________________________________________
average_pooling2d_14 (AveragePo (None, 7, 7, 768) 0 mixed5[0][0]
__________________________________________________________________________________________________
conv2d_144 (Conv2D) (None, 7, 7, 192) 147456 mixed5[0][0]
__________________________________________________________________________________________________
conv2d_147 (Conv2D) (None, 7, 7, 192) 215040 activation_146[0][0]
__________________________________________________________________________________________________
conv2d_152 (Conv2D) (None, 7, 7, 192) 215040 activation_151[0][0]
__________________________________________________________________________________________________
conv2d_153 (Conv2D) (None, 7, 7, 192) 147456 average_pooling2d_14[0][0]
__________________________________________________________________________________________________
batch_normalization_144 (BatchN (None, 7, 7, 192) 576 conv2d_144[0][0]
__________________________________________________________________________________________________
batch_normalization_147 (BatchN (None, 7, 7, 192) 576 conv2d_147[0][0]
__________________________________________________________________________________________________
batch_normalization_152 (BatchN (None, 7, 7, 192) 576 conv2d_152[0][0]
__________________________________________________________________________________________________
batch_normalization_153 (BatchN (None, 7, 7, 192) 576 conv2d_153[0][0]
__________________________________________________________________________________________________
activation_144 (Activation) (None, 7, 7, 192) 0 batch_normalization_144[0][0]
__________________________________________________________________________________________________
activation_147 (Activation) (None, 7, 7, 192) 0 batch_normalization_147[0][0]
__________________________________________________________________________________________________
activation_152 (Activation) (None, 7, 7, 192) 0 batch_normalization_152[0][0]
__________________________________________________________________________________________________
activation_153 (Activation) (None, 7, 7, 192) 0 batch_normalization_153[0][0]
__________________________________________________________________________________________________
mixed6 (Concatenate) (None, 7, 7, 768) 0 activation_144[0][0]
activation_147[0][0]
activation_152[0][0]
activation_153[0][0]
__________________________________________________________________________________________________
conv2d_158 (Conv2D) (None, 7, 7, 192) 147456 mixed6[0][0]
__________________________________________________________________________________________________
batch_normalization_158 (BatchN (None, 7, 7, 192) 576 conv2d_158[0][0]
__________________________________________________________________________________________________
activation_158 (Activation) (None, 7, 7, 192) 0 batch_normalization_158[0][0]
__________________________________________________________________________________________________
conv2d_159 (Conv2D) (None, 7, 7, 192) 258048 activation_158[0][0]
__________________________________________________________________________________________________
batch_normalization_159 (BatchN (None, 7, 7, 192) 576 conv2d_159[0][0]
__________________________________________________________________________________________________
activation_159 (Activation) (None, 7, 7, 192) 0 batch_normalization_159[0][0]
__________________________________________________________________________________________________
conv2d_155 (Conv2D) (None, 7, 7, 192) 147456 mixed6[0][0]
__________________________________________________________________________________________________
conv2d_160 (Conv2D) (None, 7, 7, 192) 258048 activation_159[0][0]
__________________________________________________________________________________________________
batch_normalization_155 (BatchN (None, 7, 7, 192) 576 conv2d_155[0][0]
__________________________________________________________________________________________________
batch_normalization_160 (BatchN (None, 7, 7, 192) 576 conv2d_160[0][0]
__________________________________________________________________________________________________
activation_155 (Activation) (None, 7, 7, 192) 0 batch_normalization_155[0][0]
__________________________________________________________________________________________________
activation_160 (Activation) (None, 7, 7, 192) 0 batch_normalization_160[0][0]
__________________________________________________________________________________________________
conv2d_156 (Conv2D) (None, 7, 7, 192) 258048 activation_155[0][0]
__________________________________________________________________________________________________
conv2d_161 (Conv2D) (None, 7, 7, 192) 258048 activation_160[0][0]
__________________________________________________________________________________________________
batch_normalization_156 (BatchN (None, 7, 7, 192) 576 conv2d_156[0][0]
__________________________________________________________________________________________________
batch_normalization_161 (BatchN (None, 7, 7, 192) 576 conv2d_161[0][0]
__________________________________________________________________________________________________
activation_156 (Activation) (None, 7, 7, 192) 0 batch_normalization_156[0][0]
__________________________________________________________________________________________________
activation_161 (Activation) (None, 7, 7, 192) 0 batch_normalization_161[0][0]
__________________________________________________________________________________________________
average_pooling2d_15 (AveragePo (None, 7, 7, 768) 0 mixed6[0][0]
__________________________________________________________________________________________________
conv2d_154 (Conv2D) (None, 7, 7, 192) 147456 mixed6[0][0]
__________________________________________________________________________________________________
conv2d_157 (Conv2D) (None, 7, 7, 192) 258048 activation_156[0][0]
__________________________________________________________________________________________________
conv2d_162 (Conv2D) (None, 7, 7, 192) 258048 activation_161[0][0]
__________________________________________________________________________________________________
conv2d_163 (Conv2D) (None, 7, 7, 192) 147456 average_pooling2d_15[0][0]
__________________________________________________________________________________________________
batch_normalization_154 (BatchN (None, 7, 7, 192) 576 conv2d_154[0][0]
__________________________________________________________________________________________________
batch_normalization_157 (BatchN (None, 7, 7, 192) 576 conv2d_157[0][0]
__________________________________________________________________________________________________
batch_normalization_162 (BatchN (None, 7, 7, 192) 576 conv2d_162[0][0]
__________________________________________________________________________________________________
batch_normalization_163 (BatchN (None, 7, 7, 192) 576 conv2d_163[0][0]
__________________________________________________________________________________________________
activation_154 (Activation) (None, 7, 7, 192) 0 batch_normalization_154[0][0]
__________________________________________________________________________________________________
activation_157 (Activation) (None, 7, 7, 192) 0 batch_normalization_157[0][0]
__________________________________________________________________________________________________
activation_162 (Activation) (None, 7, 7, 192) 0 batch_normalization_162[0][0]
__________________________________________________________________________________________________
activation_163 (Activation) (None, 7, 7, 192) 0 batch_normalization_163[0][0]
__________________________________________________________________________________________________
mixed7 (Concatenate) (None, 7, 7, 768) 0 activation_154[0][0]
activation_157[0][0]
activation_162[0][0]
activation_163[0][0]
__________________________________________________________________________________________________
conv2d_166 (Conv2D) (None, 7, 7, 192) 147456 mixed7[0][0]
__________________________________________________________________________________________________
batch_normalization_166 (BatchN (None, 7, 7, 192) 576 conv2d_166[0][0]
__________________________________________________________________________________________________
activation_166 (Activation) (None, 7, 7, 192) 0 batch_normalization_166[0][0]
__________________________________________________________________________________________________
conv2d_167 (Conv2D) (None, 7, 7, 192) 258048 activation_166[0][0]
__________________________________________________________________________________________________
batch_normalization_167 (BatchN (None, 7, 7, 192) 576 conv2d_167[0][0]
__________________________________________________________________________________________________
activation_167 (Activation) (None, 7, 7, 192) 0 batch_normalization_167[0][0]
__________________________________________________________________________________________________
conv2d_164 (Conv2D) (None, 7, 7, 192) 147456 mixed7[0][0]
__________________________________________________________________________________________________
conv2d_168 (Conv2D) (None, 7, 7, 192) 258048 activation_167[0][0]
__________________________________________________________________________________________________
batch_normalization_164 (BatchN (None, 7, 7, 192) 576 conv2d_164[0][0]
__________________________________________________________________________________________________
batch_normalization_168 (BatchN (None, 7, 7, 192) 576 conv2d_168[0][0]
__________________________________________________________________________________________________
activation_164 (Activation) (None, 7, 7, 192) 0 batch_normalization_164[0][0]
__________________________________________________________________________________________________
activation_168 (Activation) (None, 7, 7, 192) 0 batch_normalization_168[0][0]
__________________________________________________________________________________________________
conv2d_165 (Conv2D) (None, 3, 3, 320) 552960 activation_164[0][0]
__________________________________________________________________________________________________
conv2d_169 (Conv2D) (None, 3, 3, 192) 331776 activation_168[0][0]
__________________________________________________________________________________________________
batch_normalization_165 (BatchN (None, 3, 3, 320) 960 conv2d_165[0][0]
__________________________________________________________________________________________________
batch_normalization_169 (BatchN (None, 3, 3, 192) 576 conv2d_169[0][0]
__________________________________________________________________________________________________
activation_165 (Activation) (None, 3, 3, 320) 0 batch_normalization_165[0][0]
__________________________________________________________________________________________________
activation_169 (Activation) (None, 3, 3, 192) 0 batch_normalization_169[0][0]
__________________________________________________________________________________________________
max_pooling2d_7 (MaxPooling2D) (None, 3, 3, 768) 0 mixed7[0][0]
__________________________________________________________________________________________________
mixed8 (Concatenate) (None, 3, 3, 1280) 0 activation_165[0][0]
activation_169[0][0]
max_pooling2d_7[0][0]
__________________________________________________________________________________________________
conv2d_174 (Conv2D) (None, 3, 3, 448) 573440 mixed8[0][0]
__________________________________________________________________________________________________
batch_normalization_174 (BatchN (None, 3, 3, 448) 1344 conv2d_174[0][0]
__________________________________________________________________________________________________
activation_174 (Activation) (None, 3, 3, 448) 0 batch_normalization_174[0][0]
__________________________________________________________________________________________________
conv2d_171 (Conv2D) (None, 3, 3, 384) 491520 mixed8[0][0]
__________________________________________________________________________________________________
conv2d_175 (Conv2D) (None, 3, 3, 384) 1548288 activation_174[0][0]
__________________________________________________________________________________________________
batch_normalization_171 (BatchN (None, 3, 3, 384) 1152 conv2d_171[0][0]
__________________________________________________________________________________________________
batch_normalization_175 (BatchN (None, 3, 3, 384) 1152 conv2d_175[0][0]
__________________________________________________________________________________________________
activation_171 (Activation) (None, 3, 3, 384) 0 batch_normalization_171[0][0]
__________________________________________________________________________________________________
activation_175 (Activation) (None, 3, 3, 384) 0 batch_normalization_175[0][0]
__________________________________________________________________________________________________
conv2d_172 (Conv2D) (None, 3, 3, 384) 442368 activation_171[0][0]
__________________________________________________________________________________________________
conv2d_173 (Conv2D) (None, 3, 3, 384) 442368 activation_171[0][0]
__________________________________________________________________________________________________
conv2d_176 (Conv2D) (None, 3, 3, 384) 442368 activation_175[0][0]
__________________________________________________________________________________________________
conv2d_177 (Conv2D) (None, 3, 3, 384) 442368 activation_175[0][0]
__________________________________________________________________________________________________
average_pooling2d_16 (AveragePo (None, 3, 3, 1280) 0 mixed8[0][0]
__________________________________________________________________________________________________
conv2d_170 (Conv2D) (None, 3, 3, 320) 409600 mixed8[0][0]
__________________________________________________________________________________________________
batch_normalization_172 (BatchN (None, 3, 3, 384) 1152 conv2d_172[0][0]
__________________________________________________________________________________________________
batch_normalization_173 (BatchN (None, 3, 3, 384) 1152 conv2d_173[0][0]
__________________________________________________________________________________________________
batch_normalization_176 (BatchN (None, 3, 3, 384) 1152 conv2d_176[0][0]
__________________________________________________________________________________________________
batch_normalization_177 (BatchN (None, 3, 3, 384) 1152 conv2d_177[0][0]
__________________________________________________________________________________________________
conv2d_178 (Conv2D) (None, 3, 3, 192) 245760 average_pooling2d_16[0][0]
__________________________________________________________________________________________________
batch_normalization_170 (BatchN (None, 3, 3, 320) 960 conv2d_170[0][0]
__________________________________________________________________________________________________
activation_172 (Activation) (None, 3, 3, 384) 0 batch_normalization_172[0][0]
__________________________________________________________________________________________________
activation_173 (Activation) (None, 3, 3, 384) 0 batch_normalization_173[0][0]
__________________________________________________________________________________________________
activation_176 (Activation) (None, 3, 3, 384) 0 batch_normalization_176[0][0]
__________________________________________________________________________________________________
activation_177 (Activation) (None, 3, 3, 384) 0 batch_normalization_177[0][0]
__________________________________________________________________________________________________
batch_normalization_178 (BatchN (None, 3, 3, 192) 576 conv2d_178[0][0]
__________________________________________________________________________________________________
activation_170 (Activation) (None, 3, 3, 320) 0 batch_normalization_170[0][0]
__________________________________________________________________________________________________
mixed9_0 (Concatenate) (None, 3, 3, 768) 0 activation_172[0][0]
activation_173[0][0]
__________________________________________________________________________________________________
concatenate_2 (Concatenate) (None, 3, 3, 768) 0 activation_176[0][0]
activation_177[0][0]
__________________________________________________________________________________________________
activation_178 (Activation) (None, 3, 3, 192) 0 batch_normalization_178[0][0]
__________________________________________________________________________________________________
mixed9 (Concatenate) (None, 3, 3, 2048) 0 activation_170[0][0]
mixed9_0[0][0]
concatenate_2[0][0]
activation_178[0][0]
__________________________________________________________________________________________________
conv2d_183 (Conv2D) (None, 3, 3, 448) 917504 mixed9[0][0]
__________________________________________________________________________________________________
batch_normalization_183 (BatchN (None, 3, 3, 448) 1344 conv2d_183[0][0]
__________________________________________________________________________________________________
activation_183 (Activation) (None, 3, 3, 448) 0 batch_normalization_183[0][0]
__________________________________________________________________________________________________
conv2d_180 (Conv2D) (None, 3, 3, 384) 786432 mixed9[0][0]
__________________________________________________________________________________________________
conv2d_184 (Conv2D) (None, 3, 3, 384) 1548288 activation_183[0][0]
__________________________________________________________________________________________________
batch_normalization_180 (BatchN (None, 3, 3, 384) 1152 conv2d_180[0][0]
__________________________________________________________________________________________________
batch_normalization_184 (BatchN (None, 3, 3, 384) 1152 conv2d_184[0][0]
__________________________________________________________________________________________________
activation_180 (Activation) (None, 3, 3, 384) 0 batch_normalization_180[0][0]
__________________________________________________________________________________________________
activation_184 (Activation) (None, 3, 3, 384) 0 batch_normalization_184[0][0]
__________________________________________________________________________________________________
conv2d_181 (Conv2D) (None, 3, 3, 384) 442368 activation_180[0][0]
__________________________________________________________________________________________________
conv2d_182 (Conv2D) (None, 3, 3, 384) 442368 activation_180[0][0]
__________________________________________________________________________________________________
conv2d_185 (Conv2D) (None, 3, 3, 384) 442368 activation_184[0][0]
__________________________________________________________________________________________________
conv2d_186 (Conv2D) (None, 3, 3, 384) 442368 activation_184[0][0]
__________________________________________________________________________________________________
average_pooling2d_17 (AveragePo (None, 3, 3, 2048) 0 mixed9[0][0]
__________________________________________________________________________________________________
conv2d_179 (Conv2D) (None, 3, 3, 320) 655360 mixed9[0][0]
__________________________________________________________________________________________________
batch_normalization_181 (BatchN (None, 3, 3, 384) 1152 conv2d_181[0][0]
__________________________________________________________________________________________________
batch_normalization_182 (BatchN (None, 3, 3, 384) 1152 conv2d_182[0][0]
__________________________________________________________________________________________________
batch_normalization_185 (BatchN (None, 3, 3, 384) 1152 conv2d_185[0][0]
__________________________________________________________________________________________________
batch_normalization_186 (BatchN (None, 3, 3, 384) 1152 conv2d_186[0][0]
__________________________________________________________________________________________________
conv2d_187 (Conv2D) (None, 3, 3, 192) 393216 average_pooling2d_17[0][0]
__________________________________________________________________________________________________
batch_normalization_179 (BatchN (None, 3, 3, 320) 960 conv2d_179[0][0]
__________________________________________________________________________________________________
activation_181 (Activation) (None, 3, 3, 384) 0 batch_normalization_181[0][0]
__________________________________________________________________________________________________
activation_182 (Activation) (None, 3, 3, 384) 0 batch_normalization_182[0][0]
__________________________________________________________________________________________________
activation_185 (Activation) (None, 3, 3, 384) 0 batch_normalization_185[0][0]
__________________________________________________________________________________________________
activation_186 (Activation) (None, 3, 3, 384) 0 batch_normalization_186[0][0]
__________________________________________________________________________________________________
batch_normalization_187 (BatchN (None, 3, 3, 192) 576 conv2d_187[0][0]
__________________________________________________________________________________________________
activation_179 (Activation) (None, 3, 3, 320) 0 batch_normalization_179[0][0]
__________________________________________________________________________________________________
mixed9_1 (Concatenate) (None, 3, 3, 768) 0 activation_181[0][0]
activation_182[0][0]
__________________________________________________________________________________________________
concatenate_3 (Concatenate) (None, 3, 3, 768) 0 activation_185[0][0]
activation_186[0][0]
__________________________________________________________________________________________________
activation_187 (Activation) (None, 3, 3, 192) 0 batch_normalization_187[0][0]
__________________________________________________________________________________________________
mixed10 (Concatenate) (None, 3, 3, 2048) 0 activation_179[0][0]
mixed9_1[0][0]
concatenate_3[0][0]
activation_187[0][0]
==================================================================================================
Total params: 21,802,784
Trainable params: 0
Non-trainable params: 21,802,784
__________________________________________________________________________________________________
None
That is a big network.
Create the Output Layers
x = layers.GlobalAveragePooling2D()(base_model.output)
x = layers.Dense(1024, activation="relu")(x)
x = layers.Dropout(0.2)(x)
x = layers.Dense(1, activation="sigmoid")(x)
Now build the model combining the pre-built layer with a Dense layer (that we're going to train). Since we only have two classes the activation function is the sigmoid.
model = tensorflow.keras.Model(
base_model.input,
x,
)
print(model.summary())
Model: "model_1"
__________________________________________________________________________________________________
Layer (type) Output Shape Param # Connected to
==================================================================================================
input_2 (InputLayer) [(None, 150, 150, 3) 0
__________________________________________________________________________________________________
conv2d_94 (Conv2D) (None, 74, 74, 32) 864 input_2[0][0]
__________________________________________________________________________________________________
batch_normalization_94 (BatchNo (None, 74, 74, 32) 96 conv2d_94[0][0]
__________________________________________________________________________________________________
activation_94 (Activation) (None, 74, 74, 32) 0 batch_normalization_94[0][0]
__________________________________________________________________________________________________
conv2d_95 (Conv2D) (None, 72, 72, 32) 9216 activation_94[0][0]
__________________________________________________________________________________________________
batch_normalization_95 (BatchNo (None, 72, 72, 32) 96 conv2d_95[0][0]
__________________________________________________________________________________________________
activation_95 (Activation) (None, 72, 72, 32) 0 batch_normalization_95[0][0]
__________________________________________________________________________________________________
conv2d_96 (Conv2D) (None, 72, 72, 64) 18432 activation_95[0][0]
__________________________________________________________________________________________________
batch_normalization_96 (BatchNo (None, 72, 72, 64) 192 conv2d_96[0][0]
__________________________________________________________________________________________________
activation_96 (Activation) (None, 72, 72, 64) 0 batch_normalization_96[0][0]
__________________________________________________________________________________________________
max_pooling2d_4 (MaxPooling2D) (None, 35, 35, 64) 0 activation_96[0][0]
__________________________________________________________________________________________________
conv2d_97 (Conv2D) (None, 35, 35, 80) 5120 max_pooling2d_4[0][0]
__________________________________________________________________________________________________
batch_normalization_97 (BatchNo (None, 35, 35, 80) 240 conv2d_97[0][0]
__________________________________________________________________________________________________
activation_97 (Activation) (None, 35, 35, 80) 0 batch_normalization_97[0][0]
__________________________________________________________________________________________________
conv2d_98 (Conv2D) (None, 33, 33, 192) 138240 activation_97[0][0]
__________________________________________________________________________________________________
batch_normalization_98 (BatchNo (None, 33, 33, 192) 576 conv2d_98[0][0]
__________________________________________________________________________________________________
activation_98 (Activation) (None, 33, 33, 192) 0 batch_normalization_98[0][0]
__________________________________________________________________________________________________
max_pooling2d_5 (MaxPooling2D) (None, 16, 16, 192) 0 activation_98[0][0]
__________________________________________________________________________________________________
conv2d_102 (Conv2D) (None, 16, 16, 64) 12288 max_pooling2d_5[0][0]
__________________________________________________________________________________________________
batch_normalization_102 (BatchN (None, 16, 16, 64) 192 conv2d_102[0][0]
__________________________________________________________________________________________________
activation_102 (Activation) (None, 16, 16, 64) 0 batch_normalization_102[0][0]
__________________________________________________________________________________________________
conv2d_100 (Conv2D) (None, 16, 16, 48) 9216 max_pooling2d_5[0][0]
__________________________________________________________________________________________________
conv2d_103 (Conv2D) (None, 16, 16, 96) 55296 activation_102[0][0]
__________________________________________________________________________________________________
batch_normalization_100 (BatchN (None, 16, 16, 48) 144 conv2d_100[0][0]
__________________________________________________________________________________________________
batch_normalization_103 (BatchN (None, 16, 16, 96) 288 conv2d_103[0][0]
__________________________________________________________________________________________________
activation_100 (Activation) (None, 16, 16, 48) 0 batch_normalization_100[0][0]
__________________________________________________________________________________________________
activation_103 (Activation) (None, 16, 16, 96) 0 batch_normalization_103[0][0]
__________________________________________________________________________________________________
average_pooling2d_9 (AveragePoo (None, 16, 16, 192) 0 max_pooling2d_5[0][0]
__________________________________________________________________________________________________
conv2d_99 (Conv2D) (None, 16, 16, 64) 12288 max_pooling2d_5[0][0]
__________________________________________________________________________________________________
conv2d_101 (Conv2D) (None, 16, 16, 64) 76800 activation_100[0][0]
__________________________________________________________________________________________________
conv2d_104 (Conv2D) (None, 16, 16, 96) 82944 activation_103[0][0]
__________________________________________________________________________________________________
conv2d_105 (Conv2D) (None, 16, 16, 32) 6144 average_pooling2d_9[0][0]
__________________________________________________________________________________________________
batch_normalization_99 (BatchNo (None, 16, 16, 64) 192 conv2d_99[0][0]
__________________________________________________________________________________________________
batch_normalization_101 (BatchN (None, 16, 16, 64) 192 conv2d_101[0][0]
__________________________________________________________________________________________________
batch_normalization_104 (BatchN (None, 16, 16, 96) 288 conv2d_104[0][0]
__________________________________________________________________________________________________
batch_normalization_105 (BatchN (None, 16, 16, 32) 96 conv2d_105[0][0]
__________________________________________________________________________________________________
activation_99 (Activation) (None, 16, 16, 64) 0 batch_normalization_99[0][0]
__________________________________________________________________________________________________
activation_101 (Activation) (None, 16, 16, 64) 0 batch_normalization_101[0][0]
__________________________________________________________________________________________________
activation_104 (Activation) (None, 16, 16, 96) 0 batch_normalization_104[0][0]
__________________________________________________________________________________________________
activation_105 (Activation) (None, 16, 16, 32) 0 batch_normalization_105[0][0]
__________________________________________________________________________________________________
mixed0 (Concatenate) (None, 16, 16, 256) 0 activation_99[0][0]
activation_101[0][0]
activation_104[0][0]
activation_105[0][0]
__________________________________________________________________________________________________
conv2d_109 (Conv2D) (None, 16, 16, 64) 16384 mixed0[0][0]
__________________________________________________________________________________________________
batch_normalization_109 (BatchN (None, 16, 16, 64) 192 conv2d_109[0][0]
__________________________________________________________________________________________________
activation_109 (Activation) (None, 16, 16, 64) 0 batch_normalization_109[0][0]
__________________________________________________________________________________________________
conv2d_107 (Conv2D) (None, 16, 16, 48) 12288 mixed0[0][0]
__________________________________________________________________________________________________
conv2d_110 (Conv2D) (None, 16, 16, 96) 55296 activation_109[0][0]
__________________________________________________________________________________________________
batch_normalization_107 (BatchN (None, 16, 16, 48) 144 conv2d_107[0][0]
__________________________________________________________________________________________________
batch_normalization_110 (BatchN (None, 16, 16, 96) 288 conv2d_110[0][0]
__________________________________________________________________________________________________
activation_107 (Activation) (None, 16, 16, 48) 0 batch_normalization_107[0][0]
__________________________________________________________________________________________________
activation_110 (Activation) (None, 16, 16, 96) 0 batch_normalization_110[0][0]
__________________________________________________________________________________________________
average_pooling2d_10 (AveragePo (None, 16, 16, 256) 0 mixed0[0][0]
__________________________________________________________________________________________________
conv2d_106 (Conv2D) (None, 16, 16, 64) 16384 mixed0[0][0]
__________________________________________________________________________________________________
conv2d_108 (Conv2D) (None, 16, 16, 64) 76800 activation_107[0][0]
__________________________________________________________________________________________________
conv2d_111 (Conv2D) (None, 16, 16, 96) 82944 activation_110[0][0]
__________________________________________________________________________________________________
conv2d_112 (Conv2D) (None, 16, 16, 64) 16384 average_pooling2d_10[0][0]
__________________________________________________________________________________________________
batch_normalization_106 (BatchN (None, 16, 16, 64) 192 conv2d_106[0][0]
__________________________________________________________________________________________________
batch_normalization_108 (BatchN (None, 16, 16, 64) 192 conv2d_108[0][0]
__________________________________________________________________________________________________
batch_normalization_111 (BatchN (None, 16, 16, 96) 288 conv2d_111[0][0]
__________________________________________________________________________________________________
batch_normalization_112 (BatchN (None, 16, 16, 64) 192 conv2d_112[0][0]
__________________________________________________________________________________________________
activation_106 (Activation) (None, 16, 16, 64) 0 batch_normalization_106[0][0]
__________________________________________________________________________________________________
activation_108 (Activation) (None, 16, 16, 64) 0 batch_normalization_108[0][0]
__________________________________________________________________________________________________
activation_111 (Activation) (None, 16, 16, 96) 0 batch_normalization_111[0][0]
__________________________________________________________________________________________________
activation_112 (Activation) (None, 16, 16, 64) 0 batch_normalization_112[0][0]
__________________________________________________________________________________________________
mixed1 (Concatenate) (None, 16, 16, 288) 0 activation_106[0][0]
activation_108[0][0]
activation_111[0][0]
activation_112[0][0]
__________________________________________________________________________________________________
conv2d_116 (Conv2D) (None, 16, 16, 64) 18432 mixed1[0][0]
__________________________________________________________________________________________________
batch_normalization_116 (BatchN (None, 16, 16, 64) 192 conv2d_116[0][0]
__________________________________________________________________________________________________
activation_116 (Activation) (None, 16, 16, 64) 0 batch_normalization_116[0][0]
__________________________________________________________________________________________________
conv2d_114 (Conv2D) (None, 16, 16, 48) 13824 mixed1[0][0]
__________________________________________________________________________________________________
conv2d_117 (Conv2D) (None, 16, 16, 96) 55296 activation_116[0][0]
__________________________________________________________________________________________________
batch_normalization_114 (BatchN (None, 16, 16, 48) 144 conv2d_114[0][0]
__________________________________________________________________________________________________
batch_normalization_117 (BatchN (None, 16, 16, 96) 288 conv2d_117[0][0]
__________________________________________________________________________________________________
activation_114 (Activation) (None, 16, 16, 48) 0 batch_normalization_114[0][0]
__________________________________________________________________________________________________
activation_117 (Activation) (None, 16, 16, 96) 0 batch_normalization_117[0][0]
__________________________________________________________________________________________________
average_pooling2d_11 (AveragePo (None, 16, 16, 288) 0 mixed1[0][0]
__________________________________________________________________________________________________
conv2d_113 (Conv2D) (None, 16, 16, 64) 18432 mixed1[0][0]
__________________________________________________________________________________________________
conv2d_115 (Conv2D) (None, 16, 16, 64) 76800 activation_114[0][0]
__________________________________________________________________________________________________
conv2d_118 (Conv2D) (None, 16, 16, 96) 82944 activation_117[0][0]
__________________________________________________________________________________________________
conv2d_119 (Conv2D) (None, 16, 16, 64) 18432 average_pooling2d_11[0][0]
__________________________________________________________________________________________________
batch_normalization_113 (BatchN (None, 16, 16, 64) 192 conv2d_113[0][0]
__________________________________________________________________________________________________
batch_normalization_115 (BatchN (None, 16, 16, 64) 192 conv2d_115[0][0]
__________________________________________________________________________________________________
batch_normalization_118 (BatchN (None, 16, 16, 96) 288 conv2d_118[0][0]
__________________________________________________________________________________________________
batch_normalization_119 (BatchN (None, 16, 16, 64) 192 conv2d_119[0][0]
__________________________________________________________________________________________________
activation_113 (Activation) (None, 16, 16, 64) 0 batch_normalization_113[0][0]
__________________________________________________________________________________________________
activation_115 (Activation) (None, 16, 16, 64) 0 batch_normalization_115[0][0]
__________________________________________________________________________________________________
activation_118 (Activation) (None, 16, 16, 96) 0 batch_normalization_118[0][0]
__________________________________________________________________________________________________
activation_119 (Activation) (None, 16, 16, 64) 0 batch_normalization_119[0][0]
__________________________________________________________________________________________________
mixed2 (Concatenate) (None, 16, 16, 288) 0 activation_113[0][0]
activation_115[0][0]
activation_118[0][0]
activation_119[0][0]
__________________________________________________________________________________________________
conv2d_121 (Conv2D) (None, 16, 16, 64) 18432 mixed2[0][0]
__________________________________________________________________________________________________
batch_normalization_121 (BatchN (None, 16, 16, 64) 192 conv2d_121[0][0]
__________________________________________________________________________________________________
activation_121 (Activation) (None, 16, 16, 64) 0 batch_normalization_121[0][0]
__________________________________________________________________________________________________
conv2d_122 (Conv2D) (None, 16, 16, 96) 55296 activation_121[0][0]
__________________________________________________________________________________________________
batch_normalization_122 (BatchN (None, 16, 16, 96) 288 conv2d_122[0][0]
__________________________________________________________________________________________________
activation_122 (Activation) (None, 16, 16, 96) 0 batch_normalization_122[0][0]
__________________________________________________________________________________________________
conv2d_120 (Conv2D) (None, 7, 7, 384) 995328 mixed2[0][0]
__________________________________________________________________________________________________
conv2d_123 (Conv2D) (None, 7, 7, 96) 82944 activation_122[0][0]
__________________________________________________________________________________________________
batch_normalization_120 (BatchN (None, 7, 7, 384) 1152 conv2d_120[0][0]
__________________________________________________________________________________________________
batch_normalization_123 (BatchN (None, 7, 7, 96) 288 conv2d_123[0][0]
__________________________________________________________________________________________________
activation_120 (Activation) (None, 7, 7, 384) 0 batch_normalization_120[0][0]
__________________________________________________________________________________________________
activation_123 (Activation) (None, 7, 7, 96) 0 batch_normalization_123[0][0]
__________________________________________________________________________________________________
max_pooling2d_6 (MaxPooling2D) (None, 7, 7, 288) 0 mixed2[0][0]
__________________________________________________________________________________________________
mixed3 (Concatenate) (None, 7, 7, 768) 0 activation_120[0][0]
activation_123[0][0]
max_pooling2d_6[0][0]
__________________________________________________________________________________________________
conv2d_128 (Conv2D) (None, 7, 7, 128) 98304 mixed3[0][0]
__________________________________________________________________________________________________
batch_normalization_128 (BatchN (None, 7, 7, 128) 384 conv2d_128[0][0]
__________________________________________________________________________________________________
activation_128 (Activation) (None, 7, 7, 128) 0 batch_normalization_128[0][0]
__________________________________________________________________________________________________
conv2d_129 (Conv2D) (None, 7, 7, 128) 114688 activation_128[0][0]
__________________________________________________________________________________________________
batch_normalization_129 (BatchN (None, 7, 7, 128) 384 conv2d_129[0][0]
__________________________________________________________________________________________________
activation_129 (Activation) (None, 7, 7, 128) 0 batch_normalization_129[0][0]
__________________________________________________________________________________________________
conv2d_125 (Conv2D) (None, 7, 7, 128) 98304 mixed3[0][0]
__________________________________________________________________________________________________
conv2d_130 (Conv2D) (None, 7, 7, 128) 114688 activation_129[0][0]
__________________________________________________________________________________________________
batch_normalization_125 (BatchN (None, 7, 7, 128) 384 conv2d_125[0][0]
__________________________________________________________________________________________________
batch_normalization_130 (BatchN (None, 7, 7, 128) 384 conv2d_130[0][0]
__________________________________________________________________________________________________
activation_125 (Activation) (None, 7, 7, 128) 0 batch_normalization_125[0][0]
__________________________________________________________________________________________________
activation_130 (Activation) (None, 7, 7, 128) 0 batch_normalization_130[0][0]
__________________________________________________________________________________________________
conv2d_126 (Conv2D) (None, 7, 7, 128) 114688 activation_125[0][0]
__________________________________________________________________________________________________
conv2d_131 (Conv2D) (None, 7, 7, 128) 114688 activation_130[0][0]
__________________________________________________________________________________________________
batch_normalization_126 (BatchN (None, 7, 7, 128) 384 conv2d_126[0][0]
__________________________________________________________________________________________________
batch_normalization_131 (BatchN (None, 7, 7, 128) 384 conv2d_131[0][0]
__________________________________________________________________________________________________
activation_126 (Activation) (None, 7, 7, 128) 0 batch_normalization_126[0][0]
__________________________________________________________________________________________________
activation_131 (Activation) (None, 7, 7, 128) 0 batch_normalization_131[0][0]
__________________________________________________________________________________________________
average_pooling2d_12 (AveragePo (None, 7, 7, 768) 0 mixed3[0][0]
__________________________________________________________________________________________________
conv2d_124 (Conv2D) (None, 7, 7, 192) 147456 mixed3[0][0]
__________________________________________________________________________________________________
conv2d_127 (Conv2D) (None, 7, 7, 192) 172032 activation_126[0][0]
__________________________________________________________________________________________________
conv2d_132 (Conv2D) (None, 7, 7, 192) 172032 activation_131[0][0]
__________________________________________________________________________________________________
conv2d_133 (Conv2D) (None, 7, 7, 192) 147456 average_pooling2d_12[0][0]
__________________________________________________________________________________________________
batch_normalization_124 (BatchN (None, 7, 7, 192) 576 conv2d_124[0][0]
__________________________________________________________________________________________________
batch_normalization_127 (BatchN (None, 7, 7, 192) 576 conv2d_127[0][0]
__________________________________________________________________________________________________
batch_normalization_132 (BatchN (None, 7, 7, 192) 576 conv2d_132[0][0]
__________________________________________________________________________________________________
batch_normalization_133 (BatchN (None, 7, 7, 192) 576 conv2d_133[0][0]
__________________________________________________________________________________________________
activation_124 (Activation) (None, 7, 7, 192) 0 batch_normalization_124[0][0]
__________________________________________________________________________________________________
activation_127 (Activation) (None, 7, 7, 192) 0 batch_normalization_127[0][0]
__________________________________________________________________________________________________
activation_132 (Activation) (None, 7, 7, 192) 0 batch_normalization_132[0][0]
__________________________________________________________________________________________________
activation_133 (Activation) (None, 7, 7, 192) 0 batch_normalization_133[0][0]
__________________________________________________________________________________________________
mixed4 (Concatenate) (None, 7, 7, 768) 0 activation_124[0][0]
activation_127[0][0]
activation_132[0][0]
activation_133[0][0]
__________________________________________________________________________________________________
conv2d_138 (Conv2D) (None, 7, 7, 160) 122880 mixed4[0][0]
__________________________________________________________________________________________________
batch_normalization_138 (BatchN (None, 7, 7, 160) 480 conv2d_138[0][0]
__________________________________________________________________________________________________
activation_138 (Activation) (None, 7, 7, 160) 0 batch_normalization_138[0][0]
__________________________________________________________________________________________________
conv2d_139 (Conv2D) (None, 7, 7, 160) 179200 activation_138[0][0]
__________________________________________________________________________________________________
batch_normalization_139 (BatchN (None, 7, 7, 160) 480 conv2d_139[0][0]
__________________________________________________________________________________________________
activation_139 (Activation) (None, 7, 7, 160) 0 batch_normalization_139[0][0]
__________________________________________________________________________________________________
conv2d_135 (Conv2D) (None, 7, 7, 160) 122880 mixed4[0][0]
__________________________________________________________________________________________________
conv2d_140 (Conv2D) (None, 7, 7, 160) 179200 activation_139[0][0]
__________________________________________________________________________________________________
batch_normalization_135 (BatchN (None, 7, 7, 160) 480 conv2d_135[0][0]
__________________________________________________________________________________________________
batch_normalization_140 (BatchN (None, 7, 7, 160) 480 conv2d_140[0][0]
__________________________________________________________________________________________________
activation_135 (Activation) (None, 7, 7, 160) 0 batch_normalization_135[0][0]
__________________________________________________________________________________________________
activation_140 (Activation) (None, 7, 7, 160) 0 batch_normalization_140[0][0]
__________________________________________________________________________________________________
conv2d_136 (Conv2D) (None, 7, 7, 160) 179200 activation_135[0][0]
__________________________________________________________________________________________________
conv2d_141 (Conv2D) (None, 7, 7, 160) 179200 activation_140[0][0]
__________________________________________________________________________________________________
batch_normalization_136 (BatchN (None, 7, 7, 160) 480 conv2d_136[0][0]
__________________________________________________________________________________________________
batch_normalization_141 (BatchN (None, 7, 7, 160) 480 conv2d_141[0][0]
__________________________________________________________________________________________________
activation_136 (Activation) (None, 7, 7, 160) 0 batch_normalization_136[0][0]
__________________________________________________________________________________________________
activation_141 (Activation) (None, 7, 7, 160) 0 batch_normalization_141[0][0]
__________________________________________________________________________________________________
average_pooling2d_13 (AveragePo (None, 7, 7, 768) 0 mixed4[0][0]
__________________________________________________________________________________________________
conv2d_134 (Conv2D) (None, 7, 7, 192) 147456 mixed4[0][0]
__________________________________________________________________________________________________
conv2d_137 (Conv2D) (None, 7, 7, 192) 215040 activation_136[0][0]
__________________________________________________________________________________________________
conv2d_142 (Conv2D) (None, 7, 7, 192) 215040 activation_141[0][0]
__________________________________________________________________________________________________
conv2d_143 (Conv2D) (None, 7, 7, 192) 147456 average_pooling2d_13[0][0]
__________________________________________________________________________________________________
batch_normalization_134 (BatchN (None, 7, 7, 192) 576 conv2d_134[0][0]
__________________________________________________________________________________________________
batch_normalization_137 (BatchN (None, 7, 7, 192) 576 conv2d_137[0][0]
__________________________________________________________________________________________________
batch_normalization_142 (BatchN (None, 7, 7, 192) 576 conv2d_142[0][0]
__________________________________________________________________________________________________
batch_normalization_143 (BatchN (None, 7, 7, 192) 576 conv2d_143[0][0]
__________________________________________________________________________________________________
activation_134 (Activation) (None, 7, 7, 192) 0 batch_normalization_134[0][0]
__________________________________________________________________________________________________
activation_137 (Activation) (None, 7, 7, 192) 0 batch_normalization_137[0][0]
__________________________________________________________________________________________________
activation_142 (Activation) (None, 7, 7, 192) 0 batch_normalization_142[0][0]
__________________________________________________________________________________________________
activation_143 (Activation) (None, 7, 7, 192) 0 batch_normalization_143[0][0]
__________________________________________________________________________________________________
mixed5 (Concatenate) (None, 7, 7, 768) 0 activation_134[0][0]
activation_137[0][0]
activation_142[0][0]
activation_143[0][0]
__________________________________________________________________________________________________
conv2d_148 (Conv2D) (None, 7, 7, 160) 122880 mixed5[0][0]
__________________________________________________________________________________________________
batch_normalization_148 (BatchN (None, 7, 7, 160) 480 conv2d_148[0][0]
__________________________________________________________________________________________________
activation_148 (Activation) (None, 7, 7, 160) 0 batch_normalization_148[0][0]
__________________________________________________________________________________________________
conv2d_149 (Conv2D) (None, 7, 7, 160) 179200 activation_148[0][0]
__________________________________________________________________________________________________
batch_normalization_149 (BatchN (None, 7, 7, 160) 480 conv2d_149[0][0]
__________________________________________________________________________________________________
activation_149 (Activation) (None, 7, 7, 160) 0 batch_normalization_149[0][0]
__________________________________________________________________________________________________
conv2d_145 (Conv2D) (None, 7, 7, 160) 122880 mixed5[0][0]
__________________________________________________________________________________________________
conv2d_150 (Conv2D) (None, 7, 7, 160) 179200 activation_149[0][0]
__________________________________________________________________________________________________
batch_normalization_145 (BatchN (None, 7, 7, 160) 480 conv2d_145[0][0]
__________________________________________________________________________________________________
batch_normalization_150 (BatchN (None, 7, 7, 160) 480 conv2d_150[0][0]
__________________________________________________________________________________________________
activation_145 (Activation) (None, 7, 7, 160) 0 batch_normalization_145[0][0]
__________________________________________________________________________________________________
activation_150 (Activation) (None, 7, 7, 160) 0 batch_normalization_150[0][0]
__________________________________________________________________________________________________
conv2d_146 (Conv2D) (None, 7, 7, 160) 179200 activation_145[0][0]
__________________________________________________________________________________________________
conv2d_151 (Conv2D) (None, 7, 7, 160) 179200 activation_150[0][0]
__________________________________________________________________________________________________
batch_normalization_146 (BatchN (None, 7, 7, 160) 480 conv2d_146[0][0]
__________________________________________________________________________________________________
batch_normalization_151 (BatchN (None, 7, 7, 160) 480 conv2d_151[0][0]
__________________________________________________________________________________________________
activation_146 (Activation) (None, 7, 7, 160) 0 batch_normalization_146[0][0]
__________________________________________________________________________________________________
activation_151 (Activation) (None, 7, 7, 160) 0 batch_normalization_151[0][0]
__________________________________________________________________________________________________
average_pooling2d_14 (AveragePo (None, 7, 7, 768) 0 mixed5[0][0]
__________________________________________________________________________________________________
conv2d_144 (Conv2D) (None, 7, 7, 192) 147456 mixed5[0][0]
__________________________________________________________________________________________________
conv2d_147 (Conv2D) (None, 7, 7, 192) 215040 activation_146[0][0]
__________________________________________________________________________________________________
conv2d_152 (Conv2D) (None, 7, 7, 192) 215040 activation_151[0][0]
__________________________________________________________________________________________________
conv2d_153 (Conv2D) (None, 7, 7, 192) 147456 average_pooling2d_14[0][0]
__________________________________________________________________________________________________
batch_normalization_144 (BatchN (None, 7, 7, 192) 576 conv2d_144[0][0]
__________________________________________________________________________________________________
batch_normalization_147 (BatchN (None, 7, 7, 192) 576 conv2d_147[0][0]
__________________________________________________________________________________________________
batch_normalization_152 (BatchN (None, 7, 7, 192) 576 conv2d_152[0][0]
__________________________________________________________________________________________________
batch_normalization_153 (BatchN (None, 7, 7, 192) 576 conv2d_153[0][0]
__________________________________________________________________________________________________
activation_144 (Activation) (None, 7, 7, 192) 0 batch_normalization_144[0][0]
__________________________________________________________________________________________________
activation_147 (Activation) (None, 7, 7, 192) 0 batch_normalization_147[0][0]
__________________________________________________________________________________________________
activation_152 (Activation) (None, 7, 7, 192) 0 batch_normalization_152[0][0]
__________________________________________________________________________________________________
activation_153 (Activation) (None, 7, 7, 192) 0 batch_normalization_153[0][0]
__________________________________________________________________________________________________
mixed6 (Concatenate) (None, 7, 7, 768) 0 activation_144[0][0]
activation_147[0][0]
activation_152[0][0]
activation_153[0][0]
__________________________________________________________________________________________________
conv2d_158 (Conv2D) (None, 7, 7, 192) 147456 mixed6[0][0]
__________________________________________________________________________________________________
batch_normalization_158 (BatchN (None, 7, 7, 192) 576 conv2d_158[0][0]
__________________________________________________________________________________________________
activation_158 (Activation) (None, 7, 7, 192) 0 batch_normalization_158[0][0]
__________________________________________________________________________________________________
conv2d_159 (Conv2D) (None, 7, 7, 192) 258048 activation_158[0][0]
__________________________________________________________________________________________________
batch_normalization_159 (BatchN (None, 7, 7, 192) 576 conv2d_159[0][0]
__________________________________________________________________________________________________
activation_159 (Activation) (None, 7, 7, 192) 0 batch_normalization_159[0][0]
__________________________________________________________________________________________________
conv2d_155 (Conv2D) (None, 7, 7, 192) 147456 mixed6[0][0]
__________________________________________________________________________________________________
conv2d_160 (Conv2D) (None, 7, 7, 192) 258048 activation_159[0][0]
__________________________________________________________________________________________________
batch_normalization_155 (BatchN (None, 7, 7, 192) 576 conv2d_155[0][0]
__________________________________________________________________________________________________
batch_normalization_160 (BatchN (None, 7, 7, 192) 576 conv2d_160[0][0]
__________________________________________________________________________________________________
activation_155 (Activation) (None, 7, 7, 192) 0 batch_normalization_155[0][0]
__________________________________________________________________________________________________
activation_160 (Activation) (None, 7, 7, 192) 0 batch_normalization_160[0][0]
__________________________________________________________________________________________________
conv2d_156 (Conv2D) (None, 7, 7, 192) 258048 activation_155[0][0]
__________________________________________________________________________________________________
conv2d_161 (Conv2D) (None, 7, 7, 192) 258048 activation_160[0][0]
__________________________________________________________________________________________________
batch_normalization_156 (BatchN (None, 7, 7, 192) 576 conv2d_156[0][0]
__________________________________________________________________________________________________
batch_normalization_161 (BatchN (None, 7, 7, 192) 576 conv2d_161[0][0]
__________________________________________________________________________________________________
activation_156 (Activation) (None, 7, 7, 192) 0 batch_normalization_156[0][0]
__________________________________________________________________________________________________
activation_161 (Activation) (None, 7, 7, 192) 0 batch_normalization_161[0][0]
__________________________________________________________________________________________________
average_pooling2d_15 (AveragePo (None, 7, 7, 768) 0 mixed6[0][0]
__________________________________________________________________________________________________
conv2d_154 (Conv2D) (None, 7, 7, 192) 147456 mixed6[0][0]
__________________________________________________________________________________________________
conv2d_157 (Conv2D) (None, 7, 7, 192) 258048 activation_156[0][0]
__________________________________________________________________________________________________
conv2d_162 (Conv2D) (None, 7, 7, 192) 258048 activation_161[0][0]
__________________________________________________________________________________________________
conv2d_163 (Conv2D) (None, 7, 7, 192) 147456 average_pooling2d_15[0][0]
__________________________________________________________________________________________________
batch_normalization_154 (BatchN (None, 7, 7, 192) 576 conv2d_154[0][0]
__________________________________________________________________________________________________
batch_normalization_157 (BatchN (None, 7, 7, 192) 576 conv2d_157[0][0]
__________________________________________________________________________________________________
batch_normalization_162 (BatchN (None, 7, 7, 192) 576 conv2d_162[0][0]
__________________________________________________________________________________________________
batch_normalization_163 (BatchN (None, 7, 7, 192) 576 conv2d_163[0][0]
__________________________________________________________________________________________________
activation_154 (Activation) (None, 7, 7, 192) 0 batch_normalization_154[0][0]
__________________________________________________________________________________________________
activation_157 (Activation) (None, 7, 7, 192) 0 batch_normalization_157[0][0]
__________________________________________________________________________________________________
activation_162 (Activation) (None, 7, 7, 192) 0 batch_normalization_162[0][0]
__________________________________________________________________________________________________
activation_163 (Activation) (None, 7, 7, 192) 0 batch_normalization_163[0][0]
__________________________________________________________________________________________________
mixed7 (Concatenate) (None, 7, 7, 768) 0 activation_154[0][0]
activation_157[0][0]
activation_162[0][0]
activation_163[0][0]
__________________________________________________________________________________________________
conv2d_166 (Conv2D) (None, 7, 7, 192) 147456 mixed7[0][0]
__________________________________________________________________________________________________
batch_normalization_166 (BatchN (None, 7, 7, 192) 576 conv2d_166[0][0]
__________________________________________________________________________________________________
activation_166 (Activation) (None, 7, 7, 192) 0 batch_normalization_166[0][0]
__________________________________________________________________________________________________
conv2d_167 (Conv2D) (None, 7, 7, 192) 258048 activation_166[0][0]
__________________________________________________________________________________________________
batch_normalization_167 (BatchN (None, 7, 7, 192) 576 conv2d_167[0][0]
__________________________________________________________________________________________________
activation_167 (Activation) (None, 7, 7, 192) 0 batch_normalization_167[0][0]
__________________________________________________________________________________________________
conv2d_164 (Conv2D) (None, 7, 7, 192) 147456 mixed7[0][0]
__________________________________________________________________________________________________
conv2d_168 (Conv2D) (None, 7, 7, 192) 258048 activation_167[0][0]
__________________________________________________________________________________________________
batch_normalization_164 (BatchN (None, 7, 7, 192) 576 conv2d_164[0][0]
__________________________________________________________________________________________________
batch_normalization_168 (BatchN (None, 7, 7, 192) 576 conv2d_168[0][0]
__________________________________________________________________________________________________
activation_164 (Activation) (None, 7, 7, 192) 0 batch_normalization_164[0][0]
__________________________________________________________________________________________________
activation_168 (Activation) (None, 7, 7, 192) 0 batch_normalization_168[0][0]
__________________________________________________________________________________________________
conv2d_165 (Conv2D) (None, 3, 3, 320) 552960 activation_164[0][0]
__________________________________________________________________________________________________
conv2d_169 (Conv2D) (None, 3, 3, 192) 331776 activation_168[0][0]
__________________________________________________________________________________________________
batch_normalization_165 (BatchN (None, 3, 3, 320) 960 conv2d_165[0][0]
__________________________________________________________________________________________________
batch_normalization_169 (BatchN (None, 3, 3, 192) 576 conv2d_169[0][0]
__________________________________________________________________________________________________
activation_165 (Activation) (None, 3, 3, 320) 0 batch_normalization_165[0][0]
__________________________________________________________________________________________________
activation_169 (Activation) (None, 3, 3, 192) 0 batch_normalization_169[0][0]
__________________________________________________________________________________________________
max_pooling2d_7 (MaxPooling2D) (None, 3, 3, 768) 0 mixed7[0][0]
__________________________________________________________________________________________________
mixed8 (Concatenate) (None, 3, 3, 1280) 0 activation_165[0][0]
activation_169[0][0]
max_pooling2d_7[0][0]
__________________________________________________________________________________________________
conv2d_174 (Conv2D) (None, 3, 3, 448) 573440 mixed8[0][0]
__________________________________________________________________________________________________
batch_normalization_174 (BatchN (None, 3, 3, 448) 1344 conv2d_174[0][0]
__________________________________________________________________________________________________
activation_174 (Activation) (None, 3, 3, 448) 0 batch_normalization_174[0][0]
__________________________________________________________________________________________________
conv2d_171 (Conv2D) (None, 3, 3, 384) 491520 mixed8[0][0]
__________________________________________________________________________________________________
conv2d_175 (Conv2D) (None, 3, 3, 384) 1548288 activation_174[0][0]
__________________________________________________________________________________________________
batch_normalization_171 (BatchN (None, 3, 3, 384) 1152 conv2d_171[0][0]
__________________________________________________________________________________________________
batch_normalization_175 (BatchN (None, 3, 3, 384) 1152 conv2d_175[0][0]
__________________________________________________________________________________________________
activation_171 (Activation) (None, 3, 3, 384) 0 batch_normalization_171[0][0]
__________________________________________________________________________________________________
activation_175 (Activation) (None, 3, 3, 384) 0 batch_normalization_175[0][0]
__________________________________________________________________________________________________
conv2d_172 (Conv2D) (None, 3, 3, 384) 442368 activation_171[0][0]
__________________________________________________________________________________________________
conv2d_173 (Conv2D) (None, 3, 3, 384) 442368 activation_171[0][0]
__________________________________________________________________________________________________
conv2d_176 (Conv2D) (None, 3, 3, 384) 442368 activation_175[0][0]
__________________________________________________________________________________________________
conv2d_177 (Conv2D) (None, 3, 3, 384) 442368 activation_175[0][0]
__________________________________________________________________________________________________
average_pooling2d_16 (AveragePo (None, 3, 3, 1280) 0 mixed8[0][0]
__________________________________________________________________________________________________
conv2d_170 (Conv2D) (None, 3, 3, 320) 409600 mixed8[0][0]
__________________________________________________________________________________________________
batch_normalization_172 (BatchN (None, 3, 3, 384) 1152 conv2d_172[0][0]
__________________________________________________________________________________________________
batch_normalization_173 (BatchN (None, 3, 3, 384) 1152 conv2d_173[0][0]
__________________________________________________________________________________________________
batch_normalization_176 (BatchN (None, 3, 3, 384) 1152 conv2d_176[0][0]
__________________________________________________________________________________________________
batch_normalization_177 (BatchN (None, 3, 3, 384) 1152 conv2d_177[0][0]
__________________________________________________________________________________________________
conv2d_178 (Conv2D) (None, 3, 3, 192) 245760 average_pooling2d_16[0][0]
__________________________________________________________________________________________________
batch_normalization_170 (BatchN (None, 3, 3, 320) 960 conv2d_170[0][0]
__________________________________________________________________________________________________
activation_172 (Activation) (None, 3, 3, 384) 0 batch_normalization_172[0][0]
__________________________________________________________________________________________________
activation_173 (Activation) (None, 3, 3, 384) 0 batch_normalization_173[0][0]
__________________________________________________________________________________________________
activation_176 (Activation) (None, 3, 3, 384) 0 batch_normalization_176[0][0]
__________________________________________________________________________________________________
activation_177 (Activation) (None, 3, 3, 384) 0 batch_normalization_177[0][0]
__________________________________________________________________________________________________
batch_normalization_178 (BatchN (None, 3, 3, 192) 576 conv2d_178[0][0]
__________________________________________________________________________________________________
activation_170 (Activation) (None, 3, 3, 320) 0 batch_normalization_170[0][0]
__________________________________________________________________________________________________
mixed9_0 (Concatenate) (None, 3, 3, 768) 0 activation_172[0][0]
activation_173[0][0]
__________________________________________________________________________________________________
concatenate_2 (Concatenate) (None, 3, 3, 768) 0 activation_176[0][0]
activation_177[0][0]
__________________________________________________________________________________________________
activation_178 (Activation) (None, 3, 3, 192) 0 batch_normalization_178[0][0]
__________________________________________________________________________________________________
mixed9 (Concatenate) (None, 3, 3, 2048) 0 activation_170[0][0]
mixed9_0[0][0]
concatenate_2[0][0]
activation_178[0][0]
__________________________________________________________________________________________________
conv2d_183 (Conv2D) (None, 3, 3, 448) 917504 mixed9[0][0]
__________________________________________________________________________________________________
batch_normalization_183 (BatchN (None, 3, 3, 448) 1344 conv2d_183[0][0]
__________________________________________________________________________________________________
activation_183 (Activation) (None, 3, 3, 448) 0 batch_normalization_183[0][0]
__________________________________________________________________________________________________
conv2d_180 (Conv2D) (None, 3, 3, 384) 786432 mixed9[0][0]
__________________________________________________________________________________________________
conv2d_184 (Conv2D) (None, 3, 3, 384) 1548288 activation_183[0][0]
__________________________________________________________________________________________________
batch_normalization_180 (BatchN (None, 3, 3, 384) 1152 conv2d_180[0][0]
__________________________________________________________________________________________________
batch_normalization_184 (BatchN (None, 3, 3, 384) 1152 conv2d_184[0][0]
__________________________________________________________________________________________________
activation_180 (Activation) (None, 3, 3, 384) 0 batch_normalization_180[0][0]
__________________________________________________________________________________________________
activation_184 (Activation) (None, 3, 3, 384) 0 batch_normalization_184[0][0]
__________________________________________________________________________________________________
conv2d_181 (Conv2D) (None, 3, 3, 384) 442368 activation_180[0][0]
__________________________________________________________________________________________________
conv2d_182 (Conv2D) (None, 3, 3, 384) 442368 activation_180[0][0]
__________________________________________________________________________________________________
conv2d_185 (Conv2D) (None, 3, 3, 384) 442368 activation_184[0][0]
__________________________________________________________________________________________________
conv2d_186 (Conv2D) (None, 3, 3, 384) 442368 activation_184[0][0]
__________________________________________________________________________________________________
average_pooling2d_17 (AveragePo (None, 3, 3, 2048) 0 mixed9[0][0]
__________________________________________________________________________________________________
conv2d_179 (Conv2D) (None, 3, 3, 320) 655360 mixed9[0][0]
__________________________________________________________________________________________________
batch_normalization_181 (BatchN (None, 3, 3, 384) 1152 conv2d_181[0][0]
__________________________________________________________________________________________________
batch_normalization_182 (BatchN (None, 3, 3, 384) 1152 conv2d_182[0][0]
__________________________________________________________________________________________________
batch_normalization_185 (BatchN (None, 3, 3, 384) 1152 conv2d_185[0][0]
__________________________________________________________________________________________________
batch_normalization_186 (BatchN (None, 3, 3, 384) 1152 conv2d_186[0][0]
__________________________________________________________________________________________________
conv2d_187 (Conv2D) (None, 3, 3, 192) 393216 average_pooling2d_17[0][0]
__________________________________________________________________________________________________
batch_normalization_179 (BatchN (None, 3, 3, 320) 960 conv2d_179[0][0]
__________________________________________________________________________________________________
activation_181 (Activation) (None, 3, 3, 384) 0 batch_normalization_181[0][0]
__________________________________________________________________________________________________
activation_182 (Activation) (None, 3, 3, 384) 0 batch_normalization_182[0][0]
__________________________________________________________________________________________________
activation_185 (Activation) (None, 3, 3, 384) 0 batch_normalization_185[0][0]
__________________________________________________________________________________________________
activation_186 (Activation) (None, 3, 3, 384) 0 batch_normalization_186[0][0]
__________________________________________________________________________________________________
batch_normalization_187 (BatchN (None, 3, 3, 192) 576 conv2d_187[0][0]
__________________________________________________________________________________________________
activation_179 (Activation) (None, 3, 3, 320) 0 batch_normalization_179[0][0]
__________________________________________________________________________________________________
mixed9_1 (Concatenate) (None, 3, 3, 768) 0 activation_181[0][0]
activation_182[0][0]
__________________________________________________________________________________________________
concatenate_3 (Concatenate) (None, 3, 3, 768) 0 activation_185[0][0]
activation_186[0][0]
__________________________________________________________________________________________________
activation_187 (Activation) (None, 3, 3, 192) 0 batch_normalization_187[0][0]
__________________________________________________________________________________________________
mixed10 (Concatenate) (None, 3, 3, 2048) 0 activation_179[0][0]
mixed9_1[0][0]
concatenate_3[0][0]
activation_187[0][0]
__________________________________________________________________________________________________
global_average_pooling2d_1 (Glo (None, 2048) 0 mixed10[0][0]
__________________________________________________________________________________________________
dense_2 (Dense) (None, 1024) 2098176 global_average_pooling2d_1[0][0]
__________________________________________________________________________________________________
dropout_1 (Dropout) (None, 1024) 0 dense_2[0][0]
__________________________________________________________________________________________________
dense_3 (Dense) (None, 1) 1025 dropout_1[0][0]
==================================================================================================
Total params: 23,901,985
Trainable params: 2,099,201
Non-trainable params: 21,802,784
__________________________________________________________________________________________________
None
Compile the Model
model.compile(optimizer = RMSprop(lr=0.0001),
loss = 'binary_crossentropy',
metrics = ['acc'])
Train the Model
A Model Saver
checkpoint = tensorflow.keras.callbacks.ModelCheckpoint(
str(MODELS/"inception_transfer.hdf5"), monitor="val_acc", verbose=1,
save_best_only=True)
A Data Generator
This bundles up the steps to build the data generator.
class Data:
"""creates the data generators
Args:
path: path to the images
validation_split: fraction that goes to the validation set
batch_size: size for the batches in the epochs
"""
def __init__(self, path: str, validation_split: float=0.2,
batch_size: int=20) -> None:
self.path = path
self.validation_split = validation_split
self.batch_size = batch_size
self._data_generator = None
self._testing_data_generator = None
self._training_generator = None
self._validation_generator = None
return
@property
def data_generator(self) -> ImageDataGenerator:
"""The data generator for training and validation"""
if self._data_generator is None:
self._data_generator = ImageDataGenerator(
rescale=1/255,
rotation_range=40,
width_shift_range=0.2,
height_shift_range=0.2,
horizontal_flip=True,
shear_range=0.2,
zoom_range=0.2,
fill_mode="nearest",
validation_split=self.validation_split)
return self._data_generator
@property
def training_generator(self):
"""The training data generator"""
if self._training_generator is None:
self._training_generator = (self.data_generator
.flow_from_directory)(
self.path,
batch_size=self.batch_size,
class_mode="binary",
target_size=(150, 150),
subset="training",
)
return self._training_generator
@property
def validation_generator(self):
"""the validation data generator"""
if self._validation_generator is None:
self._validation_generator = (self.data_generator
.flow_from_directory)(
self.path,
batch_size=self.batch_size,
class_mode="binary",
target_size = (150, 150),
subset="validation",
)
return self._validation_generator
def __str__(self) -> str:
return (f"(Data) - Path: {self.path}, "
f"Validation Split: {self.validation_split},"
f"Batch Size: {self.batch_size}")
A Model Builder
class Network:
"""The model to categorize the images
Args:
path: path to the training data
epochs: number of epochs to train
batch_size: size of the batches for each epoch
convolution_layers: layers of cnn/max-pooling
callbacks: things to stop the training
set_steps: whether to set the training steps-per-epoch
"""
def __init__(self, path: str, epochs: int=15,
batch_size: int=128, convolution_layers: int=3,
set_steps: bool=True,
callbacks: list=None) -> None:
self.path = path
self.epochs = epochs
self.batch_size = batch_size
self.convolution_layers = convolution_layers
self.set_steps = set_steps
self.callbacks = callbacks
self._data = None
self._model = None
self.history = None
return
@property
def data(self) -> Data:
"""The data generator builder"""
if self._data is None:
self._data = Data(self.path, batch_size=self.batch_size)
return self._data
@property
def model(self) -> tensorflow.keras.models.Sequential:
"""The neural network"""
if self._model is None:
self._model = tensorflow.keras.models.Sequential([
tensorflow.keras.layers.Conv2D(
32, (3,3), activation='relu',
input_shape=(150, 150, 3)),
tensorflow.keras.layers.MaxPooling2D(2,2)])
self._model.add(
tensorflow.keras.layers.Conv2D(
64, (3,3), activation='relu'))
self._model.add(
tensorflow.keras.layers.MaxPooling2D(2,2))
for layer in range(self.convolution_layers - 2):
self._model.add(tensorflow.keras.layers.Conv2D(
128, (3,3), activation='relu'))
self._model.add(tensorflow.keras.layers.MaxPooling2D(2,2))
for layer in [
tensorflow.keras.layers.Flatten(),
tensorflow.keras.layers.Dense(512, activation='relu'),
tensorflow.keras.layers.Dense(1, activation='sigmoid')]:
self._model.add(layer)
self._model.compile(optimizer=RMSprop(lr=0.001),
loss='binary_crossentropy',
metrics = ['acc'])
return self._model
def summary(self) -> None:
"""Prints the model summary"""
print(self.model.summary())
return
def train(self) -> None:
"""Trains the model"""
callbacks = self.callbacks if self.callbacks else []
arguments = dict(
generator=self.data.training_generator,
validation_data=self.data.validation_generator,
epochs = self.epochs,
callbacks = callbacks,
verbose=2,
)
if self.set_steps:
arguments["steps_per_epoch"] = int(
self.data.training_generator.samples/self.batch_size)
arguments["validation_steps"] = int(
self.data.validation_generator.samples/self.batch_size)
self.history = self.model.fit_generator(**arguments)
return
def __str__(self) -> str:
return (f"(Network) - \nPath: {self.path}\n Epochs: {self.epochs}\n "
f"Batch Size: {self.batch_size}\n Callbacks: {self.callbacks}\n"
f"Data: {self.data}\n"
f"Callbacks: {self.callbacks}")
Train It
network = Network(str(training_path),
set_steps = True,
epochs = 10,
callbacks=[checkpoint],
batch_size=1)
network._model = model
with TIMER:
network.train()
2019-08-03 19:28:04,102 graeae.timers.timer start: Started: 2019-08-03 19:28:04.102954 I0803 19:28:04.102986 139918777980736 timer.py:70] Started: 2019-08-03 19:28:04.102954 Found 20000 images belonging to 2 classes. Found 5000 images belonging to 2 classes. Epoch 1/10 Epoch 00001: val_acc improved from -inf to 0.43660, saving model to /home/athena/models/dogs-vs-cats/inception_transfer.hdf5 20000/20000 - 615s - loss: 0.7032 - acc: 0.4977 - val_loss: 0.8069 - val_acc: 0.4366 Epoch 2/10 Epoch 00002: val_acc improved from 0.43660 to 0.43780, saving model to /home/athena/models/dogs-vs-cats/inception_transfer.hdf5 20000/20000 - 631s - loss: 0.6933 - acc: 0.5049 - val_loss: 0.7958 - val_acc: 0.4378 Epoch 3/10 Epoch 00003: val_acc did not improve from 0.43780 20000/20000 - 670s - loss: 0.6932 - acc: 0.4990 - val_loss: 0.8142 - val_acc: 0.4230 Epoch 4/10 Epoch 00004: val_acc improved from 0.43780 to 0.45020, saving model to /home/athena/models/dogs-vs-cats/inception_transfer.hdf5 20000/20000 - 666s - loss: 0.6932 - acc: 0.4990 - val_loss: 0.7856 - val_acc: 0.4502 Epoch 5/10 Epoch 00005: val_acc did not improve from 0.45020 20000/20000 - 636s - loss: 0.6932 - acc: 0.4983 - val_loss: 0.7982 - val_acc: 0.4312 Epoch 6/10 Epoch 00006: val_acc did not improve from 0.45020 20000/20000 - 618s - loss: 0.6932 - acc: 0.4999 - val_loss: 0.8018 - val_acc: 0.4326 Epoch 7/10 Epoch 00007: val_acc did not improve from 0.45020 20000/20000 - 614s - loss: 0.6932 - acc: 0.4999 - val_loss: 0.7870 - val_acc: 0.4484 Epoch 8/10 Epoch 00008: val_acc improved from 0.45020 to 0.45660, saving model to /home/athena/models/dogs-vs-cats/inception_transfer.hdf5 20000/20000 - 607s - loss: 0.6932 - acc: 0.4981 - val_loss: 0.7773 - val_acc: 0.4566 Epoch 9/10 Epoch 00009: val_acc did not improve from 0.45660 20000/20000 - 608s - loss: 0.6932 - acc: 0.4891 - val_loss: 0.7811 - val_acc: 0.4414 Epoch 10/10 Epoch 00010: val_acc did not improve from 0.45660 20000/20000 - 619s - loss: 0.6932 - acc: 0.5010 - val_loss: 0.7878 - val_acc: 0.4474 2019-08-03 21:12:49,142 graeae.timers.timer end: Ended: 2019-08-03 21:12:49.142478 I0803 21:12:49.142507 139918777980736 timer.py:77] Ended: 2019-08-03 21:12:49.142478 2019-08-03 21:12:49,143 graeae.timers.timer end: Elapsed: 1:44:45.039524 I0803 21:12:49.143225 139918777980736 timer.py:78] Elapsed: 1:44:45.039524