Tanzania Cassava Disease Image Classification¶

0. Table of Contents¶

  1. Introduction

  2. Objective

  3. Methodology

  4. Exploratory Data Analysis

    4.1 Load the Cassava dataset

    4.2 Data description

    4.3 Visualize class distribution

    4.4 Build a training pipeline

    4.5 Plot examples

  5. Developing Classifiers

    5.1 Baseline classifier from scratch

    5.2 Improve baseline classifier

    5.3 Regularization and model tuning

    5.4 Using pretrained models and architectures

    5.5 Ensemble with majority voting

  6. Interpreting what Convnets learn

    6.1 Preprocessing an input for Xception

    6.2 Get the last convolutional output

    6.3 Reapply classifier on last convolutional output

    6.4 Heatmap post-processing

  7. Conclusion

    7.1 Summary of model performance

    7.2 Key takeaways

    7.3 Future considerations

  8. References

1. Introduction¶

The iCassava 2019 Fine-Grained Visual Categorization Challenge is focused on addressing the problem of cassava diseases in Africa through the use of fine-grained visual recognition technology. Cassava is a vital food crop in the continent, but it is constantly threatened by various diseases, such as cassava brown streak disease and cassava mosaic disease. These diseases can lead to significant yield losses, thereby putting the food security of millions of people at risk. The challenge seeks to encourage the development of innovative solutions to accurately identify and classify cassava diseases through machine learning and computer vision techniques. This initiative aims to assist farmers in taking proactive measures in controlling the spread of cassava diseases and safeguarding their crop yields. [1]

By 2021 Smart Data Finance reports that most farmers, livestock keepers, and fishermen continue to use very low technology and productivity is very low even by regional standards. If productivity is still low then so is the level of commercialization. [2]

Given the crucial role that cassava plays in the lives and livelihoods of millions of people in Africa, and especially Tanzania, I saw an opportunity to continue contributing to the current solutions in order to explore improvement possibilities and develop innovative solutions to address these challenges. Therefore, it is the aim in this project to experiment on a variety of state of the art computer vision models and try to improve their overall performance on the iCassava dataset. The scope of this project does not cover the important consideration of distributing these Convnets on edge devices and scaling them to farmers across the continent.

2. Objective¶

The objective of image classification is to assign one or more labels or categories to an input image. In the context of the iCassava 2019 Fine-Grained Visual Categorization Challenge, the objective is to accurately identify and classify different types of cassava diseases from images of cassava plants.

The list of target labels will be one of 6 classes:

  • bacterial blight,
  • brown streak disease,
  • green mite, mosaic disease,
  • healthy,
  • unknown

The main measure of this task's success is to improve the model's Precision-Recall Curve. This metric was chosen because of the class imbalance inherent in the dataset collected from Makerere University's AI Lab. Categorical accuracy is also monitored along the various model iterations.

3. Methodology¶

✝️ - Refers to the customizations added in the various model development iterations.

Screen Shot 2023-03-19 at 4.47.22 AM.png

4. Exploratory Data Analysis¶

In [2]:
import os
import math
import numpy as np
import seaborn as sns
import matplotlib
import matplotlib.pyplot as plt
import tensorflow as tf
import tensorflow_datasets as tfds
import tensorflow_hub as hub
from tensorflow import keras
from tensorflow.keras import layers
import tensorflow_addons as tfa
from sklearn.metrics import confusion_matrix
import warnings
warnings.filterwarnings('ignore') 
/opt/conda/lib/python3.7/site-packages/tensorflow_addons/utils/ensure_tf_install.py:67: UserWarning: Tensorflow Addons supports using Python ops for all Tensorflow versions above or equal to 2.9.0 and strictly below 2.12.0 (nightly versions are not supported). 
 The versions of TensorFlow you are currently using is 2.8.4 and is not supported. 
Some things might work, some things might not.
If you were to encounter a bug, do not file an issue.
If you want to make sure you're using a tested and supported configuration, either change the TensorFlow version or the TensorFlow Addons's version. 
You can find the compatibility matrix in TensorFlow Addon's readme:
https://github.com/tensorflow/addons
  UserWarning,
In [4]:
print(tf.__version__)
2.8.4

Note: Version 2.9.3 is currently required to train the EfficientNet B4

In [5]:
print(tf.config.list_physical_devices('GPU'))
[PhysicalDevice(name='/physical_device:GPU:0', device_type='GPU')]
2023-03-19 02:42:30.408677: I tensorflow/stream_executor/cuda/cuda_gpu_executor.cc:936] successful NUMA node read from SysFS had negative value (-1), but there must be at least one NUMA node, so returning NUMA node zero
2023-03-19 02:42:30.624291: I tensorflow/stream_executor/cuda/cuda_gpu_executor.cc:936] successful NUMA node read from SysFS had negative value (-1), but there must be at least one NUMA node, so returning NUMA node zero
2023-03-19 02:42:30.626334: I tensorflow/stream_executor/cuda/cuda_gpu_executor.cc:936] successful NUMA node read from SysFS had negative value (-1), but there must be at least one NUMA node, so returning NUMA node zero

Set Batch Size

In [3]:
strategy = tf.distribute.get_strategy()
BATCH_SIZE= 32 * strategy.num_replicas_in_sync

4.1 Load the cassava dataset from TensorFlow Datasets (TFDS)¶

In [6]:
(ds_train, ds_validation, ds_test), ds_info = tfds.load('cassava', 
                                         split=['train', 'validation', 'test'],
                                         shuffle_files=True,
                                         as_supervised=True,
                                         with_info=True)
2023-03-19 03:00:21.787546: I tensorflow/stream_executor/cuda/cuda_gpu_executor.cc:936] successful NUMA node read from SysFS had negative value (-1), but there must be at least one NUMA node, so returning NUMA node zero
2023-03-19 03:00:21.914259: I tensorflow/stream_executor/cuda/cuda_gpu_executor.cc:936] successful NUMA node read from SysFS had negative value (-1), but there must be at least one NUMA node, so returning NUMA node zero
2023-03-19 03:00:21.916117: I tensorflow/stream_executor/cuda/cuda_gpu_executor.cc:936] successful NUMA node read from SysFS had negative value (-1), but there must be at least one NUMA node, so returning NUMA node zero
2023-03-19 03:00:21.945246: I tensorflow/core/platform/cpu_feature_guard.cc:151] This TensorFlow binary is optimized with oneAPI Deep Neural Network Library (oneDNN) to use the following CPU instructions in performance-critical operations:  AVX2 FMA
To enable them in other operations, rebuild TensorFlow with the appropriate compiler flags.
2023-03-19 03:00:21.957168: I tensorflow/stream_executor/cuda/cuda_gpu_executor.cc:936] successful NUMA node read from SysFS had negative value (-1), but there must be at least one NUMA node, so returning NUMA node zero
2023-03-19 03:00:21.959138: I tensorflow/stream_executor/cuda/cuda_gpu_executor.cc:936] successful NUMA node read from SysFS had negative value (-1), but there must be at least one NUMA node, so returning NUMA node zero
2023-03-19 03:00:21.960888: I tensorflow/stream_executor/cuda/cuda_gpu_executor.cc:936] successful NUMA node read from SysFS had negative value (-1), but there must be at least one NUMA node, so returning NUMA node zero
2023-03-19 03:00:28.954046: I tensorflow/stream_executor/cuda/cuda_gpu_executor.cc:936] successful NUMA node read from SysFS had negative value (-1), but there must be at least one NUMA node, so returning NUMA node zero
2023-03-19 03:00:28.956144: I tensorflow/stream_executor/cuda/cuda_gpu_executor.cc:936] successful NUMA node read from SysFS had negative value (-1), but there must be at least one NUMA node, so returning NUMA node zero
2023-03-19 03:00:28.957950: I tensorflow/stream_executor/cuda/cuda_gpu_executor.cc:936] successful NUMA node read from SysFS had negative value (-1), but there must be at least one NUMA node, so returning NUMA node zero
2023-03-19 03:00:28.982248: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1525] Created device /job:localhost/replica:0/task:0/device:GPU:0 with 13598 MB memory:  -> device: 0, name: Tesla T4, pci bus id: 0000:00:04.0, compute capability: 7.5

4.2 Data description¶

Here we will take a look at the description and citation information provided along with the dataset.

In [7]:
ds_info
Out[7]:
tfds.core.DatasetInfo(
    name='cassava',
    full_name='cassava/0.1.0',
    description="""
    Cassava consists of leaf images for the cassava plant depicting healthy and
    four (4) disease conditions; Cassava Mosaic Disease (CMD), Cassava Bacterial
    Blight (CBB), Cassava Greem Mite (CGM) and Cassava Brown Streak Disease (CBSD).
    Dataset consists of a total of 9430 labelled images.
    The 9430 labelled images are split into a training set (5656), a test set(1885)
    and a validation set (1889). The number of images per class are unbalanced with
    the two disease classes CMD and CBSD having 72% of the images.
    """,
    homepage='https://www.kaggle.com/c/cassava-disease/overview',
    data_path='/home/jupyter/tensorflow_datasets/cassava/0.1.0',
    file_format=tfrecord,
    download_size=1.26 GiB,
    dataset_size=1.26 GiB,
    features=FeaturesDict({
        'image': Image(shape=(None, None, 3), dtype=uint8),
        'image/filename': Text(shape=(), dtype=string),
        'label': ClassLabel(shape=(), dtype=int64, num_classes=5),
    }),
    supervised_keys=('image', 'label'),
    disable_shuffling=False,
    splits={
        'test': <SplitInfo num_examples=1885, num_shards=4>,
        'train': <SplitInfo num_examples=5656, num_shards=8>,
        'validation': <SplitInfo num_examples=1889, num_shards=4>,
    },
    citation="""@misc{mwebaze2019icassava,
        title={iCassava 2019Fine-Grained Visual Categorization Challenge},
        author={Ernest Mwebaze and Timnit Gebru and Andrea Frome and Solomon Nsumba and Jeremy Tusubira},
        year={2019},
        eprint={1908.02900},
        archivePrefix={arXiv},
        primaryClass={cs.CV}
    }""",
)
In [8]:
# List of label categories
ds_info.features['label'].names
Out[8]:
['cbb', 'cbsd', 'cgm', 'cmd', 'healthy']

The .names attribute returns string names for the integer classes. The order in which the names are provided is kept.

In [9]:
# Extend the cassava dataset classes with 'unknown'
class_names = ds_info.features['label'].names + ['unknown']

# Map the class names to human readable names
name_map = dict(
    cmd='Mosaic Disease',
    cbb='Bacterial Blight',
    cgm='Green Mite',
    cbsd='Brown Streak Disease',
    healthy='Healthy',
    unknown='Unknown')

label_map = {
    0:'cbb',
    1:'cbsd',
    2:'cgm',
    3:'cmd',
    4:'healthy',
    5:'unknown'
}

# print(len(class_names), 'classes:')
print(class_names)
print([name_map[name] for name in class_names])
['cbb', 'cbsd', 'cgm', 'cmd', 'healthy', 'unknown']
['Bacterial Blight', 'Brown Streak Disease', 'Green Mite', 'Mosaic Disease', 'Healthy', 'Unknown']
In [11]:
ds_info.splits['train'].num_examples
Out[11]:
5656
In [12]:
 tf.data.experimental.cardinality(ds_train).numpy()
Out[12]:
5656

4.3 Visualize class distribution¶

In [13]:
def get_label_frequency(dataset):
    class_distribution = np.array([record[1] for record in dataset.as_numpy_iterator()])
    labels, frequency = np.unique(class_distribution, return_counts = True)
    return frequency
In [13]:
get_label_frequency(ds_train)
Out[13]:
array([ 466, 1443,  773, 2658,  316])
In [14]:
# most frequent category represents close to 50% of all training samples
get_label_frequency(ds_train)[3]/tf.data.experimental.cardinality(ds_train).numpy()
Out[14]:
0.46994342291371993
In [15]:
# most frequent category represents close to 50% of all validation samples
get_label_frequency(ds_validation)[3]/tf.data.experimental.cardinality(ds_validation).numpy()
Out[15]:
0.4695606140815246
In [14]:
def plot_distribution(frequency):
    fig, ax = plt.subplots()
    bar_colors = ['tab:red', 'tab:blue', 'tab:green', 'tab:orange', 'tab:cyan']
    ax.bar(class_names[:-1], frequency, color=bar_colors)
    ax.set_ylabel('Frequency')
    ax.set_title('Class distribution')
    plt.show()

Plot training set class distribution

In [13]:
plot_distribution(get_label_frequency(ds_train))

Plot validation set class distribution

In [17]:
plot_distribution(get_label_frequency(ds_validation))

Plot test set class distribution

In [18]:
plot_distribution(get_label_frequency(ds_test))

Clearly the datasets seem to be highly imbalanced however the class distributions in validation and test set are representative of the training set.

4.4 Build a training pipeline¶

Our model will process 224 x 224 images but before we feed this data to the model, we are required to transform the data appropriately.

Let's apply the following transformations:

  • tf.data.Dataset.map - TFDS provides images of type tf.uint8, but the model expects tf.float32. Therefore we need to normalize images and resize them

  • tf.one_hot - The chosen metric, categorical accuracy expects one hot encoded labels, we perform this transformation using look up table

  • tf.data.Dataset.cache - For better performance, we cache data before shuffling.

  • tf.data.Dataset.shuffle - Since our data is larage we will use buffer_size=1000 for randomness.

  • tf.data.Dataset.batch - Batch elements of the dataset after shuffling to get unique batches at each epoch.

  • tf.data.Dataset.prefetch - For better performance, we will end the pipeline by prefetching

Prepare lookup table for mapping from integer to one hot encoding labels

In [10]:
indices = list(label_map.keys())
depth=6
one_hot_lookup = tf.one_hot(indices, depth)
In [194]:
one_hot_lookup
Out[194]:
<tf.Tensor: shape=(6, 6), dtype=float32, numpy=
array([[1., 0., 0., 0., 0., 0.],
       [0., 1., 0., 0., 0., 0.],
       [0., 0., 1., 0., 0., 0.],
       [0., 0., 0., 1., 0., 0.],
       [0., 0., 0., 0., 1., 0.],
       [0., 0., 0., 0., 0., 1.]], dtype=float32)>
In [195]:
indices
Out[195]:
[0, 1, 2, 3, 4, 5]
In [165]:
def data_preprocessing(image, label, img_size=(224, 224)):
    
    # Normalize [0, 255] to [0, 1]
    image = tf.cast(image, tf.float32)
    image = image / 255.

    # Resize the images to 224 x 224
    image = tf.image.resize(image, img_size)
    
    # image = tf.image.resize(image, (512,512)) # Trainning efficientnetb4
    
    # Map integer label to one hot encode value
    label = one_hot_lookup[label]

    return image, label
In [17]:
# Training pipeline
ds_train = ds_train.map(data_preprocessing, num_parallel_calls=tf.data.AUTOTUNE)
ds_train = ds_train.cache()
ds_train = ds_train.shuffle(ds_info.splits['train'].num_examples)
ds_train = ds_train.batch(BATCH_SIZE)
ds_train = ds_train.prefetch(tf.data.AUTOTUNE)
In [18]:
# Validation pipeline
ds_validation = ds_validation.map(data_preprocessing, num_parallel_calls=tf.data.AUTOTUNE)
# ds_validation = ds_validation.shuffle(ds_info.splits['validation'].num_examples) # no need for shuffle
ds_validation = ds_validation.batch(BATCH_SIZE)
ds_validation = ds_validation.cache() # cache after because batches can be the same between epochs
ds_validation = ds_train.prefetch(tf.data.AUTOTUNE)

4.5 Plot examples¶

In [12]:
def plot(examples, predictions=None):
    
    # Get the images, labels, and optionally predictions
    images = examples[0] # images
    labels = examples[1] # labels
    # batch_size = len(images)
    # batch_size = 25

    if predictions is None:
        predictions = BATCH_SIZE * [None]

    # Configure the layout of the grid
    x = np.ceil(np.sqrt(BATCH_SIZE))
    y = np.ceil(BATCH_SIZE / x)
    fig = plt.figure(figsize=(x * 6, y * 7))

    for i, (image, label, prediction) in enumerate(zip(images, labels, predictions)):
        # Render the image
        ax = fig.add_subplot(int(x), int(y), i+1)
        ax.imshow(image, aspect='auto')
        ax.grid(False)
        ax.set_xticks([])
        ax.set_yticks([])

        # Display the label and optionally prediction
        x_label = 'Label: ' + name_map[label_map[label.argmax()]]
        if prediction is not None:
          x_label = 'Prediction: ' + name_map[label_map[prediction.argmax()]] + '\n' + x_label
          ax.xaxis.label.set_color('green' if label == prediction else 'red')
        ax.set_xlabel(x_label)

    plt.show()

Notes:

  • as_numpy_terator: returns an iterator which converts all elements of the dataset to numpy. We use as_numpy_iterator to inspect the content of our dataset.

  • batch: combines consecuritve elements of this dataset into batches of 25.

In [17]:
examples = next(ds_validation.as_numpy_iterator())

plot(examples)
2023-02-27 18:05:05.570767: I tensorflow/core/kernels/data/shuffle_dataset_op.cc:392] Filling up shuffle buffer (this may take a while): 3284 of 5656
2023-02-27 18:05:13.129758: I tensorflow/core/kernels/data/shuffle_dataset_op.cc:417] Shuffle buffer filled.

5. Develop classifier¶

5.1 Baseline classifier from scratch¶

Let's begin by training a baseline classifier and scale from there.

This convnet takes as input tensors of shape (image_height, image_width, image_channels). In this case we will configure the convnet to process inputs of size (224, 224, 3).

The images are passed through a series of Convolutional and max-pooling layers where the output of these layers is a rank 3 tensor.

The next part of the feeds this output to a flatten layer and finally into a densely connected classifier to perform a 6 way classification

5.1.1 Build model¶
In [59]:
inputs = keras.Input(shape=(224, 224, 3), name='preprocessedimage')
x = layers.Conv2D(filters=32, kernel_size=3, activation='relu')(inputs)
x = layers.MaxPooling2D(pool_size=2)(x)
x = layers.Conv2D(filters=64, kernel_size=3, activation='relu')(x)
x = layers.MaxPooling2D(pool_size=2)(x)
x = layers.Conv2D(filters=128, kernel_size=3, activation='relu')(x)
x = layers.MaxPooling2D(pool_size=2)(x)
x = layers.Conv2D(filters=256, kernel_size=3, activation='relu')(x)
x = layers.MaxPooling2D(pool_size=2)(x)
x = layers.Conv2D(filters=512, kernel_size=3, activation='relu')(x)
x = layers.Flatten()(x)
outputs = layers.Dense(6, activation='softmax', name='softmax_layer')(x)
model = keras.Model(inputs=inputs, outputs=outputs, name='baseline_classifier')
In [60]:
model.summary()
Model: "baseline_classifier"
_________________________________________________________________
 Layer (type)                Output Shape              Param #   
=================================================================
 preprocessedimage (InputLay  [(None, 224, 224, 3)]    0         
 er)                                                             
                                                                 
 conv2d_18 (Conv2D)          (None, 222, 222, 32)      896       
                                                                 
 max_pooling2d_15 (MaxPoolin  (None, 111, 111, 32)     0         
 g2D)                                                            
                                                                 
 conv2d_19 (Conv2D)          (None, 109, 109, 64)      18496     
                                                                 
 max_pooling2d_16 (MaxPoolin  (None, 54, 54, 64)       0         
 g2D)                                                            
                                                                 
 conv2d_20 (Conv2D)          (None, 52, 52, 128)       73856     
                                                                 
 max_pooling2d_17 (MaxPoolin  (None, 26, 26, 128)      0         
 g2D)                                                            
                                                                 
 conv2d_21 (Conv2D)          (None, 24, 24, 256)       295168    
                                                                 
 max_pooling2d_18 (MaxPoolin  (None, 12, 12, 256)      0         
 g2D)                                                            
                                                                 
 conv2d_22 (Conv2D)          (None, 10, 10, 512)       1180160   
                                                                 
 flatten_3 (Flatten)         (None, 51200)             0         
                                                                 
 softmax_layer (Dense)       (None, 6)                 307206    
                                                                 
=================================================================
Total params: 1,875,782
Trainable params: 1,875,782
Non-trainable params: 0
_________________________________________________________________

Plot Classifier DAG

Let's plot the baseline classfier directed acyclic graph (DAG) along with it's shapes.

We notice that the dimensions shrink as you go deeper in the model. On the one hand, the maxpooling operation is used to downsample feature maps. It halves the feature maps using 2x2 windows. On the other hand, the convolution operation extracts patches that encode specific aspects of input image data. It is configured using 3x3 windows and stride=1 (no strides)

In [61]:
keras.utils.plot_model(model, "baseline_classifier.png", show_shapes=True)
Out[61]:

Setup callbacks

Before we launch a training run on this large dataset for multiple epochs, we can mitigate bad outcomes by sending data back and dynamically take action based on the progression.

Here is a list of callbacks that I will use.

  1. Model checkpointing to save the current state of the model at different points during training
  2. Tensorboard to enhance monitoring everything that goes on inside the model during training
  3. Learning rate scheduler (in the following iterations of baseline classifier)
In [21]:
callbacks_list = [
    # keras.callbacks.EarlyStopping(monitor="val_categorical_accuracy", patience=5), # interrupts training when categorical accurcay has stopped improving for 5 epochs
    keras.callbacks.ModelCheckpoint(filepath="models/baseline_classifier_checkpoint.keras", monitor="val_loss", save_best_only=True), # prevents overwriting model file unless validation loss has improved
    keras.callbacks.TensorBoard(log_dir="./tensorboard/baseline_classifier") # path where callback writes logs
]

Picking the loss function

It's not possible to directly optimize for ROC AUC on classification tasks. However this metric is important since it can accurately measure success on imbalanced dataset such as the one we are working with.

Therefore for this problem, we will go with a proxy metric of ROC AUC. Focal cross entropy loss is extremely useful for classification when you have highly imbalanced classes. It down weights well-classified examples and focuses on hard examples. The loss value is much higher for a sample which is misclassified by the classifier as compared to the loss value corresponding to a well-classified example.

In [63]:
# sparse_categorical_crossentropy is for integer type categorical labels
model.compile(optimizer='rmsprop',
              loss=tfa.losses.SigmoidFocalCrossEntropy(),
              metrics=[tf.keras.metrics.CategoricalAccuracy()])
5.1.2 Train the initial baseline model¶

In case training is interrupted we can continue from the model checkpoint by setting the continue_training_from_saved_model boolean variable to True

In [64]:
model_name="models/baseline_classifier_checkpoint.keras"
continue_training_from_saved_model=False

if os.path.exists(model_name) and continue_training_from_saved_model:
    model = keras.models.load_model(model_name)
    history = model.fit(ds_train,
          validation_data=ds_validation,
          epochs=150,
          callbacks=callbacks_list
        )
else:
    history = model.fit(ds_train,
          validation_data=ds_validation,
          epochs=150,
          callbacks=callbacks_list
        )
Epoch 1/150
2023-03-01 17:38:37.190678: I tensorflow/compiler/xla/service/service.cc:173] XLA service 0x7f23e402eae0 initialized for platform CUDA (this does not guarantee that XLA will be used). Devices:
2023-03-01 17:38:37.190723: I tensorflow/compiler/xla/service/service.cc:181]   StreamExecutor device (0): Tesla T4, Compute Capability 7.5
2023-03-01 17:38:37.244627: I tensorflow/compiler/mlir/tensorflow/utils/dump_mlir_util.cc:268] disabling MLIR crash reproducer, set env var `MLIR_CRASH_REPRODUCER_DIRECTORY` to enable.
2023-03-01 17:38:37.784582: I tensorflow/compiler/jit/xla_compilation_cache.cc:477] Compiled cluster using XLA!  This line is logged at most once for the lifetime of the process.
177/177 [==============================] - 23s 78ms/step - loss: 0.2990 - categorical_accuracy: 0.4401 - val_loss: 0.2368 - val_categorical_accuracy: 0.4701
Epoch 2/150
177/177 [==============================] - 14s 77ms/step - loss: 0.2371 - categorical_accuracy: 0.5152 - val_loss: 0.2020 - val_categorical_accuracy: 0.5925
Epoch 3/150
177/177 [==============================] - 13s 72ms/step - loss: 0.2109 - categorical_accuracy: 0.5728 - val_loss: 0.2090 - val_categorical_accuracy: 0.5713
Epoch 4/150
177/177 [==============================] - 13s 73ms/step - loss: 0.1995 - categorical_accuracy: 0.6087 - val_loss: 0.1868 - val_categorical_accuracy: 0.6220
Epoch 5/150
177/177 [==============================] - 13s 73ms/step - loss: 0.1881 - categorical_accuracy: 0.6296 - val_loss: 0.1748 - val_categorical_accuracy: 0.6528
Epoch 6/150
177/177 [==============================] - 13s 72ms/step - loss: 0.1796 - categorical_accuracy: 0.6452 - val_loss: 0.1764 - val_categorical_accuracy: 0.6653
Epoch 7/150
177/177 [==============================] - 13s 75ms/step - loss: 0.1699 - categorical_accuracy: 0.6699 - val_loss: 0.1461 - val_categorical_accuracy: 0.7235
Epoch 8/150
177/177 [==============================] - 13s 73ms/step - loss: 0.1570 - categorical_accuracy: 0.7008 - val_loss: 0.1303 - val_categorical_accuracy: 0.7629
Epoch 9/150
177/177 [==============================] - 13s 73ms/step - loss: 0.1387 - categorical_accuracy: 0.7330 - val_loss: 0.1351 - val_categorical_accuracy: 0.7219
Epoch 10/150
177/177 [==============================] - 13s 73ms/step - loss: 0.1217 - categorical_accuracy: 0.7756 - val_loss: 0.0757 - val_categorical_accuracy: 0.8796
Epoch 11/150
177/177 [==============================] - 13s 76ms/step - loss: 0.0939 - categorical_accuracy: 0.8262 - val_loss: 0.0930 - val_categorical_accuracy: 0.8600
Epoch 12/150
177/177 [==============================] - 13s 74ms/step - loss: 0.0717 - categorical_accuracy: 0.8782 - val_loss: 0.0679 - val_categorical_accuracy: 0.8752
Epoch 13/150
177/177 [==============================] - 13s 73ms/step - loss: 0.0568 - categorical_accuracy: 0.9061 - val_loss: 0.0321 - val_categorical_accuracy: 0.9501
Epoch 14/150
177/177 [==============================] - 13s 73ms/step - loss: 0.0467 - categorical_accuracy: 0.9310 - val_loss: 0.0330 - val_categorical_accuracy: 0.9634
Epoch 15/150
177/177 [==============================] - 13s 75ms/step - loss: 0.0381 - categorical_accuracy: 0.9517 - val_loss: 0.2616 - val_categorical_accuracy: 0.6867
Epoch 16/150
177/177 [==============================] - 13s 73ms/step - loss: 0.0373 - categorical_accuracy: 0.9519 - val_loss: 0.0662 - val_categorical_accuracy: 0.8943
Epoch 17/150
177/177 [==============================] - 13s 74ms/step - loss: 0.0378 - categorical_accuracy: 0.9585 - val_loss: 0.0104 - val_categorical_accuracy: 0.9887
Epoch 18/150
177/177 [==============================] - 13s 73ms/step - loss: 0.0335 - categorical_accuracy: 0.9576 - val_loss: 0.0474 - val_categorical_accuracy: 0.9450
Epoch 19/150
177/177 [==============================] - 13s 73ms/step - loss: 0.0324 - categorical_accuracy: 0.9685 - val_loss: 0.0221 - val_categorical_accuracy: 0.9768
Epoch 20/150
177/177 [==============================] - 13s 75ms/step - loss: 0.0270 - categorical_accuracy: 0.9648 - val_loss: 0.0077 - val_categorical_accuracy: 0.9954
Epoch 21/150
177/177 [==============================] - 13s 73ms/step - loss: 0.0316 - categorical_accuracy: 0.9692 - val_loss: 0.0168 - val_categorical_accuracy: 0.9804
Epoch 22/150
177/177 [==============================] - 13s 73ms/step - loss: 0.0245 - categorical_accuracy: 0.9754 - val_loss: 0.0120 - val_categorical_accuracy: 0.9878
Epoch 23/150
177/177 [==============================] - 13s 73ms/step - loss: 0.0325 - categorical_accuracy: 0.9703 - val_loss: 0.0198 - val_categorical_accuracy: 0.9841
Epoch 24/150
177/177 [==============================] - 13s 73ms/step - loss: 0.0401 - categorical_accuracy: 0.9646 - val_loss: 0.0105 - val_categorical_accuracy: 0.9933
Epoch 25/150
177/177 [==============================] - 13s 75ms/step - loss: 0.0294 - categorical_accuracy: 0.9744 - val_loss: 0.0208 - val_categorical_accuracy: 0.9786
Epoch 26/150
177/177 [==============================] - 13s 73ms/step - loss: 0.0389 - categorical_accuracy: 0.9643 - val_loss: 0.0239 - val_categorical_accuracy: 0.9735
Epoch 27/150
177/177 [==============================] - 13s 73ms/step - loss: 0.0287 - categorical_accuracy: 0.9721 - val_loss: 0.0092 - val_categorical_accuracy: 0.9929
Epoch 28/150
177/177 [==============================] - 13s 73ms/step - loss: 0.0343 - categorical_accuracy: 0.9698 - val_loss: 0.0297 - val_categorical_accuracy: 0.9646
Epoch 29/150
177/177 [==============================] - 14s 78ms/step - loss: 0.0287 - categorical_accuracy: 0.9710 - val_loss: 0.0124 - val_categorical_accuracy: 0.9945
Epoch 30/150
177/177 [==============================] - 13s 73ms/step - loss: 0.0256 - categorical_accuracy: 0.9747 - val_loss: 0.2413 - val_categorical_accuracy: 0.8653
Epoch 31/150
177/177 [==============================] - 13s 73ms/step - loss: 0.0292 - categorical_accuracy: 0.9768 - val_loss: 0.0114 - val_categorical_accuracy: 0.9906
Epoch 32/150
177/177 [==============================] - 13s 73ms/step - loss: 0.0230 - categorical_accuracy: 0.9761 - val_loss: 0.0185 - val_categorical_accuracy: 0.9860
Epoch 33/150
177/177 [==============================] - 13s 73ms/step - loss: 0.0361 - categorical_accuracy: 0.9701 - val_loss: 0.0185 - val_categorical_accuracy: 0.9807
Epoch 34/150
177/177 [==============================] - 13s 75ms/step - loss: 0.0267 - categorical_accuracy: 0.9791 - val_loss: 0.0499 - val_categorical_accuracy: 0.9249
Epoch 35/150
177/177 [==============================] - 13s 73ms/step - loss: 0.0284 - categorical_accuracy: 0.9715 - val_loss: 0.0147 - val_categorical_accuracy: 0.9894
Epoch 36/150
177/177 [==============================] - 13s 73ms/step - loss: 0.0266 - categorical_accuracy: 0.9790 - val_loss: 0.0136 - val_categorical_accuracy: 0.9894
Epoch 37/150
177/177 [==============================] - 13s 73ms/step - loss: 0.0277 - categorical_accuracy: 0.9751 - val_loss: 0.0139 - val_categorical_accuracy: 0.9894
Epoch 38/150
177/177 [==============================] - 13s 74ms/step - loss: 0.0330 - categorical_accuracy: 0.9682 - val_loss: 0.0181 - val_categorical_accuracy: 0.9813
Epoch 39/150
177/177 [==============================] - 13s 73ms/step - loss: 0.0294 - categorical_accuracy: 0.9735 - val_loss: 0.0213 - val_categorical_accuracy: 0.9807
Epoch 40/150
177/177 [==============================] - 13s 73ms/step - loss: 0.0310 - categorical_accuracy: 0.9737 - val_loss: 0.0121 - val_categorical_accuracy: 0.9885
Epoch 41/150
177/177 [==============================] - 13s 73ms/step - loss: 0.0361 - categorical_accuracy: 0.9724 - val_loss: 0.0189 - val_categorical_accuracy: 0.9832
Epoch 42/150
177/177 [==============================] - 13s 73ms/step - loss: 0.0279 - categorical_accuracy: 0.9751 - val_loss: 0.0135 - val_categorical_accuracy: 0.9896
Epoch 43/150
177/177 [==============================] - 13s 75ms/step - loss: 0.0248 - categorical_accuracy: 0.9793 - val_loss: 0.0258 - val_categorical_accuracy: 0.9728
Epoch 44/150
177/177 [==============================] - 13s 73ms/step - loss: 0.0276 - categorical_accuracy: 0.9765 - val_loss: 0.0351 - val_categorical_accuracy: 0.9517
Epoch 45/150
177/177 [==============================] - 13s 73ms/step - loss: 0.0255 - categorical_accuracy: 0.9806 - val_loss: 0.0131 - val_categorical_accuracy: 0.9935
Epoch 46/150
177/177 [==============================] - 13s 74ms/step - loss: 0.0283 - categorical_accuracy: 0.9784 - val_loss: 0.0145 - val_categorical_accuracy: 0.9864
Epoch 47/150
177/177 [==============================] - 14s 77ms/step - loss: 0.0251 - categorical_accuracy: 0.9818 - val_loss: 0.0447 - val_categorical_accuracy: 0.9721
Epoch 48/150
177/177 [==============================] - 13s 74ms/step - loss: 0.0289 - categorical_accuracy: 0.9790 - val_loss: 0.0102 - val_categorical_accuracy: 0.9956
Epoch 49/150
177/177 [==============================] - 13s 73ms/step - loss: 0.0306 - categorical_accuracy: 0.9765 - val_loss: 0.0148 - val_categorical_accuracy: 0.9882
Epoch 50/150
177/177 [==============================] - 13s 73ms/step - loss: 0.0263 - categorical_accuracy: 0.9795 - val_loss: 0.0111 - val_categorical_accuracy: 0.9933
Epoch 51/150
177/177 [==============================] - 13s 73ms/step - loss: 0.0309 - categorical_accuracy: 0.9754 - val_loss: 0.0125 - val_categorical_accuracy: 0.9943
Epoch 52/150
177/177 [==============================] - 13s 75ms/step - loss: 0.0306 - categorical_accuracy: 0.9770 - val_loss: 0.0193 - val_categorical_accuracy: 0.9811
Epoch 53/150
177/177 [==============================] - 13s 73ms/step - loss: 0.0226 - categorical_accuracy: 0.9820 - val_loss: 0.0391 - val_categorical_accuracy: 0.9747
Epoch 54/150
177/177 [==============================] - 13s 73ms/step - loss: 0.0279 - categorical_accuracy: 0.9839 - val_loss: 0.0219 - val_categorical_accuracy: 0.9846
Epoch 55/150
177/177 [==============================] - 13s 73ms/step - loss: 0.0258 - categorical_accuracy: 0.9818 - val_loss: 0.0083 - val_categorical_accuracy: 0.9926
Epoch 56/150
177/177 [==============================] - 13s 75ms/step - loss: 0.0217 - categorical_accuracy: 0.9804 - val_loss: 0.0195 - val_categorical_accuracy: 0.9802
Epoch 57/150
177/177 [==============================] - 13s 73ms/step - loss: 0.0327 - categorical_accuracy: 0.9793 - val_loss: 0.0080 - val_categorical_accuracy: 0.9945
Epoch 58/150
177/177 [==============================] - 13s 72ms/step - loss: 0.0330 - categorical_accuracy: 0.9791 - val_loss: 0.0182 - val_categorical_accuracy: 0.9890
Epoch 59/150
177/177 [==============================] - 13s 73ms/step - loss: 0.0274 - categorical_accuracy: 0.9821 - val_loss: 0.0216 - val_categorical_accuracy: 0.9843
Epoch 60/150
177/177 [==============================] - 13s 72ms/step - loss: 0.0252 - categorical_accuracy: 0.9806 - val_loss: 0.0254 - val_categorical_accuracy: 0.9839
Epoch 61/150
177/177 [==============================] - 13s 74ms/step - loss: 0.0375 - categorical_accuracy: 0.9791 - val_loss: 0.0088 - val_categorical_accuracy: 0.9913
Epoch 62/150
177/177 [==============================] - 13s 72ms/step - loss: 0.0239 - categorical_accuracy: 0.9797 - val_loss: 0.0166 - val_categorical_accuracy: 0.9910
Epoch 63/150
177/177 [==============================] - 13s 72ms/step - loss: 0.0354 - categorical_accuracy: 0.9761 - val_loss: 0.0118 - val_categorical_accuracy: 0.9928
Epoch 64/150
177/177 [==============================] - 13s 72ms/step - loss: 0.0336 - categorical_accuracy: 0.9781 - val_loss: 0.0093 - val_categorical_accuracy: 0.9929
Epoch 65/150
177/177 [==============================] - 13s 72ms/step - loss: 0.0321 - categorical_accuracy: 0.9806 - val_loss: 0.0226 - val_categorical_accuracy: 0.9862
Epoch 66/150
177/177 [==============================] - 13s 75ms/step - loss: 0.0301 - categorical_accuracy: 0.9839 - val_loss: 0.0128 - val_categorical_accuracy: 0.9929
Epoch 67/150
177/177 [==============================] - 13s 72ms/step - loss: 0.0253 - categorical_accuracy: 0.9820 - val_loss: 0.0267 - val_categorical_accuracy: 0.9871
Epoch 68/150
177/177 [==============================] - 13s 73ms/step - loss: 0.0327 - categorical_accuracy: 0.9823 - val_loss: 0.1401 - val_categorical_accuracy: 0.9151
Epoch 69/150
177/177 [==============================] - 13s 72ms/step - loss: 0.0359 - categorical_accuracy: 0.9802 - val_loss: 0.0237 - val_categorical_accuracy: 0.9848
Epoch 70/150
177/177 [==============================] - 14s 77ms/step - loss: 0.0288 - categorical_accuracy: 0.9813 - val_loss: 0.0185 - val_categorical_accuracy: 0.9844
Epoch 71/150
177/177 [==============================] - 13s 72ms/step - loss: 0.0319 - categorical_accuracy: 0.9790 - val_loss: 0.0152 - val_categorical_accuracy: 0.9889
Epoch 72/150
177/177 [==============================] - 13s 72ms/step - loss: 0.0412 - categorical_accuracy: 0.9774 - val_loss: 0.0190 - val_categorical_accuracy: 0.9781
Epoch 73/150
177/177 [==============================] - 13s 72ms/step - loss: 0.0247 - categorical_accuracy: 0.9814 - val_loss: 0.0174 - val_categorical_accuracy: 0.9933
Epoch 74/150
177/177 [==============================] - 13s 73ms/step - loss: 0.0325 - categorical_accuracy: 0.9790 - val_loss: 0.0534 - val_categorical_accuracy: 0.9623
Epoch 75/150
177/177 [==============================] - 13s 74ms/step - loss: 0.0269 - categorical_accuracy: 0.9855 - val_loss: 0.0113 - val_categorical_accuracy: 0.9924
Epoch 76/150
177/177 [==============================] - 13s 72ms/step - loss: 0.0351 - categorical_accuracy: 0.9821 - val_loss: 0.0404 - val_categorical_accuracy: 0.9699
Epoch 77/150
177/177 [==============================] - 13s 72ms/step - loss: 0.0281 - categorical_accuracy: 0.9806 - val_loss: 0.0177 - val_categorical_accuracy: 0.9908
Epoch 78/150
177/177 [==============================] - 13s 72ms/step - loss: 0.0496 - categorical_accuracy: 0.9788 - val_loss: 0.0122 - val_categorical_accuracy: 0.9920
Epoch 79/150
177/177 [==============================] - 13s 72ms/step - loss: 0.0305 - categorical_accuracy: 0.9790 - val_loss: 0.0252 - val_categorical_accuracy: 0.9882
Epoch 80/150
177/177 [==============================] - 13s 75ms/step - loss: 0.0289 - categorical_accuracy: 0.9846 - val_loss: 0.0200 - val_categorical_accuracy: 0.9883
Epoch 81/150
177/177 [==============================] - 13s 72ms/step - loss: 0.0421 - categorical_accuracy: 0.9791 - val_loss: 0.0247 - val_categorical_accuracy: 0.9897
Epoch 82/150
177/177 [==============================] - 13s 72ms/step - loss: 0.0409 - categorical_accuracy: 0.9770 - val_loss: 0.0127 - val_categorical_accuracy: 0.9901
Epoch 83/150
177/177 [==============================] - 13s 72ms/step - loss: 0.0335 - categorical_accuracy: 0.9802 - val_loss: 0.0439 - val_categorical_accuracy: 0.9761
Epoch 84/150
177/177 [==============================] - 14s 78ms/step - loss: 0.0373 - categorical_accuracy: 0.9818 - val_loss: 0.0154 - val_categorical_accuracy: 0.9910
Epoch 85/150
177/177 [==============================] - 13s 72ms/step - loss: 0.0363 - categorical_accuracy: 0.9807 - val_loss: 0.0333 - val_categorical_accuracy: 0.9843
Epoch 86/150
177/177 [==============================] - 13s 72ms/step - loss: 0.0376 - categorical_accuracy: 0.9821 - val_loss: 0.0272 - val_categorical_accuracy: 0.9892
Epoch 87/150
177/177 [==============================] - 13s 72ms/step - loss: 0.0255 - categorical_accuracy: 0.9860 - val_loss: 0.0184 - val_categorical_accuracy: 0.9935
Epoch 88/150
177/177 [==============================] - 13s 72ms/step - loss: 0.0369 - categorical_accuracy: 0.9827 - val_loss: 0.0305 - val_categorical_accuracy: 0.9913
Epoch 89/150
177/177 [==============================] - 13s 74ms/step - loss: 0.0347 - categorical_accuracy: 0.9811 - val_loss: 0.1911 - val_categorical_accuracy: 0.9194
Epoch 90/150
177/177 [==============================] - 13s 72ms/step - loss: 0.0326 - categorical_accuracy: 0.9823 - val_loss: 0.1026 - val_categorical_accuracy: 0.9263
Epoch 91/150
177/177 [==============================] - 13s 72ms/step - loss: 0.0406 - categorical_accuracy: 0.9816 - val_loss: 0.0315 - val_categorical_accuracy: 0.9848
Epoch 92/150
177/177 [==============================] - 13s 72ms/step - loss: 0.0430 - categorical_accuracy: 0.9791 - val_loss: 0.0261 - val_categorical_accuracy: 0.9857
Epoch 93/150
177/177 [==============================] - 13s 73ms/step - loss: 0.0287 - categorical_accuracy: 0.9816 - val_loss: 0.0241 - val_categorical_accuracy: 0.9853
Epoch 94/150
177/177 [==============================] - 13s 74ms/step - loss: 0.0303 - categorical_accuracy: 0.9834 - val_loss: 0.0186 - val_categorical_accuracy: 0.9942
Epoch 95/150
177/177 [==============================] - 13s 72ms/step - loss: 0.0431 - categorical_accuracy: 0.9809 - val_loss: 0.0341 - val_categorical_accuracy: 0.9653
Epoch 96/150
177/177 [==============================] - 13s 72ms/step - loss: 0.0386 - categorical_accuracy: 0.9834 - val_loss: 0.0280 - val_categorical_accuracy: 0.9878
Epoch 97/150
177/177 [==============================] - 13s 72ms/step - loss: 0.0408 - categorical_accuracy: 0.9798 - val_loss: 0.0175 - val_categorical_accuracy: 0.9922
Epoch 98/150
177/177 [==============================] - 14s 77ms/step - loss: 0.0381 - categorical_accuracy: 0.9823 - val_loss: 0.0211 - val_categorical_accuracy: 0.9933
Epoch 99/150
177/177 [==============================] - 13s 72ms/step - loss: 0.0272 - categorical_accuracy: 0.9843 - val_loss: 0.0234 - val_categorical_accuracy: 0.9878
Epoch 100/150
177/177 [==============================] - 13s 72ms/step - loss: 0.0455 - categorical_accuracy: 0.9800 - val_loss: 0.0315 - val_categorical_accuracy: 0.9837
Epoch 101/150
177/177 [==============================] - 13s 72ms/step - loss: 0.0362 - categorical_accuracy: 0.9843 - val_loss: 0.0190 - val_categorical_accuracy: 0.9940
Epoch 102/150
177/177 [==============================] - 13s 73ms/step - loss: 0.0503 - categorical_accuracy: 0.9848 - val_loss: 0.0253 - val_categorical_accuracy: 0.9954
Epoch 103/150
177/177 [==============================] - 13s 75ms/step - loss: 0.0417 - categorical_accuracy: 0.9851 - val_loss: 0.0174 - val_categorical_accuracy: 0.9928
Epoch 104/150
177/177 [==============================] - 13s 73ms/step - loss: 0.0494 - categorical_accuracy: 0.9834 - val_loss: 0.0454 - val_categorical_accuracy: 0.9850
Epoch 105/150
177/177 [==============================] - 13s 73ms/step - loss: 0.0563 - categorical_accuracy: 0.9818 - val_loss: 0.0375 - val_categorical_accuracy: 0.9894
Epoch 106/150
177/177 [==============================] - 13s 73ms/step - loss: 0.0502 - categorical_accuracy: 0.9832 - val_loss: 0.0188 - val_categorical_accuracy: 0.9922
Epoch 107/150
177/177 [==============================] - 13s 73ms/step - loss: 0.0509 - categorical_accuracy: 0.9841 - val_loss: 0.0338 - val_categorical_accuracy: 0.9903
Epoch 108/150
177/177 [==============================] - 13s 74ms/step - loss: 0.0434 - categorical_accuracy: 0.9818 - val_loss: 0.0349 - val_categorical_accuracy: 0.9901
Epoch 109/150
177/177 [==============================] - 13s 73ms/step - loss: 0.0379 - categorical_accuracy: 0.9836 - val_loss: 0.0300 - val_categorical_accuracy: 0.9910
Epoch 110/150
177/177 [==============================] - 13s 73ms/step - loss: 0.0616 - categorical_accuracy: 0.9848 - val_loss: 0.0627 - val_categorical_accuracy: 0.9643
Epoch 111/150
177/177 [==============================] - 13s 72ms/step - loss: 0.0433 - categorical_accuracy: 0.9818 - val_loss: 0.0259 - val_categorical_accuracy: 0.9917
Epoch 112/150
177/177 [==============================] - 13s 76ms/step - loss: 0.0363 - categorical_accuracy: 0.9885 - val_loss: 0.0257 - val_categorical_accuracy: 0.9928
Epoch 113/150
177/177 [==============================] - 13s 72ms/step - loss: 0.0485 - categorical_accuracy: 0.9774 - val_loss: 0.0123 - val_categorical_accuracy: 0.9885
Epoch 114/150
177/177 [==============================] - 13s 72ms/step - loss: 0.0502 - categorical_accuracy: 0.9827 - val_loss: 0.0585 - val_categorical_accuracy: 0.9783
Epoch 115/150
177/177 [==============================] - 13s 73ms/step - loss: 0.0494 - categorical_accuracy: 0.9784 - val_loss: 0.0844 - val_categorical_accuracy: 0.9493
Epoch 116/150
177/177 [==============================] - 13s 73ms/step - loss: 0.0388 - categorical_accuracy: 0.9816 - val_loss: 0.0635 - val_categorical_accuracy: 0.9650
Epoch 117/150
177/177 [==============================] - 13s 75ms/step - loss: 0.0356 - categorical_accuracy: 0.9821 - val_loss: 0.0237 - val_categorical_accuracy: 0.9890
Epoch 118/150
177/177 [==============================] - 13s 73ms/step - loss: 0.0431 - categorical_accuracy: 0.9830 - val_loss: 0.0300 - val_categorical_accuracy: 0.9802
Epoch 119/150
177/177 [==============================] - 13s 73ms/step - loss: 0.0302 - categorical_accuracy: 0.9857 - val_loss: 0.0606 - val_categorical_accuracy: 0.9721
Epoch 120/150
177/177 [==============================] - 13s 73ms/step - loss: 0.0363 - categorical_accuracy: 0.9839 - val_loss: 0.0422 - val_categorical_accuracy: 0.9836
Epoch 121/150
177/177 [==============================] - 14s 77ms/step - loss: 0.0394 - categorical_accuracy: 0.9844 - val_loss: 0.0368 - val_categorical_accuracy: 0.9806
Epoch 122/150
177/177 [==============================] - 13s 73ms/step - loss: 0.0341 - categorical_accuracy: 0.9873 - val_loss: 0.0391 - val_categorical_accuracy: 0.9882
Epoch 123/150
177/177 [==============================] - 13s 73ms/step - loss: 0.0510 - categorical_accuracy: 0.9818 - val_loss: 0.0162 - val_categorical_accuracy: 0.9936
Epoch 124/150
177/177 [==============================] - 13s 73ms/step - loss: 0.0399 - categorical_accuracy: 0.9846 - val_loss: 0.0276 - val_categorical_accuracy: 0.9878
Epoch 125/150
177/177 [==============================] - 13s 74ms/step - loss: 0.0413 - categorical_accuracy: 0.9816 - val_loss: 0.0241 - val_categorical_accuracy: 0.9919
Epoch 126/150
177/177 [==============================] - 13s 74ms/step - loss: 0.0479 - categorical_accuracy: 0.9818 - val_loss: 0.0412 - val_categorical_accuracy: 0.9882
Epoch 127/150
177/177 [==============================] - 13s 73ms/step - loss: 0.0499 - categorical_accuracy: 0.9813 - val_loss: 0.0310 - val_categorical_accuracy: 0.9722
Epoch 128/150
177/177 [==============================] - 13s 73ms/step - loss: 0.0302 - categorical_accuracy: 0.9850 - val_loss: 0.0207 - val_categorical_accuracy: 0.9963
Epoch 129/150
177/177 [==============================] - 13s 73ms/step - loss: 0.0464 - categorical_accuracy: 0.9876 - val_loss: 0.0323 - val_categorical_accuracy: 0.9949
Epoch 130/150
177/177 [==============================] - 13s 76ms/step - loss: 0.0563 - categorical_accuracy: 0.9813 - val_loss: 0.0391 - val_categorical_accuracy: 0.9890
Epoch 131/150
177/177 [==============================] - 13s 72ms/step - loss: 0.0593 - categorical_accuracy: 0.9830 - val_loss: 0.0422 - val_categorical_accuracy: 0.9867
Epoch 132/150
177/177 [==============================] - 13s 73ms/step - loss: 0.0645 - categorical_accuracy: 0.9843 - val_loss: 0.0513 - val_categorical_accuracy: 0.9857
Epoch 133/150
177/177 [==============================] - 13s 73ms/step - loss: 0.0639 - categorical_accuracy: 0.9836 - val_loss: 0.0338 - val_categorical_accuracy: 0.9947
Epoch 134/150
177/177 [==============================] - 13s 72ms/step - loss: 0.0538 - categorical_accuracy: 0.9892 - val_loss: 0.0330 - val_categorical_accuracy: 0.9928
Epoch 135/150
177/177 [==============================] - 13s 74ms/step - loss: 0.0737 - categorical_accuracy: 0.9850 - val_loss: 0.0984 - val_categorical_accuracy: 0.9691
Epoch 136/150
177/177 [==============================] - 13s 73ms/step - loss: 0.0670 - categorical_accuracy: 0.9823 - val_loss: 0.0226 - val_categorical_accuracy: 0.9915
Epoch 137/150
177/177 [==============================] - 13s 72ms/step - loss: 0.0496 - categorical_accuracy: 0.9874 - val_loss: 0.0809 - val_categorical_accuracy: 0.9744
Epoch 138/150
177/177 [==============================] - 13s 72ms/step - loss: 0.0646 - categorical_accuracy: 0.9813 - val_loss: 0.0499 - val_categorical_accuracy: 0.9760
Epoch 139/150
177/177 [==============================] - 13s 76ms/step - loss: 0.0415 - categorical_accuracy: 0.9857 - val_loss: 0.0293 - val_categorical_accuracy: 0.9899
Epoch 140/150
177/177 [==============================] - 13s 72ms/step - loss: 0.0705 - categorical_accuracy: 0.9829 - val_loss: 0.0251 - val_categorical_accuracy: 0.9892
Epoch 141/150
177/177 [==============================] - 13s 73ms/step - loss: 0.0548 - categorical_accuracy: 0.9804 - val_loss: 0.0345 - val_categorical_accuracy: 0.9851
Epoch 142/150
177/177 [==============================] - 13s 72ms/step - loss: 0.0459 - categorical_accuracy: 0.9793 - val_loss: 0.0333 - val_categorical_accuracy: 0.9885
Epoch 143/150
177/177 [==============================] - 14s 82ms/step - loss: 0.0502 - categorical_accuracy: 0.9857 - val_loss: 0.0259 - val_categorical_accuracy: 0.9938
Epoch 144/150
177/177 [==============================] - 13s 72ms/step - loss: 0.0653 - categorical_accuracy: 0.9834 - val_loss: 0.0371 - val_categorical_accuracy: 0.9910
Epoch 145/150
177/177 [==============================] - 13s 72ms/step - loss: 0.0536 - categorical_accuracy: 0.9825 - val_loss: 0.0489 - val_categorical_accuracy: 0.9915
Epoch 146/150
177/177 [==============================] - 13s 72ms/step - loss: 0.0774 - categorical_accuracy: 0.9851 - val_loss: 0.3977 - val_categorical_accuracy: 0.9374
Epoch 147/150
177/177 [==============================] - 13s 72ms/step - loss: 0.0695 - categorical_accuracy: 0.9843 - val_loss: 0.0405 - val_categorical_accuracy: 0.9913
Epoch 148/150
177/177 [==============================] - 13s 74ms/step - loss: 0.0576 - categorical_accuracy: 0.9869 - val_loss: 0.0726 - val_categorical_accuracy: 0.9813
Epoch 149/150
177/177 [==============================] - 13s 72ms/step - loss: 0.0650 - categorical_accuracy: 0.9839 - val_loss: 0.0433 - val_categorical_accuracy: 0.9806
Epoch 150/150
177/177 [==============================] - 13s 72ms/step - loss: 0.0715 - categorical_accuracy: 0.9814 - val_loss: 0.0600 - val_categorical_accuracy: 0.9722
5.1.3 Analyze training history¶
In [427]:
# %reload_ext tensorboard
# %tensorboard --logdir ./tensorboard/baseline_classifier --bind_all

Screen Shot 2023-03-18 at 1.28.45 AM.png

In [66]:
val_loss, val_acc = model.evaluate(ds_validation)
print(f"Validation loss: {val_loss:.3f}")
print(f"Validation accuracy: {val_acc:.3f}")
177/177 [==============================] - 4s 20ms/step - loss: 0.0600 - categorical_accuracy: 0.9722
Validation loss: 0.060
Validation accuracy: 0.972
5.1.4 Evaluate model on TEST set¶
In [24]:
# Evaluation pipeline
ds_test = ds_test.map(data_preprocessing, num_parallel_calls=tf.data.AUTOTUNE)
# ds_validation = ds_validation.shuffle(ds_info.splits['validation'].num_examples) # no need for shuffle
ds_test = ds_test.batch(BATCH_SIZE)
ds_test = ds_test.cache() # cache after because batches can be the same between epochs
ds_test = ds_test.prefetch(tf.data.AUTOTUNE)
In [68]:
test_loss, test_acc = model.evaluate(ds_test)
print(f"Test loss: {test_loss:.3f}")
print(f"Test accuracy: {test_acc:.3f}")
59/59 [==============================] - 3s 41ms/step - loss: 2.5941 - categorical_accuracy: 0.5491
Test loss: 2.594
Test accuracy: 0.549

The model is clearly overfitting.

The discrepancy between test accuracy and validation accuracy is very high.

5.2 Improve baseline classifier¶

Training and validation loss both go down over time, but the model loss on the validation data begins to degrade after 60 epochs. This indicative of overfitting. One downside is that the model above is not monitoring some important metrics. In the following section I will introduce the following changes.

  • Train for more epochs
  • Add layers
  • Monitor additional useful metrics for a robust evaluation (val_prc) and log additional metrics

The first two changes are introduce to scale up the baseline model and test whether it will show any further signs of overfitting. The last change is introduced to ensure that our measure of success aligns with the problem better.

5.2.1 Monitoring additional robust metrics¶
In [13]:
# https://www.tensorflow.org/tutorials/structured_data/imbalanced_data#understanding_useful_metrics
METRICS = [
        tf.keras.metrics.CategoricalAccuracy(name='categorical_accuracy'),
        tf.keras.metrics.TruePositives(name='tp'),
        tf.keras.metrics.FalsePositives(name='fp'),
        tf.keras.metrics.TrueNegatives(name='tn'),
        tf.keras.metrics.FalseNegatives(name='fn'), 
        tf.keras.metrics.Precision(name='precision'),
        tf.keras.metrics.Recall(name='recall'),
        tf.keras.metrics.AUC(name='auc'),
        tf.keras.metrics.AUC(name='prc', curve='PR'), # precision-recall curve
]
5.2.2 Increase model capacity¶
In [26]:
def get_model(modelname):
    # Model architecture
    inputs = keras.Input(shape=(224, 224, 3), name='preprocessedimage')
    x = layers.Conv2D(filters=32, kernel_size=3, activation='relu')(inputs)
    x = layers.MaxPooling2D(pool_size=2)(x)
    x = layers.Conv2D(filters=64, kernel_size=3, activation='relu')(x)
    x = layers.MaxPooling2D(pool_size=2)(x)
    x = layers.Conv2D(filters=128, kernel_size=3, activation='relu')(x)
    x = layers.MaxPooling2D(pool_size=2)(x)
    x = layers.Conv2D(filters=256, kernel_size=3, activation='relu')(x)
    x = layers.MaxPooling2D(pool_size=2)(x)
    x = layers.Conv2D(filters=512, kernel_size=3, activation='relu')(x)
    x = layers.MaxPooling2D(pool_size=2)(x) # New 
    x = layers.Conv2D(filters=1024, kernel_size=3, activation='relu')(x) # New
    x = layers.Flatten()(x)
    outputs = layers.Dense(6, activation='softmax', name='softmax_layer')(x)
    model = keras.Model(inputs=inputs, outputs=outputs, name=modelname)

    # Compile model
    model.compile(optimizer='rmsprop',
                  loss=tfa.losses.SigmoidFocalCrossEntropy(),
                  metrics=METRICS)
    return model
5.2.3 Model validation using K-Fold Cross Validation¶

Implement 10-fold cross validation using the string API.

The validation datasets are each going to be 10%: [0%:10%], [10%:20%], ..., [90%:100%].

And the training datasets are each going to be the complementary 90%:

  • [10%:100%] (for a corresponding validation set of [0%:10%]),
  • [20%:100%] (for a validation set of [10%:20%]), and so on
In [25]:
vals_ds = tfds.load('cassava', split=[
    f'train+validation[{k}%:{k+10}%]' for k in range(0, 100, 10)
], as_supervised=True)
trains_ds = tfds.load('cassava', split=[
    f'train+validation[:{k}%]+train+validation[{k+10}%:]' for k in range(0, 100, 10)
], as_supervised=True)
In [ ]:
modelname = "baseline_classifier_kfold" # set the model name prefix
k=10 # split data to K partitions

validation_scores = []

for fold in range(k):
    
    # select 10% of validation data partition
    validation_data = vals_ds[fold]
    
    # Preprocess validation data
    validation_data = validation_data.map(data_preprocessing, num_parallel_calls=tf.data.AUTOTUNE)
    validation_data = validation_data.batch(BATCH_SIZE)
    validation_data = validation_data.cache() # cache after because batches can be the same between epochs
    validation_data = validation_data.prefetch(tf.data.AUTOTUNE)
    
    # select 90% of training data partition
    training_data = trains_ds[fold]
    
    # preprocess training data
    training_data = training_data.map(data_preprocessing, num_parallel_calls=tf.data.AUTOTUNE)
    training_data = training_data.cache()
    training_data = training_data.shuffle(ds_info.splits['train'].num_examples)
    training_data = training_data.batch(BATCH_SIZE)
    training_data = training_data.prefetch(tf.data.AUTOTUNE)
    
    model_fp=f"models/{modelname}_{fold}_checkpoint.keras" # Model's checkpoint file path
    if os.path.exists(model_fp):
        continue
    else:
        
        # Get brand new instance of the untrained model
        model = get_model(f"{modelname}_fold{fold}")
        print(f"---------------------Training model: {model.name}---------------------")
        # Set callbacks for model
        callbacks_list = [
            keras.callbacks.EarlyStopping(monitor="val_categorical_accuracy", patience=5, restore_best_weights=True), # interrupts training when categorical accurcay has stopped improving for 5 epochs
            keras.callbacks.ModelCheckpoint(filepath=model_fp, monitor="val_loss", save_best_only=True), # prevents overwriting model file unless validation loss has improved
            # tf.keras.callbacks.LearningRateScheduler(lambda epoch: lrfn(epoch), verbose=1)
        ]
        model.fit(training_data,
              validation_data=validation_data,
              epochs=50,
              callbacks=callbacks_list
            )
        validation_score = model.evaluate(validation_data) # get val_loss and val_acc
        validation_scores.append(validation_score) # average of the validation scores of the k folds
validation_score = np.average(validation_scores)
---------------------Training model: baseline_classifier_kfold_fold7---------------------
Epoch 1/50
407/407 [==============================] - 44s 82ms/step - loss: 0.2413 - categorical_accuracy: 0.5038 - tp: 224.0000 - fp: 200.0000 - tn: 65020.0000 - fn: 12820.0000 - precision: 0.5283 - recall: 0.0172 - auc: 0.8213 - prc: 0.4637 - val_loss: 0.2048 - val_categorical_accuracy: 0.5906 - val_tp: 11.0000 - val_fp: 3.0000 - val_tn: 29222.0000 - val_fn: 5834.0000 - val_precision: 0.7857 - val_recall: 0.0019 - val_auc: 0.8637 - val_prc: 0.5734
Epoch 2/50
407/407 [==============================] - 28s 68ms/step - loss: 0.2029 - categorical_accuracy: 0.5965 - tp: 854.0000 - fp: 272.0000 - tn: 64788.0000 - fn: 12158.0000 - precision: 0.7584 - recall: 0.0656 - auc: 0.8679 - prc: 0.5798 - val_loss: 0.1880 - val_categorical_accuracy: 0.6352 - val_tp: 469.0000 - val_fp: 146.0000 - val_tn: 29079.0000 - val_fn: 5376.0000 - val_precision: 0.7626 - val_recall: 0.0802 - val_auc: 0.8884 - val_prc: 0.6377
Epoch 3/50
407/407 [==============================] - 27s 67ms/step - loss: 0.1869 - categorical_accuracy: 0.6301 - tp: 2142.0000 - fp: 550.0000 - tn: 64510.0000 - fn: 10870.0000 - precision: 0.7957 - recall: 0.1646 - auc: 0.8910 - prc: 0.6385 - val_loss: 0.1694 - val_categorical_accuracy: 0.6753 - val_tp: 1218.0000 - val_fp: 187.0000 - val_tn: 29038.0000 - val_fn: 4627.0000 - val_precision: 0.8669 - val_recall: 0.2084 - val_auc: 0.9150 - val_prc: 0.7121
Epoch 4/50
407/407 [==============================] - 28s 68ms/step - loss: 0.1710 - categorical_accuracy: 0.6604 - tp: 3358.0000 - fp: 652.0000 - tn: 64408.0000 - fn: 9654.0000 - precision: 0.8374 - recall: 0.2581 - auc: 0.9101 - prc: 0.6991 - val_loss: 0.1585 - val_categorical_accuracy: 0.7021 - val_tp: 828.0000 - val_fp: 60.0000 - val_tn: 29165.0000 - val_fn: 5017.0000 - val_precision: 0.9324 - val_recall: 0.1417 - val_auc: 0.9253 - val_prc: 0.7558
Epoch 5/50
407/407 [==============================] - 27s 68ms/step - loss: 0.1446 - categorical_accuracy: 0.7186 - tp: 5120.0000 - fp: 617.0000 - tn: 64443.0000 - fn: 7892.0000 - precision: 0.8925 - recall: 0.3935 - auc: 0.9372 - prc: 0.7865 - val_loss: 0.1082 - val_categorical_accuracy: 0.8039 - val_tp: 3182.0000 - val_fp: 159.0000 - val_tn: 29066.0000 - val_fn: 2663.0000 - val_precision: 0.9524 - val_recall: 0.5444 - val_auc: 0.9639 - val_prc: 0.8823
Epoch 6/50
407/407 [==============================] - 30s 73ms/step - loss: 0.1038 - categorical_accuracy: 0.8136 - tp: 7762.0000 - fp: 586.0000 - tn: 64474.0000 - fn: 5250.0000 - precision: 0.9298 - recall: 0.5965 - auc: 0.9687 - prc: 0.8880 - val_loss: 0.0801 - val_categorical_accuracy: 0.8867 - val_tp: 3366.0000 - val_fp: 84.0000 - val_tn: 29141.0000 - val_fn: 2479.0000 - val_precision: 0.9757 - val_recall: 0.5759 - val_auc: 0.9841 - val_prc: 0.9414
Epoch 7/50
407/407 [==============================] - 28s 68ms/step - loss: 0.0713 - categorical_accuracy: 0.8828 - tp: 9737.0000 - fp: 516.0000 - tn: 64544.0000 - fn: 3275.0000 - precision: 0.9497 - recall: 0.7483 - auc: 0.9859 - prc: 0.9465 - val_loss: 0.0554 - val_categorical_accuracy: 0.9244 - val_tp: 4832.0000 - val_fp: 177.0000 - val_tn: 29048.0000 - val_fn: 1013.0000 - val_precision: 0.9647 - val_recall: 0.8267 - val_auc: 0.9918 - val_prc: 0.9684
Epoch 8/50
407/407 [==============================] - 27s 67ms/step - loss: 0.0490 - categorical_accuracy: 0.9277 - tp: 11146.0000 - fp: 396.0000 - tn: 64664.0000 - fn: 1866.0000 - precision: 0.9657 - recall: 0.8566 - auc: 0.9936 - prc: 0.9748 - val_loss: 0.0434 - val_categorical_accuracy: 0.9541 - val_tp: 5300.0000 - val_fp: 138.0000 - val_tn: 29087.0000 - val_fn: 545.0000 - val_precision: 0.9746 - val_recall: 0.9068 - val_auc: 0.9946 - val_prc: 0.9812
Epoch 9/50
407/407 [==============================] - 27s 68ms/step - loss: 0.0422 - categorical_accuracy: 0.9446 - tp: 11607.0000 - fp: 375.0000 - tn: 64685.0000 - fn: 1405.0000 - precision: 0.9687 - recall: 0.8920 - auc: 0.9955 - prc: 0.9822 - val_loss: 0.0448 - val_categorical_accuracy: 0.9526 - val_tp: 5287.0000 - val_fp: 148.0000 - val_tn: 29077.0000 - val_fn: 558.0000 - val_precision: 0.9728 - val_recall: 0.9045 - val_auc: 0.9939 - val_prc: 0.9801
Epoch 10/50
407/407 [==============================] - 27s 67ms/step - loss: 0.0400 - categorical_accuracy: 0.9501 - tp: 11806.0000 - fp: 352.0000 - tn: 64708.0000 - fn: 1206.0000 - precision: 0.9710 - recall: 0.9073 - auc: 0.9960 - prc: 0.9844 - val_loss: 0.0390 - val_categorical_accuracy: 0.9661 - val_tp: 5529.0000 - val_fp: 147.0000 - val_tn: 29078.0000 - val_fn: 316.0000 - val_precision: 0.9741 - val_recall: 0.9459 - val_auc: 0.9954 - val_prc: 0.9860
Epoch 11/50
407/407 [==============================] - 27s 67ms/step - loss: 0.0394 - categorical_accuracy: 0.9493 - tp: 11843.0000 - fp: 367.0000 - tn: 64693.0000 - fn: 1169.0000 - precision: 0.9699 - recall: 0.9102 - auc: 0.9963 - prc: 0.9855 - val_loss: 0.0593 - val_categorical_accuracy: 0.9263 - val_tp: 4880.0000 - val_fp: 168.0000 - val_tn: 29057.0000 - val_fn: 965.0000 - val_precision: 0.9667 - val_recall: 0.8349 - val_auc: 0.9912 - val_prc: 0.9691
Epoch 12/50
407/407 [==============================] - 27s 67ms/step - loss: 0.0403 - categorical_accuracy: 0.9524 - tp: 11917.0000 - fp: 370.0000 - tn: 64690.0000 - fn: 1095.0000 - precision: 0.9699 - recall: 0.9158 - auc: 0.9963 - prc: 0.9856 - val_loss: 0.0973 - val_categorical_accuracy: 0.9408 - val_tp: 5446.0000 - val_fp: 296.0000 - val_tn: 28929.0000 - val_fn: 399.0000 - val_precision: 0.9485 - val_recall: 0.9317 - val_auc: 0.9922 - val_prc: 0.9759
Epoch 13/50
407/407 [==============================] - 27s 67ms/step - loss: 0.0472 - categorical_accuracy: 0.9520 - tp: 12019.0000 - fp: 409.0000 - tn: 64651.0000 - fn: 993.0000 - precision: 0.9671 - recall: 0.9237 - auc: 0.9957 - prc: 0.9842 - val_loss: 0.0495 - val_categorical_accuracy: 0.9757 - val_tp: 5670.0000 - val_fp: 125.0000 - val_tn: 29100.0000 - val_fn: 175.0000 - val_precision: 0.9784 - val_recall: 0.9701 - val_auc: 0.9956 - val_prc: 0.9887
Epoch 16/50
407/407 [==============================] - 27s 66ms/step - loss: 0.0426 - categorical_accuracy: 0.9523 - tp: 12067.0000 - fp: 397.0000 - tn: 64663.0000 - fn: 945.0000 - precision: 0.9681 - recall: 0.9274 - auc: 0.9964 - prc: 0.9864 - val_loss: 0.0564 - val_categorical_accuracy: 0.9603 - val_tp: 5529.0000 - val_fp: 165.0000 - val_tn: 29060.0000 - val_fn: 316.0000 - val_precision: 0.9710 - val_recall: 0.9459 - val_auc: 0.9949 - val_prc: 0.9849
Epoch 17/50
407/407 [==============================] - 27s 67ms/step - loss: 0.0479 - categorical_accuracy: 0.9504 - tp: 12022.0000 - fp: 392.0000 - tn: 64668.0000 - fn: 990.0000 - precision: 0.9684 - recall: 0.9239 - auc: 0.9958 - prc: 0.9846 - val_loss: 0.0802 - val_categorical_accuracy: 0.9444 - val_tp: 5403.0000 - val_fp: 226.0000 - val_tn: 28999.0000 - val_fn: 442.0000 - val_precision: 0.9599 - val_recall: 0.9244 - val_auc: 0.9914 - val_prc: 0.9720
Epoch 18/50
407/407 [==============================] - 27s 66ms/step - loss: 0.0493 - categorical_accuracy: 0.9493 - tp: 11996.0000 - fp: 426.0000 - tn: 64634.0000 - fn: 1016.0000 - precision: 0.9657 - recall: 0.9219 - auc: 0.9956 - prc: 0.9837 - val_loss: 0.0543 - val_categorical_accuracy: 0.9642 - val_tp: 5563.0000 - val_fp: 147.0000 - val_tn: 29078.0000 - val_fn: 282.0000 - val_precision: 0.9743 - val_recall: 0.9518 - val_auc: 0.9948 - val_prc: 0.9843
Epoch 19/50
407/407 [==============================] - 27s 67ms/step - loss: 0.0459 - categorical_accuracy: 0.9526 - tp: 12053.0000 - fp: 399.0000 - tn: 64661.0000 - fn: 959.0000 - precision: 0.9680 - recall: 0.9263 - auc: 0.9962 - prc: 0.9856 - val_loss: 0.0785 - val_categorical_accuracy: 0.9307 - val_tp: 5285.0000 - val_fp: 272.0000 - val_tn: 28953.0000 - val_fn: 560.0000 - val_precision: 0.9511 - val_recall: 0.9042 - val_auc: 0.9912 - val_prc: 0.9697
Epoch 20/50
407/407 [==============================] - 27s 66ms/step - loss: 0.0539 - categorical_accuracy: 0.9453 - tp: 11893.0000 - fp: 461.0000 - tn: 64599.0000 - fn: 1119.0000 - precision: 0.9627 - recall: 0.9140 - auc: 0.9952 - prc: 0.9815 - val_loss: 0.1056 - val_categorical_accuracy: 0.9088 - val_tp: 5188.0000 - val_fp: 423.0000 - val_tn: 28802.0000 - val_fn: 657.0000 - val_precision: 0.9246 - val_recall: 0.8876 - val_auc: 0.9900 - val_prc: 0.9643
183/183 [==============================] - 3s 18ms/step - loss: 0.0495 - categorical_accuracy: 0.9757 - tp: 5670.0000 - fp: 125.0000 - tn: 29100.0000 - fn: 175.0000 - precision: 0.9784 - recall: 0.9701 - auc: 0.9956 - prc: 0.9887
---------------------Training model: baseline_classifier_kfold_fold8---------------------
Epoch 1/50
407/407 [==============================] - 41s 83ms/step - loss: 0.2541 - categorical_accuracy: 0.6366 - tp: 5990.0000 - fp: 407.0000 - tn: 93878.0000 - fn: 12867.0000 - precision: 0.9364 - recall: 0.3177 - auc: 0.9042 - prc: 0.7232 - val_loss: 0.2069 - val_categorical_accuracy: 0.5916 - val_tp: 12.0000 - val_fp: 3.0000 - val_tn: 29222.0000 - val_fn: 5833.0000 - val_precision: 0.8000 - val_recall: 0.0021 - val_auc: 0.8650 - val_prc: 0.5663
Epoch 2/50
407/407 [==============================] - 28s 68ms/step - loss: 0.2045 - categorical_accuracy: 0.5938 - tp: 674.0000 - fp: 293.0000 - tn: 64767.0000 - fn: 12338.0000 - precision: 0.6970 - recall: 0.0518 - auc: 0.8665 - prc: 0.5677 - val_loss: 0.1911 - val_categorical_accuracy: 0.6250 - val_tp: 132.0000 - val_fp: 42.0000 - val_tn: 29183.0000 - val_fn: 5713.0000 - val_precision: 0.7586 - val_recall: 0.0226 - val_auc: 0.8861 - val_prc: 0.6125
Epoch 3/50
407/407 [==============================] - 28s 69ms/step - loss: 0.1882 - categorical_accuracy: 0.6236 - tp: 1893.0000 - fp: 517.0000 - tn: 64543.0000 - fn: 11119.0000 - precision: 0.7855 - recall: 0.1455 - auc: 0.8891 - prc: 0.6322 - val_loss: 0.1681 - val_categorical_accuracy: 0.6631 - val_tp: 282.0000 - val_fp: 22.0000 - val_tn: 29203.0000 - val_fn: 5563.0000 - val_precision: 0.9276 - val_recall: 0.0482 - val_auc: 0.9160 - val_prc: 0.7260
Epoch 4/50
407/407 [==============================] - 28s 68ms/step - loss: 0.1691 - categorical_accuracy: 0.6582 - tp: 3480.0000 - fp: 591.0000 - tn: 64469.0000 - fn: 9532.0000 - precision: 0.8548 - recall: 0.2674 - auc: 0.9125 - prc: 0.7076 - val_loss: 0.1581 - val_categorical_accuracy: 0.6980 - val_tp: 769.0000 - val_fp: 82.0000 - val_tn: 29143.0000 - val_fn: 5076.0000 - val_precision: 0.9036 - val_recall: 0.1316 - val_auc: 0.9278 - val_prc: 0.7498
Epoch 5/50
407/407 [==============================] - 28s 69ms/step - loss: 0.1396 - categorical_accuracy: 0.7303 - tp: 5517.0000 - fp: 611.0000 - tn: 64449.0000 - fn: 7495.0000 - precision: 0.9003 - recall: 0.4240 - auc: 0.9422 - prc: 0.8011 - val_loss: 0.0999 - val_categorical_accuracy: 0.8224 - val_tp: 3021.0000 - val_fp: 105.0000 - val_tn: 29120.0000 - val_fn: 2824.0000 - val_precision: 0.9664 - val_recall: 0.5169 - val_auc: 0.9717 - val_prc: 0.9015
Epoch 6/50
407/407 [==============================] - 28s 68ms/step - loss: 0.0976 - categorical_accuracy: 0.8252 - tp: 8154.0000 - fp: 564.0000 - tn: 64496.0000 - fn: 4858.0000 - precision: 0.9353 - recall: 0.6267 - auc: 0.9728 - prc: 0.9019 - val_loss: 0.0546 - val_categorical_accuracy: 0.9252 - val_tp: 4424.0000 - val_fp: 93.0000 - val_tn: 29132.0000 - val_fn: 1421.0000 - val_precision: 0.9794 - val_recall: 0.7569 - val_auc: 0.9918 - val_prc: 0.9700
Epoch 7/50
407/407 [==============================] - 28s 69ms/step - loss: 0.0712 - categorical_accuracy: 0.8888 - tp: 9980.0000 - fp: 514.0000 - tn: 64546.0000 - fn: 3032.0000 - precision: 0.9510 - recall: 0.7670 - auc: 0.9865 - prc: 0.9487 - val_loss: 0.0338 - val_categorical_accuracy: 0.9603 - val_tp: 5347.0000 - val_fp: 99.0000 - val_tn: 29126.0000 - val_fn: 498.0000 - val_precision: 0.9818 - val_recall: 0.9148 - val_auc: 0.9961 - val_prc: 0.9867
Epoch 8/50
  8/407 [..............................] - ETA: 23s - loss: 0.0246 - categorical_accuracy: 0.9570 - tp: 236.0000 - fp: 5.0000 - tn: 1275.0000 - fn: 20.0000 - precision: 0.9793 - recall: 0.9219 - auc: 0.9982 - prc: 0.9925
In [21]:
def revaluate_k_fold_models(modelname="baseline_classifier_kfold"):
    """ Utility function to reevaluate the performance of K fold models
        on validation data
    """
    
    k=10 # split data to K partitions
    validation_scores = []
    
    for fold in range(k):
        # select 10% of validation data partition
        validation_data = vals_ds[fold]

        # Preprocess validation data
        validation_data = validation_data.map(data_preprocessing, num_parallel_calls=tf.data.AUTOTUNE)
        validation_data = validation_data.batch(BATCH_SIZE)
        validation_data = validation_data.cache() # cache after because batches can be the same between epochs
        validation_data = validation_data.prefetch(tf.data.AUTOTUNE)

        # select 90% of training data partition
        training_data = trains_ds[fold]

        # preprocess training data
        training_data = training_data.map(data_preprocessing, num_parallel_calls=tf.data.AUTOTUNE)
        training_data = training_data.cache()
        training_data = training_data.shuffle(ds_info.splits['train'].num_examples)
        training_data = training_data.batch(BATCH_SIZE)
        training_data = training_data.prefetch(tf.data.AUTOTUNE)

        model_fp=f"models/{modelname}_{fold}_checkpoint.keras" # Model's checkpoint file path

        # Load saved model
        model = keras.models.load_model(model_fp)

        #Evaluate and append score
        validation_score = model.evaluate(validation_data) # get val_loss and val_acc
        validation_scores.append(validation_score) # average of the validation scores of the k folds
    return validation_scores
In [30]:
val_scores = revaluate_k_fold_models()
183/183 [==============================] - 7s 32ms/step - loss: 0.0297 - categorical_accuracy: 0.9721 - tp: 5543.0000 - fp: 99.0000 - tn: 29126.0000 - fn: 302.0000 - precision: 0.9825 - recall: 0.9483 - auc: 0.9971 - prc: 0.9901
183/183 [==============================] - 6s 31ms/step - loss: 8.0632 - categorical_accuracy: 0.4729 - tp: 2764.0000 - fp: 3081.0000 - tn: 26144.0000 - fn: 3081.0000 - precision: 0.4729 - recall: 0.4729 - auc: 0.6837 - prc: 0.3486
183/183 [==============================] - 6s 31ms/step - loss: 0.0361 - categorical_accuracy: 0.9666 - tp: 5495.0000 - fp: 120.0000 - tn: 29105.0000 - fn: 350.0000 - precision: 0.9786 - recall: 0.9401 - auc: 0.9964 - prc: 0.9883
183/183 [==============================] - 6s 31ms/step - loss: 0.0338 - categorical_accuracy: 0.9682 - tp: 5485.0000 - fp: 108.0000 - tn: 29117.0000 - fn: 360.0000 - precision: 0.9807 - recall: 0.9384 - auc: 0.9962 - prc: 0.9875
183/183 [==============================] - 6s 32ms/step - loss: 0.0349 - categorical_accuracy: 0.9665 - tp: 5501.0000 - fp: 116.0000 - tn: 29104.0000 - fn: 343.0000 - precision: 0.9793 - recall: 0.9413 - auc: 0.9965 - prc: 0.9883
183/183 [==============================] - 8s 39ms/step - loss: 0.0412 - categorical_accuracy: 0.9516 - tp: 5293.0000 - fp: 148.0000 - tn: 29077.0000 - fn: 552.0000 - precision: 0.9728 - recall: 0.9056 - auc: 0.9948 - prc: 0.9825
183/183 [==============================] - 12s 57ms/step - loss: 0.2108 - categorical_accuracy: 0.5884 - tp: 128.0000 - fp: 97.0000 - tn: 29128.0000 - fn: 5717.0000 - precision: 0.5689 - recall: 0.0219 - auc: 0.8588 - prc: 0.5364
183/183 [==============================] - 6s 31ms/step - loss: 0.0390 - categorical_accuracy: 0.9661 - tp: 5529.0000 - fp: 147.0000 - tn: 29078.0000 - fn: 316.0000 - precision: 0.9741 - recall: 0.9459 - auc: 0.9954 - prc: 0.9860
183/183 [==============================] - 6s 32ms/step - loss: 0.0338 - categorical_accuracy: 0.9603 - tp: 5347.0000 - fp: 99.0000 - tn: 29126.0000 - fn: 498.0000 - precision: 0.9818 - recall: 0.9148 - auc: 0.9961 - prc: 0.9867
183/183 [==============================] - 6s 32ms/step - loss: 0.1620 - categorical_accuracy: 0.6809 - tp: 1220.0000 - fp: 145.0000 - tn: 29080.0000 - fn: 4625.0000 - precision: 0.8938 - recall: 0.2087 - auc: 0.9183 - prc: 0.7285
In [41]:
average_categorical_accuracy = np.average([val_scores[i][1] for i in range(10)])
print(f"Average categorical accuracy for 10 fold validation: {average_categorical_accuracy}")
Average categorical accuracy for 10 fold validation: 0.8493578493595123
5.2.4 Train the final baseline model¶

Train final model on all non-test data available

In [135]:
modelname = "improved_baseline_classifier"
model = get_model(modelname)
model.summary()
callbacks_list = [
    # keras.callbacks.EarlyStopping(monitor="val_categorical_accuracy", patience=15), # interrupts training when categorical accurcay has stopped improving for 15 epochs
    keras.callbacks.ModelCheckpoint(filepath=f"models/{modelname}.keras", monitor="val_prc", save_best_only=True), # prevents overwriting model file unless validation loss has improved
    keras.callbacks.TensorBoard(log_dir=f"./tensorboard/{modelname}") # path where callback writes logs
]
baseline_history = model.fit(
    ds_train,
    epochs=300,
    validation_data=ds_validation,
    callbacks=callbacks_list
    )
Model: "improved_baseline_classifier"
_________________________________________________________________
 Layer (type)                Output Shape              Param #   
=================================================================
 preprocessedimage (InputLay  [(None, 224, 224, 3)]    0         
 er)                                                             
                                                                 
 conv2d_29 (Conv2D)          (None, 222, 222, 32)      896       
                                                                 
 max_pooling2d_24 (MaxPoolin  (None, 111, 111, 32)     0         
 g2D)                                                            
                                                                 
 conv2d_30 (Conv2D)          (None, 109, 109, 64)      18496     
                                                                 
 max_pooling2d_25 (MaxPoolin  (None, 54, 54, 64)       0         
 g2D)                                                            
                                                                 
 conv2d_31 (Conv2D)          (None, 52, 52, 128)       73856     
                                                                 
 max_pooling2d_26 (MaxPoolin  (None, 26, 26, 128)      0         
 g2D)                                                            
                                                                 
 conv2d_32 (Conv2D)          (None, 24, 24, 256)       295168    
                                                                 
 max_pooling2d_27 (MaxPoolin  (None, 12, 12, 256)      0         
 g2D)                                                            
                                                                 
 conv2d_33 (Conv2D)          (None, 10, 10, 512)       1180160   
                                                                 
 max_pooling2d_28 (MaxPoolin  (None, 5, 5, 512)        0         
 g2D)                                                            
                                                                 
 conv2d_34 (Conv2D)          (None, 3, 3, 1024)        4719616   
                                                                 
 flatten_5 (Flatten)         (None, 9216)              0         
                                                                 
 softmax_layer (Dense)       (None, 6)                 55302     
                                                                 
=================================================================
Total params: 6,343,494
Trainable params: 6,343,494
Non-trainable params: 0
_________________________________________________________________
Epoch 1/300
177/177 [==============================] - 18s 85ms/step - loss: 0.3049 - categorical_accuracy: 0.4777 - tp: 1065.0000 - fp: 788.0000 - tn: 36917.0000 - fn: 6476.0000 - precision: 0.5747 - recall: 0.1412 - auc: 0.7961 - prc: 0.4483 - val_loss: 0.2360 - val_categorical_accuracy: 0.4701 - val_tp: 0.0000e+00 - val_fp: 0.0000e+00 - val_tn: 28280.0000 - val_fn: 5656.0000 - val_precision: 0.0000e+00 - val_recall: 0.0000e+00 - val_auc: 0.8172 - val_prc: 0.4601
Epoch 2/300
177/177 [==============================] - 14s 81ms/step - loss: 0.2303 - categorical_accuracy: 0.5171 - tp: 154.0000 - fp: 92.0000 - tn: 28188.0000 - fn: 5502.0000 - precision: 0.6260 - recall: 0.0272 - auc: 0.8239 - prc: 0.4739 - val_loss: 0.2224 - val_categorical_accuracy: 0.5502 - val_tp: 0.0000e+00 - val_fp: 0.0000e+00 - val_tn: 28280.0000 - val_fn: 5656.0000 - val_precision: 0.0000e+00 - val_recall: 0.0000e+00 - val_auc: 0.8494 - val_prc: 0.5433
Epoch 3/300
177/177 [==============================] - 14s 81ms/step - loss: 0.2138 - categorical_accuracy: 0.5718 - tp: 167.0000 - fp: 74.0000 - tn: 28206.0000 - fn: 5489.0000 - precision: 0.6929 - recall: 0.0295 - auc: 0.8511 - prc: 0.5340 - val_loss: 0.2066 - val_categorical_accuracy: 0.5663 - val_tp: 55.0000 - val_fp: 18.0000 - val_tn: 28262.0000 - val_fn: 5601.0000 - val_precision: 0.7534 - val_recall: 0.0097 - val_auc: 0.8634 - val_prc: 0.5589
Epoch 4/300
177/177 [==============================] - 15s 83ms/step - loss: 0.2032 - categorical_accuracy: 0.5925 - tp: 306.0000 - fp: 119.0000 - tn: 28161.0000 - fn: 5350.0000 - precision: 0.7200 - recall: 0.0541 - auc: 0.8678 - prc: 0.5714 - val_loss: 0.1881 - val_categorical_accuracy: 0.6259 - val_tp: 82.0000 - val_fp: 13.0000 - val_tn: 28267.0000 - val_fn: 5574.0000 - val_precision: 0.8632 - val_recall: 0.0145 - val_auc: 0.8866 - val_prc: 0.6348
Epoch 5/300
177/177 [==============================] - 14s 82ms/step - loss: 0.1965 - categorical_accuracy: 0.6077 - tp: 581.0000 - fp: 188.0000 - tn: 28092.0000 - fn: 5075.0000 - precision: 0.7555 - recall: 0.1027 - auc: 0.8778 - prc: 0.6018 - val_loss: 0.1952 - val_categorical_accuracy: 0.6252 - val_tp: 62.0000 - val_fp: 10.0000 - val_tn: 28270.0000 - val_fn: 5594.0000 - val_precision: 0.8611 - val_recall: 0.0110 - val_auc: 0.8882 - val_prc: 0.6246
Epoch 6/300
177/177 [==============================] - 14s 81ms/step - loss: 0.1887 - categorical_accuracy: 0.6250 - tp: 796.0000 - fp: 213.0000 - tn: 28067.0000 - fn: 4860.0000 - precision: 0.7889 - recall: 0.1407 - auc: 0.8881 - prc: 0.6311 - val_loss: 0.1736 - val_categorical_accuracy: 0.6565 - val_tp: 199.0000 - val_fp: 20.0000 - val_tn: 28260.0000 - val_fn: 5457.0000 - val_precision: 0.9087 - val_recall: 0.0352 - val_auc: 0.9088 - val_prc: 0.7010
Epoch 7/300
177/177 [==============================] - 14s 82ms/step - loss: 0.1818 - categorical_accuracy: 0.6404 - tp: 1040.0000 - fp: 250.0000 - tn: 28030.0000 - fn: 4616.0000 - precision: 0.8062 - recall: 0.1839 - auc: 0.8972 - prc: 0.6568 - val_loss: 0.1670 - val_categorical_accuracy: 0.6713 - val_tp: 568.0000 - val_fp: 71.0000 - val_tn: 28209.0000 - val_fn: 5088.0000 - val_precision: 0.8889 - val_recall: 0.1004 - val_auc: 0.9140 - val_prc: 0.7146
Epoch 8/300
177/177 [==============================] - 15s 83ms/step - loss: 0.1719 - categorical_accuracy: 0.6581 - tp: 1310.0000 - fp: 265.0000 - tn: 28015.0000 - fn: 4346.0000 - precision: 0.8317 - recall: 0.2316 - auc: 0.9092 - prc: 0.6945 - val_loss: 0.1488 - val_categorical_accuracy: 0.7033 - val_tp: 1331.0000 - val_fp: 102.0000 - val_tn: 28178.0000 - val_fn: 4325.0000 - val_precision: 0.9288 - val_recall: 0.2353 - val_auc: 0.9325 - val_prc: 0.7720
Epoch 9/300
177/177 [==============================] - 14s 82ms/step - loss: 0.1592 - categorical_accuracy: 0.6830 - tp: 1821.0000 - fp: 280.0000 - tn: 28000.0000 - fn: 3835.0000 - precision: 0.8667 - recall: 0.3220 - auc: 0.9235 - prc: 0.7380 - val_loss: 0.1384 - val_categorical_accuracy: 0.7184 - val_tp: 2064.0000 - val_fp: 127.0000 - val_tn: 28153.0000 - val_fn: 3592.0000 - val_precision: 0.9420 - val_recall: 0.3649 - val_auc: 0.9423 - val_prc: 0.8105
Epoch 10/300
177/177 [==============================] - 14s 82ms/step - loss: 0.1416 - categorical_accuracy: 0.7184 - tp: 2335.0000 - fp: 275.0000 - tn: 28005.0000 - fn: 3321.0000 - precision: 0.8946 - recall: 0.4128 - auc: 0.9403 - prc: 0.7928 - val_loss: 0.1204 - val_categorical_accuracy: 0.7679 - val_tp: 2459.0000 - val_fp: 132.0000 - val_tn: 28148.0000 - val_fn: 3197.0000 - val_precision: 0.9491 - val_recall: 0.4348 - val_auc: 0.9573 - val_prc: 0.8562
Epoch 11/300
177/177 [==============================] - 14s 82ms/step - loss: 0.1182 - categorical_accuracy: 0.7726 - tp: 2929.0000 - fp: 247.0000 - tn: 28033.0000 - fn: 2727.0000 - precision: 0.9222 - recall: 0.5179 - auc: 0.9582 - prc: 0.8566 - val_loss: 0.0917 - val_categorical_accuracy: 0.8527 - val_tp: 2643.0000 - val_fp: 88.0000 - val_tn: 28192.0000 - val_fn: 3013.0000 - val_precision: 0.9678 - val_recall: 0.4673 - val_auc: 0.9801 - val_prc: 0.9206
Epoch 12/300
177/177 [==============================] - 15s 87ms/step - loss: 0.0898 - categorical_accuracy: 0.8432 - tp: 3664.0000 - fp: 244.0000 - tn: 28036.0000 - fn: 1992.0000 - precision: 0.9376 - recall: 0.6478 - auc: 0.9769 - prc: 0.9143 - val_loss: 0.0516 - val_categorical_accuracy: 0.9107 - val_tp: 4578.0000 - val_fp: 155.0000 - val_tn: 28125.0000 - val_fn: 1078.0000 - val_precision: 0.9673 - val_recall: 0.8094 - val_auc: 0.9921 - val_prc: 0.9697
Epoch 13/300
177/177 [==============================] - 14s 82ms/step - loss: 0.0657 - categorical_accuracy: 0.8858 - tp: 4354.0000 - fp: 223.0000 - tn: 28057.0000 - fn: 1302.0000 - precision: 0.9513 - recall: 0.7698 - auc: 0.9882 - prc: 0.9534 - val_loss: 0.0421 - val_categorical_accuracy: 0.9470 - val_tp: 4634.0000 - val_fp: 64.0000 - val_tn: 28216.0000 - val_fn: 1022.0000 - val_precision: 0.9864 - val_recall: 0.8193 - val_auc: 0.9960 - val_prc: 0.9831
Epoch 14/300
177/177 [==============================] - 14s 81ms/step - loss: 0.0563 - categorical_accuracy: 0.9185 - tp: 4714.0000 - fp: 178.0000 - tn: 28102.0000 - fn: 942.0000 - precision: 0.9636 - recall: 0.8335 - auc: 0.9919 - prc: 0.9679 - val_loss: 0.0302 - val_categorical_accuracy: 0.9576 - val_tp: 5173.0000 - val_fp: 133.0000 - val_tn: 28147.0000 - val_fn: 483.0000 - val_precision: 0.9749 - val_recall: 0.9146 - val_auc: 0.9975 - val_prc: 0.9898
Epoch 15/300
177/177 [==============================] - 14s 82ms/step - loss: 0.0456 - categorical_accuracy: 0.9351 - tp: 4948.0000 - fp: 164.0000 - tn: 28116.0000 - fn: 708.0000 - precision: 0.9679 - recall: 0.8748 - auc: 0.9948 - prc: 0.9786 - val_loss: 0.1137 - val_categorical_accuracy: 0.8462 - val_tp: 4067.0000 - val_fp: 432.0000 - val_tn: 27848.0000 - val_fn: 1589.0000 - val_precision: 0.9040 - val_recall: 0.7191 - val_auc: 0.9748 - val_prc: 0.9000
Epoch 16/300
177/177 [==============================] - 15s 87ms/step - loss: 0.0415 - categorical_accuracy: 0.9413 - tp: 5059.0000 - fp: 174.0000 - tn: 28106.0000 - fn: 597.0000 - precision: 0.9667 - recall: 0.8944 - auc: 0.9956 - prc: 0.9827 - val_loss: 0.0479 - val_categorical_accuracy: 0.9309 - val_tp: 4610.0000 - val_fp: 143.0000 - val_tn: 28137.0000 - val_fn: 1046.0000 - val_precision: 0.9699 - val_recall: 0.8151 - val_auc: 0.9941 - val_prc: 0.9750
Epoch 17/300
177/177 [==============================] - 15s 82ms/step - loss: 0.0390 - categorical_accuracy: 0.9514 - tp: 5139.0000 - fp: 148.0000 - tn: 28132.0000 - fn: 517.0000 - precision: 0.9720 - recall: 0.9086 - auc: 0.9959 - prc: 0.9851 - val_loss: 0.0314 - val_categorical_accuracy: 0.9581 - val_tp: 5017.0000 - val_fp: 95.0000 - val_tn: 28185.0000 - val_fn: 639.0000 - val_precision: 0.9814 - val_recall: 0.8870 - val_auc: 0.9974 - val_prc: 0.9889
Epoch 18/300
177/177 [==============================] - 14s 82ms/step - loss: 0.0377 - categorical_accuracy: 0.9514 - tp: 5201.0000 - fp: 154.0000 - tn: 28126.0000 - fn: 455.0000 - precision: 0.9712 - recall: 0.9196 - auc: 0.9963 - prc: 0.9863 - val_loss: 0.0218 - val_categorical_accuracy: 0.9729 - val_tp: 5422.0000 - val_fp: 104.0000 - val_tn: 28176.0000 - val_fn: 234.0000 - val_precision: 0.9812 - val_recall: 0.9586 - val_auc: 0.9986 - val_prc: 0.9954
Epoch 19/300
177/177 [==============================] - 15s 82ms/step - loss: 0.0344 - categorical_accuracy: 0.9569 - tp: 5211.0000 - fp: 143.0000 - tn: 28137.0000 - fn: 445.0000 - precision: 0.9733 - recall: 0.9213 - auc: 0.9971 - prc: 0.9885 - val_loss: 0.0216 - val_categorical_accuracy: 0.9744 - val_tp: 5421.0000 - val_fp: 100.0000 - val_tn: 28180.0000 - val_fn: 235.0000 - val_precision: 0.9819 - val_recall: 0.9585 - val_auc: 0.9987 - val_prc: 0.9949
Epoch 20/300
177/177 [==============================] - 15s 86ms/step - loss: 0.0339 - categorical_accuracy: 0.9562 - tp: 5257.0000 - fp: 160.0000 - tn: 28120.0000 - fn: 399.0000 - precision: 0.9705 - recall: 0.9295 - auc: 0.9971 - prc: 0.9891 - val_loss: 0.0167 - val_categorical_accuracy: 0.9825 - val_tp: 5463.0000 - val_fp: 61.0000 - val_tn: 28219.0000 - val_fn: 193.0000 - val_precision: 0.9890 - val_recall: 0.9659 - val_auc: 0.9989 - val_prc: 0.9962
Epoch 21/300
177/177 [==============================] - 14s 82ms/step - loss: 0.0348 - categorical_accuracy: 0.9622 - tp: 5282.0000 - fp: 130.0000 - tn: 28150.0000 - fn: 374.0000 - precision: 0.9760 - recall: 0.9339 - auc: 0.9971 - prc: 0.9890 - val_loss: 0.0457 - val_categorical_accuracy: 0.9341 - val_tp: 4992.0000 - val_fp: 213.0000 - val_tn: 28067.0000 - val_fn: 664.0000 - val_precision: 0.9591 - val_recall: 0.8826 - val_auc: 0.9949 - val_prc: 0.9805
Epoch 22/300
177/177 [==============================] - 14s 82ms/step - loss: 0.0347 - categorical_accuracy: 0.9586 - tp: 5289.0000 - fp: 159.0000 - tn: 28121.0000 - fn: 367.0000 - precision: 0.9708 - recall: 0.9351 - auc: 0.9972 - prc: 0.9892 - val_loss: 0.0112 - val_categorical_accuracy: 0.9889 - val_tp: 5535.0000 - val_fp: 38.0000 - val_tn: 28242.0000 - val_fn: 121.0000 - val_precision: 0.9932 - val_recall: 0.9786 - val_auc: 0.9992 - val_prc: 0.9979
Epoch 23/300
177/177 [==============================] - 14s 82ms/step - loss: 0.0338 - categorical_accuracy: 0.9620 - tp: 5290.0000 - fp: 137.0000 - tn: 28143.0000 - fn: 366.0000 - precision: 0.9748 - recall: 0.9353 - auc: 0.9974 - prc: 0.9900 - val_loss: 0.0195 - val_categorical_accuracy: 0.9763 - val_tp: 5351.0000 - val_fp: 73.0000 - val_tn: 28207.0000 - val_fn: 305.0000 - val_precision: 0.9865 - val_recall: 0.9461 - val_auc: 0.9987 - val_prc: 0.9952
Epoch 24/300
177/177 [==============================] - 15s 85ms/step - loss: 0.0408 - categorical_accuracy: 0.9600 - tp: 5280.0000 - fp: 137.0000 - tn: 28143.0000 - fn: 376.0000 - precision: 0.9747 - recall: 0.9335 - auc: 0.9960 - prc: 0.9866 - val_loss: 0.0240 - val_categorical_accuracy: 0.9758 - val_tp: 5461.0000 - val_fp: 95.0000 - val_tn: 28185.0000 - val_fn: 195.0000 - val_precision: 0.9829 - val_recall: 0.9655 - val_auc: 0.9982 - val_prc: 0.9942
Epoch 25/300
177/177 [==============================] - 14s 82ms/step - loss: 0.0312 - categorical_accuracy: 0.9684 - tp: 5366.0000 - fp: 114.0000 - tn: 28166.0000 - fn: 290.0000 - precision: 0.9792 - recall: 0.9487 - auc: 0.9976 - prc: 0.9912 - val_loss: 0.0302 - val_categorical_accuracy: 0.9701 - val_tp: 5422.0000 - val_fp: 128.0000 - val_tn: 28152.0000 - val_fn: 234.0000 - val_precision: 0.9769 - val_recall: 0.9586 - val_auc: 0.9982 - val_prc: 0.9933
Epoch 26/300
177/177 [==============================] - 14s 82ms/step - loss: 0.0328 - categorical_accuracy: 0.9616 - tp: 5332.0000 - fp: 137.0000 - tn: 28143.0000 - fn: 324.0000 - precision: 0.9749 - recall: 0.9427 - auc: 0.9975 - prc: 0.9907 - val_loss: 0.0095 - val_categorical_accuracy: 0.9920 - val_tp: 5590.0000 - val_fp: 34.0000 - val_tn: 28246.0000 - val_fn: 66.0000 - val_precision: 0.9940 - val_recall: 0.9883 - val_auc: 0.9995 - val_prc: 0.9985
Epoch 27/300
177/177 [==============================] - 14s 82ms/step - loss: 0.0325 - categorical_accuracy: 0.9652 - tp: 5370.0000 - fp: 121.0000 - tn: 28159.0000 - fn: 286.0000 - precision: 0.9780 - recall: 0.9494 - auc: 0.9977 - prc: 0.9917 - val_loss: 0.0244 - val_categorical_accuracy: 0.9735 - val_tp: 5467.0000 - val_fp: 129.0000 - val_tn: 28151.0000 - val_fn: 189.0000 - val_precision: 0.9769 - val_recall: 0.9666 - val_auc: 0.9988 - val_prc: 0.9956
Epoch 28/300
177/177 [==============================] - 15s 86ms/step - loss: 0.0314 - categorical_accuracy: 0.9648 - tp: 5343.0000 - fp: 132.0000 - tn: 28148.0000 - fn: 313.0000 - precision: 0.9759 - recall: 0.9447 - auc: 0.9978 - prc: 0.9918 - val_loss: 0.0152 - val_categorical_accuracy: 0.9827 - val_tp: 5481.0000 - val_fp: 53.0000 - val_tn: 28227.0000 - val_fn: 175.0000 - val_precision: 0.9904 - val_recall: 0.9691 - val_auc: 0.9989 - val_prc: 0.9972
Epoch 29/300
177/177 [==============================] - 14s 82ms/step - loss: 0.0318 - categorical_accuracy: 0.9694 - tp: 5404.0000 - fp: 122.0000 - tn: 28158.0000 - fn: 252.0000 - precision: 0.9779 - recall: 0.9554 - auc: 0.9978 - prc: 0.9921 - val_loss: 0.0174 - val_categorical_accuracy: 0.9818 - val_tp: 5476.0000 - val_fp: 75.0000 - val_tn: 28205.0000 - val_fn: 180.0000 - val_precision: 0.9865 - val_recall: 0.9682 - val_auc: 0.9988 - val_prc: 0.9962
Epoch 30/300
177/177 [==============================] - 14s 82ms/step - loss: 0.0341 - categorical_accuracy: 0.9629 - tp: 5350.0000 - fp: 141.0000 - tn: 28139.0000 - fn: 306.0000 - precision: 0.9743 - recall: 0.9459 - auc: 0.9975 - prc: 0.9913 - val_loss: 0.0133 - val_categorical_accuracy: 0.9874 - val_tp: 5564.0000 - val_fp: 57.0000 - val_tn: 28223.0000 - val_fn: 92.0000 - val_precision: 0.9899 - val_recall: 0.9837 - val_auc: 0.9995 - val_prc: 0.9984
Epoch 31/300
177/177 [==============================] - 14s 82ms/step - loss: 0.0476 - categorical_accuracy: 0.9627 - tp: 5375.0000 - fp: 149.0000 - tn: 28131.0000 - fn: 281.0000 - precision: 0.9730 - recall: 0.9503 - auc: 0.9970 - prc: 0.9883 - val_loss: 0.0091 - val_categorical_accuracy: 0.9920 - val_tp: 5603.0000 - val_fp: 32.0000 - val_tn: 28248.0000 - val_fn: 53.0000 - val_precision: 0.9943 - val_recall: 0.9906 - val_auc: 0.9995 - val_prc: 0.9988
Epoch 32/300
177/177 [==============================] - 15s 87ms/step - loss: 0.0383 - categorical_accuracy: 0.9659 - tp: 5370.0000 - fp: 136.0000 - tn: 28144.0000 - fn: 286.0000 - precision: 0.9753 - recall: 0.9494 - auc: 0.9974 - prc: 0.9900 - val_loss: 0.0157 - val_categorical_accuracy: 0.9832 - val_tp: 5444.0000 - val_fp: 53.0000 - val_tn: 28227.0000 - val_fn: 212.0000 - val_precision: 0.9904 - val_recall: 0.9625 - val_auc: 0.9991 - val_prc: 0.9966
Epoch 33/300
177/177 [==============================] - 14s 82ms/step - loss: 0.0358 - categorical_accuracy: 0.9650 - tp: 5346.0000 - fp: 132.0000 - tn: 28148.0000 - fn: 310.0000 - precision: 0.9759 - recall: 0.9452 - auc: 0.9976 - prc: 0.9908 - val_loss: 0.0274 - val_categorical_accuracy: 0.9707 - val_tp: 5416.0000 - val_fp: 108.0000 - val_tn: 28172.0000 - val_fn: 240.0000 - val_precision: 0.9804 - val_recall: 0.9576 - val_auc: 0.9982 - val_prc: 0.9938
Epoch 34/300
177/177 [==============================] - 14s 82ms/step - loss: 0.0353 - categorical_accuracy: 0.9715 - tp: 5424.0000 - fp: 118.0000 - tn: 28162.0000 - fn: 232.0000 - precision: 0.9787 - recall: 0.9590 - auc: 0.9976 - prc: 0.9918 - val_loss: 0.0152 - val_categorical_accuracy: 0.9894 - val_tp: 5589.0000 - val_fp: 49.0000 - val_tn: 28231.0000 - val_fn: 67.0000 - val_precision: 0.9913 - val_recall: 0.9882 - val_auc: 0.9993 - val_prc: 0.9980
Epoch 35/300
177/177 [==============================] - 14s 82ms/step - loss: 0.0447 - categorical_accuracy: 0.9627 - tp: 5346.0000 - fp: 156.0000 - tn: 28124.0000 - fn: 310.0000 - precision: 0.9716 - recall: 0.9452 - auc: 0.9967 - prc: 0.9879 - val_loss: 0.0102 - val_categorical_accuracy: 0.9890 - val_tp: 5532.0000 - val_fp: 29.0000 - val_tn: 28251.0000 - val_fn: 124.0000 - val_precision: 0.9948 - val_recall: 0.9781 - val_auc: 0.9995 - val_prc: 0.9983
Epoch 36/300
177/177 [==============================] - 15s 84ms/step - loss: 0.0396 - categorical_accuracy: 0.9632 - tp: 5369.0000 - fp: 148.0000 - tn: 28132.0000 - fn: 287.0000 - precision: 0.9732 - recall: 0.9493 - auc: 0.9972 - prc: 0.9901 - val_loss: 0.0126 - val_categorical_accuracy: 0.9869 - val_tp: 5567.0000 - val_fp: 58.0000 - val_tn: 28222.0000 - val_fn: 89.0000 - val_precision: 0.9897 - val_recall: 0.9843 - val_auc: 0.9997 - val_prc: 0.9987
Epoch 37/300
177/177 [==============================] - 14s 82ms/step - loss: 0.0410 - categorical_accuracy: 0.9618 - tp: 5348.0000 - fp: 155.0000 - tn: 28125.0000 - fn: 308.0000 - precision: 0.9718 - recall: 0.9455 - auc: 0.9975 - prc: 0.9901 - val_loss: 0.0114 - val_categorical_accuracy: 0.9919 - val_tp: 5606.0000 - val_fp: 39.0000 - val_tn: 28241.0000 - val_fn: 50.0000 - val_precision: 0.9931 - val_recall: 0.9912 - val_auc: 0.9994 - val_prc: 0.9986
Epoch 38/300
177/177 [==============================] - 14s 82ms/step - loss: 0.0417 - categorical_accuracy: 0.9602 - tp: 5326.0000 - fp: 150.0000 - tn: 28130.0000 - fn: 330.0000 - precision: 0.9726 - recall: 0.9417 - auc: 0.9969 - prc: 0.9890 - val_loss: 0.0167 - val_categorical_accuracy: 0.9866 - val_tp: 5550.0000 - val_fp: 52.0000 - val_tn: 28228.0000 - val_fn: 106.0000 - val_precision: 0.9907 - val_recall: 0.9813 - val_auc: 0.9990 - val_prc: 0.9967
Epoch 39/300
177/177 [==============================] - 14s 81ms/step - loss: 0.0386 - categorical_accuracy: 0.9675 - tp: 5383.0000 - fp: 136.0000 - tn: 28144.0000 - fn: 273.0000 - precision: 0.9754 - recall: 0.9517 - auc: 0.9976 - prc: 0.9911 - val_loss: 0.0707 - val_categorical_accuracy: 0.9413 - val_tp: 5254.0000 - val_fp: 264.0000 - val_tn: 28016.0000 - val_fn: 402.0000 - val_precision: 0.9522 - val_recall: 0.9289 - val_auc: 0.9955 - val_prc: 0.9835
Epoch 40/300
177/177 [==============================] - 15s 85ms/step - loss: 0.0392 - categorical_accuracy: 0.9653 - tp: 5397.0000 - fp: 149.0000 - tn: 28131.0000 - fn: 259.0000 - precision: 0.9731 - recall: 0.9542 - auc: 0.9977 - prc: 0.9917 - val_loss: 0.0208 - val_categorical_accuracy: 0.9807 - val_tp: 5527.0000 - val_fp: 86.0000 - val_tn: 28194.0000 - val_fn: 129.0000 - val_precision: 0.9847 - val_recall: 0.9772 - val_auc: 0.9991 - val_prc: 0.9971
Epoch 41/300
177/177 [==============================] - 14s 82ms/step - loss: 0.0418 - categorical_accuracy: 0.9623 - tp: 5368.0000 - fp: 145.0000 - tn: 28135.0000 - fn: 288.0000 - precision: 0.9737 - recall: 0.9491 - auc: 0.9974 - prc: 0.9904 - val_loss: 0.0292 - val_categorical_accuracy: 0.9698 - val_tp: 5402.0000 - val_fp: 126.0000 - val_tn: 28154.0000 - val_fn: 254.0000 - val_precision: 0.9772 - val_recall: 0.9551 - val_auc: 0.9980 - val_prc: 0.9929
Epoch 42/300
177/177 [==============================] - 14s 82ms/step - loss: 0.0448 - categorical_accuracy: 0.9620 - tp: 5370.0000 - fp: 143.0000 - tn: 28137.0000 - fn: 286.0000 - precision: 0.9741 - recall: 0.9494 - auc: 0.9968 - prc: 0.9888 - val_loss: 0.0257 - val_categorical_accuracy: 0.9710 - val_tp: 5366.0000 - val_fp: 98.0000 - val_tn: 28182.0000 - val_fn: 290.0000 - val_precision: 0.9821 - val_recall: 0.9487 - val_auc: 0.9981 - val_prc: 0.9930
Epoch 43/300
177/177 [==============================] - 14s 82ms/step - loss: 0.0358 - categorical_accuracy: 0.9673 - tp: 5400.0000 - fp: 134.0000 - tn: 28146.0000 - fn: 256.0000 - precision: 0.9758 - recall: 0.9547 - auc: 0.9979 - prc: 0.9925 - val_loss: 0.0237 - val_categorical_accuracy: 0.9684 - val_tp: 5300.0000 - val_fp: 96.0000 - val_tn: 28184.0000 - val_fn: 356.0000 - val_precision: 0.9822 - val_recall: 0.9371 - val_auc: 0.9984 - val_prc: 0.9935
Epoch 44/300
177/177 [==============================] - 15s 86ms/step - loss: 0.0399 - categorical_accuracy: 0.9641 - tp: 5383.0000 - fp: 150.0000 - tn: 28130.0000 - fn: 273.0000 - precision: 0.9729 - recall: 0.9517 - auc: 0.9971 - prc: 0.9910 - val_loss: 0.0590 - val_categorical_accuracy: 0.9593 - val_tp: 5381.0000 - val_fp: 191.0000 - val_tn: 28089.0000 - val_fn: 275.0000 - val_precision: 0.9657 - val_recall: 0.9514 - val_auc: 0.9965 - val_prc: 0.9876
Epoch 45/300
177/177 [==============================] - 14s 82ms/step - loss: 0.0430 - categorical_accuracy: 0.9607 - tp: 5349.0000 - fp: 164.0000 - tn: 28116.0000 - fn: 307.0000 - precision: 0.9703 - recall: 0.9457 - auc: 0.9973 - prc: 0.9899 - val_loss: 0.0193 - val_categorical_accuracy: 0.9848 - val_tp: 5544.0000 - val_fp: 66.0000 - val_tn: 28214.0000 - val_fn: 112.0000 - val_precision: 0.9882 - val_recall: 0.9802 - val_auc: 0.9992 - val_prc: 0.9972
Epoch 46/300
177/177 [==============================] - 14s 82ms/step - loss: 0.0451 - categorical_accuracy: 0.9634 - tp: 5373.0000 - fp: 153.0000 - tn: 28127.0000 - fn: 283.0000 - precision: 0.9723 - recall: 0.9500 - auc: 0.9970 - prc: 0.9901 - val_loss: 0.0328 - val_categorical_accuracy: 0.9687 - val_tp: 5449.0000 - val_fp: 130.0000 - val_tn: 28150.0000 - val_fn: 207.0000 - val_precision: 0.9767 - val_recall: 0.9634 - val_auc: 0.9985 - val_prc: 0.9946
Epoch 47/300
177/177 [==============================] - 14s 82ms/step - loss: 0.0456 - categorical_accuracy: 0.9691 - tp: 5443.0000 - fp: 143.0000 - tn: 28137.0000 - fn: 213.0000 - precision: 0.9744 - recall: 0.9623 - auc: 0.9972 - prc: 0.9907 - val_loss: 0.0193 - val_categorical_accuracy: 0.9873 - val_tp: 5555.0000 - val_fp: 61.0000 - val_tn: 28219.0000 - val_fn: 101.0000 - val_precision: 0.9891 - val_recall: 0.9821 - val_auc: 0.9989 - val_prc: 0.9966
Epoch 48/300
177/177 [==============================] - 15s 84ms/step - loss: 0.0462 - categorical_accuracy: 0.9648 - tp: 5400.0000 - fp: 146.0000 - tn: 28134.0000 - fn: 256.0000 - precision: 0.9737 - recall: 0.9547 - auc: 0.9970 - prc: 0.9896 - val_loss: 0.0284 - val_categorical_accuracy: 0.9710 - val_tp: 5359.0000 - val_fp: 115.0000 - val_tn: 28165.0000 - val_fn: 297.0000 - val_precision: 0.9790 - val_recall: 0.9475 - val_auc: 0.9980 - val_prc: 0.9927
Epoch 49/300
177/177 [==============================] - 14s 82ms/step - loss: 0.0444 - categorical_accuracy: 0.9630 - tp: 5392.0000 - fp: 157.0000 - tn: 28123.0000 - fn: 264.0000 - precision: 0.9717 - recall: 0.9533 - auc: 0.9974 - prc: 0.9910 - val_loss: 0.0342 - val_categorical_accuracy: 0.9450 - val_tp: 5205.0000 - val_fp: 173.0000 - val_tn: 28107.0000 - val_fn: 451.0000 - val_precision: 0.9678 - val_recall: 0.9203 - val_auc: 0.9973 - val_prc: 0.9889
Epoch 50/300
177/177 [==============================] - 14s 81ms/step - loss: 0.0420 - categorical_accuracy: 0.9669 - tp: 5424.0000 - fp: 145.0000 - tn: 28135.0000 - fn: 232.0000 - precision: 0.9740 - recall: 0.9590 - auc: 0.9977 - prc: 0.9915 - val_loss: 0.0201 - val_categorical_accuracy: 0.9834 - val_tp: 5540.0000 - val_fp: 76.0000 - val_tn: 28204.0000 - val_fn: 116.0000 - val_precision: 0.9865 - val_recall: 0.9795 - val_auc: 0.9993 - val_prc: 0.9971
Epoch 51/300
177/177 [==============================] - 14s 81ms/step - loss: 0.0446 - categorical_accuracy: 0.9650 - tp: 5377.0000 - fp: 146.0000 - tn: 28134.0000 - fn: 279.0000 - precision: 0.9736 - recall: 0.9507 - auc: 0.9972 - prc: 0.9895 - val_loss: 0.0284 - val_categorical_accuracy: 0.9790 - val_tp: 5516.0000 - val_fp: 103.0000 - val_tn: 28177.0000 - val_fn: 140.0000 - val_precision: 0.9817 - val_recall: 0.9752 - val_auc: 0.9987 - val_prc: 0.9960
Epoch 52/300
177/177 [==============================] - 15s 83ms/step - loss: 0.0427 - categorical_accuracy: 0.9678 - tp: 5425.0000 - fp: 155.0000 - tn: 28125.0000 - fn: 231.0000 - precision: 0.9722 - recall: 0.9592 - auc: 0.9976 - prc: 0.9916 - val_loss: 0.0164 - val_categorical_accuracy: 0.9839 - val_tp: 5529.0000 - val_fp: 67.0000 - val_tn: 28213.0000 - val_fn: 127.0000 - val_precision: 0.9880 - val_recall: 0.9775 - val_auc: 0.9993 - val_prc: 0.9974
Epoch 53/300
177/177 [==============================] - 15s 83ms/step - loss: 0.0426 - categorical_accuracy: 0.9659 - tp: 5416.0000 - fp: 155.0000 - tn: 28125.0000 - fn: 240.0000 - precision: 0.9722 - recall: 0.9576 - auc: 0.9977 - prc: 0.9914 - val_loss: 0.0210 - val_categorical_accuracy: 0.9890 - val_tp: 5588.0000 - val_fp: 53.0000 - val_tn: 28227.0000 - val_fn: 68.0000 - val_precision: 0.9906 - val_recall: 0.9880 - val_auc: 0.9991 - val_prc: 0.9972
Epoch 54/300
177/177 [==============================] - 14s 81ms/step - loss: 0.0509 - categorical_accuracy: 0.9638 - tp: 5387.0000 - fp: 164.0000 - tn: 28116.0000 - fn: 269.0000 - precision: 0.9705 - recall: 0.9524 - auc: 0.9970 - prc: 0.9891 - val_loss: 0.0459 - val_categorical_accuracy: 0.9770 - val_tp: 5512.0000 - val_fp: 127.0000 - val_tn: 28153.0000 - val_fn: 144.0000 - val_precision: 0.9775 - val_recall: 0.9745 - val_auc: 0.9983 - val_prc: 0.9947
Epoch 55/300
177/177 [==============================] - 14s 82ms/step - loss: 0.0588 - categorical_accuracy: 0.9599 - tp: 5370.0000 - fp: 178.0000 - tn: 28102.0000 - fn: 286.0000 - precision: 0.9679 - recall: 0.9494 - auc: 0.9962 - prc: 0.9877 - val_loss: 0.0356 - val_categorical_accuracy: 0.9655 - val_tp: 5427.0000 - val_fp: 171.0000 - val_tn: 28109.0000 - val_fn: 229.0000 - val_precision: 0.9695 - val_recall: 0.9595 - val_auc: 0.9985 - val_prc: 0.9944
Epoch 56/300
177/177 [==============================] - 14s 82ms/step - loss: 0.0491 - categorical_accuracy: 0.9646 - tp: 5391.0000 - fp: 147.0000 - tn: 28133.0000 - fn: 265.0000 - precision: 0.9735 - recall: 0.9531 - auc: 0.9969 - prc: 0.9896 - val_loss: 0.0234 - val_categorical_accuracy: 0.9772 - val_tp: 5503.0000 - val_fp: 99.0000 - val_tn: 28181.0000 - val_fn: 153.0000 - val_precision: 0.9823 - val_recall: 0.9729 - val_auc: 0.9989 - val_prc: 0.9960
Epoch 57/300
177/177 [==============================] - 15s 85ms/step - loss: 0.0425 - categorical_accuracy: 0.9676 - tp: 5424.0000 - fp: 150.0000 - tn: 28130.0000 - fn: 232.0000 - precision: 0.9731 - recall: 0.9590 - auc: 0.9973 - prc: 0.9911 - val_loss: 0.0405 - val_categorical_accuracy: 0.9668 - val_tp: 5401.0000 - val_fp: 151.0000 - val_tn: 28129.0000 - val_fn: 255.0000 - val_precision: 0.9728 - val_recall: 0.9549 - val_auc: 0.9974 - val_prc: 0.9916
Epoch 58/300
177/177 [==============================] - 14s 81ms/step - loss: 0.0419 - categorical_accuracy: 0.9652 - tp: 5398.0000 - fp: 151.0000 - tn: 28129.0000 - fn: 258.0000 - precision: 0.9728 - recall: 0.9544 - auc: 0.9975 - prc: 0.9918 - val_loss: 0.0204 - val_categorical_accuracy: 0.9784 - val_tp: 5476.0000 - val_fp: 95.0000 - val_tn: 28185.0000 - val_fn: 180.0000 - val_precision: 0.9829 - val_recall: 0.9682 - val_auc: 0.9990 - val_prc: 0.9964
Epoch 59/300
177/177 [==============================] - 14s 82ms/step - loss: 0.0419 - categorical_accuracy: 0.9703 - tp: 5440.0000 - fp: 136.0000 - tn: 28144.0000 - fn: 216.0000 - precision: 0.9756 - recall: 0.9618 - auc: 0.9973 - prc: 0.9917 - val_loss: 0.0098 - val_categorical_accuracy: 0.9906 - val_tp: 5567.0000 - val_fp: 37.0000 - val_tn: 28243.0000 - val_fn: 89.0000 - val_precision: 0.9934 - val_recall: 0.9843 - val_auc: 0.9995 - val_prc: 0.9987
Epoch 60/300
177/177 [==============================] - 14s 81ms/step - loss: 0.0392 - categorical_accuracy: 0.9646 - tp: 5389.0000 - fp: 154.0000 - tn: 28126.0000 - fn: 267.0000 - precision: 0.9722 - recall: 0.9528 - auc: 0.9975 - prc: 0.9919 - val_loss: 0.0247 - val_categorical_accuracy: 0.9830 - val_tp: 5514.0000 - val_fp: 67.0000 - val_tn: 28213.0000 - val_fn: 142.0000 - val_precision: 0.9880 - val_recall: 0.9749 - val_auc: 0.9987 - val_prc: 0.9955
Epoch 61/300
177/177 [==============================] - 15s 85ms/step - loss: 0.0430 - categorical_accuracy: 0.9685 - tp: 5413.0000 - fp: 136.0000 - tn: 28144.0000 - fn: 243.0000 - precision: 0.9755 - recall: 0.9570 - auc: 0.9973 - prc: 0.9911 - val_loss: 0.0114 - val_categorical_accuracy: 0.9883 - val_tp: 5540.0000 - val_fp: 51.0000 - val_tn: 28229.0000 - val_fn: 116.0000 - val_precision: 0.9909 - val_recall: 0.9795 - val_auc: 0.9994 - val_prc: 0.9985
Epoch 62/300
177/177 [==============================] - 14s 81ms/step - loss: 0.0605 - categorical_accuracy: 0.9615 - tp: 5380.0000 - fp: 176.0000 - tn: 28104.0000 - fn: 276.0000 - precision: 0.9683 - recall: 0.9512 - auc: 0.9959 - prc: 0.9868 - val_loss: 0.0203 - val_categorical_accuracy: 0.9829 - val_tp: 5531.0000 - val_fp: 75.0000 - val_tn: 28205.0000 - val_fn: 125.0000 - val_precision: 0.9866 - val_recall: 0.9779 - val_auc: 0.9991 - val_prc: 0.9966
Epoch 63/300
177/177 [==============================] - 14s 81ms/step - loss: 0.0402 - categorical_accuracy: 0.9708 - tp: 5450.0000 - fp: 128.0000 - tn: 28152.0000 - fn: 206.0000 - precision: 0.9771 - recall: 0.9636 - auc: 0.9979 - prc: 0.9931 - val_loss: 0.0491 - val_categorical_accuracy: 0.9554 - val_tp: 5330.0000 - val_fp: 198.0000 - val_tn: 28082.0000 - val_fn: 326.0000 - val_precision: 0.9642 - val_recall: 0.9424 - val_auc: 0.9972 - val_prc: 0.9892
Epoch 64/300
177/177 [==============================] - 14s 81ms/step - loss: 0.0593 - categorical_accuracy: 0.9630 - tp: 5411.0000 - fp: 172.0000 - tn: 28108.0000 - fn: 245.0000 - precision: 0.9692 - recall: 0.9567 - auc: 0.9965 - prc: 0.9885 - val_loss: 0.0353 - val_categorical_accuracy: 0.9719 - val_tp: 5447.0000 - val_fp: 132.0000 - val_tn: 28148.0000 - val_fn: 209.0000 - val_precision: 0.9763 - val_recall: 0.9630 - val_auc: 0.9980 - val_prc: 0.9937
Epoch 65/300
177/177 [==============================] - 15s 83ms/step - loss: 0.0384 - categorical_accuracy: 0.9705 - tp: 5440.0000 - fp: 135.0000 - tn: 28145.0000 - fn: 216.0000 - precision: 0.9758 - recall: 0.9618 - auc: 0.9980 - prc: 0.9935 - val_loss: 0.0159 - val_categorical_accuracy: 0.9899 - val_tp: 5569.0000 - val_fp: 44.0000 - val_tn: 28236.0000 - val_fn: 87.0000 - val_precision: 0.9922 - val_recall: 0.9846 - val_auc: 0.9988 - val_prc: 0.9973
Epoch 66/300
177/177 [==============================] - 14s 81ms/step - loss: 0.0537 - categorical_accuracy: 0.9676 - tp: 5441.0000 - fp: 158.0000 - tn: 28122.0000 - fn: 215.0000 - precision: 0.9718 - recall: 0.9620 - auc: 0.9967 - prc: 0.9901 - val_loss: 0.0271 - val_categorical_accuracy: 0.9797 - val_tp: 5525.0000 - val_fp: 93.0000 - val_tn: 28187.0000 - val_fn: 131.0000 - val_precision: 0.9834 - val_recall: 0.9768 - val_auc: 0.9990 - val_prc: 0.9969
Epoch 67/300
177/177 [==============================] - 14s 81ms/step - loss: 0.0412 - categorical_accuracy: 0.9659 - tp: 5407.0000 - fp: 145.0000 - tn: 28135.0000 - fn: 249.0000 - precision: 0.9739 - recall: 0.9560 - auc: 0.9972 - prc: 0.9911 - val_loss: 0.0288 - val_categorical_accuracy: 0.9775 - val_tp: 5509.0000 - val_fp: 99.0000 - val_tn: 28181.0000 - val_fn: 147.0000 - val_precision: 0.9823 - val_recall: 0.9740 - val_auc: 0.9987 - val_prc: 0.9956
Epoch 68/300
177/177 [==============================] - 14s 82ms/step - loss: 0.0482 - categorical_accuracy: 0.9630 - tp: 5398.0000 - fp: 156.0000 - tn: 28124.0000 - fn: 258.0000 - precision: 0.9719 - recall: 0.9544 - auc: 0.9972 - prc: 0.9906 - val_loss: 0.1062 - val_categorical_accuracy: 0.9279 - val_tp: 5172.0000 - val_fp: 308.0000 - val_tn: 27972.0000 - val_fn: 484.0000 - val_precision: 0.9438 - val_recall: 0.9144 - val_auc: 0.9922 - val_prc: 0.9733
Epoch 69/300
177/177 [==============================] - 15s 83ms/step - loss: 0.0384 - categorical_accuracy: 0.9692 - tp: 5437.0000 - fp: 135.0000 - tn: 28145.0000 - fn: 219.0000 - precision: 0.9758 - recall: 0.9613 - auc: 0.9980 - prc: 0.9929 - val_loss: 0.1514 - val_categorical_accuracy: 0.9134 - val_tp: 5095.0000 - val_fp: 415.0000 - val_tn: 27865.0000 - val_fn: 561.0000 - val_precision: 0.9247 - val_recall: 0.9008 - val_auc: 0.9891 - val_prc: 0.9625
Epoch 70/300
177/177 [==============================] - 14s 81ms/step - loss: 0.0593 - categorical_accuracy: 0.9645 - tp: 5407.0000 - fp: 167.0000 - tn: 28113.0000 - fn: 249.0000 - precision: 0.9700 - recall: 0.9560 - auc: 0.9964 - prc: 0.9889 - val_loss: 0.0460 - val_categorical_accuracy: 0.9669 - val_tp: 5426.0000 - val_fp: 138.0000 - val_tn: 28142.0000 - val_fn: 230.0000 - val_precision: 0.9752 - val_recall: 0.9593 - val_auc: 0.9978 - val_prc: 0.9923
Epoch 71/300
177/177 [==============================] - 14s 81ms/step - loss: 0.0563 - categorical_accuracy: 0.9630 - tp: 5390.0000 - fp: 167.0000 - tn: 28113.0000 - fn: 266.0000 - precision: 0.9699 - recall: 0.9530 - auc: 0.9964 - prc: 0.9889 - val_loss: 0.0497 - val_categorical_accuracy: 0.9701 - val_tp: 5418.0000 - val_fp: 133.0000 - val_tn: 28147.0000 - val_fn: 238.0000 - val_precision: 0.9760 - val_recall: 0.9579 - val_auc: 0.9971 - val_prc: 0.9907
Epoch 72/300
177/177 [==============================] - 14s 81ms/step - loss: 0.0577 - categorical_accuracy: 0.9653 - tp: 5425.0000 - fp: 165.0000 - tn: 28115.0000 - fn: 231.0000 - precision: 0.9705 - recall: 0.9592 - auc: 0.9963 - prc: 0.9892 - val_loss: 0.0273 - val_categorical_accuracy: 0.9871 - val_tp: 5558.0000 - val_fp: 68.0000 - val_tn: 28212.0000 - val_fn: 98.0000 - val_precision: 0.9879 - val_recall: 0.9827 - val_auc: 0.9987 - val_prc: 0.9962
Epoch 73/300
177/177 [==============================] - 15s 85ms/step - loss: 0.0615 - categorical_accuracy: 0.9641 - tp: 5407.0000 - fp: 159.0000 - tn: 28121.0000 - fn: 249.0000 - precision: 0.9714 - recall: 0.9560 - auc: 0.9967 - prc: 0.9891 - val_loss: 0.0166 - val_categorical_accuracy: 0.9832 - val_tp: 5542.0000 - val_fp: 72.0000 - val_tn: 28208.0000 - val_fn: 114.0000 - val_precision: 0.9872 - val_recall: 0.9798 - val_auc: 0.9994 - val_prc: 0.9977
Epoch 74/300
177/177 [==============================] - 14s 81ms/step - loss: 0.0415 - categorical_accuracy: 0.9691 - tp: 5449.0000 - fp: 141.0000 - tn: 28139.0000 - fn: 207.0000 - precision: 0.9748 - recall: 0.9634 - auc: 0.9975 - prc: 0.9921 - val_loss: 0.1774 - val_categorical_accuracy: 0.9441 - val_tp: 5319.0000 - val_fp: 290.0000 - val_tn: 27990.0000 - val_fn: 337.0000 - val_precision: 0.9483 - val_recall: 0.9404 - val_auc: 0.9900 - val_prc: 0.9703
Epoch 75/300
177/177 [==============================] - 14s 81ms/step - loss: 0.0639 - categorical_accuracy: 0.9602 - tp: 5375.0000 - fp: 178.0000 - tn: 28102.0000 - fn: 281.0000 - precision: 0.9679 - recall: 0.9503 - auc: 0.9961 - prc: 0.9871 - val_loss: 0.0222 - val_categorical_accuracy: 0.9804 - val_tp: 5499.0000 - val_fp: 72.0000 - val_tn: 28208.0000 - val_fn: 157.0000 - val_precision: 0.9871 - val_recall: 0.9722 - val_auc: 0.9988 - val_prc: 0.9960
Epoch 76/300
177/177 [==============================] - 14s 82ms/step - loss: 0.0634 - categorical_accuracy: 0.9657 - tp: 5437.0000 - fp: 158.0000 - tn: 28122.0000 - fn: 219.0000 - precision: 0.9718 - recall: 0.9613 - auc: 0.9968 - prc: 0.9899 - val_loss: 0.0220 - val_categorical_accuracy: 0.9844 - val_tp: 5533.0000 - val_fp: 71.0000 - val_tn: 28209.0000 - val_fn: 123.0000 - val_precision: 0.9873 - val_recall: 0.9783 - val_auc: 0.9990 - val_prc: 0.9967
Epoch 77/300
177/177 [==============================] - 15s 87ms/step - loss: 0.0531 - categorical_accuracy: 0.9627 - tp: 5397.0000 - fp: 168.0000 - tn: 28112.0000 - fn: 259.0000 - precision: 0.9698 - recall: 0.9542 - auc: 0.9968 - prc: 0.9900 - val_loss: 0.0294 - val_categorical_accuracy: 0.9802 - val_tp: 5527.0000 - val_fp: 90.0000 - val_tn: 28190.0000 - val_fn: 129.0000 - val_precision: 0.9840 - val_recall: 0.9772 - val_auc: 0.9986 - val_prc: 0.9960
Epoch 78/300
177/177 [==============================] - 14s 81ms/step - loss: 0.0487 - categorical_accuracy: 0.9691 - tp: 5450.0000 - fp: 148.0000 - tn: 28132.0000 - fn: 206.0000 - precision: 0.9736 - recall: 0.9636 - auc: 0.9977 - prc: 0.9924 - val_loss: 0.0253 - val_categorical_accuracy: 0.9837 - val_tp: 5557.0000 - val_fp: 83.0000 - val_tn: 28197.0000 - val_fn: 99.0000 - val_precision: 0.9853 - val_recall: 0.9825 - val_auc: 0.9989 - val_prc: 0.9966
Epoch 79/300
177/177 [==============================] - 14s 81ms/step - loss: 0.0625 - categorical_accuracy: 0.9669 - tp: 5439.0000 - fp: 153.0000 - tn: 28127.0000 - fn: 217.0000 - precision: 0.9726 - recall: 0.9616 - auc: 0.9963 - prc: 0.9891 - val_loss: 0.0869 - val_categorical_accuracy: 0.9588 - val_tp: 5403.0000 - val_fp: 213.0000 - val_tn: 28067.0000 - val_fn: 253.0000 - val_precision: 0.9621 - val_recall: 0.9553 - val_auc: 0.9957 - val_prc: 0.9868
Epoch 80/300
177/177 [==============================] - 14s 81ms/step - loss: 0.0710 - categorical_accuracy: 0.9687 - tp: 5461.0000 - fp: 152.0000 - tn: 28128.0000 - fn: 195.0000 - precision: 0.9729 - recall: 0.9655 - auc: 0.9965 - prc: 0.9893 - val_loss: 0.0288 - val_categorical_accuracy: 0.9791 - val_tp: 5511.0000 - val_fp: 94.0000 - val_tn: 28186.0000 - val_fn: 145.0000 - val_precision: 0.9832 - val_recall: 0.9744 - val_auc: 0.9986 - val_prc: 0.9959
Epoch 81/300
177/177 [==============================] - 15s 84ms/step - loss: 0.0586 - categorical_accuracy: 0.9641 - tp: 5404.0000 - fp: 157.0000 - tn: 28123.0000 - fn: 252.0000 - precision: 0.9718 - recall: 0.9554 - auc: 0.9966 - prc: 0.9888 - val_loss: 0.0558 - val_categorical_accuracy: 0.9618 - val_tp: 5374.0000 - val_fp: 166.0000 - val_tn: 28114.0000 - val_fn: 282.0000 - val_precision: 0.9700 - val_recall: 0.9501 - val_auc: 0.9969 - val_prc: 0.9889
Epoch 82/300
177/177 [==============================] - 14s 81ms/step - loss: 0.0593 - categorical_accuracy: 0.9646 - tp: 5422.0000 - fp: 166.0000 - tn: 28114.0000 - fn: 234.0000 - precision: 0.9703 - recall: 0.9586 - auc: 0.9969 - prc: 0.9903 - val_loss: 0.0796 - val_categorical_accuracy: 0.9652 - val_tp: 5438.0000 - val_fp: 167.0000 - val_tn: 28113.0000 - val_fn: 218.0000 - val_precision: 0.9702 - val_recall: 0.9615 - val_auc: 0.9953 - val_prc: 0.9862
Epoch 83/300
177/177 [==============================] - 14s 81ms/step - loss: 0.0564 - categorical_accuracy: 0.9687 - tp: 5454.0000 - fp: 151.0000 - tn: 28129.0000 - fn: 202.0000 - precision: 0.9731 - recall: 0.9643 - auc: 0.9967 - prc: 0.9905 - val_loss: 0.0616 - val_categorical_accuracy: 0.9558 - val_tp: 5361.0000 - val_fp: 211.0000 - val_tn: 28069.0000 - val_fn: 295.0000 - val_precision: 0.9621 - val_recall: 0.9478 - val_auc: 0.9967 - val_prc: 0.9881
Epoch 84/300
177/177 [==============================] - 14s 81ms/step - loss: 0.0566 - categorical_accuracy: 0.9632 - tp: 5412.0000 - fp: 162.0000 - tn: 28118.0000 - fn: 244.0000 - precision: 0.9709 - recall: 0.9569 - auc: 0.9970 - prc: 0.9903 - val_loss: 0.0263 - val_categorical_accuracy: 0.9768 - val_tp: 5457.0000 - val_fp: 90.0000 - val_tn: 28190.0000 - val_fn: 199.0000 - val_precision: 0.9838 - val_recall: 0.9648 - val_auc: 0.9984 - val_prc: 0.9946
Epoch 85/300
177/177 [==============================] - 15s 84ms/step - loss: 0.0624 - categorical_accuracy: 0.9703 - tp: 5459.0000 - fp: 148.0000 - tn: 28132.0000 - fn: 197.0000 - precision: 0.9736 - recall: 0.9652 - auc: 0.9964 - prc: 0.9895 - val_loss: 0.0590 - val_categorical_accuracy: 0.9781 - val_tp: 5530.0000 - val_fp: 122.0000 - val_tn: 28158.0000 - val_fn: 126.0000 - val_precision: 0.9784 - val_recall: 0.9777 - val_auc: 0.9972 - val_prc: 0.9930
Epoch 86/300
177/177 [==============================] - 14s 81ms/step - loss: 0.0699 - categorical_accuracy: 0.9676 - tp: 5442.0000 - fp: 162.0000 - tn: 28118.0000 - fn: 214.0000 - precision: 0.9711 - recall: 0.9622 - auc: 0.9960 - prc: 0.9877 - val_loss: 0.0319 - val_categorical_accuracy: 0.9788 - val_tp: 5508.0000 - val_fp: 93.0000 - val_tn: 28187.0000 - val_fn: 148.0000 - val_precision: 0.9834 - val_recall: 0.9738 - val_auc: 0.9983 - val_prc: 0.9954
Epoch 87/300
177/177 [==============================] - 14s 80ms/step - loss: 0.0765 - categorical_accuracy: 0.9689 - tp: 5462.0000 - fp: 158.0000 - tn: 28122.0000 - fn: 194.0000 - precision: 0.9719 - recall: 0.9657 - auc: 0.9963 - prc: 0.9896 - val_loss: 0.0435 - val_categorical_accuracy: 0.9825 - val_tp: 5547.0000 - val_fp: 91.0000 - val_tn: 28189.0000 - val_fn: 109.0000 - val_precision: 0.9839 - val_recall: 0.9807 - val_auc: 0.9980 - val_prc: 0.9943
Epoch 88/300
177/177 [==============================] - 14s 81ms/step - loss: 0.0758 - categorical_accuracy: 0.9615 - tp: 5393.0000 - fp: 177.0000 - tn: 28103.0000 - fn: 263.0000 - precision: 0.9682 - recall: 0.9535 - auc: 0.9957 - prc: 0.9863 - val_loss: 0.0428 - val_categorical_accuracy: 0.9804 - val_tp: 5529.0000 - val_fp: 98.0000 - val_tn: 28182.0000 - val_fn: 127.0000 - val_precision: 0.9826 - val_recall: 0.9775 - val_auc: 0.9981 - val_prc: 0.9943
Epoch 89/300
177/177 [==============================] - 15s 83ms/step - loss: 0.0520 - categorical_accuracy: 0.9721 - tp: 5471.0000 - fp: 132.0000 - tn: 28148.0000 - fn: 185.0000 - precision: 0.9764 - recall: 0.9673 - auc: 0.9975 - prc: 0.9919 - val_loss: 0.1204 - val_categorical_accuracy: 0.9409 - val_tp: 5297.0000 - val_fp: 301.0000 - val_tn: 27979.0000 - val_fn: 359.0000 - val_precision: 0.9462 - val_recall: 0.9365 - val_auc: 0.9934 - val_prc: 0.9787
Epoch 90/300
177/177 [==============================] - 14s 81ms/step - loss: 0.0604 - categorical_accuracy: 0.9659 - tp: 5437.0000 - fp: 165.0000 - tn: 28115.0000 - fn: 219.0000 - precision: 0.9705 - recall: 0.9613 - auc: 0.9964 - prc: 0.9890 - val_loss: 0.0233 - val_categorical_accuracy: 0.9880 - val_tp: 5582.0000 - val_fp: 61.0000 - val_tn: 28219.0000 - val_fn: 74.0000 - val_precision: 0.9892 - val_recall: 0.9869 - val_auc: 0.9989 - val_prc: 0.9973
Epoch 91/300
177/177 [==============================] - 14s 81ms/step - loss: 0.0619 - categorical_accuracy: 0.9629 - tp: 5394.0000 - fp: 168.0000 - tn: 28112.0000 - fn: 262.0000 - precision: 0.9698 - recall: 0.9537 - auc: 0.9965 - prc: 0.9892 - val_loss: 0.0339 - val_categorical_accuracy: 0.9726 - val_tp: 5455.0000 - val_fp: 123.0000 - val_tn: 28157.0000 - val_fn: 201.0000 - val_precision: 0.9779 - val_recall: 0.9645 - val_auc: 0.9980 - val_prc: 0.9938
Epoch 92/300
177/177 [==============================] - 14s 81ms/step - loss: 0.0583 - categorical_accuracy: 0.9636 - tp: 5394.0000 - fp: 162.0000 - tn: 28118.0000 - fn: 262.0000 - precision: 0.9708 - recall: 0.9537 - auc: 0.9961 - prc: 0.9885 - val_loss: 0.0604 - val_categorical_accuracy: 0.9535 - val_tp: 5313.0000 - val_fp: 196.0000 - val_tn: 28084.0000 - val_fn: 343.0000 - val_precision: 0.9644 - val_recall: 0.9394 - val_auc: 0.9960 - val_prc: 0.9869
Epoch 93/300
177/177 [==============================] - 15s 82ms/step - loss: 0.0504 - categorical_accuracy: 0.9662 - tp: 5436.0000 - fp: 157.0000 - tn: 28123.0000 - fn: 220.0000 - precision: 0.9719 - recall: 0.9611 - auc: 0.9972 - prc: 0.9916 - val_loss: 0.0226 - val_categorical_accuracy: 0.9837 - val_tp: 5522.0000 - val_fp: 70.0000 - val_tn: 28210.0000 - val_fn: 134.0000 - val_precision: 0.9875 - val_recall: 0.9763 - val_auc: 0.9991 - val_prc: 0.9967
Epoch 94/300
177/177 [==============================] - 14s 81ms/step - loss: 0.0474 - categorical_accuracy: 0.9760 - tp: 5487.0000 - fp: 116.0000 - tn: 28164.0000 - fn: 169.0000 - precision: 0.9793 - recall: 0.9701 - auc: 0.9976 - prc: 0.9924 - val_loss: 0.0436 - val_categorical_accuracy: 0.9641 - val_tp: 5380.0000 - val_fp: 153.0000 - val_tn: 28127.0000 - val_fn: 276.0000 - val_precision: 0.9723 - val_recall: 0.9512 - val_auc: 0.9975 - val_prc: 0.9911
Epoch 95/300
177/177 [==============================] - 14s 81ms/step - loss: 0.0726 - categorical_accuracy: 0.9625 - tp: 5420.0000 - fp: 182.0000 - tn: 28098.0000 - fn: 236.0000 - precision: 0.9675 - recall: 0.9583 - auc: 0.9959 - prc: 0.9878 - val_loss: 0.0745 - val_categorical_accuracy: 0.9438 - val_tp: 5279.0000 - val_fp: 250.0000 - val_tn: 28030.0000 - val_fn: 377.0000 - val_precision: 0.9548 - val_recall: 0.9333 - val_auc: 0.9956 - val_prc: 0.9842
Epoch 96/300
177/177 [==============================] - 14s 81ms/step - loss: 0.0579 - categorical_accuracy: 0.9661 - tp: 5415.0000 - fp: 145.0000 - tn: 28135.0000 - fn: 241.0000 - precision: 0.9739 - recall: 0.9574 - auc: 0.9967 - prc: 0.9888 - val_loss: 0.0134 - val_categorical_accuracy: 0.9880 - val_tp: 5572.0000 - val_fp: 58.0000 - val_tn: 28222.0000 - val_fn: 84.0000 - val_precision: 0.9897 - val_recall: 0.9851 - val_auc: 0.9994 - val_prc: 0.9986
Epoch 97/300
177/177 [==============================] - 15s 84ms/step - loss: 0.0519 - categorical_accuracy: 0.9707 - tp: 5465.0000 - fp: 148.0000 - tn: 28132.0000 - fn: 191.0000 - precision: 0.9736 - recall: 0.9662 - auc: 0.9972 - prc: 0.9917 - val_loss: 0.0230 - val_categorical_accuracy: 0.9860 - val_tp: 5564.0000 - val_fp: 61.0000 - val_tn: 28219.0000 - val_fn: 92.0000 - val_precision: 0.9892 - val_recall: 0.9837 - val_auc: 0.9990 - val_prc: 0.9973
Epoch 98/300
177/177 [==============================] - 14s 80ms/step - loss: 0.0558 - categorical_accuracy: 0.9659 - tp: 5424.0000 - fp: 160.0000 - tn: 28120.0000 - fn: 232.0000 - precision: 0.9713 - recall: 0.9590 - auc: 0.9969 - prc: 0.9905 - val_loss: 0.0233 - val_categorical_accuracy: 0.9878 - val_tp: 5572.0000 - val_fp: 54.0000 - val_tn: 28226.0000 - val_fn: 84.0000 - val_precision: 0.9904 - val_recall: 0.9851 - val_auc: 0.9990 - val_prc: 0.9972
Epoch 99/300
177/177 [==============================] - 14s 81ms/step - loss: 0.0412 - categorical_accuracy: 0.9767 - tp: 5507.0000 - fp: 109.0000 - tn: 28171.0000 - fn: 149.0000 - precision: 0.9806 - recall: 0.9737 - auc: 0.9979 - prc: 0.9942 - val_loss: 0.1175 - val_categorical_accuracy: 0.9611 - val_tp: 5420.0000 - val_fp: 209.0000 - val_tn: 28071.0000 - val_fn: 236.0000 - val_precision: 0.9629 - val_recall: 0.9583 - val_auc: 0.9940 - val_prc: 0.9829
Epoch 100/300
177/177 [==============================] - 14s 81ms/step - loss: 0.0528 - categorical_accuracy: 0.9737 - tp: 5495.0000 - fp: 131.0000 - tn: 28149.0000 - fn: 161.0000 - precision: 0.9767 - recall: 0.9715 - auc: 0.9977 - prc: 0.9929 - val_loss: 0.0369 - val_categorical_accuracy: 0.9890 - val_tp: 5591.0000 - val_fp: 57.0000 - val_tn: 28223.0000 - val_fn: 65.0000 - val_precision: 0.9899 - val_recall: 0.9885 - val_auc: 0.9978 - val_prc: 0.9948
Epoch 101/300
177/177 [==============================] - 15s 84ms/step - loss: 0.0806 - categorical_accuracy: 0.9673 - tp: 5444.0000 - fp: 166.0000 - tn: 28114.0000 - fn: 212.0000 - precision: 0.9704 - recall: 0.9625 - auc: 0.9955 - prc: 0.9867 - val_loss: 0.0463 - val_categorical_accuracy: 0.9717 - val_tp: 5472.0000 - val_fp: 136.0000 - val_tn: 28144.0000 - val_fn: 184.0000 - val_precision: 0.9757 - val_recall: 0.9675 - val_auc: 0.9977 - val_prc: 0.9931
Epoch 102/300
177/177 [==============================] - 14s 81ms/step - loss: 0.0686 - categorical_accuracy: 0.9705 - tp: 5460.0000 - fp: 152.0000 - tn: 28128.0000 - fn: 196.0000 - precision: 0.9729 - recall: 0.9653 - auc: 0.9966 - prc: 0.9897 - val_loss: 0.0667 - val_categorical_accuracy: 0.9726 - val_tp: 5490.0000 - val_fp: 140.0000 - val_tn: 28140.0000 - val_fn: 166.0000 - val_precision: 0.9751 - val_recall: 0.9707 - val_auc: 0.9970 - val_prc: 0.9915
Epoch 103/300
177/177 [==============================] - 14s 81ms/step - loss: 0.0602 - categorical_accuracy: 0.9643 - tp: 5406.0000 - fp: 166.0000 - tn: 28114.0000 - fn: 250.0000 - precision: 0.9702 - recall: 0.9558 - auc: 0.9968 - prc: 0.9900 - val_loss: 0.0365 - val_categorical_accuracy: 0.9860 - val_tp: 5571.0000 - val_fp: 66.0000 - val_tn: 28214.0000 - val_fn: 85.0000 - val_precision: 0.9883 - val_recall: 0.9850 - val_auc: 0.9979 - val_prc: 0.9941
Epoch 104/300
177/177 [==============================] - 14s 81ms/step - loss: 0.0659 - categorical_accuracy: 0.9613 - tp: 5390.0000 - fp: 172.0000 - tn: 28108.0000 - fn: 266.0000 - precision: 0.9691 - recall: 0.9530 - auc: 0.9955 - prc: 0.9865 - val_loss: 0.0237 - val_categorical_accuracy: 0.9871 - val_tp: 5573.0000 - val_fp: 64.0000 - val_tn: 28216.0000 - val_fn: 83.0000 - val_precision: 0.9886 - val_recall: 0.9853 - val_auc: 0.9991 - val_prc: 0.9978
Epoch 105/300
177/177 [==============================] - 14s 81ms/step - loss: 0.0684 - categorical_accuracy: 0.9684 - tp: 5447.0000 - fp: 149.0000 - tn: 28131.0000 - fn: 209.0000 - precision: 0.9734 - recall: 0.9630 - auc: 0.9965 - prc: 0.9889 - val_loss: 0.0494 - val_categorical_accuracy: 0.9744 - val_tp: 5494.0000 - val_fp: 129.0000 - val_tn: 28151.0000 - val_fn: 162.0000 - val_precision: 0.9771 - val_recall: 0.9714 - val_auc: 0.9974 - val_prc: 0.9921
Epoch 106/300
177/177 [==============================] - 15s 84ms/step - loss: 0.0653 - categorical_accuracy: 0.9639 - tp: 5418.0000 - fp: 170.0000 - tn: 28110.0000 - fn: 238.0000 - precision: 0.9696 - recall: 0.9579 - auc: 0.9961 - prc: 0.9890 - val_loss: 0.0607 - val_categorical_accuracy: 0.9275 - val_tp: 5087.0000 - val_fp: 279.0000 - val_tn: 28001.0000 - val_fn: 569.0000 - val_precision: 0.9480 - val_recall: 0.8994 - val_auc: 0.9952 - val_prc: 0.9803
Epoch 107/300
177/177 [==============================] - 14s 80ms/step - loss: 0.0583 - categorical_accuracy: 0.9703 - tp: 5462.0000 - fp: 149.0000 - tn: 28131.0000 - fn: 194.0000 - precision: 0.9734 - recall: 0.9657 - auc: 0.9968 - prc: 0.9909 - val_loss: 0.0651 - val_categorical_accuracy: 0.9760 - val_tp: 5513.0000 - val_fp: 129.0000 - val_tn: 28151.0000 - val_fn: 143.0000 - val_precision: 0.9771 - val_recall: 0.9747 - val_auc: 0.9964 - val_prc: 0.9911
Epoch 108/300
177/177 [==============================] - 14s 81ms/step - loss: 0.0499 - categorical_accuracy: 0.9738 - tp: 5484.0000 - fp: 127.0000 - tn: 28153.0000 - fn: 172.0000 - precision: 0.9774 - recall: 0.9696 - auc: 0.9974 - prc: 0.9927 - val_loss: 0.0290 - val_categorical_accuracy: 0.9859 - val_tp: 5565.0000 - val_fp: 73.0000 - val_tn: 28207.0000 - val_fn: 91.0000 - val_precision: 0.9871 - val_recall: 0.9839 - val_auc: 0.9983 - val_prc: 0.9954
Epoch 109/300
177/177 [==============================] - 14s 81ms/step - loss: 0.0654 - categorical_accuracy: 0.9692 - tp: 5464.0000 - fp: 159.0000 - tn: 28121.0000 - fn: 192.0000 - precision: 0.9717 - recall: 0.9661 - auc: 0.9968 - prc: 0.9908 - val_loss: 0.0874 - val_categorical_accuracy: 0.9729 - val_tp: 5499.0000 - val_fp: 146.0000 - val_tn: 28134.0000 - val_fn: 157.0000 - val_precision: 0.9741 - val_recall: 0.9722 - val_auc: 0.9952 - val_prc: 0.9865
Epoch 110/300
177/177 [==============================] - 15s 83ms/step - loss: 0.0650 - categorical_accuracy: 0.9701 - tp: 5457.0000 - fp: 146.0000 - tn: 28134.0000 - fn: 199.0000 - precision: 0.9739 - recall: 0.9648 - auc: 0.9966 - prc: 0.9903 - val_loss: 0.0320 - val_categorical_accuracy: 0.9797 - val_tp: 5526.0000 - val_fp: 105.0000 - val_tn: 28175.0000 - val_fn: 130.0000 - val_precision: 0.9814 - val_recall: 0.9770 - val_auc: 0.9985 - val_prc: 0.9957
Epoch 111/300
177/177 [==============================] - 14s 81ms/step - loss: 0.0744 - categorical_accuracy: 0.9722 - tp: 5484.0000 - fp: 141.0000 - tn: 28139.0000 - fn: 172.0000 - precision: 0.9749 - recall: 0.9696 - auc: 0.9962 - prc: 0.9899 - val_loss: 0.0476 - val_categorical_accuracy: 0.9837 - val_tp: 5562.0000 - val_fp: 87.0000 - val_tn: 28193.0000 - val_fn: 94.0000 - val_precision: 0.9846 - val_recall: 0.9834 - val_auc: 0.9974 - val_prc: 0.9938
Epoch 112/300
177/177 [==============================] - 14s 80ms/step - loss: 0.1062 - categorical_accuracy: 0.9676 - tp: 5460.0000 - fp: 166.0000 - tn: 28114.0000 - fn: 196.0000 - precision: 0.9705 - recall: 0.9653 - auc: 0.9949 - prc: 0.9856 - val_loss: 0.0642 - val_categorical_accuracy: 0.9806 - val_tp: 5532.0000 - val_fp: 100.0000 - val_tn: 28180.0000 - val_fn: 124.0000 - val_precision: 0.9822 - val_recall: 0.9781 - val_auc: 0.9968 - val_prc: 0.9908
Epoch 113/300
177/177 [==============================] - 14s 81ms/step - loss: 0.0914 - categorical_accuracy: 0.9653 - tp: 5444.0000 - fp: 175.0000 - tn: 28105.0000 - fn: 212.0000 - precision: 0.9689 - recall: 0.9625 - auc: 0.9953 - prc: 0.9866 - val_loss: 0.0549 - val_categorical_accuracy: 0.9763 - val_tp: 5509.0000 - val_fp: 120.0000 - val_tn: 28160.0000 - val_fn: 147.0000 - val_precision: 0.9787 - val_recall: 0.9740 - val_auc: 0.9971 - val_prc: 0.9919
Epoch 114/300
177/177 [==============================] - 15s 82ms/step - loss: 0.0898 - categorical_accuracy: 0.9692 - tp: 5465.0000 - fp: 148.0000 - tn: 28132.0000 - fn: 191.0000 - precision: 0.9736 - recall: 0.9662 - auc: 0.9953 - prc: 0.9863 - val_loss: 0.0453 - val_categorical_accuracy: 0.9770 - val_tp: 5499.0000 - val_fp: 107.0000 - val_tn: 28173.0000 - val_fn: 157.0000 - val_precision: 0.9809 - val_recall: 0.9722 - val_auc: 0.9977 - val_prc: 0.9924
Epoch 115/300
177/177 [==============================] - 14s 80ms/step - loss: 0.0789 - categorical_accuracy: 0.9717 - tp: 5479.0000 - fp: 146.0000 - tn: 28134.0000 - fn: 177.0000 - precision: 0.9740 - recall: 0.9687 - auc: 0.9960 - prc: 0.9887 - val_loss: 0.0810 - val_categorical_accuracy: 0.9770 - val_tp: 5518.0000 - val_fp: 120.0000 - val_tn: 28160.0000 - val_fn: 138.0000 - val_precision: 0.9787 - val_recall: 0.9756 - val_auc: 0.9958 - val_prc: 0.9886
Epoch 116/300
177/177 [==============================] - 14s 81ms/step - loss: 0.0906 - categorical_accuracy: 0.9717 - tp: 5474.0000 - fp: 149.0000 - tn: 28131.0000 - fn: 182.0000 - precision: 0.9735 - recall: 0.9678 - auc: 0.9957 - prc: 0.9878 - val_loss: 0.0375 - val_categorical_accuracy: 0.9859 - val_tp: 5573.0000 - val_fp: 74.0000 - val_tn: 28206.0000 - val_fn: 83.0000 - val_precision: 0.9869 - val_recall: 0.9853 - val_auc: 0.9983 - val_prc: 0.9957
Epoch 117/300
177/177 [==============================] - 14s 80ms/step - loss: 0.0708 - categorical_accuracy: 0.9729 - tp: 5480.0000 - fp: 136.0000 - tn: 28144.0000 - fn: 176.0000 - precision: 0.9758 - recall: 0.9689 - auc: 0.9969 - prc: 0.9908 - val_loss: 0.0364 - val_categorical_accuracy: 0.9807 - val_tp: 5538.0000 - val_fp: 101.0000 - val_tn: 28179.0000 - val_fn: 118.0000 - val_precision: 0.9821 - val_recall: 0.9791 - val_auc: 0.9988 - val_prc: 0.9960
Epoch 118/300
177/177 [==============================] - 15s 83ms/step - loss: 0.0835 - categorical_accuracy: 0.9671 - tp: 5452.0000 - fp: 163.0000 - tn: 28117.0000 - fn: 204.0000 - precision: 0.9710 - recall: 0.9639 - auc: 0.9959 - prc: 0.9875 - val_loss: 0.0329 - val_categorical_accuracy: 0.9866 - val_tp: 5574.0000 - val_fp: 72.0000 - val_tn: 28208.0000 - val_fn: 82.0000 - val_precision: 0.9872 - val_recall: 0.9855 - val_auc: 0.9981 - val_prc: 0.9952
Epoch 119/300
177/177 [==============================] - 14s 81ms/step - loss: 0.0532 - categorical_accuracy: 0.9708 - tp: 5457.0000 - fp: 131.0000 - tn: 28149.0000 - fn: 199.0000 - precision: 0.9766 - recall: 0.9648 - auc: 0.9974 - prc: 0.9921 - val_loss: 0.0163 - val_categorical_accuracy: 0.9862 - val_tp: 5550.0000 - val_fp: 58.0000 - val_tn: 28222.0000 - val_fn: 106.0000 - val_precision: 0.9897 - val_recall: 0.9813 - val_auc: 0.9992 - val_prc: 0.9978
Epoch 120/300
177/177 [==============================] - 14s 81ms/step - loss: 0.0553 - categorical_accuracy: 0.9701 - tp: 5462.0000 - fp: 143.0000 - tn: 28137.0000 - fn: 194.0000 - precision: 0.9745 - recall: 0.9657 - auc: 0.9971 - prc: 0.9916 - val_loss: 0.0732 - val_categorical_accuracy: 0.9531 - val_tp: 5362.0000 - val_fp: 222.0000 - val_tn: 28058.0000 - val_fn: 294.0000 - val_precision: 0.9602 - val_recall: 0.9480 - val_auc: 0.9957 - val_prc: 0.9864
Epoch 121/300
177/177 [==============================] - 14s 81ms/step - loss: 0.0847 - categorical_accuracy: 0.9652 - tp: 5435.0000 - fp: 179.0000 - tn: 28101.0000 - fn: 221.0000 - precision: 0.9681 - recall: 0.9609 - auc: 0.9955 - prc: 0.9871 - val_loss: 0.0641 - val_categorical_accuracy: 0.9475 - val_tp: 5238.0000 - val_fp: 206.0000 - val_tn: 28074.0000 - val_fn: 418.0000 - val_precision: 0.9622 - val_recall: 0.9261 - val_auc: 0.9951 - val_prc: 0.9831
Epoch 122/300
177/177 [==============================] - 15s 84ms/step - loss: 0.0675 - categorical_accuracy: 0.9728 - tp: 5486.0000 - fp: 134.0000 - tn: 28146.0000 - fn: 170.0000 - precision: 0.9762 - recall: 0.9699 - auc: 0.9961 - prc: 0.9893 - val_loss: 0.0365 - val_categorical_accuracy: 0.9798 - val_tp: 5529.0000 - val_fp: 103.0000 - val_tn: 28177.0000 - val_fn: 127.0000 - val_precision: 0.9817 - val_recall: 0.9775 - val_auc: 0.9977 - val_prc: 0.9942
Epoch 123/300
177/177 [==============================] - 14s 80ms/step - loss: 0.0675 - categorical_accuracy: 0.9717 - tp: 5472.0000 - fp: 144.0000 - tn: 28136.0000 - fn: 184.0000 - precision: 0.9744 - recall: 0.9675 - auc: 0.9968 - prc: 0.9909 - val_loss: 0.0271 - val_categorical_accuracy: 0.9878 - val_tp: 5581.0000 - val_fp: 65.0000 - val_tn: 28215.0000 - val_fn: 75.0000 - val_precision: 0.9885 - val_recall: 0.9867 - val_auc: 0.9988 - val_prc: 0.9971
Epoch 124/300
177/177 [==============================] - 14s 80ms/step - loss: 0.0690 - categorical_accuracy: 0.9737 - tp: 5484.0000 - fp: 137.0000 - tn: 28143.0000 - fn: 172.0000 - precision: 0.9756 - recall: 0.9696 - auc: 0.9969 - prc: 0.9907 - val_loss: 0.0457 - val_categorical_accuracy: 0.9765 - val_tp: 5511.0000 - val_fp: 114.0000 - val_tn: 28166.0000 - val_fn: 145.0000 - val_precision: 0.9797 - val_recall: 0.9744 - val_auc: 0.9977 - val_prc: 0.9934
Epoch 125/300
177/177 [==============================] - 14s 80ms/step - loss: 0.0697 - categorical_accuracy: 0.9740 - tp: 5482.0000 - fp: 129.0000 - tn: 28151.0000 - fn: 174.0000 - precision: 0.9770 - recall: 0.9692 - auc: 0.9960 - prc: 0.9893 - val_loss: 0.0565 - val_categorical_accuracy: 0.9797 - val_tp: 5531.0000 - val_fp: 105.0000 - val_tn: 28175.0000 - val_fn: 125.0000 - val_precision: 0.9814 - val_recall: 0.9779 - val_auc: 0.9970 - val_prc: 0.9919
Epoch 126/300
177/177 [==============================] - 15s 84ms/step - loss: 0.0795 - categorical_accuracy: 0.9740 - tp: 5500.0000 - fp: 139.0000 - tn: 28141.0000 - fn: 156.0000 - precision: 0.9754 - recall: 0.9724 - auc: 0.9963 - prc: 0.9898 - val_loss: 0.0735 - val_categorical_accuracy: 0.9795 - val_tp: 5532.0000 - val_fp: 113.0000 - val_tn: 28167.0000 - val_fn: 124.0000 - val_precision: 0.9800 - val_recall: 0.9781 - val_auc: 0.9960 - val_prc: 0.9895
Epoch 127/300
177/177 [==============================] - 14s 81ms/step - loss: 0.0755 - categorical_accuracy: 0.9705 - tp: 5464.0000 - fp: 146.0000 - tn: 28134.0000 - fn: 192.0000 - precision: 0.9740 - recall: 0.9661 - auc: 0.9964 - prc: 0.9894 - val_loss: 0.0379 - val_categorical_accuracy: 0.9821 - val_tp: 5532.0000 - val_fp: 83.0000 - val_tn: 28197.0000 - val_fn: 124.0000 - val_precision: 0.9852 - val_recall: 0.9781 - val_auc: 0.9981 - val_prc: 0.9949
Epoch 128/300
177/177 [==============================] - 14s 81ms/step - loss: 0.0604 - categorical_accuracy: 0.9719 - tp: 5478.0000 - fp: 141.0000 - tn: 28139.0000 - fn: 178.0000 - precision: 0.9749 - recall: 0.9685 - auc: 0.9974 - prc: 0.9920 - val_loss: 0.1036 - val_categorical_accuracy: 0.9721 - val_tp: 5492.0000 - val_fp: 146.0000 - val_tn: 28134.0000 - val_fn: 164.0000 - val_precision: 0.9741 - val_recall: 0.9710 - val_auc: 0.9947 - val_prc: 0.9857
Epoch 129/300
177/177 [==============================] - 14s 80ms/step - loss: 0.0901 - categorical_accuracy: 0.9669 - tp: 5445.0000 - fp: 170.0000 - tn: 28110.0000 - fn: 211.0000 - precision: 0.9697 - recall: 0.9627 - auc: 0.9948 - prc: 0.9859 - val_loss: 0.1177 - val_categorical_accuracy: 0.9699 - val_tp: 5473.0000 - val_fp: 165.0000 - val_tn: 28115.0000 - val_fn: 183.0000 - val_precision: 0.9707 - val_recall: 0.9676 - val_auc: 0.9941 - val_prc: 0.9835
Epoch 130/300
177/177 [==============================] - 14s 80ms/step - loss: 0.0990 - categorical_accuracy: 0.9664 - tp: 5443.0000 - fp: 177.0000 - tn: 28103.0000 - fn: 213.0000 - precision: 0.9685 - recall: 0.9623 - auc: 0.9950 - prc: 0.9862 - val_loss: 0.0292 - val_categorical_accuracy: 0.9887 - val_tp: 5584.0000 - val_fp: 58.0000 - val_tn: 28222.0000 - val_fn: 72.0000 - val_precision: 0.9897 - val_recall: 0.9873 - val_auc: 0.9985 - val_prc: 0.9962
Epoch 131/300
177/177 [==============================] - 15s 84ms/step - loss: 0.0960 - categorical_accuracy: 0.9698 - tp: 5470.0000 - fp: 157.0000 - tn: 28123.0000 - fn: 186.0000 - precision: 0.9721 - recall: 0.9671 - auc: 0.9954 - prc: 0.9871 - val_loss: 0.0903 - val_categorical_accuracy: 0.9740 - val_tp: 5500.0000 - val_fp: 137.0000 - val_tn: 28143.0000 - val_fn: 156.0000 - val_precision: 0.9757 - val_recall: 0.9724 - val_auc: 0.9949 - val_prc: 0.9864
Epoch 132/300
177/177 [==============================] - 14s 81ms/step - loss: 0.0791 - categorical_accuracy: 0.9698 - tp: 5449.0000 - fp: 145.0000 - tn: 28135.0000 - fn: 207.0000 - precision: 0.9741 - recall: 0.9634 - auc: 0.9959 - prc: 0.9879 - val_loss: 0.0439 - val_categorical_accuracy: 0.9890 - val_tp: 5591.0000 - val_fp: 62.0000 - val_tn: 28218.0000 - val_fn: 65.0000 - val_precision: 0.9890 - val_recall: 0.9885 - val_auc: 0.9979 - val_prc: 0.9947
Epoch 133/300
177/177 [==============================] - 14s 81ms/step - loss: 0.0894 - categorical_accuracy: 0.9661 - tp: 5450.0000 - fp: 173.0000 - tn: 28107.0000 - fn: 206.0000 - precision: 0.9692 - recall: 0.9636 - auc: 0.9950 - prc: 0.9861 - val_loss: 0.0556 - val_categorical_accuracy: 0.9775 - val_tp: 5504.0000 - val_fp: 117.0000 - val_tn: 28163.0000 - val_fn: 152.0000 - val_precision: 0.9792 - val_recall: 0.9731 - val_auc: 0.9972 - val_prc: 0.9921
Epoch 134/300
177/177 [==============================] - 14s 81ms/step - loss: 0.1108 - categorical_accuracy: 0.9652 - tp: 5448.0000 - fp: 180.0000 - tn: 28100.0000 - fn: 208.0000 - precision: 0.9680 - recall: 0.9632 - auc: 0.9943 - prc: 0.9838 - val_loss: 0.0817 - val_categorical_accuracy: 0.9701 - val_tp: 5480.0000 - val_fp: 148.0000 - val_tn: 28132.0000 - val_fn: 176.0000 - val_precision: 0.9737 - val_recall: 0.9689 - val_auc: 0.9951 - val_prc: 0.9869
Epoch 135/300
177/177 [==============================] - 15s 83ms/step - loss: 0.0745 - categorical_accuracy: 0.9733 - tp: 5487.0000 - fp: 137.0000 - tn: 28143.0000 - fn: 169.0000 - precision: 0.9756 - recall: 0.9701 - auc: 0.9962 - prc: 0.9893 - val_loss: 0.0476 - val_categorical_accuracy: 0.9806 - val_tp: 5531.0000 - val_fp: 90.0000 - val_tn: 28190.0000 - val_fn: 125.0000 - val_precision: 0.9840 - val_recall: 0.9779 - val_auc: 0.9975 - val_prc: 0.9929
Epoch 136/300
177/177 [==============================] - 14s 81ms/step - loss: 0.0822 - categorical_accuracy: 0.9726 - tp: 5481.0000 - fp: 137.0000 - tn: 28143.0000 - fn: 175.0000 - precision: 0.9756 - recall: 0.9691 - auc: 0.9952 - prc: 0.9866 - val_loss: 0.0500 - val_categorical_accuracy: 0.9829 - val_tp: 5556.0000 - val_fp: 94.0000 - val_tn: 28186.0000 - val_fn: 100.0000 - val_precision: 0.9834 - val_recall: 0.9823 - val_auc: 0.9976 - val_prc: 0.9938
Epoch 137/300
177/177 [==============================] - 14s 81ms/step - loss: 0.0921 - categorical_accuracy: 0.9740 - tp: 5501.0000 - fp: 141.0000 - tn: 28139.0000 - fn: 155.0000 - precision: 0.9750 - recall: 0.9726 - auc: 0.9954 - prc: 0.9872 - val_loss: 0.1129 - val_categorical_accuracy: 0.9606 - val_tp: 5420.0000 - val_fp: 194.0000 - val_tn: 28086.0000 - val_fn: 236.0000 - val_precision: 0.9654 - val_recall: 0.9583 - val_auc: 0.9940 - val_prc: 0.9826
Epoch 138/300
177/177 [==============================] - 14s 81ms/step - loss: 0.1019 - categorical_accuracy: 0.9724 - tp: 5483.0000 - fp: 142.0000 - tn: 28138.0000 - fn: 173.0000 - precision: 0.9748 - recall: 0.9694 - auc: 0.9948 - prc: 0.9858 - val_loss: 0.0343 - val_categorical_accuracy: 0.9908 - val_tp: 5601.0000 - val_fp: 48.0000 - val_tn: 28232.0000 - val_fn: 55.0000 - val_precision: 0.9915 - val_recall: 0.9903 - val_auc: 0.9983 - val_prc: 0.9960
Epoch 139/300
177/177 [==============================] - 15s 83ms/step - loss: 0.0857 - categorical_accuracy: 0.9707 - tp: 5467.0000 - fp: 150.0000 - tn: 28130.0000 - fn: 189.0000 - precision: 0.9733 - recall: 0.9666 - auc: 0.9951 - prc: 0.9864 - val_loss: 0.0525 - val_categorical_accuracy: 0.9673 - val_tp: 5435.0000 - val_fp: 146.0000 - val_tn: 28134.0000 - val_fn: 221.0000 - val_precision: 0.9738 - val_recall: 0.9609 - val_auc: 0.9972 - val_prc: 0.9915
Epoch 140/300
177/177 [==============================] - 14s 80ms/step - loss: 0.0746 - categorical_accuracy: 0.9699 - tp: 5455.0000 - fp: 149.0000 - tn: 28131.0000 - fn: 201.0000 - precision: 0.9734 - recall: 0.9645 - auc: 0.9962 - prc: 0.9888 - val_loss: 0.1016 - val_categorical_accuracy: 0.9747 - val_tp: 5507.0000 - val_fp: 138.0000 - val_tn: 28142.0000 - val_fn: 149.0000 - val_precision: 0.9756 - val_recall: 0.9737 - val_auc: 0.9947 - val_prc: 0.9860
Epoch 141/300
177/177 [==============================] - 14s 81ms/step - loss: 0.0674 - categorical_accuracy: 0.9694 - tp: 5444.0000 - fp: 146.0000 - tn: 28134.0000 - fn: 212.0000 - precision: 0.9739 - recall: 0.9625 - auc: 0.9968 - prc: 0.9903 - val_loss: 0.0461 - val_categorical_accuracy: 0.9710 - val_tp: 5458.0000 - val_fp: 135.0000 - val_tn: 28145.0000 - val_fn: 198.0000 - val_precision: 0.9759 - val_recall: 0.9650 - val_auc: 0.9979 - val_prc: 0.9935
Epoch 142/300
177/177 [==============================] - 14s 81ms/step - loss: 0.0527 - categorical_accuracy: 0.9791 - tp: 5521.0000 - fp: 108.0000 - tn: 28172.0000 - fn: 135.0000 - precision: 0.9808 - recall: 0.9761 - auc: 0.9974 - prc: 0.9931 - val_loss: 0.0469 - val_categorical_accuracy: 0.9866 - val_tp: 5576.0000 - val_fp: 73.0000 - val_tn: 28207.0000 - val_fn: 80.0000 - val_precision: 0.9871 - val_recall: 0.9859 - val_auc: 0.9972 - val_prc: 0.9928
Epoch 143/300
177/177 [==============================] - 15s 84ms/step - loss: 0.1031 - categorical_accuracy: 0.9622 - tp: 5406.0000 - fp: 187.0000 - tn: 28093.0000 - fn: 250.0000 - precision: 0.9666 - recall: 0.9558 - auc: 0.9940 - prc: 0.9828 - val_loss: 0.0646 - val_categorical_accuracy: 0.9724 - val_tp: 5475.0000 - val_fp: 138.0000 - val_tn: 28142.0000 - val_fn: 181.0000 - val_precision: 0.9754 - val_recall: 0.9680 - val_auc: 0.9972 - val_prc: 0.9910
Epoch 144/300
177/177 [==============================] - 14s 81ms/step - loss: 0.0906 - categorical_accuracy: 0.9692 - tp: 5466.0000 - fp: 160.0000 - tn: 28120.0000 - fn: 190.0000 - precision: 0.9716 - recall: 0.9664 - auc: 0.9953 - prc: 0.9870 - val_loss: 0.0302 - val_categorical_accuracy: 0.9791 - val_tp: 5517.0000 - val_fp: 92.0000 - val_tn: 28188.0000 - val_fn: 139.0000 - val_precision: 0.9836 - val_recall: 0.9754 - val_auc: 0.9985 - val_prc: 0.9959
Epoch 145/300
177/177 [==============================] - 14s 81ms/step - loss: 0.0564 - categorical_accuracy: 0.9691 - tp: 5456.0000 - fp: 148.0000 - tn: 28132.0000 - fn: 200.0000 - precision: 0.9736 - recall: 0.9646 - auc: 0.9968 - prc: 0.9916 - val_loss: 0.0542 - val_categorical_accuracy: 0.9708 - val_tp: 5474.0000 - val_fp: 139.0000 - val_tn: 28141.0000 - val_fn: 182.0000 - val_precision: 0.9752 - val_recall: 0.9678 - val_auc: 0.9976 - val_prc: 0.9925
Epoch 146/300
177/177 [==============================] - 14s 80ms/step - loss: 0.0522 - categorical_accuracy: 0.9742 - tp: 5490.0000 - fp: 115.0000 - tn: 28165.0000 - fn: 166.0000 - precision: 0.9795 - recall: 0.9707 - auc: 0.9975 - prc: 0.9926 - val_loss: 0.0470 - val_categorical_accuracy: 0.9772 - val_tp: 5505.0000 - val_fp: 117.0000 - val_tn: 28163.0000 - val_fn: 151.0000 - val_precision: 0.9792 - val_recall: 0.9733 - val_auc: 0.9977 - val_prc: 0.9933
Epoch 147/300
177/177 [==============================] - 15s 85ms/step - loss: 0.0621 - categorical_accuracy: 0.9751 - tp: 5507.0000 - fp: 131.0000 - tn: 28149.0000 - fn: 149.0000 - precision: 0.9768 - recall: 0.9737 - auc: 0.9973 - prc: 0.9922 - val_loss: 0.1523 - val_categorical_accuracy: 0.9567 - val_tp: 5394.0000 - val_fp: 232.0000 - val_tn: 28048.0000 - val_fn: 262.0000 - val_precision: 0.9588 - val_recall: 0.9537 - val_auc: 0.9918 - val_prc: 0.9765
Epoch 148/300
177/177 [==============================] - 14s 81ms/step - loss: 0.0849 - categorical_accuracy: 0.9740 - tp: 5498.0000 - fp: 132.0000 - tn: 28148.0000 - fn: 158.0000 - precision: 0.9766 - recall: 0.9721 - auc: 0.9954 - prc: 0.9871 - val_loss: 0.0385 - val_categorical_accuracy: 0.9675 - val_tp: 5403.0000 - val_fp: 126.0000 - val_tn: 28154.0000 - val_fn: 253.0000 - val_precision: 0.9772 - val_recall: 0.9553 - val_auc: 0.9979 - val_prc: 0.9923
Epoch 149/300
177/177 [==============================] - 14s 81ms/step - loss: 0.0638 - categorical_accuracy: 0.9721 - tp: 5477.0000 - fp: 134.0000 - tn: 28146.0000 - fn: 179.0000 - precision: 0.9761 - recall: 0.9684 - auc: 0.9966 - prc: 0.9902 - val_loss: 0.0331 - val_categorical_accuracy: 0.9885 - val_tp: 5591.0000 - val_fp: 61.0000 - val_tn: 28219.0000 - val_fn: 65.0000 - val_precision: 0.9892 - val_recall: 0.9885 - val_auc: 0.9982 - val_prc: 0.9960
Epoch 150/300
177/177 [==============================] - 14s 81ms/step - loss: 0.0539 - categorical_accuracy: 0.9811 - tp: 5539.0000 - fp: 103.0000 - tn: 28177.0000 - fn: 117.0000 - precision: 0.9817 - recall: 0.9793 - auc: 0.9975 - prc: 0.9930 - val_loss: 0.0977 - val_categorical_accuracy: 0.9668 - val_tp: 5450.0000 - val_fp: 168.0000 - val_tn: 28112.0000 - val_fn: 206.0000 - val_precision: 0.9701 - val_recall: 0.9636 - val_auc: 0.9942 - val_prc: 0.9847
Epoch 151/300
177/177 [==============================] - 15s 84ms/step - loss: 0.0945 - categorical_accuracy: 0.9737 - tp: 5500.0000 - fp: 139.0000 - tn: 28141.0000 - fn: 156.0000 - precision: 0.9754 - recall: 0.9724 - auc: 0.9951 - prc: 0.9867 - val_loss: 0.0459 - val_categorical_accuracy: 0.9749 - val_tp: 5492.0000 - val_fp: 120.0000 - val_tn: 28160.0000 - val_fn: 164.0000 - val_precision: 0.9786 - val_recall: 0.9710 - val_auc: 0.9981 - val_prc: 0.9945
Epoch 152/300
177/177 [==============================] - 14s 81ms/step - loss: 0.0679 - categorical_accuracy: 0.9749 - tp: 5501.0000 - fp: 118.0000 - tn: 28162.0000 - fn: 155.0000 - precision: 0.9790 - recall: 0.9726 - auc: 0.9965 - prc: 0.9902 - val_loss: 0.0199 - val_categorical_accuracy: 0.9866 - val_tp: 5567.0000 - val_fp: 63.0000 - val_tn: 28217.0000 - val_fn: 89.0000 - val_precision: 0.9888 - val_recall: 0.9843 - val_auc: 0.9994 - val_prc: 0.9984
Epoch 153/300
177/177 [==============================] - 14s 81ms/step - loss: 0.0617 - categorical_accuracy: 0.9763 - tp: 5510.0000 - fp: 124.0000 - tn: 28156.0000 - fn: 146.0000 - precision: 0.9780 - recall: 0.9742 - auc: 0.9973 - prc: 0.9924 - val_loss: 0.1583 - val_categorical_accuracy: 0.9731 - val_tp: 5503.0000 - val_fp: 150.0000 - val_tn: 28130.0000 - val_fn: 153.0000 - val_precision: 0.9735 - val_recall: 0.9729 - val_auc: 0.9920 - val_prc: 0.9788
Epoch 154/300
177/177 [==============================] - 14s 81ms/step - loss: 0.0881 - categorical_accuracy: 0.9754 - tp: 5511.0000 - fp: 125.0000 - tn: 28155.0000 - fn: 145.0000 - precision: 0.9778 - recall: 0.9744 - auc: 0.9956 - prc: 0.9880 - val_loss: 0.0741 - val_categorical_accuracy: 0.9781 - val_tp: 5531.0000 - val_fp: 116.0000 - val_tn: 28164.0000 - val_fn: 125.0000 - val_precision: 0.9795 - val_recall: 0.9779 - val_auc: 0.9960 - val_prc: 0.9892
Epoch 155/300
177/177 [==============================] - 15s 84ms/step - loss: 0.1073 - categorical_accuracy: 0.9669 - tp: 5454.0000 - fp: 169.0000 - tn: 28111.0000 - fn: 202.0000 - precision: 0.9699 - recall: 0.9643 - auc: 0.9943 - prc: 0.9842 - val_loss: 0.0666 - val_categorical_accuracy: 0.9717 - val_tp: 5489.0000 - val_fp: 149.0000 - val_tn: 28131.0000 - val_fn: 167.0000 - val_precision: 0.9736 - val_recall: 0.9705 - val_auc: 0.9970 - val_prc: 0.9918
Epoch 156/300
177/177 [==============================] - 14s 80ms/step - loss: 0.0934 - categorical_accuracy: 0.9717 - tp: 5473.0000 - fp: 146.0000 - tn: 28134.0000 - fn: 183.0000 - precision: 0.9740 - recall: 0.9676 - auc: 0.9954 - prc: 0.9871 - val_loss: 0.0671 - val_categorical_accuracy: 0.9790 - val_tp: 5516.0000 - val_fp: 101.0000 - val_tn: 28179.0000 - val_fn: 140.0000 - val_precision: 0.9820 - val_recall: 0.9752 - val_auc: 0.9965 - val_prc: 0.9901
Epoch 157/300
177/177 [==============================] - 14s 81ms/step - loss: 0.0924 - categorical_accuracy: 0.9783 - tp: 5522.0000 - fp: 115.0000 - tn: 28165.0000 - fn: 134.0000 - precision: 0.9796 - recall: 0.9763 - auc: 0.9953 - prc: 0.9870 - val_loss: 0.0795 - val_categorical_accuracy: 0.9756 - val_tp: 5507.0000 - val_fp: 130.0000 - val_tn: 28150.0000 - val_fn: 149.0000 - val_precision: 0.9769 - val_recall: 0.9737 - val_auc: 0.9961 - val_prc: 0.9898
Epoch 158/300
177/177 [==============================] - 14s 81ms/step - loss: 0.0917 - categorical_accuracy: 0.9775 - tp: 5525.0000 - fp: 121.0000 - tn: 28159.0000 - fn: 131.0000 - precision: 0.9786 - recall: 0.9768 - auc: 0.9950 - prc: 0.9872 - val_loss: 0.0603 - val_categorical_accuracy: 0.9795 - val_tp: 5538.0000 - val_fp: 110.0000 - val_tn: 28170.0000 - val_fn: 118.0000 - val_precision: 0.9805 - val_recall: 0.9791 - val_auc: 0.9976 - val_prc: 0.9928
Epoch 159/300
177/177 [==============================] - 15s 86ms/step - loss: 0.0815 - categorical_accuracy: 0.9802 - tp: 5541.0000 - fp: 108.0000 - tn: 28172.0000 - fn: 115.0000 - precision: 0.9809 - recall: 0.9797 - auc: 0.9955 - prc: 0.9880 - val_loss: 0.0836 - val_categorical_accuracy: 0.9827 - val_tp: 5556.0000 - val_fp: 94.0000 - val_tn: 28186.0000 - val_fn: 100.0000 - val_precision: 0.9834 - val_recall: 0.9823 - val_auc: 0.9952 - val_prc: 0.9871
Epoch 160/300
177/177 [==============================] - 14s 81ms/step - loss: 0.1021 - categorical_accuracy: 0.9772 - tp: 5523.0000 - fp: 123.0000 - tn: 28157.0000 - fn: 133.0000 - precision: 0.9782 - recall: 0.9765 - auc: 0.9945 - prc: 0.9860 - val_loss: 0.0634 - val_categorical_accuracy: 0.9775 - val_tp: 5520.0000 - val_fp: 117.0000 - val_tn: 28163.0000 - val_fn: 136.0000 - val_precision: 0.9792 - val_recall: 0.9760 - val_auc: 0.9972 - val_prc: 0.9920
Epoch 161/300
177/177 [==============================] - 14s 81ms/step - loss: 0.1053 - categorical_accuracy: 0.9775 - tp: 5524.0000 - fp: 124.0000 - tn: 28156.0000 - fn: 132.0000 - precision: 0.9780 - recall: 0.9767 - auc: 0.9945 - prc: 0.9850 - val_loss: 0.0436 - val_categorical_accuracy: 0.9857 - val_tp: 5571.0000 - val_fp: 73.0000 - val_tn: 28207.0000 - val_fn: 85.0000 - val_precision: 0.9871 - val_recall: 0.9850 - val_auc: 0.9977 - val_prc: 0.9939
Epoch 162/300
177/177 [==============================] - 14s 81ms/step - loss: 0.0828 - categorical_accuracy: 0.9699 - tp: 5472.0000 - fp: 154.0000 - tn: 28126.0000 - fn: 184.0000 - precision: 0.9726 - recall: 0.9675 - auc: 0.9954 - prc: 0.9872 - val_loss: 0.0717 - val_categorical_accuracy: 0.9627 - val_tp: 5416.0000 - val_fp: 184.0000 - val_tn: 28096.0000 - val_fn: 240.0000 - val_precision: 0.9671 - val_recall: 0.9576 - val_auc: 0.9966 - val_prc: 0.9895
Epoch 163/300
177/177 [==============================] - 16s 89ms/step - loss: 0.0790 - categorical_accuracy: 0.9772 - tp: 5517.0000 - fp: 123.0000 - tn: 28157.0000 - fn: 139.0000 - precision: 0.9782 - recall: 0.9754 - auc: 0.9963 - prc: 0.9900 - val_loss: 0.0608 - val_categorical_accuracy: 0.9846 - val_tp: 5566.0000 - val_fp: 83.0000 - val_tn: 28197.0000 - val_fn: 90.0000 - val_precision: 0.9853 - val_recall: 0.9841 - val_auc: 0.9968 - val_prc: 0.9912
Epoch 164/300
177/177 [==============================] - 14s 81ms/step - loss: 0.0781 - categorical_accuracy: 0.9809 - tp: 5546.0000 - fp: 104.0000 - tn: 28176.0000 - fn: 110.0000 - precision: 0.9816 - recall: 0.9806 - auc: 0.9964 - prc: 0.9907 - val_loss: 0.1212 - val_categorical_accuracy: 0.9602 - val_tp: 5422.0000 - val_fp: 206.0000 - val_tn: 28074.0000 - val_fn: 234.0000 - val_precision: 0.9634 - val_recall: 0.9586 - val_auc: 0.9931 - val_prc: 0.9811
Epoch 165/300
177/177 [==============================] - 14s 81ms/step - loss: 0.1069 - categorical_accuracy: 0.9758 - tp: 5514.0000 - fp: 131.0000 - tn: 28149.0000 - fn: 142.0000 - precision: 0.9768 - recall: 0.9749 - auc: 0.9946 - prc: 0.9856 - val_loss: 0.0873 - val_categorical_accuracy: 0.9756 - val_tp: 5513.0000 - val_fp: 137.0000 - val_tn: 28143.0000 - val_fn: 143.0000 - val_precision: 0.9758 - val_recall: 0.9747 - val_auc: 0.9958 - val_prc: 0.9890
Epoch 166/300
177/177 [==============================] - 14s 80ms/step - loss: 0.1047 - categorical_accuracy: 0.9772 - tp: 5525.0000 - fp: 125.0000 - tn: 28155.0000 - fn: 131.0000 - precision: 0.9779 - recall: 0.9768 - auc: 0.9951 - prc: 0.9871 - val_loss: 0.0327 - val_categorical_accuracy: 0.9905 - val_tp: 5597.0000 - val_fp: 53.0000 - val_tn: 28227.0000 - val_fn: 59.0000 - val_precision: 0.9906 - val_recall: 0.9896 - val_auc: 0.9983 - val_prc: 0.9960
Epoch 167/300
177/177 [==============================] - 14s 81ms/step - loss: 0.0767 - categorical_accuracy: 0.9784 - tp: 5525.0000 - fp: 113.0000 - tn: 28167.0000 - fn: 131.0000 - precision: 0.9800 - recall: 0.9768 - auc: 0.9966 - prc: 0.9909 - val_loss: 0.0786 - val_categorical_accuracy: 0.9638 - val_tp: 5433.0000 - val_fp: 177.0000 - val_tn: 28103.0000 - val_fn: 223.0000 - val_precision: 0.9684 - val_recall: 0.9606 - val_auc: 0.9960 - val_prc: 0.9870
Epoch 168/300
177/177 [==============================] - 15s 83ms/step - loss: 0.0629 - categorical_accuracy: 0.9784 - tp: 5525.0000 - fp: 112.0000 - tn: 28168.0000 - fn: 131.0000 - precision: 0.9801 - recall: 0.9768 - auc: 0.9970 - prc: 0.9920 - val_loss: 0.0371 - val_categorical_accuracy: 0.9920 - val_tp: 5607.0000 - val_fp: 42.0000 - val_tn: 28238.0000 - val_fn: 49.0000 - val_precision: 0.9926 - val_recall: 0.9913 - val_auc: 0.9979 - val_prc: 0.9945
Epoch 169/300
177/177 [==============================] - 14s 81ms/step - loss: 0.0790 - categorical_accuracy: 0.9790 - tp: 5524.0000 - fp: 112.0000 - tn: 28168.0000 - fn: 132.0000 - precision: 0.9801 - recall: 0.9767 - auc: 0.9961 - prc: 0.9894 - val_loss: 0.0411 - val_categorical_accuracy: 0.9896 - val_tp: 5589.0000 - val_fp: 58.0000 - val_tn: 28222.0000 - val_fn: 67.0000 - val_precision: 0.9897 - val_recall: 0.9882 - val_auc: 0.9981 - val_prc: 0.9948
Epoch 170/300
177/177 [==============================] - 14s 81ms/step - loss: 0.0754 - categorical_accuracy: 0.9768 - tp: 5516.0000 - fp: 120.0000 - tn: 28160.0000 - fn: 140.0000 - precision: 0.9787 - recall: 0.9752 - auc: 0.9963 - prc: 0.9900 - val_loss: 0.0536 - val_categorical_accuracy: 0.9846 - val_tp: 5567.0000 - val_fp: 84.0000 - val_tn: 28196.0000 - val_fn: 89.0000 - val_precision: 0.9851 - val_recall: 0.9843 - val_auc: 0.9969 - val_prc: 0.9924
Epoch 171/300
177/177 [==============================] - 14s 81ms/step - loss: 0.0930 - categorical_accuracy: 0.9798 - tp: 5541.0000 - fp: 111.0000 - tn: 28169.0000 - fn: 115.0000 - precision: 0.9804 - recall: 0.9797 - auc: 0.9949 - prc: 0.9875 - val_loss: 0.0583 - val_categorical_accuracy: 0.9912 - val_tp: 5606.0000 - val_fp: 50.0000 - val_tn: 28230.0000 - val_fn: 50.0000 - val_precision: 0.9912 - val_recall: 0.9912 - val_auc: 0.9968 - val_prc: 0.9921
Epoch 172/300
177/177 [==============================] - 15s 82ms/step - loss: 0.0801 - categorical_accuracy: 0.9800 - tp: 5541.0000 - fp: 111.0000 - tn: 28169.0000 - fn: 115.0000 - precision: 0.9804 - recall: 0.9797 - auc: 0.9960 - prc: 0.9901 - val_loss: 0.0365 - val_categorical_accuracy: 0.9942 - val_tp: 5622.0000 - val_fp: 33.0000 - val_tn: 28247.0000 - val_fn: 34.0000 - val_precision: 0.9942 - val_recall: 0.9940 - val_auc: 0.9978 - val_prc: 0.9953
Epoch 173/300
177/177 [==============================] - 14s 81ms/step - loss: 0.0819 - categorical_accuracy: 0.9834 - tp: 5560.0000 - fp: 88.0000 - tn: 28192.0000 - fn: 96.0000 - precision: 0.9844 - recall: 0.9830 - auc: 0.9962 - prc: 0.9897 - val_loss: 0.0805 - val_categorical_accuracy: 0.9459 - val_tp: 5288.0000 - val_fp: 238.0000 - val_tn: 28042.0000 - val_fn: 368.0000 - val_precision: 0.9569 - val_recall: 0.9349 - val_auc: 0.9948 - val_prc: 0.9836
Epoch 174/300
177/177 [==============================] - 14s 80ms/step - loss: 0.0827 - categorical_accuracy: 0.9722 - tp: 5484.0000 - fp: 147.0000 - tn: 28133.0000 - fn: 172.0000 - precision: 0.9739 - recall: 0.9696 - auc: 0.9960 - prc: 0.9888 - val_loss: 0.0727 - val_categorical_accuracy: 0.9883 - val_tp: 5590.0000 - val_fp: 66.0000 - val_tn: 28214.0000 - val_fn: 66.0000 - val_precision: 0.9883 - val_recall: 0.9883 - val_auc: 0.9964 - val_prc: 0.9905
Epoch 175/300
177/177 [==============================] - 14s 81ms/step - loss: 0.0898 - categorical_accuracy: 0.9818 - tp: 5550.0000 - fp: 100.0000 - tn: 28180.0000 - fn: 106.0000 - precision: 0.9823 - recall: 0.9813 - auc: 0.9957 - prc: 0.9885 - val_loss: 0.1052 - val_categorical_accuracy: 0.9846 - val_tp: 5568.0000 - val_fp: 87.0000 - val_tn: 28193.0000 - val_fn: 88.0000 - val_precision: 0.9846 - val_recall: 0.9844 - val_auc: 0.9952 - val_prc: 0.9870
Epoch 176/300
177/177 [==============================] - 15s 82ms/step - loss: 0.0984 - categorical_accuracy: 0.9692 - tp: 5466.0000 - fp: 154.0000 - tn: 28126.0000 - fn: 190.0000 - precision: 0.9726 - recall: 0.9664 - auc: 0.9951 - prc: 0.9868 - val_loss: 0.0289 - val_categorical_accuracy: 0.9915 - val_tp: 5602.0000 - val_fp: 42.0000 - val_tn: 28238.0000 - val_fn: 54.0000 - val_precision: 0.9926 - val_recall: 0.9905 - val_auc: 0.9984 - val_prc: 0.9959
Epoch 177/300
177/177 [==============================] - 14s 81ms/step - loss: 0.0793 - categorical_accuracy: 0.9834 - tp: 5560.0000 - fp: 91.0000 - tn: 28189.0000 - fn: 96.0000 - precision: 0.9839 - recall: 0.9830 - auc: 0.9959 - prc: 0.9895 - val_loss: 0.0717 - val_categorical_accuracy: 0.9880 - val_tp: 5588.0000 - val_fp: 68.0000 - val_tn: 28212.0000 - val_fn: 68.0000 - val_precision: 0.9880 - val_recall: 0.9880 - val_auc: 0.9963 - val_prc: 0.9907
Epoch 178/300
177/177 [==============================] - 14s 81ms/step - loss: 0.0873 - categorical_accuracy: 0.9738 - tp: 5504.0000 - fp: 142.0000 - tn: 28138.0000 - fn: 152.0000 - precision: 0.9748 - recall: 0.9731 - auc: 0.9956 - prc: 0.9875 - val_loss: 0.0530 - val_categorical_accuracy: 0.9768 - val_tp: 5510.0000 - val_fp: 115.0000 - val_tn: 28165.0000 - val_fn: 146.0000 - val_precision: 0.9796 - val_recall: 0.9742 - val_auc: 0.9979 - val_prc: 0.9940
Epoch 179/300
177/177 [==============================] - 14s 81ms/step - loss: 0.0905 - categorical_accuracy: 0.9698 - tp: 5474.0000 - fp: 161.0000 - tn: 28119.0000 - fn: 182.0000 - precision: 0.9714 - recall: 0.9678 - auc: 0.9955 - prc: 0.9878 - val_loss: 0.0343 - val_categorical_accuracy: 0.9860 - val_tp: 5574.0000 - val_fp: 72.0000 - val_tn: 28208.0000 - val_fn: 82.0000 - val_precision: 0.9872 - val_recall: 0.9855 - val_auc: 0.9986 - val_prc: 0.9964
Epoch 180/300
177/177 [==============================] - 15s 84ms/step - loss: 0.0624 - categorical_accuracy: 0.9836 - tp: 5556.0000 - fp: 88.0000 - tn: 28192.0000 - fn: 100.0000 - precision: 0.9844 - recall: 0.9823 - auc: 0.9970 - prc: 0.9923 - val_loss: 0.0235 - val_categorical_accuracy: 0.9936 - val_tp: 5620.0000 - val_fp: 35.0000 - val_tn: 28245.0000 - val_fn: 36.0000 - val_precision: 0.9938 - val_recall: 0.9936 - val_auc: 0.9989 - val_prc: 0.9972
Epoch 181/300
177/177 [==============================] - 14s 81ms/step - loss: 0.0747 - categorical_accuracy: 0.9844 - tp: 5565.0000 - fp: 85.0000 - tn: 28195.0000 - fn: 91.0000 - precision: 0.9850 - recall: 0.9839 - auc: 0.9960 - prc: 0.9897 - val_loss: 0.0703 - val_categorical_accuracy: 0.9876 - val_tp: 5586.0000 - val_fp: 70.0000 - val_tn: 28210.0000 - val_fn: 70.0000 - val_precision: 0.9876 - val_recall: 0.9876 - val_auc: 0.9963 - val_prc: 0.9908
Epoch 182/300
177/177 [==============================] - 14s 81ms/step - loss: 0.0975 - categorical_accuracy: 0.9779 - tp: 5531.0000 - fp: 122.0000 - tn: 28158.0000 - fn: 125.0000 - precision: 0.9784 - recall: 0.9779 - auc: 0.9956 - prc: 0.9883 - val_loss: 0.0563 - val_categorical_accuracy: 0.9905 - val_tp: 5602.0000 - val_fp: 54.0000 - val_tn: 28226.0000 - val_fn: 54.0000 - val_precision: 0.9905 - val_recall: 0.9905 - val_auc: 0.9971 - val_prc: 0.9924
Epoch 183/300
177/177 [==============================] - 14s 80ms/step - loss: 0.0903 - categorical_accuracy: 0.9814 - tp: 5549.0000 - fp: 102.0000 - tn: 28178.0000 - fn: 107.0000 - precision: 0.9820 - recall: 0.9811 - auc: 0.9957 - prc: 0.9884 - val_loss: 0.0605 - val_categorical_accuracy: 0.9839 - val_tp: 5564.0000 - val_fp: 90.0000 - val_tn: 28190.0000 - val_fn: 92.0000 - val_precision: 0.9841 - val_recall: 0.9837 - val_auc: 0.9971 - val_prc: 0.9929
Epoch 184/300
177/177 [==============================] - 15s 86ms/step - loss: 0.0691 - categorical_accuracy: 0.9807 - tp: 5543.0000 - fp: 100.0000 - tn: 28180.0000 - fn: 113.0000 - precision: 0.9823 - recall: 0.9800 - auc: 0.9968 - prc: 0.9917 - val_loss: 0.0571 - val_categorical_accuracy: 0.9816 - val_tp: 5537.0000 - val_fp: 96.0000 - val_tn: 28184.0000 - val_fn: 119.0000 - val_precision: 0.9830 - val_recall: 0.9790 - val_auc: 0.9972 - val_prc: 0.9922
Epoch 185/300
177/177 [==============================] - 14s 81ms/step - loss: 0.0798 - categorical_accuracy: 0.9783 - tp: 5519.0000 - fp: 113.0000 - tn: 28167.0000 - fn: 137.0000 - precision: 0.9799 - recall: 0.9758 - auc: 0.9955 - prc: 0.9880 - val_loss: 0.0727 - val_categorical_accuracy: 0.9832 - val_tp: 5558.0000 - val_fp: 93.0000 - val_tn: 28187.0000 - val_fn: 98.0000 - val_precision: 0.9835 - val_recall: 0.9827 - val_auc: 0.9962 - val_prc: 0.9902
Epoch 186/300
177/177 [==============================] - 14s 81ms/step - loss: 0.1147 - categorical_accuracy: 0.9775 - tp: 5527.0000 - fp: 121.0000 - tn: 28159.0000 - fn: 129.0000 - precision: 0.9786 - recall: 0.9772 - auc: 0.9945 - prc: 0.9850 - val_loss: 0.0690 - val_categorical_accuracy: 0.9892 - val_tp: 5594.0000 - val_fp: 60.0000 - val_tn: 28220.0000 - val_fn: 62.0000 - val_precision: 0.9894 - val_recall: 0.9890 - val_auc: 0.9962 - val_prc: 0.9900
Epoch 187/300
177/177 [==============================] - 14s 81ms/step - loss: 0.0781 - categorical_accuracy: 0.9806 - tp: 5541.0000 - fp: 102.0000 - tn: 28178.0000 - fn: 115.0000 - precision: 0.9819 - recall: 0.9797 - auc: 0.9966 - prc: 0.9908 - val_loss: 0.0382 - val_categorical_accuracy: 0.9885 - val_tp: 5586.0000 - val_fp: 62.0000 - val_tn: 28218.0000 - val_fn: 70.0000 - val_precision: 0.9890 - val_recall: 0.9876 - val_auc: 0.9981 - val_prc: 0.9950
Epoch 188/300
177/177 [==============================] - 15s 85ms/step - loss: 0.1067 - categorical_accuracy: 0.9837 - tp: 5563.0000 - fp: 89.0000 - tn: 28191.0000 - fn: 93.0000 - precision: 0.9843 - recall: 0.9836 - auc: 0.9952 - prc: 0.9873 - val_loss: 0.0514 - val_categorical_accuracy: 0.9924 - val_tp: 5613.0000 - val_fp: 43.0000 - val_tn: 28237.0000 - val_fn: 43.0000 - val_precision: 0.9924 - val_recall: 0.9924 - val_auc: 0.9977 - val_prc: 0.9937
Epoch 189/300
177/177 [==============================] - 14s 81ms/step - loss: 0.0779 - categorical_accuracy: 0.9806 - tp: 5542.0000 - fp: 107.0000 - tn: 28173.0000 - fn: 114.0000 - precision: 0.9811 - recall: 0.9798 - auc: 0.9956 - prc: 0.9889 - val_loss: 0.0503 - val_categorical_accuracy: 0.9912 - val_tp: 5606.0000 - val_fp: 49.0000 - val_tn: 28231.0000 - val_fn: 50.0000 - val_precision: 0.9913 - val_recall: 0.9912 - val_auc: 0.9971 - val_prc: 0.9925
Epoch 190/300
177/177 [==============================] - 14s 81ms/step - loss: 0.0884 - categorical_accuracy: 0.9811 - tp: 5548.0000 - fp: 104.0000 - tn: 28176.0000 - fn: 108.0000 - precision: 0.9816 - recall: 0.9809 - auc: 0.9957 - prc: 0.9885 - val_loss: 0.0915 - val_categorical_accuracy: 0.9866 - val_tp: 5580.0000 - val_fp: 74.0000 - val_tn: 28206.0000 - val_fn: 76.0000 - val_precision: 0.9869 - val_recall: 0.9866 - val_auc: 0.9959 - val_prc: 0.9890
Epoch 191/300
177/177 [==============================] - 14s 81ms/step - loss: 0.1554 - categorical_accuracy: 0.9772 - tp: 5525.0000 - fp: 127.0000 - tn: 28153.0000 - fn: 131.0000 - precision: 0.9775 - recall: 0.9768 - auc: 0.9918 - prc: 0.9789 - val_loss: 0.0582 - val_categorical_accuracy: 0.9889 - val_tp: 5591.0000 - val_fp: 60.0000 - val_tn: 28220.0000 - val_fn: 65.0000 - val_precision: 0.9894 - val_recall: 0.9885 - val_auc: 0.9976 - val_prc: 0.9935
Epoch 192/300
177/177 [==============================] - 15s 85ms/step - loss: 0.1141 - categorical_accuracy: 0.9775 - tp: 5528.0000 - fp: 122.0000 - tn: 28158.0000 - fn: 128.0000 - precision: 0.9784 - recall: 0.9774 - auc: 0.9940 - prc: 0.9843 - val_loss: 0.0611 - val_categorical_accuracy: 0.9896 - val_tp: 5596.0000 - val_fp: 59.0000 - val_tn: 28221.0000 - val_fn: 60.0000 - val_precision: 0.9896 - val_recall: 0.9894 - val_auc: 0.9970 - val_prc: 0.9928
Epoch 193/300
177/177 [==============================] - 14s 81ms/step - loss: 0.1022 - categorical_accuracy: 0.9751 - tp: 5512.0000 - fp: 136.0000 - tn: 28144.0000 - fn: 144.0000 - precision: 0.9759 - recall: 0.9745 - auc: 0.9952 - prc: 0.9874 - val_loss: 0.0611 - val_categorical_accuracy: 0.9648 - val_tp: 5429.0000 - val_fp: 169.0000 - val_tn: 28111.0000 - val_fn: 227.0000 - val_precision: 0.9698 - val_recall: 0.9599 - val_auc: 0.9971 - val_prc: 0.9910
Epoch 194/300
177/177 [==============================] - 14s 81ms/step - loss: 0.0552 - categorical_accuracy: 0.9814 - tp: 5545.0000 - fp: 94.0000 - tn: 28186.0000 - fn: 111.0000 - precision: 0.9833 - recall: 0.9804 - auc: 0.9971 - prc: 0.9927 - val_loss: 0.0285 - val_categorical_accuracy: 0.9890 - val_tp: 5591.0000 - val_fp: 61.0000 - val_tn: 28219.0000 - val_fn: 65.0000 - val_precision: 0.9892 - val_recall: 0.9885 - val_auc: 0.9986 - val_prc: 0.9966
Epoch 195/300
177/177 [==============================] - 14s 80ms/step - loss: 0.0804 - categorical_accuracy: 0.9823 - tp: 5555.0000 - fp: 97.0000 - tn: 28183.0000 - fn: 101.0000 - precision: 0.9828 - recall: 0.9821 - auc: 0.9954 - prc: 0.9889 - val_loss: 0.0442 - val_categorical_accuracy: 0.9901 - val_tp: 5599.0000 - val_fp: 56.0000 - val_tn: 28224.0000 - val_fn: 57.0000 - val_precision: 0.9901 - val_recall: 0.9899 - val_auc: 0.9977 - val_prc: 0.9942
Epoch 196/300
177/177 [==============================] - 15s 85ms/step - loss: 0.0988 - categorical_accuracy: 0.9811 - tp: 5547.0000 - fp: 105.0000 - tn: 28175.0000 - fn: 109.0000 - precision: 0.9814 - recall: 0.9807 - auc: 0.9952 - prc: 0.9878 - val_loss: 0.0917 - val_categorical_accuracy: 0.9844 - val_tp: 5565.0000 - val_fp: 87.0000 - val_tn: 28193.0000 - val_fn: 91.0000 - val_precision: 0.9846 - val_recall: 0.9839 - val_auc: 0.9956 - val_prc: 0.9882
Epoch 197/300
177/177 [==============================] - 14s 81ms/step - loss: 0.0752 - categorical_accuracy: 0.9823 - tp: 5554.0000 - fp: 98.0000 - tn: 28182.0000 - fn: 102.0000 - precision: 0.9827 - recall: 0.9820 - auc: 0.9964 - prc: 0.9909 - val_loss: 0.1054 - val_categorical_accuracy: 0.9811 - val_tp: 5549.0000 - val_fp: 104.0000 - val_tn: 28176.0000 - val_fn: 107.0000 - val_precision: 0.9816 - val_recall: 0.9811 - val_auc: 0.9946 - val_prc: 0.9863
Epoch 198/300
177/177 [==============================] - 14s 81ms/step - loss: 0.0758 - categorical_accuracy: 0.9791 - tp: 5533.0000 - fp: 107.0000 - tn: 28173.0000 - fn: 123.0000 - precision: 0.9810 - recall: 0.9783 - auc: 0.9961 - prc: 0.9901 - val_loss: 0.0353 - val_categorical_accuracy: 0.9889 - val_tp: 5592.0000 - val_fp: 55.0000 - val_tn: 28225.0000 - val_fn: 64.0000 - val_precision: 0.9903 - val_recall: 0.9887 - val_auc: 0.9991 - val_prc: 0.9974
Epoch 199/300
177/177 [==============================] - 14s 81ms/step - loss: 0.0583 - categorical_accuracy: 0.9797 - tp: 5530.0000 - fp: 100.0000 - tn: 28180.0000 - fn: 126.0000 - precision: 0.9822 - recall: 0.9777 - auc: 0.9974 - prc: 0.9928 - val_loss: 0.0331 - val_categorical_accuracy: 0.9894 - val_tp: 5593.0000 - val_fp: 53.0000 - val_tn: 28227.0000 - val_fn: 63.0000 - val_precision: 0.9906 - val_recall: 0.9889 - val_auc: 0.9987 - val_prc: 0.9970
Epoch 200/300
177/177 [==============================] - 16s 88ms/step - loss: 0.0717 - categorical_accuracy: 0.9829 - tp: 5556.0000 - fp: 92.0000 - tn: 28188.0000 - fn: 100.0000 - precision: 0.9837 - recall: 0.9823 - auc: 0.9965 - prc: 0.9910 - val_loss: 0.0306 - val_categorical_accuracy: 0.9899 - val_tp: 5597.0000 - val_fp: 49.0000 - val_tn: 28231.0000 - val_fn: 59.0000 - val_precision: 0.9913 - val_recall: 0.9896 - val_auc: 0.9984 - val_prc: 0.9957
Epoch 201/300
177/177 [==============================] - 14s 80ms/step - loss: 0.0603 - categorical_accuracy: 0.9857 - tp: 5568.0000 - fp: 72.0000 - tn: 28208.0000 - fn: 88.0000 - precision: 0.9872 - recall: 0.9844 - auc: 0.9975 - prc: 0.9935 - val_loss: 0.0468 - val_categorical_accuracy: 0.9837 - val_tp: 5560.0000 - val_fp: 89.0000 - val_tn: 28191.0000 - val_fn: 96.0000 - val_precision: 0.9842 - val_recall: 0.9830 - val_auc: 0.9976 - val_prc: 0.9940
Epoch 202/300
177/177 [==============================] - 14s 81ms/step - loss: 0.0982 - categorical_accuracy: 0.9809 - tp: 5548.0000 - fp: 105.0000 - tn: 28175.0000 - fn: 108.0000 - precision: 0.9814 - recall: 0.9809 - auc: 0.9948 - prc: 0.9869 - val_loss: 0.0843 - val_categorical_accuracy: 0.9869 - val_tp: 5582.0000 - val_fp: 74.0000 - val_tn: 28206.0000 - val_fn: 74.0000 - val_precision: 0.9869 - val_recall: 0.9869 - val_auc: 0.9956 - val_prc: 0.9891
Epoch 203/300
177/177 [==============================] - 14s 81ms/step - loss: 0.0827 - categorical_accuracy: 0.9848 - tp: 5570.0000 - fp: 86.0000 - tn: 28194.0000 - fn: 86.0000 - precision: 0.9848 - recall: 0.9848 - auc: 0.9957 - prc: 0.9887 - val_loss: 0.0543 - val_categorical_accuracy: 0.9928 - val_tp: 5614.0000 - val_fp: 41.0000 - val_tn: 28239.0000 - val_fn: 42.0000 - val_precision: 0.9927 - val_recall: 0.9926 - val_auc: 0.9974 - val_prc: 0.9932
Epoch 204/300
177/177 [==============================] - 15s 85ms/step - loss: 0.0731 - categorical_accuracy: 0.9829 - tp: 5557.0000 - fp: 91.0000 - tn: 28189.0000 - fn: 99.0000 - precision: 0.9839 - recall: 0.9825 - auc: 0.9960 - prc: 0.9901 - val_loss: 0.0677 - val_categorical_accuracy: 0.9850 - val_tp: 5568.0000 - val_fp: 82.0000 - val_tn: 28198.0000 - val_fn: 88.0000 - val_precision: 0.9855 - val_recall: 0.9844 - val_auc: 0.9966 - val_prc: 0.9909
Epoch 205/300
177/177 [==============================] - 14s 81ms/step - loss: 0.0730 - categorical_accuracy: 0.9843 - tp: 5567.0000 - fp: 88.0000 - tn: 28192.0000 - fn: 89.0000 - precision: 0.9844 - recall: 0.9843 - auc: 0.9971 - prc: 0.9924 - val_loss: 0.0449 - val_categorical_accuracy: 0.9936 - val_tp: 5619.0000 - val_fp: 36.0000 - val_tn: 28244.0000 - val_fn: 37.0000 - val_precision: 0.9936 - val_recall: 0.9935 - val_auc: 0.9977 - val_prc: 0.9937
Epoch 206/300
177/177 [==============================] - 14s 81ms/step - loss: 0.0862 - categorical_accuracy: 0.9850 - tp: 5570.0000 - fp: 84.0000 - tn: 28196.0000 - fn: 86.0000 - precision: 0.9851 - recall: 0.9848 - auc: 0.9951 - prc: 0.9877 - val_loss: 0.1199 - val_categorical_accuracy: 0.9825 - val_tp: 5556.0000 - val_fp: 98.0000 - val_tn: 28182.0000 - val_fn: 100.0000 - val_precision: 0.9827 - val_recall: 0.9823 - val_auc: 0.9942 - val_prc: 0.9847
Epoch 207/300
177/177 [==============================] - 14s 80ms/step - loss: 0.1134 - categorical_accuracy: 0.9813 - tp: 5549.0000 - fp: 104.0000 - tn: 28176.0000 - fn: 107.0000 - precision: 0.9816 - recall: 0.9811 - auc: 0.9943 - prc: 0.9852 - val_loss: 0.0748 - val_categorical_accuracy: 0.9871 - val_tp: 5583.0000 - val_fp: 70.0000 - val_tn: 28210.0000 - val_fn: 73.0000 - val_precision: 0.9876 - val_recall: 0.9871 - val_auc: 0.9963 - val_prc: 0.9905
Epoch 208/300
177/177 [==============================] - 15s 84ms/step - loss: 0.0914 - categorical_accuracy: 0.9790 - tp: 5535.0000 - fp: 113.0000 - tn: 28167.0000 - fn: 121.0000 - precision: 0.9800 - recall: 0.9786 - auc: 0.9951 - prc: 0.9871 - val_loss: 0.0405 - val_categorical_accuracy: 0.9928 - val_tp: 5613.0000 - val_fp: 41.0000 - val_tn: 28239.0000 - val_fn: 43.0000 - val_precision: 0.9927 - val_recall: 0.9924 - val_auc: 0.9979 - val_prc: 0.9950
Epoch 209/300
177/177 [==============================] - 14s 81ms/step - loss: 0.0886 - categorical_accuracy: 0.9795 - tp: 5540.0000 - fp: 111.0000 - tn: 28169.0000 - fn: 116.0000 - precision: 0.9804 - recall: 0.9795 - auc: 0.9953 - prc: 0.9882 - val_loss: 0.0429 - val_categorical_accuracy: 0.9917 - val_tp: 5608.0000 - val_fp: 47.0000 - val_tn: 28233.0000 - val_fn: 48.0000 - val_precision: 0.9917 - val_recall: 0.9915 - val_auc: 0.9980 - val_prc: 0.9950
Epoch 210/300
177/177 [==============================] - 14s 81ms/step - loss: 0.0945 - categorical_accuracy: 0.9818 - tp: 5552.0000 - fp: 101.0000 - tn: 28179.0000 - fn: 104.0000 - precision: 0.9821 - recall: 0.9816 - auc: 0.9958 - prc: 0.9888 - val_loss: 0.0804 - val_categorical_accuracy: 0.9804 - val_tp: 5543.0000 - val_fp: 110.0000 - val_tn: 28170.0000 - val_fn: 113.0000 - val_precision: 0.9805 - val_recall: 0.9800 - val_auc: 0.9958 - val_prc: 0.9891
Epoch 211/300
177/177 [==============================] - 14s 81ms/step - loss: 0.0907 - categorical_accuracy: 0.9837 - tp: 5564.0000 - fp: 89.0000 - tn: 28191.0000 - fn: 92.0000 - precision: 0.9843 - recall: 0.9837 - auc: 0.9961 - prc: 0.9900 - val_loss: 0.1587 - val_categorical_accuracy: 0.9638 - val_tp: 5450.0000 - val_fp: 197.0000 - val_tn: 28083.0000 - val_fn: 206.0000 - val_precision: 0.9651 - val_recall: 0.9636 - val_auc: 0.9922 - val_prc: 0.9794
Epoch 212/300
177/177 [==============================] - 14s 81ms/step - loss: 0.1139 - categorical_accuracy: 0.9715 - tp: 5478.0000 - fp: 146.0000 - tn: 28134.0000 - fn: 178.0000 - precision: 0.9740 - recall: 0.9685 - auc: 0.9944 - prc: 0.9850 - val_loss: 0.0740 - val_categorical_accuracy: 0.9897 - val_tp: 5598.0000 - val_fp: 57.0000 - val_tn: 28223.0000 - val_fn: 58.0000 - val_precision: 0.9899 - val_recall: 0.9897 - val_auc: 0.9966 - val_prc: 0.9913
Epoch 213/300
177/177 [==============================] - 15s 82ms/step - loss: 0.0859 - categorical_accuracy: 0.9859 - tp: 5576.0000 - fp: 80.0000 - tn: 28200.0000 - fn: 80.0000 - precision: 0.9859 - recall: 0.9859 - auc: 0.9955 - prc: 0.9889 - val_loss: 0.1263 - val_categorical_accuracy: 0.9853 - val_tp: 5573.0000 - val_fp: 83.0000 - val_tn: 28197.0000 - val_fn: 83.0000 - val_precision: 0.9853 - val_recall: 0.9853 - val_auc: 0.9941 - val_prc: 0.9842
Epoch 214/300
177/177 [==============================] - 14s 80ms/step - loss: 0.1120 - categorical_accuracy: 0.9839 - tp: 5564.0000 - fp: 90.0000 - tn: 28190.0000 - fn: 92.0000 - precision: 0.9841 - recall: 0.9837 - auc: 0.9943 - prc: 0.9854 - val_loss: 0.1608 - val_categorical_accuracy: 0.9615 - val_tp: 5435.0000 - val_fp: 208.0000 - val_tn: 28072.0000 - val_fn: 221.0000 - val_precision: 0.9631 - val_recall: 0.9609 - val_auc: 0.9921 - val_prc: 0.9786
Epoch 215/300
177/177 [==============================] - 14s 80ms/step - loss: 0.0931 - categorical_accuracy: 0.9820 - tp: 5551.0000 - fp: 100.0000 - tn: 28180.0000 - fn: 105.0000 - precision: 0.9823 - recall: 0.9814 - auc: 0.9956 - prc: 0.9886 - val_loss: 0.0490 - val_categorical_accuracy: 0.9874 - val_tp: 5583.0000 - val_fp: 71.0000 - val_tn: 28209.0000 - val_fn: 73.0000 - val_precision: 0.9874 - val_recall: 0.9871 - val_auc: 0.9978 - val_prc: 0.9943
Epoch 216/300
177/177 [==============================] - 14s 80ms/step - loss: 0.0826 - categorical_accuracy: 0.9807 - tp: 5546.0000 - fp: 105.0000 - tn: 28175.0000 - fn: 110.0000 - precision: 0.9814 - recall: 0.9806 - auc: 0.9961 - prc: 0.9908 - val_loss: 0.1044 - val_categorical_accuracy: 0.9830 - val_tp: 5559.0000 - val_fp: 95.0000 - val_tn: 28185.0000 - val_fn: 97.0000 - val_precision: 0.9832 - val_recall: 0.9829 - val_auc: 0.9948 - val_prc: 0.9864
Epoch 217/300
177/177 [==============================] - 15s 84ms/step - loss: 0.0832 - categorical_accuracy: 0.9821 - tp: 5554.0000 - fp: 96.0000 - tn: 28184.0000 - fn: 102.0000 - precision: 0.9830 - recall: 0.9820 - auc: 0.9957 - prc: 0.9885 - val_loss: 0.0322 - val_categorical_accuracy: 0.9945 - val_tp: 5625.0000 - val_fp: 31.0000 - val_tn: 28249.0000 - val_fn: 31.0000 - val_precision: 0.9945 - val_recall: 0.9945 - val_auc: 0.9984 - val_prc: 0.9961
Epoch 218/300
177/177 [==============================] - 14s 80ms/step - loss: 0.0856 - categorical_accuracy: 0.9885 - tp: 5590.0000 - fp: 65.0000 - tn: 28215.0000 - fn: 66.0000 - precision: 0.9885 - recall: 0.9883 - auc: 0.9960 - prc: 0.9895 - val_loss: 0.0966 - val_categorical_accuracy: 0.9871 - val_tp: 5583.0000 - val_fp: 72.0000 - val_tn: 28208.0000 - val_fn: 73.0000 - val_precision: 0.9873 - val_recall: 0.9871 - val_auc: 0.9956 - val_prc: 0.9883
Epoch 219/300
177/177 [==============================] - 14s 81ms/step - loss: 0.1229 - categorical_accuracy: 0.9850 - tp: 5571.0000 - fp: 85.0000 - tn: 28195.0000 - fn: 85.0000 - precision: 0.9850 - recall: 0.9850 - auc: 0.9942 - prc: 0.9849 - val_loss: 0.2140 - val_categorical_accuracy: 0.9763 - val_tp: 5522.0000 - val_fp: 134.0000 - val_tn: 28146.0000 - val_fn: 134.0000 - val_precision: 0.9763 - val_recall: 0.9763 - val_auc: 0.9898 - val_prc: 0.9732
Epoch 220/300
177/177 [==============================] - 14s 81ms/step - loss: 0.1082 - categorical_accuracy: 0.9848 - tp: 5569.0000 - fp: 85.0000 - tn: 28195.0000 - fn: 87.0000 - precision: 0.9850 - recall: 0.9846 - auc: 0.9947 - prc: 0.9864 - val_loss: 0.0795 - val_categorical_accuracy: 0.9869 - val_tp: 5580.0000 - val_fp: 74.0000 - val_tn: 28206.0000 - val_fn: 76.0000 - val_precision: 0.9869 - val_recall: 0.9866 - val_auc: 0.9964 - val_prc: 0.9903
Epoch 221/300
177/177 [==============================] - 15s 82ms/step - loss: 0.1065 - categorical_accuracy: 0.9784 - tp: 5532.0000 - fp: 117.0000 - tn: 28163.0000 - fn: 124.0000 - precision: 0.9793 - recall: 0.9781 - auc: 0.9949 - prc: 0.9863 - val_loss: 0.0598 - val_categorical_accuracy: 0.9906 - val_tp: 5602.0000 - val_fp: 52.0000 - val_tn: 28228.0000 - val_fn: 54.0000 - val_precision: 0.9908 - val_recall: 0.9905 - val_auc: 0.9971 - val_prc: 0.9926
Epoch 222/300
177/177 [==============================] - 14s 80ms/step - loss: 0.1470 - categorical_accuracy: 0.9798 - tp: 5542.0000 - fp: 113.0000 - tn: 28167.0000 - fn: 114.0000 - precision: 0.9800 - recall: 0.9798 - auc: 0.9929 - prc: 0.9815 - val_loss: 0.2193 - val_categorical_accuracy: 0.9539 - val_tp: 5386.0000 - val_fp: 241.0000 - val_tn: 28039.0000 - val_fn: 270.0000 - val_precision: 0.9572 - val_recall: 0.9523 - val_auc: 0.9878 - val_prc: 0.9673
Epoch 223/300
177/177 [==============================] - 14s 81ms/step - loss: 0.1627 - categorical_accuracy: 0.9765 - tp: 5520.0000 - fp: 132.0000 - tn: 28148.0000 - fn: 136.0000 - precision: 0.9766 - recall: 0.9760 - auc: 0.9923 - prc: 0.9798 - val_loss: 0.2002 - val_categorical_accuracy: 0.9729 - val_tp: 5501.0000 - val_fp: 153.0000 - val_tn: 28127.0000 - val_fn: 155.0000 - val_precision: 0.9729 - val_recall: 0.9726 - val_auc: 0.9907 - val_prc: 0.9759
Epoch 224/300
177/177 [==============================] - 14s 80ms/step - loss: 0.0975 - categorical_accuracy: 0.9818 - tp: 5552.0000 - fp: 100.0000 - tn: 28180.0000 - fn: 104.0000 - precision: 0.9823 - recall: 0.9816 - auc: 0.9952 - prc: 0.9872 - val_loss: 0.0662 - val_categorical_accuracy: 0.9820 - val_tp: 5549.0000 - val_fp: 94.0000 - val_tn: 28186.0000 - val_fn: 107.0000 - val_precision: 0.9833 - val_recall: 0.9811 - val_auc: 0.9968 - val_prc: 0.9919
Epoch 225/300
177/177 [==============================] - 15s 82ms/step - loss: 0.1001 - categorical_accuracy: 0.9784 - tp: 5529.0000 - fp: 118.0000 - tn: 28162.0000 - fn: 127.0000 - precision: 0.9791 - recall: 0.9775 - auc: 0.9948 - prc: 0.9866 - val_loss: 0.0679 - val_categorical_accuracy: 0.9836 - val_tp: 5560.0000 - val_fp: 89.0000 - val_tn: 28191.0000 - val_fn: 96.0000 - val_precision: 0.9842 - val_recall: 0.9830 - val_auc: 0.9966 - val_prc: 0.9909
Epoch 226/300
177/177 [==============================] - 14s 80ms/step - loss: 0.1035 - categorical_accuracy: 0.9837 - tp: 5563.0000 - fp: 89.0000 - tn: 28191.0000 - fn: 93.0000 - precision: 0.9843 - recall: 0.9836 - auc: 0.9950 - prc: 0.9869 - val_loss: 0.0492 - val_categorical_accuracy: 0.9885 - val_tp: 5591.0000 - val_fp: 63.0000 - val_tn: 28217.0000 - val_fn: 65.0000 - val_precision: 0.9889 - val_recall: 0.9885 - val_auc: 0.9973 - val_prc: 0.9936
Epoch 227/300
177/177 [==============================] - 14s 81ms/step - loss: 0.1020 - categorical_accuracy: 0.9850 - tp: 5571.0000 - fp: 84.0000 - tn: 28196.0000 - fn: 85.0000 - precision: 0.9851 - recall: 0.9850 - auc: 0.9951 - prc: 0.9874 - val_loss: 0.0860 - val_categorical_accuracy: 0.9896 - val_tp: 5597.0000 - val_fp: 59.0000 - val_tn: 28221.0000 - val_fn: 59.0000 - val_precision: 0.9896 - val_recall: 0.9896 - val_auc: 0.9958 - val_prc: 0.9887
Epoch 228/300
177/177 [==============================] - 14s 81ms/step - loss: 0.1105 - categorical_accuracy: 0.9846 - tp: 5566.0000 - fp: 87.0000 - tn: 28193.0000 - fn: 90.0000 - precision: 0.9846 - recall: 0.9841 - auc: 0.9950 - prc: 0.9868 - val_loss: 0.3568 - val_categorical_accuracy: 0.9639 - val_tp: 5452.0000 - val_fp: 204.0000 - val_tn: 28076.0000 - val_fn: 204.0000 - val_precision: 0.9639 - val_recall: 0.9639 - val_auc: 0.9833 - val_prc: 0.9558
Epoch 229/300
177/177 [==============================] - 15s 87ms/step - loss: 0.1359 - categorical_accuracy: 0.9829 - tp: 5558.0000 - fp: 95.0000 - tn: 28185.0000 - fn: 98.0000 - precision: 0.9832 - recall: 0.9827 - auc: 0.9937 - prc: 0.9833 - val_loss: 0.0681 - val_categorical_accuracy: 0.9867 - val_tp: 5580.0000 - val_fp: 71.0000 - val_tn: 28209.0000 - val_fn: 76.0000 - val_precision: 0.9874 - val_recall: 0.9866 - val_auc: 0.9964 - val_prc: 0.9909
Epoch 230/300
177/177 [==============================] - 14s 81ms/step - loss: 0.0980 - categorical_accuracy: 0.9843 - tp: 5567.0000 - fp: 87.0000 - tn: 28193.0000 - fn: 89.0000 - precision: 0.9846 - recall: 0.9843 - auc: 0.9952 - prc: 0.9876 - val_loss: 0.0740 - val_categorical_accuracy: 0.9919 - val_tp: 5610.0000 - val_fp: 46.0000 - val_tn: 28234.0000 - val_fn: 46.0000 - val_precision: 0.9919 - val_recall: 0.9919 - val_auc: 0.9969 - val_prc: 0.9917
Epoch 231/300
177/177 [==============================] - 14s 80ms/step - loss: 0.1107 - categorical_accuracy: 0.9853 - tp: 5572.0000 - fp: 82.0000 - tn: 28198.0000 - fn: 84.0000 - precision: 0.9855 - recall: 0.9851 - auc: 0.9948 - prc: 0.9861 - val_loss: 0.0882 - val_categorical_accuracy: 0.9883 - val_tp: 5590.0000 - val_fp: 66.0000 - val_tn: 28214.0000 - val_fn: 66.0000 - val_precision: 0.9883 - val_recall: 0.9883 - val_auc: 0.9960 - val_prc: 0.9899
Epoch 232/300
177/177 [==============================] - 14s 81ms/step - loss: 0.1296 - categorical_accuracy: 0.9728 - tp: 5501.0000 - fp: 149.0000 - tn: 28131.0000 - fn: 155.0000 - precision: 0.9736 - recall: 0.9726 - auc: 0.9933 - prc: 0.9828 - val_loss: 0.0766 - val_categorical_accuracy: 0.9837 - val_tp: 5558.0000 - val_fp: 89.0000 - val_tn: 28191.0000 - val_fn: 98.0000 - val_precision: 0.9842 - val_recall: 0.9827 - val_auc: 0.9963 - val_prc: 0.9900
Epoch 233/300
177/177 [==============================] - 15s 83ms/step - loss: 0.1007 - categorical_accuracy: 0.9825 - tp: 5555.0000 - fp: 97.0000 - tn: 28183.0000 - fn: 101.0000 - precision: 0.9828 - recall: 0.9821 - auc: 0.9943 - prc: 0.9856 - val_loss: 0.2709 - val_categorical_accuracy: 0.9256 - val_tp: 5216.0000 - val_fp: 390.0000 - val_tn: 27890.0000 - val_fn: 440.0000 - val_precision: 0.9304 - val_recall: 0.9222 - val_auc: 0.9844 - val_prc: 0.9567
Epoch 234/300
177/177 [==============================] - 14s 81ms/step - loss: 0.1351 - categorical_accuracy: 0.9765 - tp: 5521.0000 - fp: 131.0000 - tn: 28149.0000 - fn: 135.0000 - precision: 0.9768 - recall: 0.9761 - auc: 0.9933 - prc: 0.9825 - val_loss: 0.2048 - val_categorical_accuracy: 0.9692 - val_tp: 5481.0000 - val_fp: 171.0000 - val_tn: 28109.0000 - val_fn: 175.0000 - val_precision: 0.9697 - val_recall: 0.9691 - val_auc: 0.9904 - val_prc: 0.9744
Epoch 235/300
177/177 [==============================] - 14s 80ms/step - loss: 0.1096 - categorical_accuracy: 0.9806 - tp: 5543.0000 - fp: 105.0000 - tn: 28175.0000 - fn: 113.0000 - precision: 0.9814 - recall: 0.9800 - auc: 0.9948 - prc: 0.9859 - val_loss: 0.0649 - val_categorical_accuracy: 0.9906 - val_tp: 5602.0000 - val_fp: 51.0000 - val_tn: 28229.0000 - val_fn: 54.0000 - val_precision: 0.9910 - val_recall: 0.9905 - val_auc: 0.9970 - val_prc: 0.9921
Epoch 236/300
177/177 [==============================] - 14s 81ms/step - loss: 0.0976 - categorical_accuracy: 0.9836 - tp: 5562.0000 - fp: 93.0000 - tn: 28187.0000 - fn: 94.0000 - precision: 0.9836 - recall: 0.9834 - auc: 0.9945 - prc: 0.9853 - val_loss: 0.0798 - val_categorical_accuracy: 0.9889 - val_tp: 5593.0000 - val_fp: 63.0000 - val_tn: 28217.0000 - val_fn: 63.0000 - val_precision: 0.9889 - val_recall: 0.9889 - val_auc: 0.9960 - val_prc: 0.9896
Epoch 237/300
177/177 [==============================] - 15s 84ms/step - loss: 0.1182 - categorical_accuracy: 0.9821 - tp: 5554.0000 - fp: 100.0000 - tn: 28180.0000 - fn: 102.0000 - precision: 0.9823 - recall: 0.9820 - auc: 0.9944 - prc: 0.9851 - val_loss: 0.0539 - val_categorical_accuracy: 0.9887 - val_tp: 5592.0000 - val_fp: 63.0000 - val_tn: 28217.0000 - val_fn: 64.0000 - val_precision: 0.9889 - val_recall: 0.9887 - val_auc: 0.9976 - val_prc: 0.9937
Epoch 238/300
177/177 [==============================] - 14s 81ms/step - loss: 0.0911 - categorical_accuracy: 0.9859 - tp: 5573.0000 - fp: 79.0000 - tn: 28201.0000 - fn: 83.0000 - precision: 0.9860 - recall: 0.9853 - auc: 0.9953 - prc: 0.9879 - val_loss: 0.1808 - val_categorical_accuracy: 0.9547 - val_tp: 5393.0000 - val_fp: 246.0000 - val_tn: 28034.0000 - val_fn: 263.0000 - val_precision: 0.9564 - val_recall: 0.9535 - val_auc: 0.9904 - val_prc: 0.9743
Epoch 239/300
177/177 [==============================] - 14s 80ms/step - loss: 0.0812 - categorical_accuracy: 0.9795 - tp: 5537.0000 - fp: 110.0000 - tn: 28170.0000 - fn: 119.0000 - precision: 0.9805 - recall: 0.9790 - auc: 0.9960 - prc: 0.9893 - val_loss: 0.0855 - val_categorical_accuracy: 0.9851 - val_tp: 5572.0000 - val_fp: 83.0000 - val_tn: 28197.0000 - val_fn: 84.0000 - val_precision: 0.9853 - val_recall: 0.9851 - val_auc: 0.9960 - val_prc: 0.9893
Epoch 240/300
177/177 [==============================] - 14s 81ms/step - loss: 0.0699 - categorical_accuracy: 0.9843 - tp: 5567.0000 - fp: 87.0000 - tn: 28193.0000 - fn: 89.0000 - precision: 0.9846 - recall: 0.9843 - auc: 0.9966 - prc: 0.9908 - val_loss: 0.1111 - val_categorical_accuracy: 0.9814 - val_tp: 5548.0000 - val_fp: 103.0000 - val_tn: 28177.0000 - val_fn: 108.0000 - val_precision: 0.9818 - val_recall: 0.9809 - val_auc: 0.9946 - val_prc: 0.9858
Epoch 241/300
177/177 [==============================] - 15s 85ms/step - loss: 0.1022 - categorical_accuracy: 0.9859 - tp: 5576.0000 - fp: 80.0000 - tn: 28200.0000 - fn: 80.0000 - precision: 0.9859 - recall: 0.9859 - auc: 0.9952 - prc: 0.9873 - val_loss: 0.0924 - val_categorical_accuracy: 0.9850 - val_tp: 5571.0000 - val_fp: 83.0000 - val_tn: 28197.0000 - val_fn: 85.0000 - val_precision: 0.9853 - val_recall: 0.9850 - val_auc: 0.9957 - val_prc: 0.9888
Epoch 242/300
177/177 [==============================] - 14s 81ms/step - loss: 0.1055 - categorical_accuracy: 0.9802 - tp: 5542.0000 - fp: 104.0000 - tn: 28176.0000 - fn: 114.0000 - precision: 0.9816 - recall: 0.9798 - auc: 0.9950 - prc: 0.9872 - val_loss: 0.0573 - val_categorical_accuracy: 0.9933 - val_tp: 5618.0000 - val_fp: 38.0000 - val_tn: 28242.0000 - val_fn: 38.0000 - val_precision: 0.9933 - val_recall: 0.9933 - val_auc: 0.9969 - val_prc: 0.9919
Epoch 243/300
177/177 [==============================] - 14s 81ms/step - loss: 0.1223 - categorical_accuracy: 0.9790 - tp: 5535.0000 - fp: 117.0000 - tn: 28163.0000 - fn: 121.0000 - precision: 0.9793 - recall: 0.9786 - auc: 0.9943 - prc: 0.9852 - val_loss: 0.0610 - val_categorical_accuracy: 0.9906 - val_tp: 5602.0000 - val_fp: 53.0000 - val_tn: 28227.0000 - val_fn: 54.0000 - val_precision: 0.9906 - val_recall: 0.9905 - val_auc: 0.9973 - val_prc: 0.9928
Epoch 244/300
177/177 [==============================] - 14s 80ms/step - loss: 0.0947 - categorical_accuracy: 0.9837 - tp: 5562.0000 - fp: 90.0000 - tn: 28190.0000 - fn: 94.0000 - precision: 0.9841 - recall: 0.9834 - auc: 0.9949 - prc: 0.9877 - val_loss: 0.0785 - val_categorical_accuracy: 0.9798 - val_tp: 5539.0000 - val_fp: 110.0000 - val_tn: 28170.0000 - val_fn: 117.0000 - val_precision: 0.9805 - val_recall: 0.9793 - val_auc: 0.9962 - val_prc: 0.9901
Epoch 245/300
177/177 [==============================] - 14s 81ms/step - loss: 0.0792 - categorical_accuracy: 0.9850 - tp: 5568.0000 - fp: 84.0000 - tn: 28196.0000 - fn: 88.0000 - precision: 0.9851 - recall: 0.9844 - auc: 0.9959 - prc: 0.9895 - val_loss: 0.0813 - val_categorical_accuracy: 0.9894 - val_tp: 5595.0000 - val_fp: 60.0000 - val_tn: 28220.0000 - val_fn: 61.0000 - val_precision: 0.9894 - val_recall: 0.9892 - val_auc: 0.9960 - val_prc: 0.9895
Epoch 246/300
177/177 [==============================] - 14s 81ms/step - loss: 0.1021 - categorical_accuracy: 0.9827 - tp: 5557.0000 - fp: 96.0000 - tn: 28184.0000 - fn: 99.0000 - precision: 0.9830 - recall: 0.9825 - auc: 0.9947 - prc: 0.9866 - val_loss: 0.0748 - val_categorical_accuracy: 0.9862 - val_tp: 5577.0000 - val_fp: 78.0000 - val_tn: 28202.0000 - val_fn: 79.0000 - val_precision: 0.9862 - val_recall: 0.9860 - val_auc: 0.9962 - val_prc: 0.9897
Epoch 247/300
177/177 [==============================] - 14s 81ms/step - loss: 0.0808 - categorical_accuracy: 0.9851 - tp: 5572.0000 - fp: 83.0000 - tn: 28197.0000 - fn: 84.0000 - precision: 0.9853 - recall: 0.9851 - auc: 0.9959 - prc: 0.9895 - val_loss: 0.0537 - val_categorical_accuracy: 0.9919 - val_tp: 5610.0000 - val_fp: 45.0000 - val_tn: 28235.0000 - val_fn: 46.0000 - val_precision: 0.9920 - val_recall: 0.9919 - val_auc: 0.9973 - val_prc: 0.9931
Epoch 248/300
177/177 [==============================] - 14s 81ms/step - loss: 0.0812 - categorical_accuracy: 0.9827 - tp: 5556.0000 - fp: 95.0000 - tn: 28185.0000 - fn: 100.0000 - precision: 0.9832 - recall: 0.9823 - auc: 0.9956 - prc: 0.9891 - val_loss: 0.0747 - val_categorical_accuracy: 0.9887 - val_tp: 5591.0000 - val_fp: 62.0000 - val_tn: 28218.0000 - val_fn: 65.0000 - val_precision: 0.9890 - val_recall: 0.9885 - val_auc: 0.9966 - val_prc: 0.9907
Epoch 249/300
177/177 [==============================] - 15s 86ms/step - loss: 0.0818 - categorical_accuracy: 0.9882 - tp: 5589.0000 - fp: 66.0000 - tn: 28214.0000 - fn: 67.0000 - precision: 0.9883 - recall: 0.9882 - auc: 0.9960 - prc: 0.9893 - val_loss: 0.1052 - val_categorical_accuracy: 0.9848 - val_tp: 5569.0000 - val_fp: 86.0000 - val_tn: 28194.0000 - val_fn: 87.0000 - val_precision: 0.9848 - val_recall: 0.9846 - val_auc: 0.9949 - val_prc: 0.9864
Epoch 250/300
177/177 [==============================] - 14s 80ms/step - loss: 0.0905 - categorical_accuracy: 0.9867 - tp: 5581.0000 - fp: 75.0000 - tn: 28205.0000 - fn: 75.0000 - precision: 0.9867 - recall: 0.9867 - auc: 0.9953 - prc: 0.9875 - val_loss: 0.1474 - val_categorical_accuracy: 0.9809 - val_tp: 5548.0000 - val_fp: 108.0000 - val_tn: 28172.0000 - val_fn: 108.0000 - val_precision: 0.9809 - val_recall: 0.9809 - val_auc: 0.9929 - val_prc: 0.9807
Epoch 251/300
177/177 [==============================] - 14s 81ms/step - loss: 0.0988 - categorical_accuracy: 0.9859 - tp: 5575.0000 - fp: 80.0000 - tn: 28200.0000 - fn: 81.0000 - precision: 0.9859 - recall: 0.9857 - auc: 0.9956 - prc: 0.9889 - val_loss: 0.2867 - val_categorical_accuracy: 0.9729 - val_tp: 5502.0000 - val_fp: 151.0000 - val_tn: 28129.0000 - val_fn: 154.0000 - val_precision: 0.9733 - val_recall: 0.9728 - val_auc: 0.9873 - val_prc: 0.9661
Epoch 252/300
177/177 [==============================] - 14s 80ms/step - loss: 0.1096 - categorical_accuracy: 0.9807 - tp: 5546.0000 - fp: 106.0000 - tn: 28174.0000 - fn: 110.0000 - precision: 0.9812 - recall: 0.9806 - auc: 0.9941 - prc: 0.9857 - val_loss: 0.0992 - val_categorical_accuracy: 0.9837 - val_tp: 5562.0000 - val_fp: 91.0000 - val_tn: 28189.0000 - val_fn: 94.0000 - val_precision: 0.9839 - val_recall: 0.9834 - val_auc: 0.9953 - val_prc: 0.9878
Epoch 253/300
177/177 [==============================] - 16s 89ms/step - loss: 0.0747 - categorical_accuracy: 0.9857 - tp: 5573.0000 - fp: 80.0000 - tn: 28200.0000 - fn: 83.0000 - precision: 0.9858 - recall: 0.9853 - auc: 0.9964 - prc: 0.9909 - val_loss: 0.0442 - val_categorical_accuracy: 0.9938 - val_tp: 5620.0000 - val_fp: 34.0000 - val_tn: 28246.0000 - val_fn: 36.0000 - val_precision: 0.9940 - val_recall: 0.9936 - val_auc: 0.9979 - val_prc: 0.9945
Epoch 254/300
177/177 [==============================] - 14s 81ms/step - loss: 0.1044 - categorical_accuracy: 0.9839 - tp: 5564.0000 - fp: 91.0000 - tn: 28189.0000 - fn: 92.0000 - precision: 0.9839 - recall: 0.9837 - auc: 0.9948 - prc: 0.9864 - val_loss: 0.0468 - val_categorical_accuracy: 0.9915 - val_tp: 5608.0000 - val_fp: 47.0000 - val_tn: 28233.0000 - val_fn: 48.0000 - val_precision: 0.9917 - val_recall: 0.9915 - val_auc: 0.9974 - val_prc: 0.9931
Epoch 255/300
177/177 [==============================] - 14s 81ms/step - loss: 0.0982 - categorical_accuracy: 0.9860 - tp: 5576.0000 - fp: 78.0000 - tn: 28202.0000 - fn: 80.0000 - precision: 0.9862 - recall: 0.9859 - auc: 0.9954 - prc: 0.9878 - val_loss: 0.0956 - val_categorical_accuracy: 0.9802 - val_tp: 5544.0000 - val_fp: 112.0000 - val_tn: 28168.0000 - val_fn: 112.0000 - val_precision: 0.9802 - val_recall: 0.9802 - val_auc: 0.9955 - val_prc: 0.9877
Epoch 256/300
177/177 [==============================] - 14s 80ms/step - loss: 0.1357 - categorical_accuracy: 0.9768 - tp: 5524.0000 - fp: 121.0000 - tn: 28159.0000 - fn: 132.0000 - precision: 0.9786 - recall: 0.9767 - auc: 0.9935 - prc: 0.9829 - val_loss: 0.1071 - val_categorical_accuracy: 0.9738 - val_tp: 5504.0000 - val_fp: 142.0000 - val_tn: 28138.0000 - val_fn: 152.0000 - val_precision: 0.9748 - val_recall: 0.9731 - val_auc: 0.9943 - val_prc: 0.9851
Epoch 257/300
177/177 [==============================] - 14s 81ms/step - loss: 0.1077 - categorical_accuracy: 0.9873 - tp: 5584.0000 - fp: 72.0000 - tn: 28208.0000 - fn: 72.0000 - precision: 0.9873 - recall: 0.9873 - auc: 0.9949 - prc: 0.9865 - val_loss: 0.0736 - val_categorical_accuracy: 0.9919 - val_tp: 5610.0000 - val_fp: 45.0000 - val_tn: 28235.0000 - val_fn: 46.0000 - val_precision: 0.9920 - val_recall: 0.9919 - val_auc: 0.9966 - val_prc: 0.9910
Epoch 258/300
177/177 [==============================] - 14s 81ms/step - loss: 0.1101 - categorical_accuracy: 0.9797 - tp: 5538.0000 - fp: 111.0000 - tn: 28169.0000 - fn: 118.0000 - precision: 0.9804 - recall: 0.9791 - auc: 0.9943 - prc: 0.9853 - val_loss: 0.1285 - val_categorical_accuracy: 0.9779 - val_tp: 5529.0000 - val_fp: 123.0000 - val_tn: 28157.0000 - val_fn: 127.0000 - val_precision: 0.9782 - val_recall: 0.9775 - val_auc: 0.9939 - val_prc: 0.9838
Epoch 259/300
177/177 [==============================] - 14s 81ms/step - loss: 0.1016 - categorical_accuracy: 0.9813 - tp: 5546.0000 - fp: 104.0000 - tn: 28176.0000 - fn: 110.0000 - precision: 0.9816 - recall: 0.9806 - auc: 0.9945 - prc: 0.9857 - val_loss: 0.0889 - val_categorical_accuracy: 0.9878 - val_tp: 5587.0000 - val_fp: 69.0000 - val_tn: 28211.0000 - val_fn: 69.0000 - val_precision: 0.9878 - val_recall: 0.9878 - val_auc: 0.9953 - val_prc: 0.9875
Epoch 260/300
177/177 [==============================] - 14s 81ms/step - loss: 0.1022 - categorical_accuracy: 0.9798 - tp: 5539.0000 - fp: 108.0000 - tn: 28172.0000 - fn: 117.0000 - precision: 0.9809 - recall: 0.9793 - auc: 0.9946 - prc: 0.9859 - val_loss: 0.0423 - val_categorical_accuracy: 0.9906 - val_tp: 5602.0000 - val_fp: 50.0000 - val_tn: 28230.0000 - val_fn: 54.0000 - val_precision: 0.9912 - val_recall: 0.9905 - val_auc: 0.9981 - val_prc: 0.9947
Epoch 261/300
177/177 [==============================] - 15s 84ms/step - loss: 0.0953 - categorical_accuracy: 0.9823 - tp: 5555.0000 - fp: 97.0000 - tn: 28183.0000 - fn: 101.0000 - precision: 0.9828 - recall: 0.9821 - auc: 0.9959 - prc: 0.9891 - val_loss: 0.0322 - val_categorical_accuracy: 0.9940 - val_tp: 5622.0000 - val_fp: 34.0000 - val_tn: 28246.0000 - val_fn: 34.0000 - val_precision: 0.9940 - val_recall: 0.9940 - val_auc: 0.9985 - val_prc: 0.9966
Epoch 262/300
177/177 [==============================] - 14s 81ms/step - loss: 0.0847 - categorical_accuracy: 0.9883 - tp: 5589.0000 - fp: 65.0000 - tn: 28215.0000 - fn: 67.0000 - precision: 0.9885 - recall: 0.9882 - auc: 0.9956 - prc: 0.9887 - val_loss: 0.2084 - val_categorical_accuracy: 0.9590 - val_tp: 5422.0000 - val_fp: 225.0000 - val_tn: 28055.0000 - val_fn: 234.0000 - val_precision: 0.9602 - val_recall: 0.9586 - val_auc: 0.9892 - val_prc: 0.9714
Epoch 263/300
177/177 [==============================] - 14s 80ms/step - loss: 0.1174 - categorical_accuracy: 0.9821 - tp: 5555.0000 - fp: 101.0000 - tn: 28179.0000 - fn: 101.0000 - precision: 0.9821 - recall: 0.9821 - auc: 0.9946 - prc: 0.9859 - val_loss: 0.0864 - val_categorical_accuracy: 0.9897 - val_tp: 5597.0000 - val_fp: 58.0000 - val_tn: 28222.0000 - val_fn: 59.0000 - val_precision: 0.9897 - val_recall: 0.9896 - val_auc: 0.9959 - val_prc: 0.9890
Epoch 264/300
177/177 [==============================] - 14s 81ms/step - loss: 0.1875 - categorical_accuracy: 0.9811 - tp: 5549.0000 - fp: 106.0000 - tn: 28174.0000 - fn: 107.0000 - precision: 0.9813 - recall: 0.9811 - auc: 0.9907 - prc: 0.9757 - val_loss: 0.1798 - val_categorical_accuracy: 0.9774 - val_tp: 5528.0000 - val_fp: 128.0000 - val_tn: 28152.0000 - val_fn: 128.0000 - val_precision: 0.9774 - val_recall: 0.9774 - val_auc: 0.9909 - val_prc: 0.9770
Epoch 265/300
177/177 [==============================] - 15s 82ms/step - loss: 0.1530 - categorical_accuracy: 0.9800 - tp: 5543.0000 - fp: 112.0000 - tn: 28168.0000 - fn: 113.0000 - precision: 0.9802 - recall: 0.9800 - auc: 0.9921 - prc: 0.9797 - val_loss: 0.0571 - val_categorical_accuracy: 0.9926 - val_tp: 5614.0000 - val_fp: 42.0000 - val_tn: 28238.0000 - val_fn: 42.0000 - val_precision: 0.9926 - val_recall: 0.9926 - val_auc: 0.9975 - val_prc: 0.9932
Epoch 266/300
177/177 [==============================] - 14s 81ms/step - loss: 0.1368 - categorical_accuracy: 0.9821 - tp: 5555.0000 - fp: 100.0000 - tn: 28180.0000 - fn: 101.0000 - precision: 0.9823 - recall: 0.9821 - auc: 0.9935 - prc: 0.9825 - val_loss: 0.0443 - val_categorical_accuracy: 0.9922 - val_tp: 5611.0000 - val_fp: 44.0000 - val_tn: 28236.0000 - val_fn: 45.0000 - val_precision: 0.9922 - val_recall: 0.9920 - val_auc: 0.9979 - val_prc: 0.9944
Epoch 267/300
177/177 [==============================] - 14s 81ms/step - loss: 0.1295 - categorical_accuracy: 0.9772 - tp: 5527.0000 - fp: 124.0000 - tn: 28156.0000 - fn: 129.0000 - precision: 0.9781 - recall: 0.9772 - auc: 0.9936 - prc: 0.9835 - val_loss: 0.0690 - val_categorical_accuracy: 0.9903 - val_tp: 5601.0000 - val_fp: 55.0000 - val_tn: 28225.0000 - val_fn: 55.0000 - val_precision: 0.9903 - val_recall: 0.9903 - val_auc: 0.9966 - val_prc: 0.9914
Epoch 268/300
177/177 [==============================] - 14s 81ms/step - loss: 0.1326 - categorical_accuracy: 0.9823 - tp: 5556.0000 - fp: 99.0000 - tn: 28181.0000 - fn: 100.0000 - precision: 0.9825 - recall: 0.9823 - auc: 0.9938 - prc: 0.9837 - val_loss: 0.1381 - val_categorical_accuracy: 0.9800 - val_tp: 5542.0000 - val_fp: 111.0000 - val_tn: 28169.0000 - val_fn: 114.0000 - val_precision: 0.9804 - val_recall: 0.9798 - val_auc: 0.9933 - val_prc: 0.9820
Epoch 269/300
177/177 [==============================] - 14s 81ms/step - loss: 0.0985 - categorical_accuracy: 0.9846 - tp: 5569.0000 - fp: 85.0000 - tn: 28195.0000 - fn: 87.0000 - precision: 0.9850 - recall: 0.9846 - auc: 0.9954 - prc: 0.9879 - val_loss: 0.1190 - val_categorical_accuracy: 0.9745 - val_tp: 5510.0000 - val_fp: 141.0000 - val_tn: 28139.0000 - val_fn: 146.0000 - val_precision: 0.9750 - val_recall: 0.9742 - val_auc: 0.9943 - val_prc: 0.9851
Epoch 270/300
177/177 [==============================] - 15s 82ms/step - loss: 0.1518 - categorical_accuracy: 0.9774 - tp: 5526.0000 - fp: 126.0000 - tn: 28154.0000 - fn: 130.0000 - precision: 0.9777 - recall: 0.9770 - auc: 0.9937 - prc: 0.9836 - val_loss: 0.0933 - val_categorical_accuracy: 0.9790 - val_tp: 5530.0000 - val_fp: 113.0000 - val_tn: 28167.0000 - val_fn: 126.0000 - val_precision: 0.9800 - val_recall: 0.9777 - val_auc: 0.9954 - val_prc: 0.9882
Epoch 271/300
177/177 [==============================] - 14s 81ms/step - loss: 0.1143 - categorical_accuracy: 0.9802 - tp: 5542.0000 - fp: 106.0000 - tn: 28174.0000 - fn: 114.0000 - precision: 0.9812 - recall: 0.9798 - auc: 0.9949 - prc: 0.9863 - val_loss: 0.0349 - val_categorical_accuracy: 0.9943 - val_tp: 5624.0000 - val_fp: 32.0000 - val_tn: 28248.0000 - val_fn: 32.0000 - val_precision: 0.9943 - val_recall: 0.9943 - val_auc: 0.9981 - val_prc: 0.9953
Epoch 272/300
177/177 [==============================] - 14s 80ms/step - loss: 0.0703 - categorical_accuracy: 0.9883 - tp: 5588.0000 - fp: 63.0000 - tn: 28217.0000 - fn: 68.0000 - precision: 0.9889 - recall: 0.9880 - auc: 0.9964 - prc: 0.9907 - val_loss: 0.0670 - val_categorical_accuracy: 0.9816 - val_tp: 5547.0000 - val_fp: 92.0000 - val_tn: 28188.0000 - val_fn: 109.0000 - val_precision: 0.9837 - val_recall: 0.9807 - val_auc: 0.9967 - val_prc: 0.9914
Epoch 273/300
177/177 [==============================] - 14s 80ms/step - loss: 0.0911 - categorical_accuracy: 0.9867 - tp: 5581.0000 - fp: 73.0000 - tn: 28207.0000 - fn: 75.0000 - precision: 0.9871 - recall: 0.9867 - auc: 0.9955 - prc: 0.9886 - val_loss: 0.1087 - val_categorical_accuracy: 0.9859 - val_tp: 5575.0000 - val_fp: 80.0000 - val_tn: 28200.0000 - val_fn: 81.0000 - val_precision: 0.9859 - val_recall: 0.9857 - val_auc: 0.9952 - val_prc: 0.9869
Epoch 274/300
177/177 [==============================] - 15s 82ms/step - loss: 0.0968 - categorical_accuracy: 0.9848 - tp: 5569.0000 - fp: 85.0000 - tn: 28195.0000 - fn: 87.0000 - precision: 0.9850 - recall: 0.9846 - auc: 0.9956 - prc: 0.9879 - val_loss: 0.0610 - val_categorical_accuracy: 0.9869 - val_tp: 5581.0000 - val_fp: 70.0000 - val_tn: 28210.0000 - val_fn: 75.0000 - val_precision: 0.9876 - val_recall: 0.9867 - val_auc: 0.9972 - val_prc: 0.9923
Epoch 275/300
177/177 [==============================] - 14s 80ms/step - loss: 0.0896 - categorical_accuracy: 0.9859 - tp: 5576.0000 - fp: 78.0000 - tn: 28202.0000 - fn: 80.0000 - precision: 0.9862 - recall: 0.9859 - auc: 0.9956 - prc: 0.9883 - val_loss: 0.0646 - val_categorical_accuracy: 0.9906 - val_tp: 5603.0000 - val_fp: 53.0000 - val_tn: 28227.0000 - val_fn: 53.0000 - val_precision: 0.9906 - val_recall: 0.9906 - val_auc: 0.9973 - val_prc: 0.9928
Epoch 276/300
177/177 [==============================] - 14s 80ms/step - loss: 0.1035 - categorical_accuracy: 0.9811 - tp: 5547.0000 - fp: 103.0000 - tn: 28177.0000 - fn: 109.0000 - precision: 0.9818 - recall: 0.9807 - auc: 0.9951 - prc: 0.9868 - val_loss: 0.0646 - val_categorical_accuracy: 0.9889 - val_tp: 5593.0000 - val_fp: 63.0000 - val_tn: 28217.0000 - val_fn: 63.0000 - val_precision: 0.9889 - val_recall: 0.9889 - val_auc: 0.9965 - val_prc: 0.9914
Epoch 277/300
177/177 [==============================] - 14s 81ms/step - loss: 0.0941 - categorical_accuracy: 0.9827 - tp: 5554.0000 - fp: 96.0000 - tn: 28184.0000 - fn: 102.0000 - precision: 0.9830 - recall: 0.9820 - auc: 0.9953 - prc: 0.9879 - val_loss: 0.0855 - val_categorical_accuracy: 0.9777 - val_tp: 5529.0000 - val_fp: 122.0000 - val_tn: 28158.0000 - val_fn: 127.0000 - val_precision: 0.9784 - val_recall: 0.9775 - val_auc: 0.9958 - val_prc: 0.9889
Epoch 278/300
177/177 [==============================] - 15s 83ms/step - loss: 0.0707 - categorical_accuracy: 0.9873 - tp: 5581.0000 - fp: 69.0000 - tn: 28211.0000 - fn: 75.0000 - precision: 0.9878 - recall: 0.9867 - auc: 0.9965 - prc: 0.9913 - val_loss: 0.1020 - val_categorical_accuracy: 0.9848 - val_tp: 5570.0000 - val_fp: 86.0000 - val_tn: 28194.0000 - val_fn: 86.0000 - val_precision: 0.9848 - val_recall: 0.9848 - val_auc: 0.9949 - val_prc: 0.9861
Epoch 279/300
177/177 [==============================] - 14s 80ms/step - loss: 0.1222 - categorical_accuracy: 0.9682 - tp: 5455.0000 - fp: 157.0000 - tn: 28123.0000 - fn: 201.0000 - precision: 0.9720 - recall: 0.9645 - auc: 0.9936 - prc: 0.9840 - val_loss: 0.0456 - val_categorical_accuracy: 0.9926 - val_tp: 5613.0000 - val_fp: 40.0000 - val_tn: 28240.0000 - val_fn: 43.0000 - val_precision: 0.9929 - val_recall: 0.9924 - val_auc: 0.9975 - val_prc: 0.9942
Epoch 280/300
177/177 [==============================] - 14s 81ms/step - loss: 0.0797 - categorical_accuracy: 0.9860 - tp: 5571.0000 - fp: 77.0000 - tn: 28203.0000 - fn: 85.0000 - precision: 0.9864 - recall: 0.9850 - auc: 0.9955 - prc: 0.9885 - val_loss: 0.0810 - val_categorical_accuracy: 0.9912 - val_tp: 5606.0000 - val_fp: 50.0000 - val_tn: 28230.0000 - val_fn: 50.0000 - val_precision: 0.9912 - val_recall: 0.9912 - val_auc: 0.9964 - val_prc: 0.9901
Epoch 281/300
177/177 [==============================] - 14s 81ms/step - loss: 0.0967 - categorical_accuracy: 0.9844 - tp: 5568.0000 - fp: 84.0000 - tn: 28196.0000 - fn: 88.0000 - precision: 0.9851 - recall: 0.9844 - auc: 0.9947 - prc: 0.9865 - val_loss: 0.0449 - val_categorical_accuracy: 0.9945 - val_tp: 5625.0000 - val_fp: 31.0000 - val_tn: 28249.0000 - val_fn: 31.0000 - val_precision: 0.9945 - val_recall: 0.9945 - val_auc: 0.9980 - val_prc: 0.9946
Epoch 282/300
177/177 [==============================] - 15s 84ms/step - loss: 0.1155 - categorical_accuracy: 0.9850 - tp: 5570.0000 - fp: 85.0000 - tn: 28195.0000 - fn: 86.0000 - precision: 0.9850 - recall: 0.9848 - auc: 0.9941 - prc: 0.9846 - val_loss: 0.0844 - val_categorical_accuracy: 0.9910 - val_tp: 5605.0000 - val_fp: 51.0000 - val_tn: 28229.0000 - val_fn: 51.0000 - val_precision: 0.9910 - val_recall: 0.9910 - val_auc: 0.9961 - val_prc: 0.9896
Epoch 283/300
177/177 [==============================] - 14s 81ms/step - loss: 0.1334 - categorical_accuracy: 0.9735 - tp: 5500.0000 - fp: 145.0000 - tn: 28135.0000 - fn: 156.0000 - precision: 0.9743 - recall: 0.9724 - auc: 0.9939 - prc: 0.9832 - val_loss: 0.0843 - val_categorical_accuracy: 0.9832 - val_tp: 5560.0000 - val_fp: 93.0000 - val_tn: 28187.0000 - val_fn: 96.0000 - val_precision: 0.9835 - val_recall: 0.9830 - val_auc: 0.9955 - val_prc: 0.9887
Epoch 284/300
177/177 [==============================] - 14s 81ms/step - loss: 0.1200 - categorical_accuracy: 0.9806 - tp: 5543.0000 - fp: 103.0000 - tn: 28177.0000 - fn: 113.0000 - precision: 0.9818 - recall: 0.9800 - auc: 0.9943 - prc: 0.9852 - val_loss: 0.2205 - val_categorical_accuracy: 0.9590 - val_tp: 5421.0000 - val_fp: 224.0000 - val_tn: 28056.0000 - val_fn: 235.0000 - val_precision: 0.9603 - val_recall: 0.9585 - val_auc: 0.9883 - val_prc: 0.9702
Epoch 285/300
177/177 [==============================] - 14s 81ms/step - loss: 0.1057 - categorical_accuracy: 0.9843 - tp: 5567.0000 - fp: 88.0000 - tn: 28192.0000 - fn: 89.0000 - precision: 0.9844 - recall: 0.9843 - auc: 0.9942 - prc: 0.9851 - val_loss: 0.0849 - val_categorical_accuracy: 0.9857 - val_tp: 5575.0000 - val_fp: 80.0000 - val_tn: 28200.0000 - val_fn: 81.0000 - val_precision: 0.9859 - val_recall: 0.9857 - val_auc: 0.9959 - val_prc: 0.9892
Epoch 286/300
177/177 [==============================] - 15s 83ms/step - loss: 0.1023 - categorical_accuracy: 0.9811 - tp: 5549.0000 - fp: 103.0000 - tn: 28177.0000 - fn: 107.0000 - precision: 0.9818 - recall: 0.9811 - auc: 0.9946 - prc: 0.9864 - val_loss: 0.0528 - val_categorical_accuracy: 0.9894 - val_tp: 5595.0000 - val_fp: 55.0000 - val_tn: 28225.0000 - val_fn: 61.0000 - val_precision: 0.9903 - val_recall: 0.9892 - val_auc: 0.9974 - val_prc: 0.9937
Epoch 287/300
177/177 [==============================] - 14s 81ms/step - loss: 0.0775 - categorical_accuracy: 0.9869 - tp: 5581.0000 - fp: 74.0000 - tn: 28206.0000 - fn: 75.0000 - precision: 0.9869 - recall: 0.9867 - auc: 0.9957 - prc: 0.9889 - val_loss: 0.0913 - val_categorical_accuracy: 0.9798 - val_tp: 5539.0000 - val_fp: 112.0000 - val_tn: 28168.0000 - val_fn: 117.0000 - val_precision: 0.9802 - val_recall: 0.9793 - val_auc: 0.9957 - val_prc: 0.9886
Epoch 288/300
177/177 [==============================] - 14s 81ms/step - loss: 0.1317 - categorical_accuracy: 0.9844 - tp: 5566.0000 - fp: 85.0000 - tn: 28195.0000 - fn: 90.0000 - precision: 0.9850 - recall: 0.9841 - auc: 0.9937 - prc: 0.9831 - val_loss: 0.0851 - val_categorical_accuracy: 0.9892 - val_tp: 5593.0000 - val_fp: 59.0000 - val_tn: 28221.0000 - val_fn: 63.0000 - val_precision: 0.9896 - val_recall: 0.9889 - val_auc: 0.9961 - val_prc: 0.9897
Epoch 289/300
177/177 [==============================] - 14s 80ms/step - loss: 0.1013 - categorical_accuracy: 0.9836 - tp: 5561.0000 - fp: 90.0000 - tn: 28190.0000 - fn: 95.0000 - precision: 0.9841 - recall: 0.9832 - auc: 0.9950 - prc: 0.9868 - val_loss: 0.0553 - val_categorical_accuracy: 0.9897 - val_tp: 5597.0000 - val_fp: 55.0000 - val_tn: 28225.0000 - val_fn: 59.0000 - val_precision: 0.9903 - val_recall: 0.9896 - val_auc: 0.9970 - val_prc: 0.9919
Epoch 290/300
177/177 [==============================] - 15s 82ms/step - loss: 0.1350 - categorical_accuracy: 0.9763 - tp: 5521.0000 - fp: 132.0000 - tn: 28148.0000 - fn: 135.0000 - precision: 0.9766 - recall: 0.9761 - auc: 0.9930 - prc: 0.9818 - val_loss: 0.0589 - val_categorical_accuracy: 0.9860 - val_tp: 5576.0000 - val_fp: 75.0000 - val_tn: 28205.0000 - val_fn: 80.0000 - val_precision: 0.9867 - val_recall: 0.9859 - val_auc: 0.9971 - val_prc: 0.9924
Epoch 291/300
177/177 [==============================] - 14s 81ms/step - loss: 0.0717 - categorical_accuracy: 0.9883 - tp: 5589.0000 - fp: 65.0000 - tn: 28215.0000 - fn: 67.0000 - precision: 0.9885 - recall: 0.9882 - auc: 0.9965 - prc: 0.9913 - val_loss: 0.0565 - val_categorical_accuracy: 0.9915 - val_tp: 5607.0000 - val_fp: 47.0000 - val_tn: 28233.0000 - val_fn: 49.0000 - val_precision: 0.9917 - val_recall: 0.9913 - val_auc: 0.9976 - val_prc: 0.9937
Epoch 292/300
177/177 [==============================] - 14s 81ms/step - loss: 0.0935 - categorical_accuracy: 0.9851 - tp: 5572.0000 - fp: 82.0000 - tn: 28198.0000 - fn: 84.0000 - precision: 0.9855 - recall: 0.9851 - auc: 0.9956 - prc: 0.9884 - val_loss: 0.0505 - val_categorical_accuracy: 0.9931 - val_tp: 5617.0000 - val_fp: 39.0000 - val_tn: 28241.0000 - val_fn: 39.0000 - val_precision: 0.9931 - val_recall: 0.9931 - val_auc: 0.9976 - val_prc: 0.9940
Epoch 293/300
177/177 [==============================] - 14s 81ms/step - loss: 0.0878 - categorical_accuracy: 0.9878 - tp: 5586.0000 - fp: 68.0000 - tn: 28212.0000 - fn: 70.0000 - precision: 0.9880 - recall: 0.9876 - auc: 0.9961 - prc: 0.9896 - val_loss: 0.1942 - val_categorical_accuracy: 0.9749 - val_tp: 5513.0000 - val_fp: 141.0000 - val_tn: 28139.0000 - val_fn: 143.0000 - val_precision: 0.9751 - val_recall: 0.9747 - val_auc: 0.9909 - val_prc: 0.9762
Epoch 294/300
177/177 [==============================] - 15s 83ms/step - loss: 0.1171 - categorical_accuracy: 0.9832 - tp: 5560.0000 - fp: 93.0000 - tn: 28187.0000 - fn: 96.0000 - precision: 0.9835 - recall: 0.9830 - auc: 0.9944 - prc: 0.9857 - val_loss: 0.0863 - val_categorical_accuracy: 0.9892 - val_tp: 5595.0000 - val_fp: 61.0000 - val_tn: 28219.0000 - val_fn: 61.0000 - val_precision: 0.9892 - val_recall: 0.9892 - val_auc: 0.9962 - val_prc: 0.9901
Epoch 295/300
177/177 [==============================] - 14s 81ms/step - loss: 0.0861 - categorical_accuracy: 0.9873 - tp: 5583.0000 - fp: 72.0000 - tn: 28208.0000 - fn: 73.0000 - precision: 0.9873 - recall: 0.9871 - auc: 0.9960 - prc: 0.9893 - val_loss: 0.1054 - val_categorical_accuracy: 0.9841 - val_tp: 5565.0000 - val_fp: 90.0000 - val_tn: 28190.0000 - val_fn: 91.0000 - val_precision: 0.9841 - val_recall: 0.9839 - val_auc: 0.9950 - val_prc: 0.9868
Epoch 296/300
177/177 [==============================] - 14s 81ms/step - loss: 0.0727 - categorical_accuracy: 0.9896 - tp: 5596.0000 - fp: 58.0000 - tn: 28222.0000 - fn: 60.0000 - precision: 0.9897 - recall: 0.9894 - auc: 0.9965 - prc: 0.9908 - val_loss: 0.2651 - val_categorical_accuracy: 0.9530 - val_tp: 5389.0000 - val_fp: 263.0000 - val_tn: 28017.0000 - val_fn: 267.0000 - val_precision: 0.9535 - val_recall: 0.9528 - val_auc: 0.9855 - val_prc: 0.9620
Epoch 297/300
177/177 [==============================] - 14s 81ms/step - loss: 0.0947 - categorical_accuracy: 0.9871 - tp: 5583.0000 - fp: 72.0000 - tn: 28208.0000 - fn: 73.0000 - precision: 0.9873 - recall: 0.9871 - auc: 0.9954 - prc: 0.9881 - val_loss: 0.0661 - val_categorical_accuracy: 0.9850 - val_tp: 5571.0000 - val_fp: 83.0000 - val_tn: 28197.0000 - val_fn: 85.0000 - val_precision: 0.9853 - val_recall: 0.9850 - val_auc: 0.9973 - val_prc: 0.9932
Epoch 298/300
177/177 [==============================] - 15s 83ms/step - loss: 0.1194 - categorical_accuracy: 0.9806 - tp: 5546.0000 - fp: 110.0000 - tn: 28170.0000 - fn: 110.0000 - precision: 0.9806 - recall: 0.9806 - auc: 0.9940 - prc: 0.9846 - val_loss: 0.0516 - val_categorical_accuracy: 0.9942 - val_tp: 5623.0000 - val_fp: 32.0000 - val_tn: 28248.0000 - val_fn: 33.0000 - val_precision: 0.9943 - val_recall: 0.9942 - val_auc: 0.9978 - val_prc: 0.9941
Epoch 299/300
177/177 [==============================] - 14s 80ms/step - loss: 0.0999 - categorical_accuracy: 0.9896 - tp: 5597.0000 - fp: 59.0000 - tn: 28221.0000 - fn: 59.0000 - precision: 0.9896 - recall: 0.9896 - auc: 0.9961 - prc: 0.9894 - val_loss: 0.0654 - val_categorical_accuracy: 0.9942 - val_tp: 5623.0000 - val_fp: 33.0000 - val_tn: 28247.0000 - val_fn: 33.0000 - val_precision: 0.9942 - val_recall: 0.9942 - val_auc: 0.9972 - val_prc: 0.9923
Epoch 300/300
177/177 [==============================] - 14s 81ms/step - loss: 0.1355 - categorical_accuracy: 0.9813 - tp: 5549.0000 - fp: 101.0000 - tn: 28179.0000 - fn: 107.0000 - precision: 0.9821 - recall: 0.9811 - auc: 0.9933 - prc: 0.9824 - val_loss: 0.1055 - val_categorical_accuracy: 0.9874 - val_tp: 5585.0000 - val_fp: 69.0000 - val_tn: 28211.0000 - val_fn: 71.0000 - val_precision: 0.9878 - val_recall: 0.9874 - val_auc: 0.9955 - val_prc: 0.9879
5.2.5 Analyze training history¶

In this section, I will produce plots of the baseline model's accuracy and loss on the training and validation set. These are useful to check for overfitting/underfitting scenario

Additionally a utility function has been added to visualize the results from Tensorboard more compactly.

In [31]:
matplotlib.rcParams['figure.figsize'] = (12, 10)
colors = plt.rcParams['axes.prop_cycle'].by_key()['color']
def plot_metrics(history):
  metrics = ['loss', 'prc', 'precision', 'recall']
  for n, metric in enumerate(metrics):
    name = metric.replace("_"," ").capitalize()
    plt.subplot(2,2,n+1)
    plt.plot(history.epoch[:70], history.history[metric][:70], color=colors[0], label='Train')
    plt.plot(history.epoch[:70], history.history['val_'+metric][:70],
             color=colors[0], linestyle="--", label='Val')
    plt.xlabel('Epoch')
    plt.ylabel(name)
    if metric == 'loss':
      plt.ylim([0, plt.ylim()[1]])
    elif metric == 'auc':
      plt.ylim([0.8,1])
    else:
      plt.ylim([0,1])

    plt.legend()
In [149]:
plot_metrics(baseline_history)
In [428]:
# %reload_ext tensorboard
# %tensorboard --logdir ./tensorboard/improved_baseline_classifier --bind_all

Screen Shot 2023-03-18 at 1.31.12 AM.png

The lowest validation loss was at epoch 66, thereafter the model overfits and there were no further improvements.

image.png

In [152]:
validation_eval_improved_baseline = model.evaluate(ds_validation, batch_size=BATCH_SIZE)
print(f"Validation AUC: {validation_eval_improved_baseline[8]:.3f}")
print(f"Validation PRC: {validation_eval_improved_baseline[9]:.3f}")
print(f"Validation categorical accuracy: {validation_eval_improved_baseline[1]:.3f}")
177/177 [==============================] - 4s 25ms/step - loss: 0.1055 - categorical_accuracy: 0.9874 - tp: 5585.0000 - fp: 69.0000 - tn: 28211.0000 - fn: 71.0000 - precision: 0.9878 - recall: 0.9874 - auc: 0.9955 - prc: 0.9879
Validation AUC: 0.996
Validation PRC: 0.988
Validation categorical accuracy: 0.987
5.2.6 Evaluate model on TEST set¶

Next let's evaluate the model performance on unseen test set

In [131]:
test_eval_improved_baseline = model.evaluate(ds_test, batch_size=BATCH_SIZE)
59/59 [==============================] - 1s 19ms/step - loss: 1.0047 - categorical_accuracy: 0.5549 - tp: 952.0000 - fp: 676.0000 - tn: 8749.0000 - fn: 933.0000 - precision: 0.5848 - recall: 0.5050 - auc: 0.8296 - prc: 0.5559
In [134]:
print(f"Test AUC: {test_eval_improved_baseline[8]:.3f}")
print(f"Test PRC: {test_eval_improved_baseline[9]:.3f}")
print(f"Test categorical accuracy: {test_eval_improved_baseline[1]:.3f}")
Test AUC: 0.830
Test PRC: 0.556
Test categorical accuracy: 0.555

The AUC on test set is .83 and PRC is .556. Comparing this result to the validation set AUC of .99 and PRC of .98, it turns out that the test performance is significantly worse. This may mean that the validation procedure is not very reliable, or that the model is overfitting to the validation data.

Additionally if we observe the categorical accuracy, we notice a similar variance in performance between the validation and test performance. In the next phase, I will try to switch to a more reliable evaluation protocol (K-fold validation) and additionally explore how regularization, and data augmentation can help mitigate overfitting.

5.3 Regularization and model tuning¶

As we have seen in the case of the baseline classifier, overfitting disrupts the model ability to generalize to new data.

In this section, our goal is to maximize generalization performance on test set. To do that I will mitigate overfitting by introducing the following best practice techniques:

1. Reduce model capacity and retrain on optimal epoch

Reduce the number of layers

2. Add dropout

Dropout is one of the most effective regularization techniques. It involves randomly dropping out a number of output features of the layer during training. The dropout rate is a the fraction of the features that are zeroed out. (Usually between 0.2 - 0.5). In this case we use dropout layer which drops 50% of output values of the layer before the dropout layer.

3. Perform learning rate optimization

Experimenting a learning rate optimization technique found by the winning image classifier developed by a Kaggle Grandmaster.

4. Add data augmentation

Data augmentation will generate more training data from existing training samples by augmenting the samples via a number of random transformations that yield believable looking images.

The goal in this section is to expose the model to more aspects of the data so it can generalize better.

We will leverage the following data augmentation:

  • RandomFlip("horizontal") - To apply horizontal flipping to a random 50% of the images
  • RandomRotation(0.1) - Rotates the input images by a random value in the range [-10%, +10%] or [-36 degrees, +36 degrees]
  • RandomZoom(0.2) - Zooms in or out of the image by a random factor in the rane [-20%, +20%]
  • RandomSaturation - Adjusts the saturation of RGB by a random factor
  • RandomContrast - Adjust the contrast of an image by a random factor
  • RandomBrightness - Adjust the brightness of images by a random factor

The approach used here is achieved via Dataset.map to create a dataset that yields batches of augmented images. In this case Data augmentation will happen asynchronously on the CPU, and is non-blocking. I also overlap the model training on the GPU with data preprocessing by using Dataset.prefetch API.

5.3.1 Adding Dropout¶

Try adding dropout with lower capacity model to tackle overfitting

In [30]:
def get_model_with_dropout(modelname, low_capacity=True, dropout=True):
    # Model architecture
    inputs = keras.Input(shape=(224, 224, 3), name='preprocessedimage')
    if low_capacity:
        x = layers.Conv2D(filters=8, kernel_size=3, activation='relu')(inputs)
        x = layers.MaxPooling2D(pool_size=2)(x)
        x = layers.Conv2D(filters=16, kernel_size=3, activation='relu')(x)
        x = layers.MaxPooling2D(pool_size=2)(x)
        x = layers.Conv2D(filters=32, kernel_size=3, activation='relu')(x)
    else:
        x = layers.Conv2D(filters=32, kernel_size=3, activation='relu')(inputs)
        x = layers.MaxPooling2D(pool_size=2)(x)
        x = layers.Conv2D(filters=64, kernel_size=3, activation='relu')(x)
        x = layers.MaxPooling2D(pool_size=2)(x)
        x = layers.Conv2D(filters=128, kernel_size=3, activation='relu')(x)
        x = layers.MaxPooling2D(pool_size=2)(x)
        x = layers.Conv2D(filters=256, kernel_size=3, activation='relu')(x)
        x = layers.MaxPooling2D(pool_size=2)(x)
        x = layers.Conv2D(filters=512, kernel_size=3, activation='relu')(x)
        x = layers.MaxPooling2D(pool_size=2)(x) # New 
        x = layers.Conv2D(filters=1024, kernel_size=3, activation='relu')(x) # New    
    if dropout:
        x = layers.Dropout(0.5)(x)
    x = layers.Flatten()(x)
    if dropout:
        x = layers.Dropout(0.5)(x)
    outputs = layers.Dense(6, activation='softmax', name='softmax_layer')(x)
    model = keras.Model(inputs=inputs, outputs=outputs, name=modelname)

    # Compile model
    model.compile(optimizer='rmsprop',
                  loss=tfa.losses.SigmoidFocalCrossEntropy(),
                  metrics=METRICS)
    return model

Low capacity model with dropout

In [194]:
low_cap_model = get_model_with_dropout("low_capacity_reg_model", low_capacity=True, dropout=True)
low_cap_model.summary()
callbacks_list = [
    keras.callbacks.ModelCheckpoint(filepath=f"models/low_capacity_reg_model.keras", monitor="val_prc", save_best_only=True), # prevents overwriting model file unless validation loss has improved
    keras.callbacks.TensorBoard(log_dir=f"./tensorboard/low_capacity_reg_model") # path where callback writes logs
]
low_capacity_reg_history = low_cap_model.fit(
    ds_train,
    epochs=80,
    validation_data=ds_validation,
    callbacks=callbacks_list
)
Model: "low_capacity_reg_model"
_________________________________________________________________
 Layer (type)                Output Shape              Param #   
=================================================================
 preprocessedimage (InputLay  [(None, 224, 224, 3)]    0         
 er)                                                             
                                                                 
 conv2d_74 (Conv2D)          (None, 222, 222, 8)       224       
                                                                 
 max_pooling2d_55 (MaxPoolin  (None, 111, 111, 8)      0         
 g2D)                                                            
                                                                 
 conv2d_75 (Conv2D)          (None, 109, 109, 16)      1168      
                                                                 
 max_pooling2d_56 (MaxPoolin  (None, 54, 54, 16)       0         
 g2D)                                                            
                                                                 
 conv2d_76 (Conv2D)          (None, 52, 52, 32)        4640      
                                                                 
 dropout_6 (Dropout)         (None, 52, 52, 32)        0         
                                                                 
 flatten_19 (Flatten)        (None, 86528)             0         
                                                                 
 dropout_7 (Dropout)         (None, 86528)             0         
                                                                 
 softmax_layer (Dense)       (None, 6)                 519174    
                                                                 
=================================================================
Total params: 525,206
Trainable params: 525,206
Non-trainable params: 0
_________________________________________________________________
Epoch 1/80
2023-03-02 03:18:36.622286: E tensorflow/core/grappler/optimizers/meta_optimizer.cc:954] layout failed: INVALID_ARGUMENT: Size of values 0 does not match size of permutation 4 @ fanin shape inlow_capacity_reg_model/dropout_6/dropout/SelectV2-2-TransposeNHWCToNCHW-LayoutOptimizer
177/177 [==============================] - 8s 36ms/step - loss: 0.2431 - categorical_accuracy: 0.3760 - tp: 1652.0000 - fp: 4337.0000 - tn: 52223.0000 - fn: 9660.0000 - precision: 0.2758 - recall: 0.1460 - auc: 0.6314 - prc: 0.2626 - val_loss: 0.2086 - val_categorical_accuracy: 0.5461 - val_tp: 117.0000 - val_fp: 26.0000 - val_tn: 28254.0000 - val_fn: 5539.0000 - val_precision: 0.8182 - val_recall: 0.0207 - val_auc: 0.8611 - val_prc: 0.5724
Epoch 2/80
177/177 [==============================] - 6s 33ms/step - loss: 0.2154 - categorical_accuracy: 0.5610 - tp: 369.0000 - fp: 155.0000 - tn: 28125.0000 - fn: 5287.0000 - precision: 0.7042 - recall: 0.0652 - auc: 0.8508 - prc: 0.5402 - val_loss: 0.1983 - val_categorical_accuracy: 0.5651 - val_tp: 737.0000 - val_fp: 106.0000 - val_tn: 28174.0000 - val_fn: 4919.0000 - val_precision: 0.8743 - val_recall: 0.1303 - val_auc: 0.8753 - val_prc: 0.6211
Epoch 3/80
177/177 [==============================] - 6s 33ms/step - loss: 0.2061 - categorical_accuracy: 0.5783 - tp: 592.0000 - fp: 171.0000 - tn: 28109.0000 - fn: 5064.0000 - precision: 0.7759 - recall: 0.1047 - auc: 0.8643 - prc: 0.5786 - val_loss: 0.1971 - val_categorical_accuracy: 0.5872 - val_tp: 1370.0000 - val_fp: 252.0000 - val_tn: 28028.0000 - val_fn: 4286.0000 - val_precision: 0.8446 - val_recall: 0.2422 - val_auc: 0.8782 - val_prc: 0.6426
Epoch 4/80
177/177 [==============================] - 6s 33ms/step - loss: 0.1982 - categorical_accuracy: 0.5969 - tp: 790.0000 - fp: 184.0000 - tn: 28096.0000 - fn: 4866.0000 - precision: 0.8111 - recall: 0.1397 - auc: 0.8752 - prc: 0.6121 - val_loss: 0.1814 - val_categorical_accuracy: 0.6308 - val_tp: 523.0000 - val_fp: 20.0000 - val_tn: 28260.0000 - val_fn: 5133.0000 - val_precision: 0.9632 - val_recall: 0.0925 - val_auc: 0.8953 - val_prc: 0.6943
Epoch 5/80
177/177 [==============================] - 7s 41ms/step - loss: 0.1906 - categorical_accuracy: 0.6103 - tp: 979.0000 - fp: 185.0000 - tn: 28095.0000 - fn: 4677.0000 - precision: 0.8411 - recall: 0.1731 - auc: 0.8847 - prc: 0.6392 - val_loss: 0.1721 - val_categorical_accuracy: 0.6278 - val_tp: 1240.0000 - val_fp: 111.0000 - val_tn: 28169.0000 - val_fn: 4416.0000 - val_precision: 0.9178 - val_recall: 0.2192 - val_auc: 0.9055 - val_prc: 0.7099
Epoch 6/80
177/177 [==============================] - 6s 33ms/step - loss: 0.1840 - categorical_accuracy: 0.6231 - tp: 1130.0000 - fp: 210.0000 - tn: 28070.0000 - fn: 4526.0000 - precision: 0.8433 - recall: 0.1998 - auc: 0.8948 - prc: 0.6608 - val_loss: 0.1695 - val_categorical_accuracy: 0.6388 - val_tp: 1339.0000 - val_fp: 146.0000 - val_tn: 28134.0000 - val_fn: 4317.0000 - val_precision: 0.9017 - val_recall: 0.2367 - val_auc: 0.9079 - val_prc: 0.7232
Epoch 7/80
177/177 [==============================] - 6s 34ms/step - loss: 0.1772 - categorical_accuracy: 0.6363 - tp: 1268.0000 - fp: 216.0000 - tn: 28064.0000 - fn: 4388.0000 - precision: 0.8544 - recall: 0.2242 - auc: 0.9021 - prc: 0.6841 - val_loss: 0.1622 - val_categorical_accuracy: 0.6644 - val_tp: 1611.0000 - val_fp: 135.0000 - val_tn: 28145.0000 - val_fn: 4045.0000 - val_precision: 0.9227 - val_recall: 0.2848 - val_auc: 0.9160 - val_prc: 0.7510
Epoch 8/80
177/177 [==============================] - 6s 33ms/step - loss: 0.1742 - categorical_accuracy: 0.6471 - tp: 1425.0000 - fp: 231.0000 - tn: 28049.0000 - fn: 4231.0000 - precision: 0.8605 - recall: 0.2519 - auc: 0.9066 - prc: 0.6959 - val_loss: 0.1535 - val_categorical_accuracy: 0.6713 - val_tp: 1935.0000 - val_fp: 149.0000 - val_tn: 28131.0000 - val_fn: 3721.0000 - val_precision: 0.9285 - val_recall: 0.3421 - val_auc: 0.9249 - val_prc: 0.7714
Epoch 9/80
177/177 [==============================] - 6s 34ms/step - loss: 0.1708 - categorical_accuracy: 0.6482 - tp: 1470.0000 - fp: 244.0000 - tn: 28036.0000 - fn: 4186.0000 - precision: 0.8576 - recall: 0.2599 - auc: 0.9100 - prc: 0.7063 - val_loss: 0.1562 - val_categorical_accuracy: 0.6586 - val_tp: 2077.0000 - val_fp: 247.0000 - val_tn: 28033.0000 - val_fn: 3579.0000 - val_precision: 0.8937 - val_recall: 0.3672 - val_auc: 0.9241 - val_prc: 0.7599
Epoch 10/80
177/177 [==============================] - 6s 33ms/step - loss: 0.1646 - categorical_accuracy: 0.6664 - tp: 1653.0000 - fp: 231.0000 - tn: 28049.0000 - fn: 4003.0000 - precision: 0.8774 - recall: 0.2923 - auc: 0.9162 - prc: 0.7297 - val_loss: 0.1489 - val_categorical_accuracy: 0.6805 - val_tp: 2368.0000 - val_fp: 271.0000 - val_tn: 28009.0000 - val_fn: 3288.0000 - val_precision: 0.8973 - val_recall: 0.4187 - val_auc: 0.9300 - val_prc: 0.7827
Epoch 11/80
177/177 [==============================] - 6s 33ms/step - loss: 0.1652 - categorical_accuracy: 0.6650 - tp: 1677.0000 - fp: 257.0000 - tn: 28023.0000 - fn: 3979.0000 - precision: 0.8671 - recall: 0.2965 - auc: 0.9171 - prc: 0.7264 - val_loss: 0.1447 - val_categorical_accuracy: 0.7104 - val_tp: 1549.0000 - val_fp: 62.0000 - val_tn: 28218.0000 - val_fn: 4107.0000 - val_precision: 0.9615 - val_recall: 0.2739 - val_auc: 0.9390 - val_prc: 0.7972
Epoch 12/80
177/177 [==============================] - 6s 33ms/step - loss: 0.1633 - categorical_accuracy: 0.6807 - tp: 1706.0000 - fp: 248.0000 - tn: 28032.0000 - fn: 3950.0000 - precision: 0.8731 - recall: 0.3016 - auc: 0.9190 - prc: 0.7329 - val_loss: 0.1429 - val_categorical_accuracy: 0.6920 - val_tp: 2305.0000 - val_fp: 243.0000 - val_tn: 28037.0000 - val_fn: 3351.0000 - val_precision: 0.9046 - val_recall: 0.4075 - val_auc: 0.9382 - val_prc: 0.7931
Epoch 13/80
177/177 [==============================] - 6s 33ms/step - loss: 0.1567 - categorical_accuracy: 0.6890 - tp: 1826.0000 - fp: 257.0000 - tn: 28023.0000 - fn: 3830.0000 - precision: 0.8766 - recall: 0.3228 - auc: 0.9253 - prc: 0.7543 - val_loss: 0.1434 - val_categorical_accuracy: 0.7521 - val_tp: 1141.0000 - val_fp: 13.0000 - val_tn: 28267.0000 - val_fn: 4515.0000 - val_precision: 0.9887 - val_recall: 0.2017 - val_auc: 0.9446 - val_prc: 0.8256
Epoch 14/80
177/177 [==============================] - 7s 42ms/step - loss: 0.1553 - categorical_accuracy: 0.6908 - tp: 1862.0000 - fp: 244.0000 - tn: 28036.0000 - fn: 3794.0000 - precision: 0.8841 - recall: 0.3292 - auc: 0.9268 - prc: 0.7577 - val_loss: 0.1412 - val_categorical_accuracy: 0.7385 - val_tp: 1505.0000 - val_fp: 47.0000 - val_tn: 28233.0000 - val_fn: 4151.0000 - val_precision: 0.9697 - val_recall: 0.2661 - val_auc: 0.9422 - val_prc: 0.8125
Epoch 15/80
177/177 [==============================] - 6s 34ms/step - loss: 0.1522 - categorical_accuracy: 0.7063 - tp: 2022.0000 - fp: 260.0000 - tn: 28020.0000 - fn: 3634.0000 - precision: 0.8861 - recall: 0.3575 - auc: 0.9303 - prc: 0.7674 - val_loss: 0.1311 - val_categorical_accuracy: 0.7716 - val_tp: 1773.0000 - val_fp: 48.0000 - val_tn: 28232.0000 - val_fn: 3883.0000 - val_precision: 0.9736 - val_recall: 0.3135 - val_auc: 0.9501 - val_prc: 0.8434
Epoch 16/80
177/177 [==============================] - 6s 33ms/step - loss: 0.1494 - categorical_accuracy: 0.7076 - tp: 2017.0000 - fp: 252.0000 - tn: 28028.0000 - fn: 3639.0000 - precision: 0.8889 - recall: 0.3566 - auc: 0.9338 - prc: 0.7759 - val_loss: 0.1301 - val_categorical_accuracy: 0.7290 - val_tp: 3023.0000 - val_fp: 347.0000 - val_tn: 27933.0000 - val_fn: 2633.0000 - val_precision: 0.8970 - val_recall: 0.5345 - val_auc: 0.9487 - val_prc: 0.8329
Epoch 17/80
177/177 [==============================] - 6s 33ms/step - loss: 0.1487 - categorical_accuracy: 0.7093 - tp: 2070.0000 - fp: 267.0000 - tn: 28013.0000 - fn: 3586.0000 - precision: 0.8858 - recall: 0.3660 - auc: 0.9347 - prc: 0.7778 - val_loss: 0.1341 - val_categorical_accuracy: 0.8004 - val_tp: 1248.0000 - val_fp: 33.0000 - val_tn: 28247.0000 - val_fn: 4408.0000 - val_precision: 0.9742 - val_recall: 0.2207 - val_auc: 0.9562 - val_prc: 0.8537
Epoch 18/80
177/177 [==============================] - 6s 33ms/step - loss: 0.1481 - categorical_accuracy: 0.7171 - tp: 2071.0000 - fp: 256.0000 - tn: 28024.0000 - fn: 3585.0000 - precision: 0.8900 - recall: 0.3662 - auc: 0.9352 - prc: 0.7802 - val_loss: 0.1249 - val_categorical_accuracy: 0.7465 - val_tp: 2479.0000 - val_fp: 162.0000 - val_tn: 28118.0000 - val_fn: 3177.0000 - val_precision: 0.9387 - val_recall: 0.4383 - val_auc: 0.9523 - val_prc: 0.8430
Epoch 19/80
177/177 [==============================] - 6s 33ms/step - loss: 0.1455 - categorical_accuracy: 0.7226 - tp: 2135.0000 - fp: 250.0000 - tn: 28030.0000 - fn: 3521.0000 - precision: 0.8952 - recall: 0.3775 - auc: 0.9374 - prc: 0.7858 - val_loss: 0.1219 - val_categorical_accuracy: 0.7684 - val_tp: 2273.0000 - val_fp: 87.0000 - val_tn: 28193.0000 - val_fn: 3383.0000 - val_precision: 0.9631 - val_recall: 0.4019 - val_auc: 0.9552 - val_prc: 0.8577
Epoch 20/80
177/177 [==============================] - 6s 33ms/step - loss: 0.1423 - categorical_accuracy: 0.7268 - tp: 2272.0000 - fp: 275.0000 - tn: 28005.0000 - fn: 3384.0000 - precision: 0.8920 - recall: 0.4017 - auc: 0.9402 - prc: 0.7966 - val_loss: 0.1125 - val_categorical_accuracy: 0.7981 - val_tp: 2346.0000 - val_fp: 62.0000 - val_tn: 28218.0000 - val_fn: 3310.0000 - val_precision: 0.9743 - val_recall: 0.4148 - val_auc: 0.9652 - val_prc: 0.8809
Epoch 21/80
177/177 [==============================] - 6s 33ms/step - loss: 0.1430 - categorical_accuracy: 0.7242 - tp: 2238.0000 - fp: 249.0000 - tn: 28031.0000 - fn: 3418.0000 - precision: 0.8999 - recall: 0.3957 - auc: 0.9402 - prc: 0.7974 - val_loss: 0.1170 - val_categorical_accuracy: 0.7792 - val_tp: 2343.0000 - val_fp: 95.0000 - val_tn: 28185.0000 - val_fn: 3313.0000 - val_precision: 0.9610 - val_recall: 0.4143 - val_auc: 0.9614 - val_prc: 0.8649
Epoch 22/80
177/177 [==============================] - 6s 34ms/step - loss: 0.1346 - categorical_accuracy: 0.7435 - tp: 2391.0000 - fp: 239.0000 - tn: 28041.0000 - fn: 3265.0000 - precision: 0.9091 - recall: 0.4227 - auc: 0.9465 - prc: 0.8175 - val_loss: 0.1300 - val_categorical_accuracy: 0.7371 - val_tp: 3143.0000 - val_fp: 530.0000 - val_tn: 27750.0000 - val_fn: 2513.0000 - val_precision: 0.8557 - val_recall: 0.5557 - val_auc: 0.9544 - val_prc: 0.8416
Epoch 23/80
177/177 [==============================] - 6s 33ms/step - loss: 0.1347 - categorical_accuracy: 0.7399 - tp: 2413.0000 - fp: 258.0000 - tn: 28022.0000 - fn: 3243.0000 - precision: 0.9034 - recall: 0.4266 - auc: 0.9471 - prc: 0.8150 - val_loss: 0.1064 - val_categorical_accuracy: 0.7878 - val_tp: 2892.0000 - val_fp: 134.0000 - val_tn: 28146.0000 - val_fn: 2764.0000 - val_precision: 0.9557 - val_recall: 0.5113 - val_auc: 0.9668 - val_prc: 0.8842
Epoch 24/80
177/177 [==============================] - 7s 42ms/step - loss: 0.1325 - categorical_accuracy: 0.7443 - tp: 2478.0000 - fp: 235.0000 - tn: 28045.0000 - fn: 3178.0000 - precision: 0.9134 - recall: 0.4381 - auc: 0.9483 - prc: 0.8219 - val_loss: 0.0969 - val_categorical_accuracy: 0.8361 - val_tp: 2827.0000 - val_fp: 79.0000 - val_tn: 28201.0000 - val_fn: 2829.0000 - val_precision: 0.9728 - val_recall: 0.4998 - val_auc: 0.9752 - val_prc: 0.9109
Epoch 25/80
177/177 [==============================] - 6s 34ms/step - loss: 0.1347 - categorical_accuracy: 0.7394 - tp: 2460.0000 - fp: 257.0000 - tn: 28023.0000 - fn: 3196.0000 - precision: 0.9054 - recall: 0.4349 - auc: 0.9470 - prc: 0.8176 - val_loss: 0.1101 - val_categorical_accuracy: 0.8435 - val_tp: 2020.0000 - val_fp: 41.0000 - val_tn: 28239.0000 - val_fn: 3636.0000 - val_precision: 0.9801 - val_recall: 0.3571 - val_auc: 0.9725 - val_prc: 0.9014
Epoch 26/80
177/177 [==============================] - 6s 33ms/step - loss: 0.1313 - categorical_accuracy: 0.7424 - tp: 2543.0000 - fp: 267.0000 - tn: 28013.0000 - fn: 3113.0000 - precision: 0.9050 - recall: 0.4496 - auc: 0.9504 - prc: 0.8254 - val_loss: 0.1067 - val_categorical_accuracy: 0.8002 - val_tp: 3036.0000 - val_fp: 187.0000 - val_tn: 28093.0000 - val_fn: 2620.0000 - val_precision: 0.9420 - val_recall: 0.5368 - val_auc: 0.9664 - val_prc: 0.8831
Epoch 27/80
177/177 [==============================] - 6s 33ms/step - loss: 0.1278 - categorical_accuracy: 0.7595 - tp: 2592.0000 - fp: 225.0000 - tn: 28055.0000 - fn: 3064.0000 - precision: 0.9201 - recall: 0.4583 - auc: 0.9520 - prc: 0.8351 - val_loss: 0.0976 - val_categorical_accuracy: 0.8027 - val_tp: 3405.0000 - val_fp: 226.0000 - val_tn: 28054.0000 - val_fn: 2251.0000 - val_precision: 0.9378 - val_recall: 0.6020 - val_auc: 0.9720 - val_prc: 0.8988
Epoch 28/80
177/177 [==============================] - 6s 33ms/step - loss: 0.1282 - categorical_accuracy: 0.7597 - tp: 2608.0000 - fp: 259.0000 - tn: 28021.0000 - fn: 3048.0000 - precision: 0.9097 - recall: 0.4611 - auc: 0.9528 - prc: 0.8360 - val_loss: 0.0952 - val_categorical_accuracy: 0.8557 - val_tp: 2607.0000 - val_fp: 40.0000 - val_tn: 28240.0000 - val_fn: 3049.0000 - val_precision: 0.9849 - val_recall: 0.4609 - val_auc: 0.9781 - val_prc: 0.9230
Epoch 29/80
177/177 [==============================] - 6s 33ms/step - loss: 0.1270 - categorical_accuracy: 0.7583 - tp: 2688.0000 - fp: 250.0000 - tn: 28030.0000 - fn: 2968.0000 - precision: 0.9149 - recall: 0.4752 - auc: 0.9534 - prc: 0.8382 - val_loss: 0.0936 - val_categorical_accuracy: 0.8699 - val_tp: 2616.0000 - val_fp: 45.0000 - val_tn: 28235.0000 - val_fn: 3040.0000 - val_precision: 0.9831 - val_recall: 0.4625 - val_auc: 0.9800 - val_prc: 0.9276
Epoch 30/80
177/177 [==============================] - 6s 33ms/step - loss: 0.1245 - categorical_accuracy: 0.7661 - tp: 2737.0000 - fp: 261.0000 - tn: 28019.0000 - fn: 2919.0000 - precision: 0.9129 - recall: 0.4839 - auc: 0.9554 - prc: 0.8434 - val_loss: 0.0889 - val_categorical_accuracy: 0.8421 - val_tp: 3347.0000 - val_fp: 122.0000 - val_tn: 28158.0000 - val_fn: 2309.0000 - val_precision: 0.9648 - val_recall: 0.5918 - val_auc: 0.9766 - val_prc: 0.9193
Epoch 31/80
177/177 [==============================] - 6s 33ms/step - loss: 0.1222 - categorical_accuracy: 0.7617 - tp: 2785.0000 - fp: 246.0000 - tn: 28034.0000 - fn: 2871.0000 - precision: 0.9188 - recall: 0.4924 - auc: 0.9567 - prc: 0.8482 - val_loss: 0.0839 - val_categorical_accuracy: 0.8485 - val_tp: 3282.0000 - val_fp: 102.0000 - val_tn: 28178.0000 - val_fn: 2374.0000 - val_precision: 0.9699 - val_recall: 0.5803 - val_auc: 0.9806 - val_prc: 0.9277
Epoch 32/80
177/177 [==============================] - 6s 33ms/step - loss: 0.1219 - categorical_accuracy: 0.7753 - tp: 2812.0000 - fp: 258.0000 - tn: 28022.0000 - fn: 2844.0000 - precision: 0.9160 - recall: 0.4972 - auc: 0.9577 - prc: 0.8505 - val_loss: 0.0851 - val_categorical_accuracy: 0.8458 - val_tp: 3226.0000 - val_fp: 94.0000 - val_tn: 28186.0000 - val_fn: 2430.0000 - val_precision: 0.9717 - val_recall: 0.5704 - val_auc: 0.9797 - val_prc: 0.9267
Epoch 33/80
177/177 [==============================] - 6s 34ms/step - loss: 0.1211 - categorical_accuracy: 0.7689 - tp: 2859.0000 - fp: 268.0000 - tn: 28012.0000 - fn: 2797.0000 - precision: 0.9143 - recall: 0.5055 - auc: 0.9572 - prc: 0.8524 - val_loss: 0.1079 - val_categorical_accuracy: 0.8506 - val_tp: 2050.0000 - val_fp: 12.0000 - val_tn: 28268.0000 - val_fn: 3606.0000 - val_precision: 0.9942 - val_recall: 0.3624 - val_auc: 0.9732 - val_prc: 0.9184
Epoch 34/80
177/177 [==============================] - 7s 42ms/step - loss: 0.1219 - categorical_accuracy: 0.7717 - tp: 2837.0000 - fp: 246.0000 - tn: 28034.0000 - fn: 2819.0000 - precision: 0.9202 - recall: 0.5016 - auc: 0.9575 - prc: 0.8516 - val_loss: 0.0870 - val_categorical_accuracy: 0.8281 - val_tp: 3746.0000 - val_fp: 237.0000 - val_tn: 28043.0000 - val_fn: 1910.0000 - val_precision: 0.9405 - val_recall: 0.6623 - val_auc: 0.9775 - val_prc: 0.9186
Epoch 35/80
177/177 [==============================] - 6s 33ms/step - loss: 0.1161 - categorical_accuracy: 0.7862 - tp: 2901.0000 - fp: 239.0000 - tn: 28041.0000 - fn: 2755.0000 - precision: 0.9239 - recall: 0.5129 - auc: 0.9614 - prc: 0.8635 - val_loss: 0.0826 - val_categorical_accuracy: 0.8556 - val_tp: 3605.0000 - val_fp: 108.0000 - val_tn: 28172.0000 - val_fn: 2051.0000 - val_precision: 0.9709 - val_recall: 0.6374 - val_auc: 0.9787 - val_prc: 0.9302
Epoch 36/80
177/177 [==============================] - 6s 33ms/step - loss: 0.1160 - categorical_accuracy: 0.7799 - tp: 2958.0000 - fp: 251.0000 - tn: 28029.0000 - fn: 2698.0000 - precision: 0.9218 - recall: 0.5230 - auc: 0.9620 - prc: 0.8632 - val_loss: 0.0931 - val_categorical_accuracy: 0.8119 - val_tp: 3704.0000 - val_fp: 334.0000 - val_tn: 27946.0000 - val_fn: 1952.0000 - val_precision: 0.9173 - val_recall: 0.6549 - val_auc: 0.9766 - val_prc: 0.9089
Epoch 37/80
177/177 [==============================] - 6s 33ms/step - loss: 0.1158 - categorical_accuracy: 0.7806 - tp: 3006.0000 - fp: 249.0000 - tn: 28031.0000 - fn: 2650.0000 - precision: 0.9235 - recall: 0.5315 - auc: 0.9616 - prc: 0.8639 - val_loss: 0.0765 - val_categorical_accuracy: 0.8999 - val_tp: 3099.0000 - val_fp: 30.0000 - val_tn: 28250.0000 - val_fn: 2557.0000 - val_precision: 0.9904 - val_recall: 0.5479 - val_auc: 0.9874 - val_prc: 0.9559
Epoch 38/80
177/177 [==============================] - 6s 33ms/step - loss: 0.1140 - categorical_accuracy: 0.7884 - tp: 3000.0000 - fp: 278.0000 - tn: 28002.0000 - fn: 2656.0000 - precision: 0.9152 - recall: 0.5304 - auc: 0.9631 - prc: 0.8680 - val_loss: 0.0741 - val_categorical_accuracy: 0.9047 - val_tp: 3223.0000 - val_fp: 29.0000 - val_tn: 28251.0000 - val_fn: 2433.0000 - val_precision: 0.9911 - val_recall: 0.5698 - val_auc: 0.9876 - val_prc: 0.9576
Epoch 39/80
177/177 [==============================] - 6s 34ms/step - loss: 0.1104 - categorical_accuracy: 0.8009 - tp: 3109.0000 - fp: 255.0000 - tn: 28025.0000 - fn: 2547.0000 - precision: 0.9242 - recall: 0.5497 - auc: 0.9652 - prc: 0.8758 - val_loss: 0.0674 - val_categorical_accuracy: 0.9104 - val_tp: 3550.0000 - val_fp: 23.0000 - val_tn: 28257.0000 - val_fn: 2106.0000 - val_precision: 0.9936 - val_recall: 0.6277 - val_auc: 0.9887 - val_prc: 0.9633
Epoch 40/80
177/177 [==============================] - 6s 34ms/step - loss: 0.1114 - categorical_accuracy: 0.7967 - tp: 3101.0000 - fp: 231.0000 - tn: 28049.0000 - fn: 2555.0000 - precision: 0.9307 - recall: 0.5483 - auc: 0.9647 - prc: 0.8757 - val_loss: 0.0797 - val_categorical_accuracy: 0.8487 - val_tp: 3777.0000 - val_fp: 187.0000 - val_tn: 28093.0000 - val_fn: 1879.0000 - val_precision: 0.9528 - val_recall: 0.6678 - val_auc: 0.9803 - val_prc: 0.9308
Epoch 41/80
177/177 [==============================] - 6s 34ms/step - loss: 0.1076 - categorical_accuracy: 0.8004 - tp: 3175.0000 - fp: 250.0000 - tn: 28030.0000 - fn: 2481.0000 - precision: 0.9270 - recall: 0.5614 - auc: 0.9667 - prc: 0.8825 - val_loss: 0.0797 - val_categorical_accuracy: 0.8962 - val_tp: 3089.0000 - val_fp: 34.0000 - val_tn: 28246.0000 - val_fn: 2567.0000 - val_precision: 0.9891 - val_recall: 0.5461 - val_auc: 0.9852 - val_prc: 0.9502
Epoch 42/80
177/177 [==============================] - 6s 34ms/step - loss: 0.1071 - categorical_accuracy: 0.7992 - tp: 3216.0000 - fp: 263.0000 - tn: 28017.0000 - fn: 2440.0000 - precision: 0.9244 - recall: 0.5686 - auc: 0.9669 - prc: 0.8821 - val_loss: 0.0666 - val_categorical_accuracy: 0.9286 - val_tp: 3366.0000 - val_fp: 18.0000 - val_tn: 28262.0000 - val_fn: 2290.0000 - val_precision: 0.9947 - val_recall: 0.5951 - val_auc: 0.9915 - val_prc: 0.9715
Epoch 43/80
177/177 [==============================] - 6s 36ms/step - loss: 0.1033 - categorical_accuracy: 0.8069 - tp: 3318.0000 - fp: 242.0000 - tn: 28038.0000 - fn: 2338.0000 - precision: 0.9320 - recall: 0.5866 - auc: 0.9690 - prc: 0.8902 - val_loss: 0.1146 - val_categorical_accuracy: 0.8082 - val_tp: 2468.0000 - val_fp: 71.0000 - val_tn: 28209.0000 - val_fn: 3188.0000 - val_precision: 0.9720 - val_recall: 0.4364 - val_auc: 0.9589 - val_prc: 0.8775
Epoch 44/80
177/177 [==============================] - 6s 34ms/step - loss: 0.1028 - categorical_accuracy: 0.8106 - tp: 3279.0000 - fp: 248.0000 - tn: 28032.0000 - fn: 2377.0000 - precision: 0.9297 - recall: 0.5797 - auc: 0.9700 - prc: 0.8898 - val_loss: 0.0691 - val_categorical_accuracy: 0.8716 - val_tp: 4081.0000 - val_fp: 171.0000 - val_tn: 28109.0000 - val_fn: 1575.0000 - val_precision: 0.9598 - val_recall: 0.7215 - val_auc: 0.9857 - val_prc: 0.9471
Epoch 45/80
177/177 [==============================] - 6s 34ms/step - loss: 0.1000 - categorical_accuracy: 0.8144 - tp: 3319.0000 - fp: 257.0000 - tn: 28023.0000 - fn: 2337.0000 - precision: 0.9281 - recall: 0.5868 - auc: 0.9714 - prc: 0.8953 - val_loss: 0.0974 - val_categorical_accuracy: 0.8492 - val_tp: 2677.0000 - val_fp: 12.0000 - val_tn: 28268.0000 - val_fn: 2979.0000 - val_precision: 0.9955 - val_recall: 0.4733 - val_auc: 0.9747 - val_prc: 0.9251
Epoch 46/80
177/177 [==============================] - 6s 34ms/step - loss: 0.1032 - categorical_accuracy: 0.8177 - tp: 3362.0000 - fp: 261.0000 - tn: 28019.0000 - fn: 2294.0000 - precision: 0.9280 - recall: 0.5944 - auc: 0.9699 - prc: 0.8923 - val_loss: 0.0670 - val_categorical_accuracy: 0.8969 - val_tp: 3688.0000 - val_fp: 69.0000 - val_tn: 28211.0000 - val_fn: 1968.0000 - val_precision: 0.9816 - val_recall: 0.6521 - val_auc: 0.9881 - val_prc: 0.9559
Epoch 47/80
177/177 [==============================] - 6s 34ms/step - loss: 0.1061 - categorical_accuracy: 0.8073 - tp: 3298.0000 - fp: 245.0000 - tn: 28035.0000 - fn: 2358.0000 - precision: 0.9308 - recall: 0.5831 - auc: 0.9685 - prc: 0.8865 - val_loss: 0.0611 - val_categorical_accuracy: 0.9082 - val_tp: 3907.0000 - val_fp: 43.0000 - val_tn: 28237.0000 - val_fn: 1749.0000 - val_precision: 0.9891 - val_recall: 0.6908 - val_auc: 0.9903 - val_prc: 0.9655
Epoch 48/80
177/177 [==============================] - 6s 34ms/step - loss: 0.1006 - categorical_accuracy: 0.8163 - tp: 3405.0000 - fp: 238.0000 - tn: 28042.0000 - fn: 2251.0000 - precision: 0.9347 - recall: 0.6020 - auc: 0.9710 - prc: 0.8967 - val_loss: 0.0668 - val_categorical_accuracy: 0.8683 - val_tp: 3994.0000 - val_fp: 188.0000 - val_tn: 28092.0000 - val_fn: 1662.0000 - val_precision: 0.9550 - val_recall: 0.7062 - val_auc: 0.9872 - val_prc: 0.9488
Epoch 49/80
177/177 [==============================] - 6s 33ms/step - loss: 0.0968 - categorical_accuracy: 0.8202 - tp: 3484.0000 - fp: 242.0000 - tn: 28038.0000 - fn: 2172.0000 - precision: 0.9351 - recall: 0.6160 - auc: 0.9731 - prc: 0.9029 - val_loss: 0.0580 - val_categorical_accuracy: 0.9128 - val_tp: 4013.0000 - val_fp: 58.0000 - val_tn: 28222.0000 - val_fn: 1643.0000 - val_precision: 0.9858 - val_recall: 0.7095 - val_auc: 0.9909 - val_prc: 0.9675
Epoch 50/80
177/177 [==============================] - 6s 33ms/step - loss: 0.0995 - categorical_accuracy: 0.8216 - tp: 3472.0000 - fp: 233.0000 - tn: 28047.0000 - fn: 2184.0000 - precision: 0.9371 - recall: 0.6139 - auc: 0.9717 - prc: 0.8994 - val_loss: 0.0567 - val_categorical_accuracy: 0.9226 - val_tp: 3884.0000 - val_fp: 25.0000 - val_tn: 28255.0000 - val_fn: 1772.0000 - val_precision: 0.9936 - val_recall: 0.6867 - val_auc: 0.9925 - val_prc: 0.9734
Epoch 51/80
177/177 [==============================] - 6s 33ms/step - loss: 0.0968 - categorical_accuracy: 0.8303 - tp: 3571.0000 - fp: 244.0000 - tn: 28036.0000 - fn: 2085.0000 - precision: 0.9360 - recall: 0.6314 - auc: 0.9741 - prc: 0.9053 - val_loss: 0.0670 - val_categorical_accuracy: 0.9029 - val_tp: 3558.0000 - val_fp: 21.0000 - val_tn: 28259.0000 - val_fn: 2098.0000 - val_precision: 0.9941 - val_recall: 0.6291 - val_auc: 0.9897 - val_prc: 0.9634
Epoch 52/80
177/177 [==============================] - 8s 44ms/step - loss: 0.0940 - categorical_accuracy: 0.8255 - tp: 3611.0000 - fp: 242.0000 - tn: 28038.0000 - fn: 2045.0000 - precision: 0.9372 - recall: 0.6384 - auc: 0.9744 - prc: 0.9086 - val_loss: 0.0494 - val_categorical_accuracy: 0.9355 - val_tp: 4102.0000 - val_fp: 30.0000 - val_tn: 28250.0000 - val_fn: 1554.0000 - val_precision: 0.9927 - val_recall: 0.7252 - val_auc: 0.9949 - val_prc: 0.9806
Epoch 53/80
177/177 [==============================] - 6s 34ms/step - loss: 0.0985 - categorical_accuracy: 0.8234 - tp: 3523.0000 - fp: 265.0000 - tn: 28015.0000 - fn: 2133.0000 - precision: 0.9300 - recall: 0.6229 - auc: 0.9727 - prc: 0.9009 - val_loss: 0.0483 - val_categorical_accuracy: 0.9517 - val_tp: 4076.0000 - val_fp: 14.0000 - val_tn: 28266.0000 - val_fn: 1580.0000 - val_precision: 0.9966 - val_recall: 0.7207 - val_auc: 0.9955 - val_prc: 0.9848
Epoch 54/80
177/177 [==============================] - 6s 33ms/step - loss: 0.0915 - categorical_accuracy: 0.8405 - tp: 3619.0000 - fp: 235.0000 - tn: 28045.0000 - fn: 2037.0000 - precision: 0.9390 - recall: 0.6399 - auc: 0.9767 - prc: 0.9139 - val_loss: 0.0520 - val_categorical_accuracy: 0.9264 - val_tp: 4246.0000 - val_fp: 70.0000 - val_tn: 28210.0000 - val_fn: 1410.0000 - val_precision: 0.9838 - val_recall: 0.7507 - val_auc: 0.9926 - val_prc: 0.9735
Epoch 55/80
177/177 [==============================] - 6s 33ms/step - loss: 0.0908 - categorical_accuracy: 0.8350 - tp: 3698.0000 - fp: 243.0000 - tn: 28037.0000 - fn: 1958.0000 - precision: 0.9383 - recall: 0.6538 - auc: 0.9767 - prc: 0.9148 - val_loss: 0.0844 - val_categorical_accuracy: 0.8814 - val_tp: 4144.0000 - val_fp: 300.0000 - val_tn: 27980.0000 - val_fn: 1512.0000 - val_precision: 0.9325 - val_recall: 0.7327 - val_auc: 0.9813 - val_prc: 0.9336
Epoch 56/80
177/177 [==============================] - 6s 34ms/step - loss: 0.0916 - categorical_accuracy: 0.8386 - tp: 3627.0000 - fp: 244.0000 - tn: 28036.0000 - fn: 2029.0000 - precision: 0.9370 - recall: 0.6413 - auc: 0.9755 - prc: 0.9142 - val_loss: 0.0991 - val_categorical_accuracy: 0.8278 - val_tp: 2646.0000 - val_fp: 6.0000 - val_tn: 28274.0000 - val_fn: 3010.0000 - val_precision: 0.9977 - val_recall: 0.4678 - val_auc: 0.9744 - val_prc: 0.9189
Epoch 57/80
177/177 [==============================] - 6s 33ms/step - loss: 0.0931 - categorical_accuracy: 0.8365 - tp: 3675.0000 - fp: 238.0000 - tn: 28042.0000 - fn: 1981.0000 - precision: 0.9392 - recall: 0.6498 - auc: 0.9760 - prc: 0.9129 - val_loss: 0.0456 - val_categorical_accuracy: 0.9500 - val_tp: 4415.0000 - val_fp: 28.0000 - val_tn: 28252.0000 - val_fn: 1241.0000 - val_precision: 0.9937 - val_recall: 0.7806 - val_auc: 0.9942 - val_prc: 0.9823
Epoch 58/80
177/177 [==============================] - 6s 33ms/step - loss: 0.0900 - categorical_accuracy: 0.8446 - tp: 3772.0000 - fp: 232.0000 - tn: 28048.0000 - fn: 1884.0000 - precision: 0.9421 - recall: 0.6669 - auc: 0.9774 - prc: 0.9169 - val_loss: 0.0459 - val_categorical_accuracy: 0.9298 - val_tp: 4431.0000 - val_fp: 65.0000 - val_tn: 28215.0000 - val_fn: 1225.0000 - val_precision: 0.9855 - val_recall: 0.7834 - val_auc: 0.9941 - val_prc: 0.9782
Epoch 59/80
177/177 [==============================] - 6s 33ms/step - loss: 0.0876 - categorical_accuracy: 0.8449 - tp: 3764.0000 - fp: 246.0000 - tn: 28034.0000 - fn: 1892.0000 - precision: 0.9387 - recall: 0.6655 - auc: 0.9780 - prc: 0.9202 - val_loss: 0.0831 - val_categorical_accuracy: 0.8732 - val_tp: 3130.0000 - val_fp: 25.0000 - val_tn: 28255.0000 - val_fn: 2526.0000 - val_precision: 0.9921 - val_recall: 0.5534 - val_auc: 0.9828 - val_prc: 0.9435
Epoch 60/80
177/177 [==============================] - 6s 33ms/step - loss: 0.0846 - categorical_accuracy: 0.8474 - tp: 3864.0000 - fp: 216.0000 - tn: 28064.0000 - fn: 1792.0000 - precision: 0.9471 - recall: 0.6832 - auc: 0.9796 - prc: 0.9261 - val_loss: 0.0452 - val_categorical_accuracy: 0.9394 - val_tp: 4472.0000 - val_fp: 43.0000 - val_tn: 28237.0000 - val_fn: 1184.0000 - val_precision: 0.9905 - val_recall: 0.7907 - val_auc: 0.9942 - val_prc: 0.9803
Epoch 61/80
177/177 [==============================] - 6s 33ms/step - loss: 0.0889 - categorical_accuracy: 0.8476 - tp: 3776.0000 - fp: 264.0000 - tn: 28016.0000 - fn: 1880.0000 - precision: 0.9347 - recall: 0.6676 - auc: 0.9782 - prc: 0.9193 - val_loss: 0.0437 - val_categorical_accuracy: 0.9478 - val_tp: 4435.0000 - val_fp: 28.0000 - val_tn: 28252.0000 - val_fn: 1221.0000 - val_precision: 0.9937 - val_recall: 0.7841 - val_auc: 0.9952 - val_prc: 0.9844
Epoch 62/80
177/177 [==============================] - 7s 42ms/step - loss: 0.0860 - categorical_accuracy: 0.8487 - tp: 3834.0000 - fp: 227.0000 - tn: 28053.0000 - fn: 1822.0000 - precision: 0.9441 - recall: 0.6779 - auc: 0.9795 - prc: 0.9239 - val_loss: 0.0430 - val_categorical_accuracy: 0.9484 - val_tp: 4533.0000 - val_fp: 70.0000 - val_tn: 28210.0000 - val_fn: 1123.0000 - val_precision: 0.9848 - val_recall: 0.8014 - val_auc: 0.9949 - val_prc: 0.9817
Epoch 63/80
177/177 [==============================] - 6s 34ms/step - loss: 0.0852 - categorical_accuracy: 0.8490 - tp: 3804.0000 - fp: 236.0000 - tn: 28044.0000 - fn: 1852.0000 - precision: 0.9416 - recall: 0.6726 - auc: 0.9794 - prc: 0.9248 - val_loss: 0.0526 - val_categorical_accuracy: 0.9381 - val_tp: 4056.0000 - val_fp: 4.0000 - val_tn: 28276.0000 - val_fn: 1600.0000 - val_precision: 0.9990 - val_recall: 0.7171 - val_auc: 0.9936 - val_prc: 0.9811
Epoch 64/80
177/177 [==============================] - 6s 34ms/step - loss: 0.0859 - categorical_accuracy: 0.8462 - tp: 3861.0000 - fp: 235.0000 - tn: 28045.0000 - fn: 1795.0000 - precision: 0.9426 - recall: 0.6826 - auc: 0.9791 - prc: 0.9237 - val_loss: 0.0344 - val_categorical_accuracy: 0.9669 - val_tp: 4655.0000 - val_fp: 14.0000 - val_tn: 28266.0000 - val_fn: 1001.0000 - val_precision: 0.9970 - val_recall: 0.8230 - val_auc: 0.9976 - val_prc: 0.9919
Epoch 65/80
177/177 [==============================] - 6s 33ms/step - loss: 0.0864 - categorical_accuracy: 0.8508 - tp: 3901.0000 - fp: 245.0000 - tn: 28035.0000 - fn: 1755.0000 - precision: 0.9409 - recall: 0.6897 - auc: 0.9790 - prc: 0.9244 - val_loss: 0.0374 - val_categorical_accuracy: 0.9590 - val_tp: 4643.0000 - val_fp: 20.0000 - val_tn: 28260.0000 - val_fn: 1013.0000 - val_precision: 0.9957 - val_recall: 0.8209 - val_auc: 0.9965 - val_prc: 0.9887
Epoch 66/80
177/177 [==============================] - 6s 33ms/step - loss: 0.0844 - categorical_accuracy: 0.8524 - tp: 3911.0000 - fp: 229.0000 - tn: 28051.0000 - fn: 1745.0000 - precision: 0.9447 - recall: 0.6915 - auc: 0.9800 - prc: 0.9276 - val_loss: 0.0340 - val_categorical_accuracy: 0.9636 - val_tp: 4705.0000 - val_fp: 17.0000 - val_tn: 28263.0000 - val_fn: 951.0000 - val_precision: 0.9964 - val_recall: 0.8319 - val_auc: 0.9973 - val_prc: 0.9907
Epoch 67/80
177/177 [==============================] - 6s 33ms/step - loss: 0.0841 - categorical_accuracy: 0.8538 - tp: 3907.0000 - fp: 231.0000 - tn: 28049.0000 - fn: 1749.0000 - precision: 0.9442 - recall: 0.6908 - auc: 0.9799 - prc: 0.9266 - val_loss: 0.0368 - val_categorical_accuracy: 0.9680 - val_tp: 4580.0000 - val_fp: 5.0000 - val_tn: 28275.0000 - val_fn: 1076.0000 - val_precision: 0.9989 - val_recall: 0.8098 - val_auc: 0.9973 - val_prc: 0.9921
Epoch 68/80
177/177 [==============================] - 6s 33ms/step - loss: 0.0819 - categorical_accuracy: 0.8573 - tp: 3932.0000 - fp: 236.0000 - tn: 28044.0000 - fn: 1724.0000 - precision: 0.9434 - recall: 0.6952 - auc: 0.9809 - prc: 0.9302 - val_loss: 0.0348 - val_categorical_accuracy: 0.9684 - val_tp: 4682.0000 - val_fp: 16.0000 - val_tn: 28264.0000 - val_fn: 974.0000 - val_precision: 0.9966 - val_recall: 0.8278 - val_auc: 0.9971 - val_prc: 0.9908
Epoch 69/80
177/177 [==============================] - 6s 33ms/step - loss: 0.0779 - categorical_accuracy: 0.8681 - tp: 4056.0000 - fp: 217.0000 - tn: 28063.0000 - fn: 1600.0000 - precision: 0.9492 - recall: 0.7171 - auc: 0.9827 - prc: 0.9372 - val_loss: 0.0575 - val_categorical_accuracy: 0.9275 - val_tp: 3934.0000 - val_fp: 6.0000 - val_tn: 28274.0000 - val_fn: 1722.0000 - val_precision: 0.9985 - val_recall: 0.6955 - val_auc: 0.9919 - val_prc: 0.9755
Epoch 70/80
177/177 [==============================] - 6s 33ms/step - loss: 0.0760 - categorical_accuracy: 0.8658 - tp: 4068.0000 - fp: 224.0000 - tn: 28056.0000 - fn: 1588.0000 - precision: 0.9478 - recall: 0.7192 - auc: 0.9835 - prc: 0.9393 - val_loss: 0.0308 - val_categorical_accuracy: 0.9639 - val_tp: 4861.0000 - val_fp: 25.0000 - val_tn: 28255.0000 - val_fn: 795.0000 - val_precision: 0.9949 - val_recall: 0.8594 - val_auc: 0.9975 - val_prc: 0.9914
Epoch 71/80
177/177 [==============================] - 6s 33ms/step - loss: 0.0797 - categorical_accuracy: 0.8619 - tp: 4057.0000 - fp: 246.0000 - tn: 28034.0000 - fn: 1599.0000 - precision: 0.9428 - recall: 0.7173 - auc: 0.9822 - prc: 0.9350 - val_loss: 0.0434 - val_categorical_accuracy: 0.9454 - val_tp: 4529.0000 - val_fp: 65.0000 - val_tn: 28215.0000 - val_fn: 1127.0000 - val_precision: 0.9859 - val_recall: 0.8007 - val_auc: 0.9948 - val_prc: 0.9814
Epoch 72/80
177/177 [==============================] - 7s 42ms/step - loss: 0.0795 - categorical_accuracy: 0.8607 - tp: 4033.0000 - fp: 238.0000 - tn: 28042.0000 - fn: 1623.0000 - precision: 0.9443 - recall: 0.7130 - auc: 0.9822 - prc: 0.9345 - val_loss: 0.0333 - val_categorical_accuracy: 0.9689 - val_tp: 4705.0000 - val_fp: 4.0000 - val_tn: 28276.0000 - val_fn: 951.0000 - val_precision: 0.9992 - val_recall: 0.8319 - val_auc: 0.9976 - val_prc: 0.9928
Epoch 73/80
177/177 [==============================] - 6s 34ms/step - loss: 0.0771 - categorical_accuracy: 0.8692 - tp: 4073.0000 - fp: 206.0000 - tn: 28074.0000 - fn: 1583.0000 - precision: 0.9519 - recall: 0.7201 - auc: 0.9836 - prc: 0.9389 - val_loss: 0.0452 - val_categorical_accuracy: 0.9272 - val_tp: 4697.0000 - val_fp: 118.0000 - val_tn: 28162.0000 - val_fn: 959.0000 - val_precision: 0.9755 - val_recall: 0.8304 - val_auc: 0.9936 - val_prc: 0.9769
Epoch 74/80
177/177 [==============================] - 6s 34ms/step - loss: 0.0773 - categorical_accuracy: 0.8704 - tp: 4131.0000 - fp: 245.0000 - tn: 28035.0000 - fn: 1525.0000 - precision: 0.9440 - recall: 0.7304 - auc: 0.9836 - prc: 0.9394 - val_loss: 0.0389 - val_categorical_accuracy: 0.9394 - val_tp: 4713.0000 - val_fp: 111.0000 - val_tn: 28169.0000 - val_fn: 943.0000 - val_precision: 0.9770 - val_recall: 0.8333 - val_auc: 0.9962 - val_prc: 0.9834
Epoch 75/80
177/177 [==============================] - 6s 33ms/step - loss: 0.0802 - categorical_accuracy: 0.8716 - tp: 4113.0000 - fp: 231.0000 - tn: 28049.0000 - fn: 1543.0000 - precision: 0.9468 - recall: 0.7272 - auc: 0.9824 - prc: 0.9361 - val_loss: 0.0305 - val_categorical_accuracy: 0.9768 - val_tp: 4788.0000 - val_fp: 6.0000 - val_tn: 28274.0000 - val_fn: 868.0000 - val_precision: 0.9987 - val_recall: 0.8465 - val_auc: 0.9983 - val_prc: 0.9948
Epoch 76/80
177/177 [==============================] - 6s 34ms/step - loss: 0.0746 - categorical_accuracy: 0.8695 - tp: 4155.0000 - fp: 227.0000 - tn: 28053.0000 - fn: 1501.0000 - precision: 0.9482 - recall: 0.7346 - auc: 0.9842 - prc: 0.9416 - val_loss: 0.0413 - val_categorical_accuracy: 0.9599 - val_tp: 4462.0000 - val_fp: 21.0000 - val_tn: 28259.0000 - val_fn: 1194.0000 - val_precision: 0.9953 - val_recall: 0.7889 - val_auc: 0.9965 - val_prc: 0.9889
Epoch 77/80
177/177 [==============================] - 6s 34ms/step - loss: 0.0760 - categorical_accuracy: 0.8731 - tp: 4171.0000 - fp: 226.0000 - tn: 28054.0000 - fn: 1485.0000 - precision: 0.9486 - recall: 0.7374 - auc: 0.9846 - prc: 0.9416 - val_loss: 0.0259 - val_categorical_accuracy: 0.9760 - val_tp: 5004.0000 - val_fp: 24.0000 - val_tn: 28256.0000 - val_fn: 652.0000 - val_precision: 0.9952 - val_recall: 0.8847 - val_auc: 0.9985 - val_prc: 0.9949
Epoch 78/80
177/177 [==============================] - 6s 34ms/step - loss: 0.0742 - categorical_accuracy: 0.8727 - tp: 4095.0000 - fp: 226.0000 - tn: 28054.0000 - fn: 1561.0000 - precision: 0.9477 - recall: 0.7240 - auc: 0.9846 - prc: 0.9421 - val_loss: 0.0404 - val_categorical_accuracy: 0.9480 - val_tp: 4742.0000 - val_fp: 98.0000 - val_tn: 28182.0000 - val_fn: 914.0000 - val_precision: 0.9798 - val_recall: 0.8384 - val_auc: 0.9954 - val_prc: 0.9821
Epoch 79/80
177/177 [==============================] - 6s 33ms/step - loss: 0.0730 - categorical_accuracy: 0.8750 - tp: 4173.0000 - fp: 208.0000 - tn: 28072.0000 - fn: 1483.0000 - precision: 0.9525 - recall: 0.7378 - auc: 0.9850 - prc: 0.9446 - val_loss: 0.0335 - val_categorical_accuracy: 0.9505 - val_tp: 4876.0000 - val_fp: 97.0000 - val_tn: 28183.0000 - val_fn: 780.0000 - val_precision: 0.9805 - val_recall: 0.8621 - val_auc: 0.9971 - val_prc: 0.9877
Epoch 80/80
177/177 [==============================] - 6s 34ms/step - loss: 0.0725 - categorical_accuracy: 0.8844 - tp: 4229.0000 - fp: 231.0000 - tn: 28049.0000 - fn: 1427.0000 - precision: 0.9482 - recall: 0.7477 - auc: 0.9851 - prc: 0.9459 - val_loss: 0.0331 - val_categorical_accuracy: 0.9680 - val_tp: 4823.0000 - val_fp: 15.0000 - val_tn: 28265.0000 - val_fn: 833.0000 - val_precision: 0.9969 - val_recall: 0.8527 - val_auc: 0.9971 - val_prc: 0.9917

Analyze low capacity model, with dropout training history

In [186]:
# %reload_ext tensorboard
# %tensorboard --logdir ./tensorboard/low_capacity_reg_model --bind_all

Screen Shot 2023-03-19 at 4.08.16 AM.png

In [14]:
def evaluate_model_performance(model, on_validation, on_test):
    validation = model.evaluate(on_validation, batch_size=BATCH_SIZE)
    print(f"Validation AUC: {validation[8]:.3f}")
    print(f"Validation PRC: {validation[9]:.3f}")
    print(f"Validation categorical accuracy: {validation[1]:.3f}")
    test = model.evaluate(on_test, batch_size=BATCH_SIZE)
    print(f"Test AUC: {test[8]:.3f}")
    print(f"Test PRC: {test[9]:.3f}")
    print(f"Test categorical accuracy: {test[1]:.3f}")
    

Evaluate low capacity model, with dropout layer performance on TEST set

In [210]:
evaluate_model_performance(low_cap_model, ds_validation, ds_test)
177/177 [==============================] - 2s 11ms/step - loss: 0.0331 - categorical_accuracy: 0.9680 - tp: 4823.0000 - fp: 15.0000 - tn: 28265.0000 - fn: 833.0000 - precision: 0.9969 - recall: 0.8527 - auc: 0.9971 - prc: 0.9917
Validation AUC: 0.997
Validation PRC: 0.992
Validation categorical accuracy: 0.968
59/59 [==============================] - 0s 7ms/step - loss: 0.2347 - categorical_accuracy: 0.6027 - tp: 703.0000 - fp: 197.0000 - tn: 9228.0000 - fn: 1182.0000 - precision: 0.7811 - recall: 0.3729 - auc: 0.8805 - prc: 0.6439
Test AUC: 0.881
Test PRC: 0.644
Test categorical accuracy: 0.603

Large capacity with dropout

In [212]:
large_cap_model = get_model_with_dropout("large_capacity_reg_model", low_capacity=False, dropout=True)
large_cap_model.summary()
callbacks_list = [
    keras.callbacks.ModelCheckpoint(filepath=f"models/large_capacity_reg_model.keras", monitor="val_prc", save_best_only=True), # prevents overwriting model file unless validation loss has improved
    keras.callbacks.TensorBoard(log_dir=f"./tensorboard/large_capacity_reg_model") # path where callback writes logs
]
large_capacity_reg_history = large_cap_model.fit(
    ds_train,
    epochs=66,
    validation_data=ds_validation,
    callbacks=callbacks_list
)
Model: "large_capacity_reg_model"
_________________________________________________________________
 Layer (type)                Output Shape              Param #   
=================================================================
 preprocessedimage (InputLay  [(None, 224, 224, 3)]    0         
 er)                                                             
                                                                 
 conv2d_83 (Conv2D)          (None, 222, 222, 32)      896       
                                                                 
 max_pooling2d_62 (MaxPoolin  (None, 111, 111, 32)     0         
 g2D)                                                            
                                                                 
 conv2d_84 (Conv2D)          (None, 109, 109, 64)      18496     
                                                                 
 max_pooling2d_63 (MaxPoolin  (None, 54, 54, 64)       0         
 g2D)                                                            
                                                                 
 conv2d_85 (Conv2D)          (None, 52, 52, 128)       73856     
                                                                 
 max_pooling2d_64 (MaxPoolin  (None, 26, 26, 128)      0         
 g2D)                                                            
                                                                 
 conv2d_86 (Conv2D)          (None, 24, 24, 256)       295168    
                                                                 
 max_pooling2d_65 (MaxPoolin  (None, 12, 12, 256)      0         
 g2D)                                                            
                                                                 
 conv2d_87 (Conv2D)          (None, 10, 10, 512)       1180160   
                                                                 
 max_pooling2d_66 (MaxPoolin  (None, 5, 5, 512)        0         
 g2D)                                                            
                                                                 
 conv2d_88 (Conv2D)          (None, 3, 3, 1024)        4719616   
                                                                 
 dropout_10 (Dropout)        (None, 3, 3, 1024)        0         
                                                                 
 flatten_21 (Flatten)        (None, 9216)              0         
                                                                 
 dropout_11 (Dropout)        (None, 9216)              0         
                                                                 
 softmax_layer (Dense)       (None, 6)                 55302     
                                                                 
=================================================================
Total params: 6,343,494
Trainable params: 6,343,494
Non-trainable params: 0
_________________________________________________________________
Epoch 1/66
2023-03-02 04:21:16.968791: E tensorflow/core/grappler/optimizers/meta_optimizer.cc:954] layout failed: INVALID_ARGUMENT: Size of values 0 does not match size of permutation 4 @ fanin shape inlarge_capacity_reg_model/dropout_10/dropout/SelectV2-2-TransposeNHWCToNCHW-LayoutOptimizer
177/177 [==============================] - 24s 118ms/step - loss: 0.2667 - categorical_accuracy: 0.4457 - tp: 121.0000 - fp: 125.0000 - tn: 48795.0000 - fn: 9663.0000 - precision: 0.4919 - recall: 0.0124 - auc: 0.7945 - prc: 0.3980 - val_loss: 0.2440 - val_categorical_accuracy: 0.4977 - val_tp: 0.0000e+00 - val_fp: 0.0000e+00 - val_tn: 28280.0000 - val_fn: 5656.0000 - val_precision: 0.0000e+00 - val_recall: 0.0000e+00 - val_auc: 0.8258 - val_prc: 0.5068
Epoch 2/66
177/177 [==============================] - 20s 114ms/step - loss: 0.2353 - categorical_accuracy: 0.4998 - tp: 97.0000 - fp: 66.0000 - tn: 28214.0000 - fn: 5559.0000 - precision: 0.5951 - recall: 0.0171 - auc: 0.8160 - prc: 0.4520 - val_loss: 0.2136 - val_categorical_accuracy: 0.5656 - val_tp: 0.0000e+00 - val_fp: 0.0000e+00 - val_tn: 28280.0000 - val_fn: 5656.0000 - val_precision: 0.0000e+00 - val_recall: 0.0000e+00 - val_auc: 0.8513 - val_prc: 0.5610
Epoch 3/66
177/177 [==============================] - 20s 116ms/step - loss: 0.2151 - categorical_accuracy: 0.5534 - tp: 148.0000 - fp: 67.0000 - tn: 28213.0000 - fn: 5508.0000 - precision: 0.6884 - recall: 0.0262 - auc: 0.8489 - prc: 0.5268 - val_loss: 0.2066 - val_categorical_accuracy: 0.5698 - val_tp: 30.0000 - val_fp: 6.0000 - val_tn: 28274.0000 - val_fn: 5626.0000 - val_precision: 0.8333 - val_recall: 0.0053 - val_auc: 0.8597 - val_prc: 0.5670
Epoch 4/66
177/177 [==============================] - 20s 114ms/step - loss: 0.2074 - categorical_accuracy: 0.5840 - tp: 281.0000 - fp: 128.0000 - tn: 28152.0000 - fn: 5375.0000 - precision: 0.6870 - recall: 0.0497 - auc: 0.8610 - prc: 0.5608 - val_loss: 0.1999 - val_categorical_accuracy: 0.6105 - val_tp: 61.0000 - val_fp: 18.0000 - val_tn: 28262.0000 - val_fn: 5595.0000 - val_precision: 0.7722 - val_recall: 0.0108 - val_auc: 0.8749 - val_prc: 0.5987
Epoch 5/66
177/177 [==============================] - 20s 115ms/step - loss: 0.2014 - categorical_accuracy: 0.6110 - tp: 488.0000 - fp: 195.0000 - tn: 28085.0000 - fn: 5168.0000 - precision: 0.7145 - recall: 0.0863 - auc: 0.8738 - prc: 0.5896 - val_loss: 0.1837 - val_categorical_accuracy: 0.6425 - val_tp: 277.0000 - val_fp: 66.0000 - val_tn: 28214.0000 - val_fn: 5379.0000 - val_precision: 0.8076 - val_recall: 0.0490 - val_auc: 0.8956 - val_prc: 0.6544
Epoch 6/66
177/177 [==============================] - 21s 116ms/step - loss: 0.1940 - categorical_accuracy: 0.6195 - tp: 696.0000 - fp: 239.0000 - tn: 28041.0000 - fn: 4960.0000 - precision: 0.7444 - recall: 0.1231 - auc: 0.8816 - prc: 0.6104 - val_loss: 0.1824 - val_categorical_accuracy: 0.6406 - val_tp: 187.0000 - val_fp: 33.0000 - val_tn: 28247.0000 - val_fn: 5469.0000 - val_precision: 0.8500 - val_recall: 0.0331 - val_auc: 0.8948 - val_prc: 0.6578
Epoch 7/66
177/177 [==============================] - 20s 115ms/step - loss: 0.1897 - categorical_accuracy: 0.6266 - tp: 932.0000 - fp: 240.0000 - tn: 28040.0000 - fn: 4724.0000 - precision: 0.7952 - recall: 0.1648 - auc: 0.8880 - prc: 0.6300 - val_loss: 0.1747 - val_categorical_accuracy: 0.6558 - val_tp: 1152.0000 - val_fp: 247.0000 - val_tn: 28033.0000 - val_fn: 4504.0000 - val_precision: 0.8234 - val_recall: 0.2037 - val_auc: 0.9051 - val_prc: 0.6846
Epoch 8/66
177/177 [==============================] - 20s 115ms/step - loss: 0.1832 - categorical_accuracy: 0.6427 - tp: 1153.0000 - fp: 301.0000 - tn: 27979.0000 - fn: 4503.0000 - precision: 0.7930 - recall: 0.2039 - auc: 0.8956 - prc: 0.6520 - val_loss: 0.1767 - val_categorical_accuracy: 0.6547 - val_tp: 153.0000 - val_fp: 17.0000 - val_tn: 28263.0000 - val_fn: 5503.0000 - val_precision: 0.9000 - val_recall: 0.0271 - val_auc: 0.9066 - val_prc: 0.6874
Epoch 9/66
177/177 [==============================] - 21s 116ms/step - loss: 0.1784 - categorical_accuracy: 0.6437 - tp: 1355.0000 - fp: 319.0000 - tn: 27961.0000 - fn: 4301.0000 - precision: 0.8094 - recall: 0.2396 - auc: 0.9022 - prc: 0.6715 - val_loss: 0.1745 - val_categorical_accuracy: 0.6521 - val_tp: 2772.0000 - val_fp: 677.0000 - val_tn: 27603.0000 - val_fn: 2884.0000 - val_precision: 0.8037 - val_recall: 0.4901 - val_auc: 0.9116 - val_prc: 0.7249
Epoch 10/66
177/177 [==============================] - 20s 115ms/step - loss: 0.1763 - categorical_accuracy: 0.6510 - tp: 1470.0000 - fp: 288.0000 - tn: 27992.0000 - fn: 4186.0000 - precision: 0.8362 - recall: 0.2599 - auc: 0.9063 - prc: 0.6860 - val_loss: 0.1965 - val_categorical_accuracy: 0.6130 - val_tp: 2405.0000 - val_fp: 751.0000 - val_tn: 27529.0000 - val_fn: 3251.0000 - val_precision: 0.7620 - val_recall: 0.4252 - val_auc: 0.8874 - val_prc: 0.6635
Epoch 11/66
177/177 [==============================] - 21s 116ms/step - loss: 0.1702 - categorical_accuracy: 0.6582 - tp: 1688.0000 - fp: 307.0000 - tn: 27973.0000 - fn: 3968.0000 - precision: 0.8461 - recall: 0.2984 - auc: 0.9121 - prc: 0.7077 - val_loss: 0.1487 - val_categorical_accuracy: 0.7028 - val_tp: 1415.0000 - val_fp: 92.0000 - val_tn: 28188.0000 - val_fn: 4241.0000 - val_precision: 0.9390 - val_recall: 0.2502 - val_auc: 0.9332 - val_prc: 0.7801
Epoch 12/66
177/177 [==============================] - 21s 116ms/step - loss: 0.1640 - categorical_accuracy: 0.6719 - tp: 1888.0000 - fp: 293.0000 - tn: 27987.0000 - fn: 3768.0000 - precision: 0.8657 - recall: 0.3338 - auc: 0.9193 - prc: 0.7337 - val_loss: 0.1413 - val_categorical_accuracy: 0.7191 - val_tp: 2381.0000 - val_fp: 325.0000 - val_tn: 27955.0000 - val_fn: 3275.0000 - val_precision: 0.8799 - val_recall: 0.4210 - val_auc: 0.9402 - val_prc: 0.7957
Epoch 13/66
177/177 [==============================] - 20s 116ms/step - loss: 0.1559 - categorical_accuracy: 0.6913 - tp: 2091.0000 - fp: 279.0000 - tn: 28001.0000 - fn: 3565.0000 - precision: 0.8823 - recall: 0.3697 - auc: 0.9272 - prc: 0.7597 - val_loss: 0.1446 - val_categorical_accuracy: 0.6904 - val_tp: 2832.0000 - val_fp: 345.0000 - val_tn: 27935.0000 - val_fn: 2824.0000 - val_precision: 0.8914 - val_recall: 0.5007 - val_auc: 0.9364 - val_prc: 0.8035
Epoch 14/66
177/177 [==============================] - 21s 120ms/step - loss: 0.1493 - categorical_accuracy: 0.7033 - tp: 2309.0000 - fp: 288.0000 - tn: 27992.0000 - fn: 3347.0000 - precision: 0.8891 - recall: 0.4082 - auc: 0.9344 - prc: 0.7813 - val_loss: 0.1758 - val_categorical_accuracy: 0.6623 - val_tp: 1396.0000 - val_fp: 182.0000 - val_tn: 28098.0000 - val_fn: 4260.0000 - val_precision: 0.8847 - val_recall: 0.2468 - val_auc: 0.9024 - val_prc: 0.7065
Epoch 15/66
177/177 [==============================] - 20s 115ms/step - loss: 0.1457 - categorical_accuracy: 0.7157 - tp: 2454.0000 - fp: 268.0000 - tn: 28012.0000 - fn: 3202.0000 - precision: 0.9015 - recall: 0.4339 - auc: 0.9380 - prc: 0.7942 - val_loss: 0.1019 - val_categorical_accuracy: 0.8057 - val_tp: 2872.0000 - val_fp: 64.0000 - val_tn: 28216.0000 - val_fn: 2784.0000 - val_precision: 0.9782 - val_recall: 0.5078 - val_auc: 0.9681 - val_prc: 0.8948
Epoch 16/66
177/177 [==============================] - 21s 121ms/step - loss: 0.1352 - categorical_accuracy: 0.7389 - tp: 2710.0000 - fp: 260.0000 - tn: 28020.0000 - fn: 2946.0000 - precision: 0.9125 - recall: 0.4791 - auc: 0.9470 - prc: 0.8230 - val_loss: 0.0948 - val_categorical_accuracy: 0.8237 - val_tp: 3050.0000 - val_fp: 75.0000 - val_tn: 28205.0000 - val_fn: 2606.0000 - val_precision: 0.9760 - val_recall: 0.5393 - val_auc: 0.9735 - val_prc: 0.9083
Epoch 17/66
177/177 [==============================] - 20s 115ms/step - loss: 0.1299 - categorical_accuracy: 0.7521 - tp: 2839.0000 - fp: 266.0000 - tn: 28014.0000 - fn: 2817.0000 - precision: 0.9143 - recall: 0.5019 - auc: 0.9520 - prc: 0.8372 - val_loss: 0.0844 - val_categorical_accuracy: 0.8327 - val_tp: 3705.0000 - val_fp: 167.0000 - val_tn: 28113.0000 - val_fn: 1951.0000 - val_precision: 0.9569 - val_recall: 0.6551 - val_auc: 0.9781 - val_prc: 0.9225
Epoch 18/66
177/177 [==============================] - 20s 115ms/step - loss: 0.1232 - categorical_accuracy: 0.7649 - tp: 3022.0000 - fp: 279.0000 - tn: 28001.0000 - fn: 2634.0000 - precision: 0.9155 - recall: 0.5343 - auc: 0.9570 - prc: 0.8535 - val_loss: 0.0858 - val_categorical_accuracy: 0.8566 - val_tp: 3067.0000 - val_fp: 40.0000 - val_tn: 28240.0000 - val_fn: 2589.0000 - val_precision: 0.9871 - val_recall: 0.5423 - val_auc: 0.9804 - val_prc: 0.9306
Epoch 19/66
177/177 [==============================] - 21s 118ms/step - loss: 0.1169 - categorical_accuracy: 0.7829 - tp: 3157.0000 - fp: 253.0000 - tn: 28027.0000 - fn: 2499.0000 - precision: 0.9258 - recall: 0.5582 - auc: 0.9611 - prc: 0.8664 - val_loss: 0.0933 - val_categorical_accuracy: 0.8315 - val_tp: 3444.0000 - val_fp: 190.0000 - val_tn: 28090.0000 - val_fn: 2212.0000 - val_precision: 0.9477 - val_recall: 0.6089 - val_auc: 0.9748 - val_prc: 0.9078
Epoch 20/66
177/177 [==============================] - 20s 115ms/step - loss: 0.1157 - categorical_accuracy: 0.7859 - tp: 3254.0000 - fp: 278.0000 - tn: 28002.0000 - fn: 2402.0000 - precision: 0.9213 - recall: 0.5753 - auc: 0.9628 - prc: 0.8732 - val_loss: 0.0779 - val_categorical_accuracy: 0.8644 - val_tp: 3444.0000 - val_fp: 68.0000 - val_tn: 28212.0000 - val_fn: 2212.0000 - val_precision: 0.9806 - val_recall: 0.6089 - val_auc: 0.9843 - val_prc: 0.9407
Epoch 21/66
177/177 [==============================] - 20s 115ms/step - loss: 0.1280 - categorical_accuracy: 0.7811 - tp: 3169.0000 - fp: 291.0000 - tn: 27989.0000 - fn: 2487.0000 - precision: 0.9159 - recall: 0.5603 - auc: 0.9602 - prc: 0.8607 - val_loss: 0.1076 - val_categorical_accuracy: 0.8142 - val_tp: 3269.0000 - val_fp: 256.0000 - val_tn: 28024.0000 - val_fn: 2387.0000 - val_precision: 0.9274 - val_recall: 0.5780 - val_auc: 0.9697 - val_prc: 0.8847
Epoch 22/66
177/177 [==============================] - 20s 116ms/step - loss: 0.1251 - categorical_accuracy: 0.7875 - tp: 3317.0000 - fp: 287.0000 - tn: 27993.0000 - fn: 2339.0000 - precision: 0.9204 - recall: 0.5865 - auc: 0.9625 - prc: 0.8706 - val_loss: 0.0796 - val_categorical_accuracy: 0.8624 - val_tp: 3524.0000 - val_fp: 73.0000 - val_tn: 28207.0000 - val_fn: 2132.0000 - val_precision: 0.9797 - val_recall: 0.6231 - val_auc: 0.9816 - val_prc: 0.9348
Epoch 23/66
177/177 [==============================] - 20s 115ms/step - loss: 0.1122 - categorical_accuracy: 0.7988 - tp: 3432.0000 - fp: 271.0000 - tn: 28009.0000 - fn: 2224.0000 - precision: 0.9268 - recall: 0.6068 - auc: 0.9667 - prc: 0.8831 - val_loss: 0.0631 - val_categorical_accuracy: 0.8768 - val_tp: 4080.0000 - val_fp: 130.0000 - val_tn: 28150.0000 - val_fn: 1576.0000 - val_precision: 0.9691 - val_recall: 0.7214 - val_auc: 0.9885 - val_prc: 0.9558
Epoch 24/66
177/177 [==============================] - 20s 115ms/step - loss: 0.1077 - categorical_accuracy: 0.8177 - tp: 3535.0000 - fp: 283.0000 - tn: 27997.0000 - fn: 2121.0000 - precision: 0.9259 - recall: 0.6250 - auc: 0.9701 - prc: 0.8949 - val_loss: 0.0853 - val_categorical_accuracy: 0.8621 - val_tp: 3706.0000 - val_fp: 111.0000 - val_tn: 28169.0000 - val_fn: 1950.0000 - val_precision: 0.9709 - val_recall: 0.6552 - val_auc: 0.9751 - val_prc: 0.9289
Epoch 25/66
177/177 [==============================] - 21s 116ms/step - loss: 0.1120 - categorical_accuracy: 0.8091 - tp: 3560.0000 - fp: 284.0000 - tn: 27996.0000 - fn: 2096.0000 - precision: 0.9261 - recall: 0.6294 - auc: 0.9690 - prc: 0.8898 - val_loss: 0.0682 - val_categorical_accuracy: 0.8782 - val_tp: 4004.0000 - val_fp: 126.0000 - val_tn: 28154.0000 - val_fn: 1652.0000 - val_precision: 0.9695 - val_recall: 0.7079 - val_auc: 0.9872 - val_prc: 0.9508
Epoch 26/66
177/177 [==============================] - 20s 115ms/step - loss: 0.1209 - categorical_accuracy: 0.7967 - tp: 3384.0000 - fp: 298.0000 - tn: 27982.0000 - fn: 2272.0000 - precision: 0.9191 - recall: 0.5983 - auc: 0.9654 - prc: 0.8795 - val_loss: 0.0678 - val_categorical_accuracy: 0.8819 - val_tp: 3998.0000 - val_fp: 103.0000 - val_tn: 28177.0000 - val_fn: 1658.0000 - val_precision: 0.9749 - val_recall: 0.7069 - val_auc: 0.9877 - val_prc: 0.9528
Epoch 27/66
177/177 [==============================] - 20s 115ms/step - loss: 0.1139 - categorical_accuracy: 0.8046 - tp: 3497.0000 - fp: 297.0000 - tn: 27983.0000 - fn: 2159.0000 - precision: 0.9217 - recall: 0.6183 - auc: 0.9674 - prc: 0.8860 - val_loss: 0.0793 - val_categorical_accuracy: 0.8711 - val_tp: 3226.0000 - val_fp: 40.0000 - val_tn: 28240.0000 - val_fn: 2430.0000 - val_precision: 0.9878 - val_recall: 0.5704 - val_auc: 0.9842 - val_prc: 0.9433
Epoch 28/66
177/177 [==============================] - 20s 116ms/step - loss: 0.1087 - categorical_accuracy: 0.8106 - tp: 3642.0000 - fp: 294.0000 - tn: 27986.0000 - fn: 2014.0000 - precision: 0.9253 - recall: 0.6439 - auc: 0.9705 - prc: 0.8951 - val_loss: 0.0610 - val_categorical_accuracy: 0.8762 - val_tp: 4177.0000 - val_fp: 109.0000 - val_tn: 28171.0000 - val_fn: 1479.0000 - val_precision: 0.9746 - val_recall: 0.7385 - val_auc: 0.9895 - val_prc: 0.9587
Epoch 29/66
177/177 [==============================] - 20s 114ms/step - loss: 0.1126 - categorical_accuracy: 0.8200 - tp: 3599.0000 - fp: 297.0000 - tn: 27983.0000 - fn: 2057.0000 - precision: 0.9238 - recall: 0.6363 - auc: 0.9704 - prc: 0.8949 - val_loss: 0.0944 - val_categorical_accuracy: 0.8626 - val_tp: 4218.0000 - val_fp: 258.0000 - val_tn: 28022.0000 - val_fn: 1438.0000 - val_precision: 0.9424 - val_recall: 0.7458 - val_auc: 0.9809 - val_prc: 0.9298
Epoch 30/66
177/177 [==============================] - 20s 114ms/step - loss: 0.1062 - categorical_accuracy: 0.8220 - tp: 3764.0000 - fp: 312.0000 - tn: 27968.0000 - fn: 1892.0000 - precision: 0.9235 - recall: 0.6655 - auc: 0.9724 - prc: 0.9029 - val_loss: 0.1128 - val_categorical_accuracy: 0.8449 - val_tp: 1778.0000 - val_fp: 23.0000 - val_tn: 28257.0000 - val_fn: 3878.0000 - val_precision: 0.9872 - val_recall: 0.3144 - val_auc: 0.9775 - val_prc: 0.9164
Epoch 31/66
177/177 [==============================] - 21s 116ms/step - loss: 0.1268 - categorical_accuracy: 0.7986 - tp: 3426.0000 - fp: 305.0000 - tn: 27975.0000 - fn: 2230.0000 - precision: 0.9183 - recall: 0.6057 - auc: 0.9638 - prc: 0.8751 - val_loss: 0.0796 - val_categorical_accuracy: 0.8570 - val_tp: 4358.0000 - val_fp: 295.0000 - val_tn: 27985.0000 - val_fn: 1298.0000 - val_precision: 0.9366 - val_recall: 0.7705 - val_auc: 0.9827 - val_prc: 0.9387
Epoch 32/66
177/177 [==============================] - 20s 114ms/step - loss: 0.1186 - categorical_accuracy: 0.8075 - tp: 3628.0000 - fp: 317.0000 - tn: 27963.0000 - fn: 2028.0000 - precision: 0.9196 - recall: 0.6414 - auc: 0.9684 - prc: 0.8894 - val_loss: 0.0506 - val_categorical_accuracy: 0.9114 - val_tp: 4595.0000 - val_fp: 133.0000 - val_tn: 28147.0000 - val_fn: 1061.0000 - val_precision: 0.9719 - val_recall: 0.8124 - val_auc: 0.9933 - val_prc: 0.9727
Epoch 33/66
177/177 [==============================] - 20s 114ms/step - loss: 0.1145 - categorical_accuracy: 0.8190 - tp: 3739.0000 - fp: 311.0000 - tn: 27969.0000 - fn: 1917.0000 - precision: 0.9232 - recall: 0.6611 - auc: 0.9712 - prc: 0.8994 - val_loss: 0.1394 - val_categorical_accuracy: 0.7456 - val_tp: 2571.0000 - val_fp: 279.0000 - val_tn: 28001.0000 - val_fn: 3085.0000 - val_precision: 0.9021 - val_recall: 0.4546 - val_auc: 0.9424 - val_prc: 0.8174
Epoch 34/66
177/177 [==============================] - 20s 116ms/step - loss: 0.1258 - categorical_accuracy: 0.7981 - tp: 3457.0000 - fp: 323.0000 - tn: 27957.0000 - fn: 2199.0000 - precision: 0.9146 - recall: 0.6112 - auc: 0.9635 - prc: 0.8751 - val_loss: 0.0793 - val_categorical_accuracy: 0.8766 - val_tp: 4123.0000 - val_fp: 134.0000 - val_tn: 28146.0000 - val_fn: 1533.0000 - val_precision: 0.9685 - val_recall: 0.7290 - val_auc: 0.9849 - val_prc: 0.9452
Epoch 35/66
177/177 [==============================] - 20s 114ms/step - loss: 0.1395 - categorical_accuracy: 0.7772 - tp: 3260.0000 - fp: 334.0000 - tn: 27946.0000 - fn: 2396.0000 - precision: 0.9071 - recall: 0.5764 - auc: 0.9579 - prc: 0.8575 - val_loss: 0.0840 - val_categorical_accuracy: 0.8623 - val_tp: 3520.0000 - val_fp: 104.0000 - val_tn: 28176.0000 - val_fn: 2136.0000 - val_precision: 0.9713 - val_recall: 0.6223 - val_auc: 0.9790 - val_prc: 0.9292
Epoch 36/66
177/177 [==============================] - 20s 114ms/step - loss: 0.1267 - categorical_accuracy: 0.7983 - tp: 3491.0000 - fp: 333.0000 - tn: 27947.0000 - fn: 2165.0000 - precision: 0.9129 - recall: 0.6172 - auc: 0.9648 - prc: 0.8791 - val_loss: 0.0883 - val_categorical_accuracy: 0.8497 - val_tp: 3679.0000 - val_fp: 185.0000 - val_tn: 28095.0000 - val_fn: 1977.0000 - val_precision: 0.9521 - val_recall: 0.6505 - val_auc: 0.9792 - val_prc: 0.9246
Epoch 37/66
177/177 [==============================] - 20s 116ms/step - loss: 0.1165 - categorical_accuracy: 0.8145 - tp: 3633.0000 - fp: 313.0000 - tn: 27967.0000 - fn: 2023.0000 - precision: 0.9207 - recall: 0.6423 - auc: 0.9690 - prc: 0.8929 - val_loss: 0.0455 - val_categorical_accuracy: 0.9254 - val_tp: 4617.0000 - val_fp: 81.0000 - val_tn: 28199.0000 - val_fn: 1039.0000 - val_precision: 0.9828 - val_recall: 0.8163 - val_auc: 0.9943 - val_prc: 0.9775
Epoch 38/66
177/177 [==============================] - 20s 114ms/step - loss: 0.1292 - categorical_accuracy: 0.7983 - tp: 3442.0000 - fp: 352.0000 - tn: 27928.0000 - fn: 2214.0000 - precision: 0.9072 - recall: 0.6086 - auc: 0.9625 - prc: 0.8743 - val_loss: 0.0860 - val_categorical_accuracy: 0.8635 - val_tp: 3271.0000 - val_fp: 83.0000 - val_tn: 28197.0000 - val_fn: 2385.0000 - val_precision: 0.9753 - val_recall: 0.5783 - val_auc: 0.9803 - val_prc: 0.9319
Epoch 39/66
177/177 [==============================] - 20s 114ms/step - loss: 0.1417 - categorical_accuracy: 0.7755 - tp: 3302.0000 - fp: 329.0000 - tn: 27951.0000 - fn: 2354.0000 - precision: 0.9094 - recall: 0.5838 - auc: 0.9553 - prc: 0.8529 - val_loss: 0.1067 - val_categorical_accuracy: 0.8117 - val_tp: 2922.0000 - val_fp: 139.0000 - val_tn: 28141.0000 - val_fn: 2734.0000 - val_precision: 0.9546 - val_recall: 0.5166 - val_auc: 0.9675 - val_prc: 0.8868
Epoch 40/66
177/177 [==============================] - 20s 115ms/step - loss: 0.1427 - categorical_accuracy: 0.7898 - tp: 3322.0000 - fp: 340.0000 - tn: 27940.0000 - fn: 2334.0000 - precision: 0.9072 - recall: 0.5873 - auc: 0.9599 - prc: 0.8625 - val_loss: 0.0956 - val_categorical_accuracy: 0.8559 - val_tp: 2600.0000 - val_fp: 30.0000 - val_tn: 28250.0000 - val_fn: 3056.0000 - val_precision: 0.9886 - val_recall: 0.4597 - val_auc: 0.9808 - val_prc: 0.9302
Epoch 41/66
177/177 [==============================] - 20s 114ms/step - loss: 0.1393 - categorical_accuracy: 0.7813 - tp: 3303.0000 - fp: 353.0000 - tn: 27927.0000 - fn: 2353.0000 - precision: 0.9034 - recall: 0.5840 - auc: 0.9573 - prc: 0.8580 - val_loss: 0.1046 - val_categorical_accuracy: 0.8204 - val_tp: 3739.0000 - val_fp: 260.0000 - val_tn: 28020.0000 - val_fn: 1917.0000 - val_precision: 0.9350 - val_recall: 0.6611 - val_auc: 0.9744 - val_prc: 0.9055
Epoch 42/66
177/177 [==============================] - 20s 114ms/step - loss: 0.1427 - categorical_accuracy: 0.7744 - tp: 3226.0000 - fp: 349.0000 - tn: 27931.0000 - fn: 2430.0000 - precision: 0.9024 - recall: 0.5704 - auc: 0.9548 - prc: 0.8506 - val_loss: 0.0882 - val_categorical_accuracy: 0.8658 - val_tp: 2883.0000 - val_fp: 35.0000 - val_tn: 28245.0000 - val_fn: 2773.0000 - val_precision: 0.9880 - val_recall: 0.5097 - val_auc: 0.9842 - val_prc: 0.9386
Epoch 43/66
177/177 [==============================] - 20s 115ms/step - loss: 0.1335 - categorical_accuracy: 0.7790 - tp: 3329.0000 - fp: 342.0000 - tn: 27938.0000 - fn: 2327.0000 - precision: 0.9068 - recall: 0.5886 - auc: 0.9593 - prc: 0.8607 - val_loss: 0.1714 - val_categorical_accuracy: 0.7861 - val_tp: 3714.0000 - val_fp: 489.0000 - val_tn: 27791.0000 - val_fn: 1942.0000 - val_precision: 0.8837 - val_recall: 0.6566 - val_auc: 0.9543 - val_prc: 0.8543
Epoch 44/66
177/177 [==============================] - 20s 114ms/step - loss: 0.1521 - categorical_accuracy: 0.7675 - tp: 3078.0000 - fp: 335.0000 - tn: 27945.0000 - fn: 2578.0000 - precision: 0.9018 - recall: 0.5442 - auc: 0.9526 - prc: 0.8415 - val_loss: 0.0975 - val_categorical_accuracy: 0.8411 - val_tp: 3277.0000 - val_fp: 191.0000 - val_tn: 28089.0000 - val_fn: 2379.0000 - val_precision: 0.9449 - val_recall: 0.5794 - val_auc: 0.9759 - val_prc: 0.9101
Epoch 45/66
177/177 [==============================] - 20s 114ms/step - loss: 0.1523 - categorical_accuracy: 0.7518 - tp: 2977.0000 - fp: 333.0000 - tn: 27947.0000 - fn: 2679.0000 - precision: 0.8994 - recall: 0.5263 - auc: 0.9471 - prc: 0.8295 - val_loss: 0.1388 - val_categorical_accuracy: 0.7042 - val_tp: 3009.0000 - val_fp: 267.0000 - val_tn: 28013.0000 - val_fn: 2647.0000 - val_precision: 0.9185 - val_recall: 0.5320 - val_auc: 0.9427 - val_prc: 0.8225
Epoch 46/66
177/177 [==============================] - 20s 115ms/step - loss: 0.1533 - categorical_accuracy: 0.7525 - tp: 3020.0000 - fp: 368.0000 - tn: 27912.0000 - fn: 2636.0000 - precision: 0.8914 - recall: 0.5339 - auc: 0.9481 - prc: 0.8301 - val_loss: 0.0973 - val_categorical_accuracy: 0.8066 - val_tp: 3613.0000 - val_fp: 218.0000 - val_tn: 28062.0000 - val_fn: 2043.0000 - val_precision: 0.9431 - val_recall: 0.6388 - val_auc: 0.9729 - val_prc: 0.9048
Epoch 47/66
177/177 [==============================] - 20s 113ms/step - loss: 0.1751 - categorical_accuracy: 0.7399 - tp: 2858.0000 - fp: 367.0000 - tn: 27913.0000 - fn: 2798.0000 - precision: 0.8862 - recall: 0.5053 - auc: 0.9407 - prc: 0.8070 - val_loss: 0.0821 - val_categorical_accuracy: 0.8492 - val_tp: 3851.0000 - val_fp: 167.0000 - val_tn: 28113.0000 - val_fn: 1805.0000 - val_precision: 0.9584 - val_recall: 0.6809 - val_auc: 0.9805 - val_prc: 0.9311
Epoch 48/66
177/177 [==============================] - 20s 113ms/step - loss: 0.1583 - categorical_accuracy: 0.7343 - tp: 2944.0000 - fp: 342.0000 - tn: 27938.0000 - fn: 2712.0000 - precision: 0.8959 - recall: 0.5205 - auc: 0.9442 - prc: 0.8178 - val_loss: 0.2131 - val_categorical_accuracy: 0.5557 - val_tp: 367.0000 - val_fp: 23.0000 - val_tn: 28257.0000 - val_fn: 5289.0000 - val_precision: 0.9410 - val_recall: 0.0649 - val_auc: 0.8666 - val_prc: 0.6145
Epoch 49/66
177/177 [==============================] - 20s 114ms/step - loss: 0.1494 - categorical_accuracy: 0.7516 - tp: 3086.0000 - fp: 348.0000 - tn: 27932.0000 - fn: 2570.0000 - precision: 0.8987 - recall: 0.5456 - auc: 0.9499 - prc: 0.8365 - val_loss: 0.0852 - val_categorical_accuracy: 0.8522 - val_tp: 4017.0000 - val_fp: 188.0000 - val_tn: 28092.0000 - val_fn: 1639.0000 - val_precision: 0.9553 - val_recall: 0.7102 - val_auc: 0.9817 - val_prc: 0.9343
Epoch 50/66
177/177 [==============================] - 20s 113ms/step - loss: 0.1668 - categorical_accuracy: 0.7376 - tp: 2834.0000 - fp: 367.0000 - tn: 27913.0000 - fn: 2822.0000 - precision: 0.8853 - recall: 0.5011 - auc: 0.9416 - prc: 0.8106 - val_loss: 0.1221 - val_categorical_accuracy: 0.7488 - val_tp: 2734.0000 - val_fp: 87.0000 - val_tn: 28193.0000 - val_fn: 2922.0000 - val_precision: 0.9692 - val_recall: 0.4834 - val_auc: 0.9578 - val_prc: 0.8622
Epoch 51/66
177/177 [==============================] - 20s 113ms/step - loss: 0.1595 - categorical_accuracy: 0.7350 - tp: 2840.0000 - fp: 362.0000 - tn: 27918.0000 - fn: 2816.0000 - precision: 0.8869 - recall: 0.5021 - auc: 0.9411 - prc: 0.8119 - val_loss: 0.1034 - val_categorical_accuracy: 0.8064 - val_tp: 3723.0000 - val_fp: 268.0000 - val_tn: 28012.0000 - val_fn: 1933.0000 - val_precision: 0.9328 - val_recall: 0.6582 - val_auc: 0.9700 - val_prc: 0.9010
Epoch 52/66
177/177 [==============================] - 20s 115ms/step - loss: 0.1533 - categorical_accuracy: 0.7450 - tp: 2993.0000 - fp: 353.0000 - tn: 27927.0000 - fn: 2663.0000 - precision: 0.8945 - recall: 0.5292 - auc: 0.9468 - prc: 0.8273 - val_loss: 0.4307 - val_categorical_accuracy: 0.4949 - val_tp: 1897.0000 - val_fp: 1016.0000 - val_tn: 27264.0000 - val_fn: 3759.0000 - val_precision: 0.6512 - val_recall: 0.3354 - val_auc: 0.7961 - val_prc: 0.5318
Epoch 53/66
177/177 [==============================] - 20s 114ms/step - loss: 0.1799 - categorical_accuracy: 0.7176 - tp: 2713.0000 - fp: 370.0000 - tn: 27910.0000 - fn: 2943.0000 - precision: 0.8800 - recall: 0.4797 - auc: 0.9343 - prc: 0.7902 - val_loss: 0.0959 - val_categorical_accuracy: 0.8299 - val_tp: 3414.0000 - val_fp: 171.0000 - val_tn: 28109.0000 - val_fn: 2242.0000 - val_precision: 0.9523 - val_recall: 0.6036 - val_auc: 0.9737 - val_prc: 0.9072
Epoch 54/66
177/177 [==============================] - 20s 113ms/step - loss: 0.1691 - categorical_accuracy: 0.7148 - tp: 2672.0000 - fp: 332.0000 - tn: 27948.0000 - fn: 2984.0000 - precision: 0.8895 - recall: 0.4724 - auc: 0.9351 - prc: 0.7950 - val_loss: 0.1622 - val_categorical_accuracy: 0.7440 - val_tp: 1065.0000 - val_fp: 36.0000 - val_tn: 28244.0000 - val_fn: 4591.0000 - val_precision: 0.9673 - val_recall: 0.1883 - val_auc: 0.9408 - val_prc: 0.8052
Epoch 55/66
177/177 [==============================] - 20s 114ms/step - loss: 0.1800 - categorical_accuracy: 0.6982 - tp: 2416.0000 - fp: 374.0000 - tn: 27906.0000 - fn: 3240.0000 - precision: 0.8659 - recall: 0.4272 - auc: 0.9243 - prc: 0.7630 - val_loss: 0.1403 - val_categorical_accuracy: 0.7353 - val_tp: 2387.0000 - val_fp: 209.0000 - val_tn: 28071.0000 - val_fn: 3269.0000 - val_precision: 0.9195 - val_recall: 0.4220 - val_auc: 0.9434 - val_prc: 0.8111
Epoch 56/66
177/177 [==============================] - 20s 114ms/step - loss: 0.1812 - categorical_accuracy: 0.6932 - tp: 2358.0000 - fp: 344.0000 - tn: 27936.0000 - fn: 3298.0000 - precision: 0.8727 - recall: 0.4169 - auc: 0.9217 - prc: 0.7586 - val_loss: 0.1897 - val_categorical_accuracy: 0.6455 - val_tp: 631.0000 - val_fp: 45.0000 - val_tn: 28235.0000 - val_fn: 5025.0000 - val_precision: 0.9334 - val_recall: 0.1116 - val_auc: 0.8945 - val_prc: 0.6773
Epoch 57/66
177/177 [==============================] - 20s 113ms/step - loss: 0.1942 - categorical_accuracy: 0.6664 - tp: 1999.0000 - fp: 375.0000 - tn: 27905.0000 - fn: 3657.0000 - precision: 0.8420 - recall: 0.3534 - auc: 0.9091 - prc: 0.7185 - val_loss: 0.1502 - val_categorical_accuracy: 0.6936 - val_tp: 2434.0000 - val_fp: 248.0000 - val_tn: 28032.0000 - val_fn: 3222.0000 - val_precision: 0.9075 - val_recall: 0.4303 - val_auc: 0.9319 - val_prc: 0.7902
Epoch 58/66
177/177 [==============================] - 20s 115ms/step - loss: 0.2035 - categorical_accuracy: 0.6612 - tp: 1927.0000 - fp: 352.0000 - tn: 27928.0000 - fn: 3729.0000 - precision: 0.8455 - recall: 0.3407 - auc: 0.9044 - prc: 0.7075 - val_loss: 0.1501 - val_categorical_accuracy: 0.7256 - val_tp: 2322.0000 - val_fp: 258.0000 - val_tn: 28022.0000 - val_fn: 3334.0000 - val_precision: 0.9000 - val_recall: 0.4105 - val_auc: 0.9387 - val_prc: 0.7994
Epoch 59/66
177/177 [==============================] - 20s 112ms/step - loss: 0.2113 - categorical_accuracy: 0.6353 - tp: 1888.0000 - fp: 370.0000 - tn: 27910.0000 - fn: 3768.0000 - precision: 0.8361 - recall: 0.3338 - auc: 0.8968 - prc: 0.6891 - val_loss: 0.1488 - val_categorical_accuracy: 0.7219 - val_tp: 1359.0000 - val_fp: 36.0000 - val_tn: 28244.0000 - val_fn: 4297.0000 - val_precision: 0.9742 - val_recall: 0.2403 - val_auc: 0.9409 - val_prc: 0.8098
Epoch 60/66
177/177 [==============================] - 20s 113ms/step - loss: 0.1932 - categorical_accuracy: 0.6711 - tp: 2109.0000 - fp: 364.0000 - tn: 27916.0000 - fn: 3547.0000 - precision: 0.8528 - recall: 0.3729 - auc: 0.9114 - prc: 0.7271 - val_loss: 0.1527 - val_categorical_accuracy: 0.6908 - val_tp: 1327.0000 - val_fp: 33.0000 - val_tn: 28247.0000 - val_fn: 4329.0000 - val_precision: 0.9757 - val_recall: 0.2346 - val_auc: 0.9298 - val_prc: 0.7929
Epoch 61/66
177/177 [==============================] - 20s 115ms/step - loss: 0.1998 - categorical_accuracy: 0.6545 - tp: 1925.0000 - fp: 374.0000 - tn: 27906.0000 - fn: 3731.0000 - precision: 0.8373 - recall: 0.3403 - auc: 0.9028 - prc: 0.7051 - val_loss: 0.1959 - val_categorical_accuracy: 0.6867 - val_tp: 497.0000 - val_fp: 26.0000 - val_tn: 28254.0000 - val_fn: 5159.0000 - val_precision: 0.9503 - val_recall: 0.0879 - val_auc: 0.9115 - val_prc: 0.7299
Epoch 62/66
177/177 [==============================] - 20s 113ms/step - loss: 0.2195 - categorical_accuracy: 0.6317 - tp: 1646.0000 - fp: 324.0000 - tn: 27956.0000 - fn: 4010.0000 - precision: 0.8355 - recall: 0.2910 - auc: 0.8908 - prc: 0.6738 - val_loss: 0.1691 - val_categorical_accuracy: 0.6650 - val_tp: 613.0000 - val_fp: 6.0000 - val_tn: 28274.0000 - val_fn: 5043.0000 - val_precision: 0.9903 - val_recall: 0.1084 - val_auc: 0.9201 - val_prc: 0.7641
Epoch 63/66
177/177 [==============================] - 20s 113ms/step - loss: 0.1966 - categorical_accuracy: 0.6552 - tp: 1878.0000 - fp: 330.0000 - tn: 27950.0000 - fn: 3778.0000 - precision: 0.8505 - recall: 0.3320 - auc: 0.9028 - prc: 0.7065 - val_loss: 0.1570 - val_categorical_accuracy: 0.7251 - val_tp: 1005.0000 - val_fp: 8.0000 - val_tn: 28272.0000 - val_fn: 4651.0000 - val_precision: 0.9921 - val_recall: 0.1777 - val_auc: 0.9438 - val_prc: 0.8209
Epoch 64/66
177/177 [==============================] - 20s 114ms/step - loss: 0.1900 - categorical_accuracy: 0.6620 - tp: 2070.0000 - fp: 340.0000 - tn: 27940.0000 - fn: 3586.0000 - precision: 0.8589 - recall: 0.3660 - auc: 0.9096 - prc: 0.7262 - val_loss: 0.1566 - val_categorical_accuracy: 0.6729 - val_tp: 2379.0000 - val_fp: 301.0000 - val_tn: 27979.0000 - val_fn: 3277.0000 - val_precision: 0.8877 - val_recall: 0.4206 - val_auc: 0.9299 - val_prc: 0.7736
Epoch 65/66
177/177 [==============================] - 20s 113ms/step - loss: 0.1943 - categorical_accuracy: 0.6386 - tp: 1826.0000 - fp: 320.0000 - tn: 27960.0000 - fn: 3830.0000 - precision: 0.8509 - recall: 0.3228 - auc: 0.9014 - prc: 0.7023 - val_loss: 0.1665 - val_categorical_accuracy: 0.6232 - val_tp: 1104.0000 - val_fp: 34.0000 - val_tn: 28246.0000 - val_fn: 4552.0000 - val_precision: 0.9701 - val_recall: 0.1952 - val_auc: 0.9165 - val_prc: 0.7522
Epoch 66/66
177/177 [==============================] - 20s 113ms/step - loss: 0.2073 - categorical_accuracy: 0.6452 - tp: 1806.0000 - fp: 334.0000 - tn: 27946.0000 - fn: 3850.0000 - precision: 0.8439 - recall: 0.3193 - auc: 0.9008 - prc: 0.6948 - val_loss: 0.3102 - val_categorical_accuracy: 0.6655 - val_tp: 3416.0000 - val_fp: 1195.0000 - val_tn: 27085.0000 - val_fn: 2240.0000 - val_precision: 0.7408 - val_recall: 0.6040 - val_auc: 0.9135 - val_prc: 0.7513

Analyze large capacity model, with dropout training history

In [431]:
# %reload_ext tensorboard
# %tensorboard --logdir ./tensorboard/large_capacity_reg_model --bind_all

Screen Shot 2023-03-18 at 1.33.14 AM.png

Evaluate low capacity model, with dropout layer performance on TEST set

In [214]:
evaluate_model_performance(large_cap_model, ds_validation, ds_test)
177/177 [==============================] - 4s 22ms/step - loss: 0.3104 - categorical_accuracy: 0.6655 - tp: 3416.0000 - fp: 1195.0000 - tn: 27085.0000 - fn: 2240.0000 - precision: 0.7408 - recall: 0.6040 - auc: 0.9135 - prc: 0.7513
Validation AUC: 0.914
Validation PRC: 0.751
Validation categorical accuracy: 0.665
59/59 [==============================] - 1s 18ms/step - loss: 0.5354 - categorical_accuracy: 0.6133 - tp: 1052.0000 - fp: 514.0000 - tn: 8911.0000 - fn: 833.0000 - precision: 0.6718 - recall: 0.5581 - auc: 0.8726 - prc: 0.6295
Test AUC: 0.873
Test PRC: 0.630
Test categorical accuracy: 0.613
5.3.2 Evaluate dropout with varying capacity on TEST set¶
In [215]:
evaluate_model_performance(large_cap_model, ds_validation, ds_test)
177/177 [==============================] - 4s 23ms/step - loss: 0.3101 - categorical_accuracy: 0.6655 - tp: 3416.0000 - fp: 1195.0000 - tn: 27085.0000 - fn: 2240.0000 - precision: 0.7408 - recall: 0.6040 - auc: 0.9135 - prc: 0.7513
Validation AUC: 0.914
Validation PRC: 0.751
Validation categorical accuracy: 0.665
59/59 [==============================] - 1s 18ms/step - loss: 0.5354 - categorical_accuracy: 0.6133 - tp: 1052.0000 - fp: 514.0000 - tn: 8911.0000 - fn: 833.0000 - precision: 0.6718 - recall: 0.5581 - auc: 0.8726 - prc: 0.6295
Test AUC: 0.873
Test PRC: 0.630
Test categorical accuracy: 0.613
In [216]:
evaluate_model_performance(low_cap_model, ds_validation, ds_test)
177/177 [==============================] - 2s 11ms/step - loss: 0.0331 - categorical_accuracy: 0.9680 - tp: 4823.0000 - fp: 15.0000 - tn: 28265.0000 - fn: 833.0000 - precision: 0.9969 - recall: 0.8527 - auc: 0.9971 - prc: 0.9917
Validation AUC: 0.997
Validation PRC: 0.992
Validation categorical accuracy: 0.968
59/59 [==============================] - 0s 7ms/step - loss: 0.2347 - categorical_accuracy: 0.6027 - tp: 703.0000 - fp: 197.0000 - tn: 9228.0000 - fn: 1182.0000 - precision: 0.7811 - recall: 0.3729 - auc: 0.8805 - prc: 0.6439
Test AUC: 0.881
Test PRC: 0.644
Test categorical accuracy: 0.603
  • One thing is clear, both models with dropout perform better than the two baseline models without dropout
  • The low capacity model has higher variance of 0.348 between it's performance on validation and test set, but has higher PRC compared to the large capacity model
  • The large capacity model has lower variance of 0.121 between it's performance on validation and test set, but lower PRC compared to the low capacity model
  • The PRC difference between the two models on the test set is 0.014 which may not be very significant. Moving forward I will continue with low capacity model since it performs slightly better on other metrics as well
5.3.3 Adding Learning Rate Optimization¶
In [23]:
# For infos on LR scheduler please refer to https://www.kaggle.com/markwijkhuizen/tf-efficientnetb4-mixup-cutmix-gridmask-cv-0-90)
def lrfn(epoch, bs=BATCH_SIZE, epochs=66):

    LR_START = 1e-6
    LR_MAX = 2e-4
    LR_FINAL = 1e-6
    LR_RAMPUP_EPOCHS = 4
    LR_SUSTAIN_EPOCHS = 0
    DECAY_EPOCHS = epochs  - LR_RAMPUP_EPOCHS - LR_SUSTAIN_EPOCHS - 1
    LR_EXP_DECAY = (LR_FINAL / LR_MAX) ** (1 / (66 - LR_RAMPUP_EPOCHS - LR_SUSTAIN_EPOCHS - 1))

    if epoch < LR_RAMPUP_EPOCHS:
        lr = LR_START + (LR_MAX + LR_START) * (epoch / LR_RAMPUP_EPOCHS) ** 2.5
    elif epoch < LR_RAMPUP_EPOCHS + LR_SUSTAIN_EPOCHS:
        lr = LR_MAX
    else:
        epoch_diff = epoch - LR_RAMPUP_EPOCHS - LR_SUSTAIN_EPOCHS
        decay_factor = (epoch_diff / DECAY_EPOCHS) * math.pi
        decay_factor= (tf.math.cos(decay_factor).numpy() + 1) / 2        
        lr = LR_FINAL + (LR_MAX - LR_FINAL) * decay_factor

    return lr
In [228]:
modelname = "learning_rate_optimized"
# Add Learning rate scheduler callback
callbacks_list = [
    # keras.callbacks.EarlyStopping(monitor="val_prc", patience=5, restore_best_weights=True), # interrupts training when categorical accurcay has stopped improving for 5 epochs
    keras.callbacks.ModelCheckpoint(filepath=f"models/{modelname}.keras", monitor="val_prc", save_best_only=True), # prevents overwriting model file unless validation loss has improved
    keras.callbacks.TensorBoard(log_dir=f"./tensorboard/{modelname}"), # path where callback writes logs
    keras.callbacks.LearningRateScheduler(lambda epoch: lrfn(epoch), verbose=1) # Use learning rate opt function
]
# Get low capacity model with dropout
lr_optimized_model = get_model_with_dropout(modelname, low_capacity=True, dropout=True)
# Print model summary
lr_optimized_model.summary()
lr_optimized_model_history = lr_optimized_model.fit(
    ds_train,
    epochs=66,
    validation_data=ds_validation,
    callbacks=callbacks_list
)
Model: "learning_rate_optimized"
_________________________________________________________________
 Layer (type)                Output Shape              Param #   
=================================================================
 preprocessedimage (InputLay  [(None, 224, 224, 3)]    0         
 er)                                                             
                                                                 
 conv2d_101 (Conv2D)         (None, 222, 222, 8)       224       
                                                                 
 max_pooling2d_75 (MaxPoolin  (None, 111, 111, 8)      0         
 g2D)                                                            
                                                                 
 conv2d_102 (Conv2D)         (None, 109, 109, 16)      1168      
                                                                 
 max_pooling2d_76 (MaxPoolin  (None, 54, 54, 16)       0         
 g2D)                                                            
                                                                 
 conv2d_103 (Conv2D)         (None, 52, 52, 32)        4640      
                                                                 
 dropout_20 (Dropout)        (None, 52, 52, 32)        0         
                                                                 
 flatten_26 (Flatten)        (None, 86528)             0         
                                                                 
 dropout_21 (Dropout)        (None, 86528)             0         
                                                                 
 softmax_layer (Dense)       (None, 6)                 519174    
                                                                 
=================================================================
Total params: 525,206
Trainable params: 525,206
Non-trainable params: 0
_________________________________________________________________

Epoch 1: LearningRateScheduler setting learning rate to 1e-06.
Epoch 1/66
2023-03-02 05:26:14.056295: E tensorflow/core/grappler/optimizers/meta_optimizer.cc:954] layout failed: INVALID_ARGUMENT: Size of values 0 does not match size of permutation 4 @ fanin shape inlearning_rate_optimized/dropout_20/dropout/SelectV2-2-TransposeNHWCToNCHW-LayoutOptimizer
177/177 [==============================] - 8s 37ms/step - loss: 0.2938 - categorical_accuracy: 0.4771 - tp: 133.0000 - fp: 9.0000 - tn: 56551.0000 - fn: 11179.0000 - precision: 0.9366 - recall: 0.0118 - auc: 0.8035 - prc: 0.5181 - val_loss: 0.2546 - val_categorical_accuracy: 0.4699 - val_tp: 0.0000e+00 - val_fp: 0.0000e+00 - val_tn: 28280.0000 - val_fn: 5656.0000 - val_precision: 0.0000e+00 - val_recall: 0.0000e+00 - val_auc: 0.8057 - val_prc: 0.4394 - lr: 1.0000e-06

Epoch 2: LearningRateScheduler setting learning rate to 7.28125e-06.
Epoch 2/66
177/177 [==============================] - 6s 33ms/step - loss: 0.2523 - categorical_accuracy: 0.4284 - tp: 54.0000 - fp: 36.0000 - tn: 28244.0000 - fn: 5602.0000 - precision: 0.6000 - recall: 0.0095 - auc: 0.7887 - prc: 0.3961 - val_loss: 0.2342 - val_categorical_accuracy: 0.4703 - val_tp: 0.0000e+00 - val_fp: 0.0000e+00 - val_tn: 28280.0000 - val_fn: 5656.0000 - val_precision: 0.0000e+00 - val_recall: 0.0000e+00 - val_auc: 0.8133 - val_prc: 0.4593 - lr: 7.2812e-06

Epoch 3: LearningRateScheduler setting learning rate to 3.6532115754624014e-05.
Epoch 3/66
177/177 [==============================] - 6s 34ms/step - loss: 0.2390 - categorical_accuracy: 0.4620 - tp: 28.0000 - fp: 15.0000 - tn: 28265.0000 - fn: 5628.0000 - precision: 0.6512 - recall: 0.0050 - auc: 0.8045 - prc: 0.4304 - val_loss: 0.2255 - val_categorical_accuracy: 0.4889 - val_tp: 0.0000e+00 - val_fp: 0.0000e+00 - val_tn: 28280.0000 - val_fn: 5656.0000 - val_precision: 0.0000e+00 - val_recall: 0.0000e+00 - val_auc: 0.8298 - val_prc: 0.5098 - lr: 3.6532e-05

Epoch 4: LearningRateScheduler setting learning rate to 9.89149972153781e-05.
Epoch 4/66
177/177 [==============================] - 6s 34ms/step - loss: 0.2253 - categorical_accuracy: 0.5136 - tp: 56.0000 - fp: 17.0000 - tn: 28263.0000 - fn: 5600.0000 - precision: 0.7671 - recall: 0.0099 - auc: 0.8288 - prc: 0.4978 - val_loss: 0.2121 - val_categorical_accuracy: 0.5313 - val_tp: 70.0000 - val_fp: 13.0000 - val_tn: 28267.0000 - val_fn: 5586.0000 - val_precision: 0.8434 - val_recall: 0.0124 - val_auc: 0.8534 - val_prc: 0.5574 - lr: 9.8915e-05

Epoch 5: LearningRateScheduler setting learning rate to 0.0002.
Epoch 5/66
177/177 [==============================] - 6s 34ms/step - loss: 0.2135 - categorical_accuracy: 0.5483 - tp: 213.0000 - fp: 86.0000 - tn: 28194.0000 - fn: 5443.0000 - precision: 0.7124 - recall: 0.0377 - auc: 0.8507 - prc: 0.5374 - val_loss: 0.2020 - val_categorical_accuracy: 0.5757 - val_tp: 618.0000 - val_fp: 144.0000 - val_tn: 28136.0000 - val_fn: 5038.0000 - val_precision: 0.8110 - val_recall: 0.1093 - val_auc: 0.8674 - val_prc: 0.5976 - lr: 2.0000e-04

Epoch 6: LearningRateScheduler setting learning rate to 0.00019986807242035866.
Epoch 6/66
177/177 [==============================] - 6s 33ms/step - loss: 0.2009 - categorical_accuracy: 0.5810 - tp: 487.0000 - fp: 116.0000 - tn: 28164.0000 - fn: 5169.0000 - precision: 0.8076 - recall: 0.0861 - auc: 0.8691 - prc: 0.5916 - val_loss: 0.1894 - val_categorical_accuracy: 0.6080 - val_tp: 333.0000 - val_fp: 31.0000 - val_tn: 28249.0000 - val_fn: 5323.0000 - val_precision: 0.9148 - val_recall: 0.0589 - val_auc: 0.8853 - val_prc: 0.6477 - lr: 1.9987e-04

Epoch 7: LearningRateScheduler setting learning rate to 0.0001994726395905018.
Epoch 7/66
177/177 [==============================] - 6s 34ms/step - loss: 0.1920 - categorical_accuracy: 0.6011 - tp: 647.0000 - fp: 145.0000 - tn: 28135.0000 - fn: 5009.0000 - precision: 0.8169 - recall: 0.1144 - auc: 0.8822 - prc: 0.6243 - val_loss: 0.1804 - val_categorical_accuracy: 0.6057 - val_tp: 1110.0000 - val_fp: 146.0000 - val_tn: 28134.0000 - val_fn: 4546.0000 - val_precision: 0.8838 - val_recall: 0.1963 - val_auc: 0.8975 - val_prc: 0.6731 - lr: 1.9947e-04

Epoch 8: LearningRateScheduler setting learning rate to 0.00019881474530696869.
Epoch 8/66
177/177 [==============================] - 9s 53ms/step - loss: 0.1855 - categorical_accuracy: 0.6149 - tp: 894.0000 - fp: 165.0000 - tn: 28115.0000 - fn: 4762.0000 - precision: 0.8442 - recall: 0.1581 - auc: 0.8908 - prc: 0.6478 - val_loss: 0.1851 - val_categorical_accuracy: 0.6609 - val_tp: 183.0000 - val_fp: 28.0000 - val_tn: 28252.0000 - val_fn: 5473.0000 - val_precision: 0.8673 - val_recall: 0.0324 - val_auc: 0.9035 - val_prc: 0.6797 - lr: 1.9881e-04

Epoch 9: LearningRateScheduler setting learning rate to 0.00019789613911509515.
Epoch 9/66
177/177 [==============================] - 6s 34ms/step - loss: 0.1797 - categorical_accuracy: 0.6255 - tp: 1003.0000 - fp: 156.0000 - tn: 28124.0000 - fn: 4653.0000 - precision: 0.8654 - recall: 0.1773 - auc: 0.8978 - prc: 0.6698 - val_loss: 0.1695 - val_categorical_accuracy: 0.6634 - val_tp: 516.0000 - val_fp: 27.0000 - val_tn: 28253.0000 - val_fn: 5140.0000 - val_precision: 0.9503 - val_recall: 0.0912 - val_auc: 0.9159 - val_prc: 0.7279 - lr: 1.9790e-04

Epoch 10: LearningRateScheduler setting learning rate to 0.00019671925851702692.
Epoch 10/66
177/177 [==============================] - 6s 33ms/step - loss: 0.1737 - categorical_accuracy: 0.6429 - tp: 1178.0000 - fp: 191.0000 - tn: 28089.0000 - fn: 4478.0000 - precision: 0.8605 - recall: 0.2083 - auc: 0.9056 - prc: 0.6882 - val_loss: 0.1620 - val_categorical_accuracy: 0.6777 - val_tp: 850.0000 - val_fp: 54.0000 - val_tn: 28226.0000 - val_fn: 4806.0000 - val_precision: 0.9403 - val_recall: 0.1503 - val_auc: 0.9188 - val_prc: 0.7454 - lr: 1.9672e-04

Epoch 11: LearningRateScheduler setting learning rate to 0.0001952872230410576.
Epoch 11/66
177/177 [==============================] - 6s 33ms/step - loss: 0.1682 - categorical_accuracy: 0.6503 - tp: 1326.0000 - fp: 178.0000 - tn: 28102.0000 - fn: 4330.0000 - precision: 0.8816 - recall: 0.2344 - auc: 0.9116 - prc: 0.7085 - val_loss: 0.1562 - val_categorical_accuracy: 0.6756 - val_tp: 1910.0000 - val_fp: 146.0000 - val_tn: 28134.0000 - val_fn: 3746.0000 - val_precision: 0.9290 - val_recall: 0.3377 - val_auc: 0.9206 - val_prc: 0.7646 - lr: 1.9529e-04

Epoch 12: LearningRateScheduler setting learning rate to 0.00019360382238030435.
Epoch 12/66
177/177 [==============================] - 6s 33ms/step - loss: 0.1648 - categorical_accuracy: 0.6620 - tp: 1422.0000 - fp: 207.0000 - tn: 28073.0000 - fn: 4234.0000 - precision: 0.8729 - recall: 0.2514 - auc: 0.9157 - prc: 0.7197 - val_loss: 0.1502 - val_categorical_accuracy: 0.6964 - val_tp: 1257.0000 - val_fp: 57.0000 - val_tn: 28223.0000 - val_fn: 4399.0000 - val_precision: 0.9566 - val_recall: 0.2222 - val_auc: 0.9319 - val_prc: 0.7785 - lr: 1.9360e-04

Epoch 13: LearningRateScheduler setting learning rate to 0.0001916735341846943.
Epoch 13/66
177/177 [==============================] - 6s 33ms/step - loss: 0.1596 - categorical_accuracy: 0.6697 - tp: 1527.0000 - fp: 190.0000 - tn: 28090.0000 - fn: 4129.0000 - precision: 0.8893 - recall: 0.2700 - auc: 0.9210 - prc: 0.7366 - val_loss: 0.1510 - val_categorical_accuracy: 0.6678 - val_tp: 2309.0000 - val_fp: 337.0000 - val_tn: 27943.0000 - val_fn: 3347.0000 - val_precision: 0.8726 - val_recall: 0.4082 - val_auc: 0.9306 - val_prc: 0.7711 - lr: 1.9167e-04

Epoch 14: LearningRateScheduler setting learning rate to 0.00018950146475434304.
Epoch 14/66
177/177 [==============================] - 6s 33ms/step - loss: 0.1540 - categorical_accuracy: 0.6819 - tp: 1698.0000 - fp: 204.0000 - tn: 28076.0000 - fn: 3958.0000 - precision: 0.8927 - recall: 0.3002 - auc: 0.9268 - prc: 0.7524 - val_loss: 0.1441 - val_categorical_accuracy: 0.7558 - val_tp: 1000.0000 - val_fp: 18.0000 - val_tn: 28262.0000 - val_fn: 4656.0000 - val_precision: 0.9823 - val_recall: 0.1768 - val_auc: 0.9434 - val_prc: 0.8192 - lr: 1.8950e-04

Epoch 15: LearningRateScheduler setting learning rate to 0.00018709338462352755.
Epoch 15/66
177/177 [==============================] - 6s 33ms/step - loss: 0.1502 - categorical_accuracy: 0.6945 - tp: 1748.0000 - fp: 200.0000 - tn: 28080.0000 - fn: 3908.0000 - precision: 0.8973 - recall: 0.3091 - auc: 0.9309 - prc: 0.7640 - val_loss: 0.1454 - val_categorical_accuracy: 0.6791 - val_tp: 2536.0000 - val_fp: 447.0000 - val_tn: 27833.0000 - val_fn: 3120.0000 - val_precision: 0.8502 - val_recall: 0.4484 - val_auc: 0.9402 - val_prc: 0.7872 - lr: 1.8709e-04

Epoch 16: LearningRateScheduler setting learning rate to 0.00018445566925406456.
Epoch 16/66
177/177 [==============================] - 6s 33ms/step - loss: 0.1475 - categorical_accuracy: 0.6987 - tp: 1876.0000 - fp: 203.0000 - tn: 28077.0000 - fn: 3780.0000 - precision: 0.9024 - recall: 0.3317 - auc: 0.9339 - prc: 0.7712 - val_loss: 0.1291 - val_categorical_accuracy: 0.7760 - val_tp: 1545.0000 - val_fp: 30.0000 - val_tn: 28250.0000 - val_fn: 4111.0000 - val_precision: 0.9810 - val_recall: 0.2732 - val_auc: 0.9553 - val_prc: 0.8497 - lr: 1.8446e-04

Epoch 17: LearningRateScheduler setting learning rate to 0.00018159532275795938.
Epoch 17/66
177/177 [==============================] - 6s 33ms/step - loss: 0.1435 - categorical_accuracy: 0.7104 - tp: 1885.0000 - fp: 211.0000 - tn: 28069.0000 - fn: 3771.0000 - precision: 0.8993 - recall: 0.3333 - auc: 0.9377 - prc: 0.7818 - val_loss: 0.1242 - val_categorical_accuracy: 0.7664 - val_tp: 2070.0000 - val_fp: 65.0000 - val_tn: 28215.0000 - val_fn: 3586.0000 - val_precision: 0.9696 - val_recall: 0.3660 - val_auc: 0.9560 - val_prc: 0.8493 - lr: 1.8160e-04

Epoch 18: LearningRateScheduler setting learning rate to 0.00017851992452144625.
Epoch 18/66
177/177 [==============================] - 7s 42ms/step - loss: 0.1383 - categorical_accuracy: 0.7222 - tp: 2033.0000 - fp: 200.0000 - tn: 28080.0000 - fn: 3623.0000 - precision: 0.9104 - recall: 0.3594 - auc: 0.9420 - prc: 0.7976 - val_loss: 0.1258 - val_categorical_accuracy: 0.7946 - val_tp: 1558.0000 - val_fp: 25.0000 - val_tn: 28255.0000 - val_fn: 4098.0000 - val_precision: 0.9842 - val_recall: 0.2755 - val_auc: 0.9603 - val_prc: 0.8630 - lr: 1.7852e-04

Epoch 19: LearningRateScheduler setting learning rate to 0.00017523762920498848.
Epoch 19/66
177/177 [==============================] - 6s 33ms/step - loss: 0.1361 - categorical_accuracy: 0.7291 - tp: 2112.0000 - fp: 203.0000 - tn: 28077.0000 - fn: 3544.0000 - precision: 0.9123 - recall: 0.3734 - auc: 0.9444 - prc: 0.8028 - val_loss: 0.1238 - val_categorical_accuracy: 0.7519 - val_tp: 2517.0000 - val_fp: 149.0000 - val_tn: 28131.0000 - val_fn: 3139.0000 - val_precision: 0.9441 - val_recall: 0.4450 - val_auc: 0.9524 - val_prc: 0.8393 - lr: 1.7524e-04

Epoch 20: LearningRateScheduler setting learning rate to 0.0001717571430206299.
Epoch 20/66
177/177 [==============================] - 6s 34ms/step - loss: 0.1334 - categorical_accuracy: 0.7348 - tp: 2211.0000 - fp: 210.0000 - tn: 28070.0000 - fn: 3445.0000 - precision: 0.9133 - recall: 0.3909 - auc: 0.9464 - prc: 0.8114 - val_loss: 0.1167 - val_categorical_accuracy: 0.8181 - val_tp: 1891.0000 - val_fp: 44.0000 - val_tn: 28236.0000 - val_fn: 3765.0000 - val_precision: 0.9773 - val_recall: 0.3343 - val_auc: 0.9658 - val_prc: 0.8807 - lr: 1.7176e-04

Epoch 21: LearningRateScheduler setting learning rate to 0.00016808769407868386.
Epoch 21/66
177/177 [==============================] - 6s 33ms/step - loss: 0.1304 - categorical_accuracy: 0.7458 - tp: 2234.0000 - fp: 216.0000 - tn: 28064.0000 - fn: 3422.0000 - precision: 0.9118 - recall: 0.3950 - auc: 0.9495 - prc: 0.8175 - val_loss: 0.1219 - val_categorical_accuracy: 0.7298 - val_tp: 2928.0000 - val_fp: 300.0000 - val_tn: 27980.0000 - val_fn: 2728.0000 - val_precision: 0.9071 - val_recall: 0.5177 - val_auc: 0.9550 - val_prc: 0.8438 - lr: 1.6809e-04

Epoch 22: LearningRateScheduler setting learning rate to 0.00016423902052640916.
Epoch 22/66
177/177 [==============================] - 6s 34ms/step - loss: 0.1259 - categorical_accuracy: 0.7574 - tp: 2379.0000 - fp: 203.0000 - tn: 28077.0000 - fn: 3277.0000 - precision: 0.9214 - recall: 0.4206 - auc: 0.9528 - prc: 0.8306 - val_loss: 0.1081 - val_categorical_accuracy: 0.7875 - val_tp: 2681.0000 - val_fp: 136.0000 - val_tn: 28144.0000 - val_fn: 2975.0000 - val_precision: 0.9517 - val_recall: 0.4740 - val_auc: 0.9665 - val_prc: 0.8764 - lr: 1.6424e-04

Epoch 23: LearningRateScheduler setting learning rate to 0.00016022132310271263.
Epoch 23/66
177/177 [==============================] - 6s 33ms/step - loss: 0.1254 - categorical_accuracy: 0.7548 - tp: 2381.0000 - fp: 202.0000 - tn: 28078.0000 - fn: 3275.0000 - precision: 0.9218 - recall: 0.4210 - auc: 0.9533 - prc: 0.8312 - val_loss: 0.1075 - val_categorical_accuracy: 0.8149 - val_tp: 2418.0000 - val_fp: 76.0000 - val_tn: 28204.0000 - val_fn: 3238.0000 - val_precision: 0.9695 - val_recall: 0.4275 - val_auc: 0.9702 - val_prc: 0.8888 - lr: 1.6022e-04

Epoch 24: LearningRateScheduler setting learning rate to 0.0001560452473461628.
Epoch 24/66
177/177 [==============================] - 6s 34ms/step - loss: 0.1210 - categorical_accuracy: 0.7668 - tp: 2533.0000 - fp: 208.0000 - tn: 28072.0000 - fn: 3123.0000 - precision: 0.9241 - recall: 0.4478 - auc: 0.9569 - prc: 0.8431 - val_loss: 0.1036 - val_categorical_accuracy: 0.8485 - val_tp: 2263.0000 - val_fp: 21.0000 - val_tn: 28259.0000 - val_fn: 3393.0000 - val_precision: 0.9908 - val_recall: 0.4001 - val_auc: 0.9742 - val_prc: 0.9120 - lr: 1.5605e-04

Epoch 25: LearningRateScheduler setting learning rate to 0.00015172188359498977.
Epoch 25/66
177/177 [==============================] - 6s 33ms/step - loss: 0.1205 - categorical_accuracy: 0.7680 - tp: 2488.0000 - fp: 213.0000 - tn: 28067.0000 - fn: 3168.0000 - precision: 0.9211 - recall: 0.4399 - auc: 0.9575 - prc: 0.8435 - val_loss: 0.0987 - val_categorical_accuracy: 0.8506 - val_tp: 2549.0000 - val_fp: 30.0000 - val_tn: 28250.0000 - val_fn: 3107.0000 - val_precision: 0.9884 - val_recall: 0.4507 - val_auc: 0.9751 - val_prc: 0.9165 - lr: 1.5172e-04

Epoch 26: LearningRateScheduler setting learning rate to 0.00014726268988847732.
Epoch 26/66
177/177 [==============================] - 6s 33ms/step - loss: 0.1185 - categorical_accuracy: 0.7709 - tp: 2617.0000 - fp: 212.0000 - tn: 28068.0000 - fn: 3039.0000 - precision: 0.9251 - recall: 0.4627 - auc: 0.9584 - prc: 0.8494 - val_loss: 0.0969 - val_categorical_accuracy: 0.8573 - val_tp: 2564.0000 - val_fp: 25.0000 - val_tn: 28255.0000 - val_fn: 3092.0000 - val_precision: 0.9903 - val_recall: 0.4533 - val_auc: 0.9764 - val_prc: 0.9200 - lr: 1.4726e-04

Epoch 27: LearningRateScheduler setting learning rate to 0.00014267948603630066.
Epoch 27/66
177/177 [==============================] - 6s 34ms/step - loss: 0.1152 - categorical_accuracy: 0.7822 - tp: 2662.0000 - fp: 207.0000 - tn: 28073.0000 - fn: 2994.0000 - precision: 0.9278 - recall: 0.4707 - auc: 0.9609 - prc: 0.8580 - val_loss: 0.0977 - val_categorical_accuracy: 0.8089 - val_tp: 3056.0000 - val_fp: 151.0000 - val_tn: 28129.0000 - val_fn: 2600.0000 - val_precision: 0.9529 - val_recall: 0.5403 - val_auc: 0.9727 - val_prc: 0.8958 - lr: 1.4268e-04

Epoch 28: LearningRateScheduler setting learning rate to 0.00013798442989587783.
Epoch 28/66
177/177 [==============================] - 7s 42ms/step - loss: 0.1133 - categorical_accuracy: 0.7859 - tp: 2702.0000 - fp: 209.0000 - tn: 28071.0000 - fn: 2954.0000 - precision: 0.9282 - recall: 0.4777 - auc: 0.9625 - prc: 0.8615 - val_loss: 0.0906 - val_categorical_accuracy: 0.8338 - val_tp: 3232.0000 - val_fp: 126.0000 - val_tn: 28154.0000 - val_fn: 2424.0000 - val_precision: 0.9625 - val_recall: 0.5714 - val_auc: 0.9768 - val_prc: 0.9144 - lr: 1.3798e-04

Epoch 29: LearningRateScheduler setting learning rate to 0.0001331899728924036.
Epoch 29/66
177/177 [==============================] - 6s 34ms/step - loss: 0.1134 - categorical_accuracy: 0.7802 - tp: 2818.0000 - fp: 215.0000 - tn: 28065.0000 - fn: 2838.0000 - precision: 0.9291 - recall: 0.4982 - auc: 0.9626 - prc: 0.8615 - val_loss: 0.0910 - val_categorical_accuracy: 0.8803 - val_tp: 2627.0000 - val_fp: 22.0000 - val_tn: 28258.0000 - val_fn: 3029.0000 - val_precision: 0.9917 - val_recall: 0.4645 - val_auc: 0.9823 - val_prc: 0.9372 - lr: 1.3319e-04

Epoch 30: LearningRateScheduler setting learning rate to 0.00012830882443487645.
Epoch 30/66
177/177 [==============================] - 6s 34ms/step - loss: 0.1092 - categorical_accuracy: 0.7938 - tp: 2815.0000 - fp: 202.0000 - tn: 28078.0000 - fn: 2841.0000 - precision: 0.9330 - recall: 0.4977 - auc: 0.9655 - prc: 0.8708 - val_loss: 0.1129 - val_categorical_accuracy: 0.8115 - val_tp: 2017.0000 - val_fp: 28.0000 - val_tn: 28252.0000 - val_fn: 3639.0000 - val_precision: 0.9863 - val_recall: 0.3566 - val_auc: 0.9668 - val_prc: 0.8864 - lr: 1.2831e-04

Epoch 31: LearningRateScheduler setting learning rate to 0.00012335393412411214.
Epoch 31/66
177/177 [==============================] - 6s 33ms/step - loss: 0.1058 - categorical_accuracy: 0.8006 - tp: 2935.0000 - fp: 194.0000 - tn: 28086.0000 - fn: 2721.0000 - precision: 0.9380 - recall: 0.5189 - auc: 0.9674 - prc: 0.8786 - val_loss: 0.0937 - val_categorical_accuracy: 0.8768 - val_tp: 2497.0000 - val_fp: 26.0000 - val_tn: 28254.0000 - val_fn: 3159.0000 - val_precision: 0.9897 - val_recall: 0.4415 - val_auc: 0.9804 - val_prc: 0.9322 - lr: 1.2335e-04

Epoch 32: LearningRateScheduler setting learning rate to 0.0001183384413421154.
Epoch 32/66
177/177 [==============================] - 6s 33ms/step - loss: 0.1040 - categorical_accuracy: 0.8048 - tp: 2963.0000 - fp: 198.0000 - tn: 28082.0000 - fn: 2693.0000 - precision: 0.9374 - recall: 0.5239 - auc: 0.9685 - prc: 0.8837 - val_loss: 0.0798 - val_categorical_accuracy: 0.8685 - val_tp: 3382.0000 - val_fp: 74.0000 - val_tn: 28206.0000 - val_fn: 2274.0000 - val_precision: 0.9786 - val_recall: 0.5979 - val_auc: 0.9831 - val_prc: 0.9379 - lr: 1.1834e-04

Epoch 33: LearningRateScheduler setting learning rate to 0.00011327564263343811.
Epoch 33/66
177/177 [==============================] - 6s 34ms/step - loss: 0.1028 - categorical_accuracy: 0.8037 - tp: 2963.0000 - fp: 220.0000 - tn: 28060.0000 - fn: 2693.0000 - precision: 0.9309 - recall: 0.5239 - auc: 0.9697 - prc: 0.8840 - val_loss: 0.0846 - val_categorical_accuracy: 0.8379 - val_tp: 3458.0000 - val_fp: 138.0000 - val_tn: 28142.0000 - val_fn: 2198.0000 - val_precision: 0.9616 - val_recall: 0.6114 - val_auc: 0.9793 - val_prc: 0.9229 - lr: 1.1328e-04

Epoch 34: LearningRateScheduler setting learning rate to 0.00010817895315587521.
Epoch 34/66
177/177 [==============================] - 6s 33ms/step - loss: 0.1001 - categorical_accuracy: 0.8126 - tp: 3045.0000 - fp: 204.0000 - tn: 28076.0000 - fn: 2611.0000 - precision: 0.9372 - recall: 0.5384 - auc: 0.9712 - prc: 0.8911 - val_loss: 0.0758 - val_categorical_accuracy: 0.8881 - val_tp: 3407.0000 - val_fp: 52.0000 - val_tn: 28228.0000 - val_fn: 2249.0000 - val_precision: 0.9850 - val_recall: 0.6024 - val_auc: 0.9858 - val_prc: 0.9476 - lr: 1.0818e-04

Epoch 35: LearningRateScheduler setting learning rate to 0.00010306191279646009.
Epoch 35/66
177/177 [==============================] - 6s 33ms/step - loss: 0.0997 - categorical_accuracy: 0.8154 - tp: 3109.0000 - fp: 206.0000 - tn: 28074.0000 - fn: 2547.0000 - precision: 0.9379 - recall: 0.5497 - auc: 0.9712 - prc: 0.8919 - val_loss: 0.0741 - val_categorical_accuracy: 0.9093 - val_tp: 3305.0000 - val_fp: 26.0000 - val_tn: 28254.0000 - val_fn: 2351.0000 - val_precision: 0.9922 - val_recall: 0.5843 - val_auc: 0.9882 - val_prc: 0.9587 - lr: 1.0306e-04

Epoch 36: LearningRateScheduler setting learning rate to 9.793807867821306e-05.
Epoch 36/66
177/177 [==============================] - 6s 33ms/step - loss: 0.0974 - categorical_accuracy: 0.8177 - tp: 3147.0000 - fp: 207.0000 - tn: 28073.0000 - fn: 2509.0000 - precision: 0.9383 - recall: 0.5564 - auc: 0.9728 - prc: 0.8958 - val_loss: 0.0755 - val_categorical_accuracy: 0.9059 - val_tp: 3253.0000 - val_fp: 30.0000 - val_tn: 28250.0000 - val_fn: 2403.0000 - val_precision: 0.9909 - val_recall: 0.5751 - val_auc: 0.9878 - val_prc: 0.9574 - lr: 9.7938e-05

Epoch 37: LearningRateScheduler setting learning rate to 9.282103794813156e-05.
Epoch 37/66
177/177 [==============================] - 8s 43ms/step - loss: 0.0944 - categorical_accuracy: 0.8262 - tp: 3234.0000 - fp: 199.0000 - tn: 28081.0000 - fn: 2422.0000 - precision: 0.9420 - recall: 0.5718 - auc: 0.9746 - prc: 0.9032 - val_loss: 0.0734 - val_categorical_accuracy: 0.8854 - val_tp: 3492.0000 - val_fp: 72.0000 - val_tn: 28208.0000 - val_fn: 2164.0000 - val_precision: 0.9798 - val_recall: 0.6174 - val_auc: 0.9864 - val_prc: 0.9486 - lr: 9.2821e-05

Epoch 38: LearningRateScheduler setting learning rate to 8.772436033189297e-05.
Epoch 38/66
177/177 [==============================] - 6s 33ms/step - loss: 0.0941 - categorical_accuracy: 0.8280 - tp: 3287.0000 - fp: 168.0000 - tn: 28112.0000 - fn: 2369.0000 - precision: 0.9514 - recall: 0.5812 - auc: 0.9745 - prc: 0.9042 - val_loss: 0.0702 - val_categorical_accuracy: 0.9169 - val_tp: 3455.0000 - val_fp: 23.0000 - val_tn: 28257.0000 - val_fn: 2201.0000 - val_precision: 0.9934 - val_recall: 0.6109 - val_auc: 0.9896 - val_prc: 0.9637 - lr: 8.7724e-05

Epoch 39: LearningRateScheduler setting learning rate to 8.266156162321568e-05.
Epoch 39/66
177/177 [==============================] - 6s 33ms/step - loss: 0.0904 - categorical_accuracy: 0.8365 - tp: 3329.0000 - fp: 184.0000 - tn: 28096.0000 - fn: 2327.0000 - precision: 0.9476 - recall: 0.5886 - auc: 0.9768 - prc: 0.9110 - val_loss: 0.0720 - val_categorical_accuracy: 0.8773 - val_tp: 3639.0000 - val_fp: 94.0000 - val_tn: 28186.0000 - val_fn: 2017.0000 - val_precision: 0.9748 - val_recall: 0.6434 - val_auc: 0.9860 - val_prc: 0.9466 - lr: 8.2662e-05

Epoch 40: LearningRateScheduler setting learning rate to 7.764606884121895e-05.
Epoch 40/66
177/177 [==============================] - 6s 34ms/step - loss: 0.0914 - categorical_accuracy: 0.8301 - tp: 3317.0000 - fp: 207.0000 - tn: 28073.0000 - fn: 2339.0000 - precision: 0.9413 - recall: 0.5865 - auc: 0.9760 - prc: 0.9080 - val_loss: 0.0667 - val_categorical_accuracy: 0.9173 - val_tp: 3631.0000 - val_fp: 42.0000 - val_tn: 28238.0000 - val_fn: 2025.0000 - val_precision: 0.9886 - val_recall: 0.6420 - val_auc: 0.9903 - val_prc: 0.9648 - lr: 7.7646e-05

Epoch 41: LearningRateScheduler setting learning rate to 7.269117853045463e-05.
Epoch 41/66
177/177 [==============================] - 6s 33ms/step - loss: 0.0902 - categorical_accuracy: 0.8389 - tp: 3352.0000 - fp: 205.0000 - tn: 28075.0000 - fn: 2304.0000 - precision: 0.9424 - recall: 0.5926 - auc: 0.9769 - prc: 0.9109 - val_loss: 0.0665 - val_categorical_accuracy: 0.9026 - val_tp: 3678.0000 - val_fp: 45.0000 - val_tn: 28235.0000 - val_fn: 1978.0000 - val_precision: 0.9879 - val_recall: 0.6503 - val_auc: 0.9897 - val_prc: 0.9613 - lr: 7.2691e-05

Epoch 42: LearningRateScheduler setting learning rate to 6.781003303825856e-05.
Epoch 42/66
177/177 [==============================] - 6s 34ms/step - loss: 0.0871 - categorical_accuracy: 0.8377 - tp: 3430.0000 - fp: 174.0000 - tn: 28106.0000 - fn: 2226.0000 - precision: 0.9517 - recall: 0.6064 - auc: 0.9785 - prc: 0.9169 - val_loss: 0.0648 - val_categorical_accuracy: 0.9197 - val_tp: 3717.0000 - val_fp: 26.0000 - val_tn: 28254.0000 - val_fn: 1939.0000 - val_precision: 0.9931 - val_recall: 0.6572 - val_auc: 0.9907 - val_prc: 0.9677 - lr: 6.7810e-05

Epoch 43: LearningRateScheduler setting learning rate to 6.301557603478432e-05.
Epoch 43/66
177/177 [==============================] - 6s 34ms/step - loss: 0.0875 - categorical_accuracy: 0.8411 - tp: 3447.0000 - fp: 187.0000 - tn: 28093.0000 - fn: 2209.0000 - precision: 0.9485 - recall: 0.6094 - auc: 0.9784 - prc: 0.9165 - val_loss: 0.0635 - val_categorical_accuracy: 0.9075 - val_tp: 3784.0000 - val_fp: 51.0000 - val_tn: 28229.0000 - val_fn: 1872.0000 - val_precision: 0.9867 - val_recall: 0.6690 - val_auc: 0.9903 - val_prc: 0.9636 - lr: 6.3016e-05

Epoch 44: LearningRateScheduler setting learning rate to 5.83205198943615e-05.
Epoch 44/66
177/177 [==============================] - 6s 33ms/step - loss: 0.0856 - categorical_accuracy: 0.8455 - tp: 3472.0000 - fp: 178.0000 - tn: 28102.0000 - fn: 2184.0000 - precision: 0.9512 - recall: 0.6139 - auc: 0.9790 - prc: 0.9206 - val_loss: 0.0617 - val_categorical_accuracy: 0.9257 - val_tp: 3803.0000 - val_fp: 26.0000 - val_tn: 28254.0000 - val_fn: 1853.0000 - val_precision: 0.9932 - val_recall: 0.6724 - val_auc: 0.9920 - val_prc: 0.9719 - lr: 5.8321e-05

Epoch 45: LearningRateScheduler setting learning rate to 5.373731604218483e-05.
Epoch 45/66
177/177 [==============================] - 6s 33ms/step - loss: 0.0847 - categorical_accuracy: 0.8457 - tp: 3523.0000 - fp: 190.0000 - tn: 28090.0000 - fn: 2133.0000 - precision: 0.9488 - recall: 0.6229 - auc: 0.9793 - prc: 0.9215 - val_loss: 0.0661 - val_categorical_accuracy: 0.9234 - val_tp: 3593.0000 - val_fp: 20.0000 - val_tn: 28260.0000 - val_fn: 2063.0000 - val_precision: 0.9945 - val_recall: 0.6353 - val_auc: 0.9908 - val_prc: 0.9689 - lr: 5.3737e-05

Epoch 46: LearningRateScheduler setting learning rate to 4.927811640501022e-05.
Epoch 46/66
177/177 [==============================] - 8s 43ms/step - loss: 0.0844 - categorical_accuracy: 0.8471 - tp: 3504.0000 - fp: 180.0000 - tn: 28100.0000 - fn: 2152.0000 - precision: 0.9511 - recall: 0.6195 - auc: 0.9795 - prc: 0.9221 - val_loss: 0.0619 - val_categorical_accuracy: 0.9256 - val_tp: 3806.0000 - val_fp: 32.0000 - val_tn: 28248.0000 - val_fn: 1850.0000 - val_precision: 0.9917 - val_recall: 0.6729 - val_auc: 0.9916 - val_prc: 0.9707 - lr: 4.9278e-05

Epoch 47: LearningRateScheduler setting learning rate to 4.49547526538372e-05.
Epoch 47/66
177/177 [==============================] - 6s 33ms/step - loss: 0.0820 - categorical_accuracy: 0.8490 - tp: 3562.0000 - fp: 171.0000 - tn: 28109.0000 - fn: 2094.0000 - precision: 0.9542 - recall: 0.6298 - auc: 0.9810 - prc: 0.9266 - val_loss: 0.0605 - val_categorical_accuracy: 0.9144 - val_tp: 3900.0000 - val_fp: 48.0000 - val_tn: 28232.0000 - val_fn: 1756.0000 - val_precision: 0.9878 - val_recall: 0.6895 - val_auc: 0.9915 - val_prc: 0.9678 - lr: 4.4955e-05

Epoch 48: LearningRateScheduler setting learning rate to 4.077867689728737e-05.
Epoch 48/66
177/177 [==============================] - 6s 33ms/step - loss: 0.0836 - categorical_accuracy: 0.8490 - tp: 3556.0000 - fp: 181.0000 - tn: 28099.0000 - fn: 2100.0000 - precision: 0.9516 - recall: 0.6287 - auc: 0.9803 - prc: 0.9232 - val_loss: 0.0608 - val_categorical_accuracy: 0.9372 - val_tp: 3750.0000 - val_fp: 15.0000 - val_tn: 28265.0000 - val_fn: 1906.0000 - val_precision: 0.9960 - val_recall: 0.6630 - val_auc: 0.9930 - val_prc: 0.9754 - lr: 4.0779e-05

Epoch 49: LearningRateScheduler setting learning rate to 3.676097947359085e-05.
Epoch 49/66
177/177 [==============================] - 6s 33ms/step - loss: 0.0809 - categorical_accuracy: 0.8598 - tp: 3609.0000 - fp: 184.0000 - tn: 28096.0000 - fn: 2047.0000 - precision: 0.9515 - recall: 0.6381 - auc: 0.9818 - prc: 0.9284 - val_loss: 0.0582 - val_categorical_accuracy: 0.9275 - val_tp: 3941.0000 - val_fp: 31.0000 - val_tn: 28249.0000 - val_fn: 1715.0000 - val_precision: 0.9922 - val_recall: 0.6968 - val_auc: 0.9927 - val_prc: 0.9738 - lr: 3.6761e-05

Epoch 50: LearningRateScheduler setting learning rate to 3.2912305921316145e-05.
Epoch 50/66
177/177 [==============================] - 6s 34ms/step - loss: 0.0808 - categorical_accuracy: 0.8540 - tp: 3614.0000 - fp: 176.0000 - tn: 28104.0000 - fn: 2042.0000 - precision: 0.9536 - recall: 0.6390 - auc: 0.9817 - prc: 0.9290 - val_loss: 0.0620 - val_categorical_accuracy: 0.8952 - val_tp: 4008.0000 - val_fp: 90.0000 - val_tn: 28190.0000 - val_fn: 1648.0000 - val_precision: 0.9780 - val_recall: 0.7086 - val_auc: 0.9895 - val_prc: 0.9601 - lr: 3.2912e-05

Epoch 51: LearningRateScheduler setting learning rate to 2.9242862910032273e-05.
Epoch 51/66
177/177 [==============================] - 6s 33ms/step - loss: 0.0814 - categorical_accuracy: 0.8541 - tp: 3607.0000 - fp: 182.0000 - tn: 28098.0000 - fn: 2049.0000 - precision: 0.9520 - recall: 0.6377 - auc: 0.9811 - prc: 0.9275 - val_loss: 0.0570 - val_categorical_accuracy: 0.9256 - val_tp: 3967.0000 - val_fp: 36.0000 - val_tn: 28244.0000 - val_fn: 1689.0000 - val_precision: 0.9910 - val_recall: 0.7014 - val_auc: 0.9927 - val_prc: 0.9728 - lr: 2.9243e-05

Epoch 52: LearningRateScheduler setting learning rate to 2.576237672567368e-05.
Epoch 52/66
177/177 [==============================] - 6s 33ms/step - loss: 0.0797 - categorical_accuracy: 0.8580 - tp: 3660.0000 - fp: 181.0000 - tn: 28099.0000 - fn: 1996.0000 - precision: 0.9529 - recall: 0.6471 - auc: 0.9821 - prc: 0.9307 - val_loss: 0.0568 - val_categorical_accuracy: 0.9254 - val_tp: 4004.0000 - val_fp: 32.0000 - val_tn: 28248.0000 - val_fn: 1652.0000 - val_precision: 0.9921 - val_recall: 0.7079 - val_auc: 0.9927 - val_prc: 0.9732 - lr: 2.5762e-05

Epoch 53: LearningRateScheduler setting learning rate to 2.2480081409215928e-05.
Epoch 53/66
177/177 [==============================] - 6s 34ms/step - loss: 0.0795 - categorical_accuracy: 0.8564 - tp: 3640.0000 - fp: 180.0000 - tn: 28100.0000 - fn: 2016.0000 - precision: 0.9529 - recall: 0.6436 - auc: 0.9821 - prc: 0.9310 - val_loss: 0.0576 - val_categorical_accuracy: 0.9167 - val_tp: 4015.0000 - val_fp: 54.0000 - val_tn: 28226.0000 - val_fn: 1641.0000 - val_precision: 0.9867 - val_recall: 0.7099 - val_auc: 0.9919 - val_prc: 0.9691 - lr: 2.2480e-05

Epoch 54: LearningRateScheduler setting learning rate to 1.9404683172702792e-05.
Epoch 54/66
177/177 [==============================] - 6s 33ms/step - loss: 0.0781 - categorical_accuracy: 0.8591 - tp: 3679.0000 - fp: 179.0000 - tn: 28101.0000 - fn: 1977.0000 - precision: 0.9536 - recall: 0.6505 - auc: 0.9830 - prc: 0.9336 - val_loss: 0.0553 - val_categorical_accuracy: 0.9321 - val_tp: 4008.0000 - val_fp: 25.0000 - val_tn: 28255.0000 - val_fn: 1648.0000 - val_precision: 0.9938 - val_recall: 0.7086 - val_auc: 0.9936 - val_prc: 0.9769 - lr: 1.9405e-05

Epoch 55: LearningRateScheduler setting learning rate to 1.6544336676597598e-05.
Epoch 55/66
177/177 [==============================] - 6s 33ms/step - loss: 0.0774 - categorical_accuracy: 0.8621 - tp: 3699.0000 - fp: 181.0000 - tn: 28099.0000 - fn: 1957.0000 - precision: 0.9534 - recall: 0.6540 - auc: 0.9831 - prc: 0.9343 - val_loss: 0.0557 - val_categorical_accuracy: 0.9371 - val_tp: 4038.0000 - val_fp: 16.0000 - val_tn: 28264.0000 - val_fn: 1618.0000 - val_precision: 0.9961 - val_recall: 0.7139 - val_auc: 0.9935 - val_prc: 0.9778 - lr: 1.6544e-05

Epoch 56: LearningRateScheduler setting learning rate to 1.390662130713463e-05.
Epoch 56/66
177/177 [==============================] - 7s 42ms/step - loss: 0.0791 - categorical_accuracy: 0.8573 - tp: 3656.0000 - fp: 179.0000 - tn: 28101.0000 - fn: 2000.0000 - precision: 0.9533 - recall: 0.6464 - auc: 0.9823 - prc: 0.9314 - val_loss: 0.0554 - val_categorical_accuracy: 0.9257 - val_tp: 4050.0000 - val_fp: 38.0000 - val_tn: 28242.0000 - val_fn: 1606.0000 - val_precision: 0.9907 - val_recall: 0.7161 - val_auc: 0.9930 - val_prc: 0.9737 - lr: 1.3907e-05

Epoch 57: LearningRateScheduler setting learning rate to 1.1498535245656969e-05.
Epoch 57/66
177/177 [==============================] - 6s 33ms/step - loss: 0.0769 - categorical_accuracy: 0.8653 - tp: 3706.0000 - fp: 181.0000 - tn: 28099.0000 - fn: 1950.0000 - precision: 0.9534 - recall: 0.6552 - auc: 0.9833 - prc: 0.9355 - val_loss: 0.0548 - val_categorical_accuracy: 0.9356 - val_tp: 4049.0000 - val_fp: 20.0000 - val_tn: 28260.0000 - val_fn: 1607.0000 - val_precision: 0.9951 - val_recall: 0.7159 - val_auc: 0.9937 - val_prc: 0.9779 - lr: 1.1499e-05

Epoch 58: LearningRateScheduler setting learning rate to 9.326471745967866e-06.
Epoch 58/66
177/177 [==============================] - 6s 33ms/step - loss: 0.0760 - categorical_accuracy: 0.8651 - tp: 3737.0000 - fp: 167.0000 - tn: 28113.0000 - fn: 1919.0000 - precision: 0.9572 - recall: 0.6607 - auc: 0.9838 - prc: 0.9370 - val_loss: 0.0546 - val_categorical_accuracy: 0.9379 - val_tp: 4045.0000 - val_fp: 22.0000 - val_tn: 28258.0000 - val_fn: 1611.0000 - val_precision: 0.9946 - val_recall: 0.7152 - val_auc: 0.9938 - val_prc: 0.9784 - lr: 9.3265e-06

Epoch 59: LearningRateScheduler setting learning rate to 7.396177619695664e-06.
Epoch 59/66
177/177 [==============================] - 6s 33ms/step - loss: 0.0761 - categorical_accuracy: 0.8709 - tp: 3738.0000 - fp: 168.0000 - tn: 28112.0000 - fn: 1918.0000 - precision: 0.9570 - recall: 0.6609 - auc: 0.9836 - prc: 0.9370 - val_loss: 0.0547 - val_categorical_accuracy: 0.9277 - val_tp: 4090.0000 - val_fp: 37.0000 - val_tn: 28243.0000 - val_fn: 1566.0000 - val_precision: 0.9910 - val_recall: 0.7231 - val_auc: 0.9931 - val_prc: 0.9743 - lr: 7.3962e-06

Epoch 60: LearningRateScheduler setting learning rate to 5.712782889604569e-06.
Epoch 60/66
177/177 [==============================] - 6s 33ms/step - loss: 0.0796 - categorical_accuracy: 0.8612 - tp: 3737.0000 - fp: 191.0000 - tn: 28089.0000 - fn: 1919.0000 - precision: 0.9514 - recall: 0.6607 - auc: 0.9820 - prc: 0.9308 - val_loss: 0.0542 - val_categorical_accuracy: 0.9307 - val_tp: 4079.0000 - val_fp: 37.0000 - val_tn: 28243.0000 - val_fn: 1577.0000 - val_precision: 0.9910 - val_recall: 0.7212 - val_auc: 0.9935 - val_prc: 0.9760 - lr: 5.7128e-06

Epoch 61: LearningRateScheduler setting learning rate to 4.280741482973099e-06.
Epoch 61/66
177/177 [==============================] - 6s 34ms/step - loss: 0.0761 - categorical_accuracy: 0.8644 - tp: 3738.0000 - fp: 157.0000 - tn: 28123.0000 - fn: 1918.0000 - precision: 0.9597 - recall: 0.6609 - auc: 0.9836 - prc: 0.9368 - val_loss: 0.0539 - val_categorical_accuracy: 0.9333 - val_tp: 4101.0000 - val_fp: 31.0000 - val_tn: 28249.0000 - val_fn: 1555.0000 - val_precision: 0.9925 - val_recall: 0.7251 - val_auc: 0.9937 - val_prc: 0.9770 - lr: 4.2807e-06

Epoch 62: LearningRateScheduler setting learning rate to 3.103854954242706e-06.
Epoch 62/66
177/177 [==============================] - 6s 34ms/step - loss: 0.0747 - categorical_accuracy: 0.8678 - tp: 3742.0000 - fp: 165.0000 - tn: 28115.0000 - fn: 1914.0000 - precision: 0.9578 - recall: 0.6616 - auc: 0.9844 - prc: 0.9393 - val_loss: 0.0537 - val_categorical_accuracy: 0.9353 - val_tp: 4088.0000 - val_fp: 28.0000 - val_tn: 28252.0000 - val_fn: 1568.0000 - val_precision: 0.9932 - val_recall: 0.7228 - val_auc: 0.9939 - val_prc: 0.9779 - lr: 3.1039e-06

Epoch 63: LearningRateScheduler setting learning rate to 2.1852546930313114e-06.
Epoch 63/66
177/177 [==============================] - 6s 34ms/step - loss: 0.0750 - categorical_accuracy: 0.8688 - tp: 3759.0000 - fp: 166.0000 - tn: 28114.0000 - fn: 1897.0000 - precision: 0.9577 - recall: 0.6646 - auc: 0.9844 - prc: 0.9388 - val_loss: 0.0536 - val_categorical_accuracy: 0.9346 - val_tp: 4101.0000 - val_fp: 31.0000 - val_tn: 28249.0000 - val_fn: 1555.0000 - val_precision: 0.9925 - val_recall: 0.7251 - val_auc: 0.9938 - val_prc: 0.9774 - lr: 2.1853e-06

Epoch 64: LearningRateScheduler setting learning rate to 1.5273604094982149e-06.
Epoch 64/66
177/177 [==============================] - 6s 34ms/step - loss: 0.0749 - categorical_accuracy: 0.8725 - tp: 3760.0000 - fp: 161.0000 - tn: 28119.0000 - fn: 1896.0000 - precision: 0.9589 - recall: 0.6648 - auc: 0.9845 - prc: 0.9391 - val_loss: 0.0536 - val_categorical_accuracy: 0.9332 - val_tp: 4103.0000 - val_fp: 30.0000 - val_tn: 28250.0000 - val_fn: 1553.0000 - val_precision: 0.9927 - val_recall: 0.7254 - val_auc: 0.9937 - val_prc: 0.9772 - lr: 1.5274e-06

Epoch 65: LearningRateScheduler setting learning rate to 1.1319275796413421e-06.
Epoch 65/66
177/177 [==============================] - 8s 43ms/step - loss: 0.0735 - categorical_accuracy: 0.8678 - tp: 3792.0000 - fp: 132.0000 - tn: 28148.0000 - fn: 1864.0000 - precision: 0.9664 - recall: 0.6704 - auc: 0.9848 - prc: 0.9420 - val_loss: 0.0536 - val_categorical_accuracy: 0.9332 - val_tp: 4103.0000 - val_fp: 30.0000 - val_tn: 28250.0000 - val_fn: 1553.0000 - val_precision: 0.9927 - val_recall: 0.7254 - val_auc: 0.9937 - val_prc: 0.9773 - lr: 1.1319e-06

Epoch 66: LearningRateScheduler setting learning rate to 1e-06.
Epoch 66/66
177/177 [==============================] - 6s 34ms/step - loss: 0.0743 - categorical_accuracy: 0.8725 - tp: 3784.0000 - fp: 163.0000 - tn: 28117.0000 - fn: 1872.0000 - precision: 0.9587 - recall: 0.6690 - auc: 0.9846 - prc: 0.9399 - val_loss: 0.0534 - val_categorical_accuracy: 0.9355 - val_tp: 4093.0000 - val_fp: 27.0000 - val_tn: 28253.0000 - val_fn: 1563.0000 - val_precision: 0.9934 - val_recall: 0.7237 - val_auc: 0.9940 - val_prc: 0.9785 - lr: 1.0000e-06
5.3.4 Evaluate learning rate optimization on TEST set¶
In [231]:
evaluate_model_performance(lr_optimized_model, ds_validation, ds_test)
177/177 [==============================] - 2s 11ms/step - loss: 0.0534 - categorical_accuracy: 0.9355 - tp: 4093.0000 - fp: 27.0000 - tn: 28253.0000 - fn: 1563.0000 - precision: 0.9934 - recall: 0.7237 - auc: 0.9940 - prc: 0.9785
Validation AUC: 0.994
Validation PRC: 0.979
Validation categorical accuracy: 0.935
59/59 [==============================] - 0s 7ms/step - loss: 0.2105 - categorical_accuracy: 0.6403 - tp: 835.0000 - fp: 256.0000 - tn: 9169.0000 - fn: 1050.0000 - precision: 0.7654 - recall: 0.4430 - auc: 0.9030 - prc: 0.6800
Test AUC: 0.903
Test PRC: 0.680
Test categorical accuracy: 0.640

The takeaway here is that learning rate optimization improved the model performance slightly.

5.3.5 Adding Data Augmentation¶
In [34]:
# Fresh data
(train_ds, val_ds, test_ds), info = tfds.load('cassava', 
                                         split=['train', 'validation', 'test'],
                                         shuffle_files=True,
                                         as_supervised=True,
                                         with_info=True)
augmented_train = prepare(train_ds, shuffle=True, augment=True)
val_ds = prepare(val_ds)
test_ds = prepare(test_ds)
In [15]:
def data_augmentations(image,label):
    
    image = tf.image.random_flip_left_right(image)
    image = tf.image.random_flip_up_down(image)
    
    if tf.random.uniform([], 0, 1.0, dtype = tf.float32) > 0.75:
        image = tf.image.transpose(image)
    
    probablity_rotation = tf.random.uniform([], 0, 1.0, dtype = tf.float32)
    if probablity_rotation > 0.75:
        image = tf.image.rot90(image, k = 3)
    elif probablity_rotation > 0.5:
        image = tf.image.rot90(image, k = 2)
    elif probablity_rotation > 0.25:
        image = tf.image.rot90(image, k = 1)
        
    if tf.random.uniform([], 0, 1.0, dtype = tf.float32) >= 0.4:
        image = tf.image.random_saturation(image, lower = 0.8, upper = 1.2)
    if tf.random.uniform([], 0, 1.0, dtype = tf.float32) >= 0.4:
        image = tf.image.random_contrast(image, lower = 0.8, upper = 1.2)
    if tf.random.uniform([], 0, 1.0, dtype = tf.float32) >= 0.4:
        image = tf.image.random_brightness(image, max_delta = 0.1)
    
#     probability_cropping = tf.random.uniform([], 0, 1.0, dtype = tf.float32)
#     if probability_cropping > 0.7:
#         if probability_cropping > 0.9:
#             image = tf.image.central_crop(image, central_fraction = 0.7)
#         elif probability_cropping > 0.8:
#             image = tf.image.central_crop(image, central_fraction = 0.8)
#         else:
#             image = tf.image.central_crop(image, central_fraction = 0.9)
#     elif probability_cropping > 0.5:
#         image = tf.image.central_crop(image, central_fraction = 0.6)
#         # image = tf.image.random_flip_up_down(image)
    
    return image, label
In [16]:
# re-write data preprocessing function
def prepare(ds, shuffle=False, augment=False, img_size=(224,224)):
    
    # Resize and rescale all datasets.
    ds = ds.map(data_preprocessing, 
              num_parallel_calls=tf.data.AUTOTUNE)

    if shuffle:
        ds = ds.cache()
        ds = ds.shuffle(tf.data.experimental.cardinality(ds).numpy())

    # Batch all datasets.
    ds = ds.batch(BATCH_SIZE)

    # Use data augmentation only on the training set.
    if augment:
        ds = ds.map(data_augmentations, 
                    num_parallel_calls=tf.data.AUTOTUNE)
    
    
    # Use buffered prefetching on all datasets.
    return ds.prefetch(buffer_size=tf.data.AUTOTUNE)
In [31]:
modelname = "data_augmented_model"
# Add Learning rate scheduler callback
callbacks_list = [
    # keras.callbacks.EarlyStopping(monitor="val_prc", patience=5, restore_best_weights=True), # interrupts training when categorical accurcay has stopped improving for 5 epochs
    keras.callbacks.ModelCheckpoint(filepath=f"models/{modelname}.keras", monitor="val_prc", save_best_only=True), # prevents overwriting model file unless validation loss has improved
    keras.callbacks.TensorBoard(log_dir=f"./tensorboard/{modelname}"), # path where callback writes logs
    # keras.callbacks.LearningRateScheduler(lambda epoch: lrfn(epoch), verbose=1) # Use learning rate opt function
]
# Get low capacity model with dropout
data_augmented_model = get_model_with_dropout(modelname, low_capacity=True, dropout=True)
# Print model summary
data_augmented_model.summary()
data_augmented_model_history = data_augmented_model.fit(
    augmented_train,
    epochs=66,
    validation_data=val_ds,
    callbacks=callbacks_list
)
Model: "data_augmented_model"
_________________________________________________________________
 Layer (type)                Output Shape              Param #   
=================================================================
 preprocessedimage (InputLay  [(None, 224, 224, 3)]    0         
 er)                                                             
                                                                 
 conv2d (Conv2D)             (None, 222, 222, 8)       224       
                                                                 
 max_pooling2d (MaxPooling2D  (None, 111, 111, 8)      0         
 )                                                               
                                                                 
 conv2d_1 (Conv2D)           (None, 109, 109, 16)      1168      
                                                                 
 max_pooling2d_1 (MaxPooling  (None, 54, 54, 16)       0         
 2D)                                                             
                                                                 
 conv2d_2 (Conv2D)           (None, 52, 52, 32)        4640      
                                                                 
 dropout (Dropout)           (None, 52, 52, 32)        0         
                                                                 
 flatten (Flatten)           (None, 86528)             0         
                                                                 
 dropout_1 (Dropout)         (None, 86528)             0         
                                                                 
 softmax_layer (Dense)       (None, 6)                 519174    
                                                                 
=================================================================
Total params: 525,206
Trainable params: 525,206
Non-trainable params: 0
_________________________________________________________________
Epoch 1/66
2023-03-02 06:25:52.266826: E tensorflow/core/grappler/optimizers/meta_optimizer.cc:954] layout failed: INVALID_ARGUMENT: Size of values 0 does not match size of permutation 4 @ fanin shape indata_augmented_model/dropout/dropout/SelectV2-2-TransposeNHWCToNCHW-LayoutOptimizer
2023-03-02 06:26:02.959764: I tensorflow/compiler/xla/stream_executor/cuda/cuda_dnn.cc:428] Loaded cuDNN version 8200
2023-03-02 06:26:07.825110: I tensorflow/compiler/xla/service/service.cc:173] XLA service 0x7f739007d060 initialized for platform CUDA (this does not guarantee that XLA will be used). Devices:
2023-03-02 06:26:07.825160: I tensorflow/compiler/xla/service/service.cc:181]   StreamExecutor device (0): Tesla T4, Compute Capability 7.5
2023-03-02 06:26:07.887403: I tensorflow/compiler/mlir/tensorflow/utils/dump_mlir_util.cc:268] disabling MLIR crash reproducer, set env var `MLIR_CRASH_REPRODUCER_DIRECTORY` to enable.
2023-03-02 06:26:08.766327: I tensorflow/compiler/jit/xla_compilation_cache.cc:477] Compiled cluster using XLA!  This line is logged at most once for the lifetime of the process.
177/177 [==============================] - 28s 44ms/step - loss: 0.2353 - categorical_accuracy: 0.4981 - tp: 139.0000 - fp: 80.0000 - tn: 28200.0000 - fn: 5517.0000 - precision: 0.6347 - recall: 0.0246 - auc: 0.8172 - prc: 0.4694 - val_loss: 0.2199 - val_categorical_accuracy: 0.5580 - val_tp: 431.0000 - val_fp: 165.0000 - val_tn: 9280.0000 - val_fn: 1458.0000 - val_precision: 0.7232 - val_recall: 0.2282 - val_auc: 0.8565 - val_prc: 0.5656
Epoch 2/66
177/177 [==============================] - 7s 36ms/step - loss: 0.2151 - categorical_accuracy: 0.5629 - tp: 277.0000 - fp: 107.0000 - tn: 28173.0000 - fn: 5379.0000 - precision: 0.7214 - recall: 0.0490 - auc: 0.8507 - prc: 0.5427 - val_loss: 0.2062 - val_categorical_accuracy: 0.5781 - val_tp: 188.0000 - val_fp: 48.0000 - val_tn: 9397.0000 - val_fn: 1701.0000 - val_precision: 0.7966 - val_recall: 0.0995 - val_auc: 0.8626 - val_prc: 0.5816
Epoch 3/66
177/177 [==============================] - 7s 37ms/step - loss: 0.2101 - categorical_accuracy: 0.5758 - tp: 377.0000 - fp: 141.0000 - tn: 28139.0000 - fn: 5279.0000 - precision: 0.7278 - recall: 0.0667 - auc: 0.8579 - prc: 0.5592 - val_loss: 0.2032 - val_categorical_accuracy: 0.5887 - val_tp: 48.0000 - val_fp: 13.0000 - val_tn: 9432.0000 - val_fn: 1841.0000 - val_precision: 0.7869 - val_recall: 0.0254 - val_auc: 0.8659 - val_prc: 0.5815
Epoch 4/66
177/177 [==============================] - 6s 36ms/step - loss: 0.2098 - categorical_accuracy: 0.5812 - tp: 437.0000 - fp: 154.0000 - tn: 28126.0000 - fn: 5219.0000 - precision: 0.7394 - recall: 0.0773 - auc: 0.8586 - prc: 0.5606 - val_loss: 0.2027 - val_categorical_accuracy: 0.5844 - val_tp: 53.0000 - val_fp: 19.0000 - val_tn: 9426.0000 - val_fn: 1836.0000 - val_precision: 0.7361 - val_recall: 0.0281 - val_auc: 0.8673 - val_prc: 0.5828
Epoch 5/66
177/177 [==============================] - 6s 36ms/step - loss: 0.2084 - categorical_accuracy: 0.5838 - tp: 412.0000 - fp: 151.0000 - tn: 28129.0000 - fn: 5244.0000 - precision: 0.7318 - recall: 0.0728 - auc: 0.8606 - prc: 0.5662 - val_loss: 0.2015 - val_categorical_accuracy: 0.5876 - val_tp: 75.0000 - val_fp: 28.0000 - val_tn: 9417.0000 - val_fn: 1814.0000 - val_precision: 0.7282 - val_recall: 0.0397 - val_auc: 0.8690 - val_prc: 0.5824
Epoch 6/66
177/177 [==============================] - 7s 38ms/step - loss: 0.2087 - categorical_accuracy: 0.5879 - tp: 480.0000 - fp: 171.0000 - tn: 28109.0000 - fn: 5176.0000 - precision: 0.7373 - recall: 0.0849 - auc: 0.8615 - prc: 0.5671 - val_loss: 0.2103 - val_categorical_accuracy: 0.5797 - val_tp: 8.0000 - val_fp: 3.0000 - val_tn: 9442.0000 - val_fn: 1881.0000 - val_precision: 0.7273 - val_recall: 0.0042 - val_auc: 0.8598 - val_prc: 0.5676
Epoch 7/66
177/177 [==============================] - 7s 37ms/step - loss: 0.2064 - categorical_accuracy: 0.5873 - tp: 532.0000 - fp: 180.0000 - tn: 28100.0000 - fn: 5124.0000 - precision: 0.7472 - recall: 0.0941 - auc: 0.8641 - prc: 0.5770 - val_loss: 0.2029 - val_categorical_accuracy: 0.5823 - val_tp: 88.0000 - val_fp: 22.0000 - val_tn: 9423.0000 - val_fn: 1801.0000 - val_precision: 0.8000 - val_recall: 0.0466 - val_auc: 0.8690 - val_prc: 0.5925
Epoch 8/66
177/177 [==============================] - 9s 50ms/step - loss: 0.2044 - categorical_accuracy: 0.5902 - tp: 597.0000 - fp: 168.0000 - tn: 28112.0000 - fn: 5059.0000 - precision: 0.7804 - recall: 0.1056 - auc: 0.8662 - prc: 0.5887 - val_loss: 0.2037 - val_categorical_accuracy: 0.6014 - val_tp: 13.0000 - val_fp: 2.0000 - val_tn: 9443.0000 - val_fn: 1876.0000 - val_precision: 0.8667 - val_recall: 0.0069 - val_auc: 0.8676 - val_prc: 0.6003
Epoch 9/66
177/177 [==============================] - 6s 36ms/step - loss: 0.2042 - categorical_accuracy: 0.5987 - tp: 605.0000 - fp: 182.0000 - tn: 28098.0000 - fn: 5051.0000 - precision: 0.7687 - recall: 0.1070 - auc: 0.8683 - prc: 0.5839 - val_loss: 0.1994 - val_categorical_accuracy: 0.6003 - val_tp: 155.0000 - val_fp: 38.0000 - val_tn: 9407.0000 - val_fn: 1734.0000 - val_precision: 0.8031 - val_recall: 0.0821 - val_auc: 0.8695 - val_prc: 0.6074
Epoch 10/66
177/177 [==============================] - 6s 36ms/step - loss: 0.2027 - categorical_accuracy: 0.5999 - tp: 604.0000 - fp: 197.0000 - tn: 28083.0000 - fn: 5052.0000 - precision: 0.7541 - recall: 0.1068 - auc: 0.8697 - prc: 0.5902 - val_loss: 0.1984 - val_categorical_accuracy: 0.5982 - val_tp: 447.0000 - val_fp: 124.0000 - val_tn: 9321.0000 - val_fn: 1442.0000 - val_precision: 0.7828 - val_recall: 0.2366 - val_auc: 0.8777 - val_prc: 0.6124
Epoch 11/66
177/177 [==============================] - 6s 36ms/step - loss: 0.1997 - categorical_accuracy: 0.6040 - tp: 681.0000 - fp: 187.0000 - tn: 28093.0000 - fn: 4975.0000 - precision: 0.7846 - recall: 0.1204 - auc: 0.8741 - prc: 0.6022 - val_loss: 0.1921 - val_categorical_accuracy: 0.6151 - val_tp: 139.0000 - val_fp: 28.0000 - val_tn: 9417.0000 - val_fn: 1750.0000 - val_precision: 0.8323 - val_recall: 0.0736 - val_auc: 0.8818 - val_prc: 0.6262
Epoch 12/66
177/177 [==============================] - 6s 36ms/step - loss: 0.1988 - categorical_accuracy: 0.6025 - tp: 743.0000 - fp: 189.0000 - tn: 28091.0000 - fn: 4913.0000 - precision: 0.7972 - recall: 0.1314 - auc: 0.8748 - prc: 0.6074 - val_loss: 0.1949 - val_categorical_accuracy: 0.6167 - val_tp: 476.0000 - val_fp: 134.0000 - val_tn: 9311.0000 - val_fn: 1413.0000 - val_precision: 0.7803 - val_recall: 0.2520 - val_auc: 0.8822 - val_prc: 0.6282
Epoch 13/66
177/177 [==============================] - 6s 36ms/step - loss: 0.1977 - categorical_accuracy: 0.5983 - tp: 721.0000 - fp: 220.0000 - tn: 28060.0000 - fn: 4935.0000 - precision: 0.7662 - recall: 0.1275 - auc: 0.8779 - prc: 0.6070 - val_loss: 0.1921 - val_categorical_accuracy: 0.6173 - val_tp: 156.0000 - val_fp: 34.0000 - val_tn: 9411.0000 - val_fn: 1733.0000 - val_precision: 0.8211 - val_recall: 0.0826 - val_auc: 0.8821 - val_prc: 0.6235
Epoch 14/66
177/177 [==============================] - 7s 36ms/step - loss: 0.1976 - categorical_accuracy: 0.6022 - tp: 779.0000 - fp: 205.0000 - tn: 28075.0000 - fn: 4877.0000 - precision: 0.7917 - recall: 0.1377 - auc: 0.8774 - prc: 0.6127 - val_loss: 0.2037 - val_categorical_accuracy: 0.6019 - val_tp: 352.0000 - val_fp: 115.0000 - val_tn: 9330.0000 - val_fn: 1537.0000 - val_precision: 0.7537 - val_recall: 0.1863 - val_auc: 0.8717 - val_prc: 0.5999
Epoch 15/66
177/177 [==============================] - 6s 36ms/step - loss: 0.1998 - categorical_accuracy: 0.6071 - tp: 743.0000 - fp: 225.0000 - tn: 28055.0000 - fn: 4913.0000 - precision: 0.7676 - recall: 0.1314 - auc: 0.8756 - prc: 0.6042 - val_loss: 0.1986 - val_categorical_accuracy: 0.6130 - val_tp: 119.0000 - val_fp: 29.0000 - val_tn: 9416.0000 - val_fn: 1770.0000 - val_precision: 0.8041 - val_recall: 0.0630 - val_auc: 0.8790 - val_prc: 0.6082
Epoch 16/66
177/177 [==============================] - 6s 36ms/step - loss: 0.1982 - categorical_accuracy: 0.6093 - tp: 801.0000 - fp: 238.0000 - tn: 28042.0000 - fn: 4855.0000 - precision: 0.7709 - recall: 0.1416 - auc: 0.8775 - prc: 0.6072 - val_loss: 0.1949 - val_categorical_accuracy: 0.6130 - val_tp: 58.0000 - val_fp: 16.0000 - val_tn: 9429.0000 - val_fn: 1831.0000 - val_precision: 0.7838 - val_recall: 0.0307 - val_auc: 0.8821 - val_prc: 0.6220
Epoch 17/66
177/177 [==============================] - 9s 49ms/step - loss: 0.1973 - categorical_accuracy: 0.6117 - tp: 755.0000 - fp: 218.0000 - tn: 28062.0000 - fn: 4901.0000 - precision: 0.7760 - recall: 0.1335 - auc: 0.8786 - prc: 0.6101 - val_loss: 0.1968 - val_categorical_accuracy: 0.6210 - val_tp: 63.0000 - val_fp: 22.0000 - val_tn: 9423.0000 - val_fn: 1826.0000 - val_precision: 0.7412 - val_recall: 0.0334 - val_auc: 0.8826 - val_prc: 0.6012
Epoch 18/66
177/177 [==============================] - 6s 36ms/step - loss: 0.1965 - categorical_accuracy: 0.6132 - tp: 825.0000 - fp: 243.0000 - tn: 28037.0000 - fn: 4831.0000 - precision: 0.7725 - recall: 0.1459 - auc: 0.8804 - prc: 0.6144 - val_loss: 0.1892 - val_categorical_accuracy: 0.6236 - val_tp: 135.0000 - val_fp: 29.0000 - val_tn: 9416.0000 - val_fn: 1754.0000 - val_precision: 0.8232 - val_recall: 0.0715 - val_auc: 0.8873 - val_prc: 0.6327
Epoch 19/66
177/177 [==============================] - 7s 36ms/step - loss: 0.1965 - categorical_accuracy: 0.6112 - tp: 822.0000 - fp: 235.0000 - tn: 28045.0000 - fn: 4834.0000 - precision: 0.7777 - recall: 0.1453 - auc: 0.8808 - prc: 0.6126 - val_loss: 0.1881 - val_categorical_accuracy: 0.6289 - val_tp: 148.0000 - val_fp: 35.0000 - val_tn: 9410.0000 - val_fn: 1741.0000 - val_precision: 0.8087 - val_recall: 0.0783 - val_auc: 0.8885 - val_prc: 0.6409
Epoch 20/66
177/177 [==============================] - 7s 36ms/step - loss: 0.1947 - categorical_accuracy: 0.6064 - tp: 899.0000 - fp: 233.0000 - tn: 28047.0000 - fn: 4757.0000 - precision: 0.7942 - recall: 0.1589 - auc: 0.8827 - prc: 0.6229 - val_loss: 0.1916 - val_categorical_accuracy: 0.6278 - val_tp: 142.0000 - val_fp: 41.0000 - val_tn: 9404.0000 - val_fn: 1747.0000 - val_precision: 0.7760 - val_recall: 0.0752 - val_auc: 0.8859 - val_prc: 0.6230
Epoch 21/66
177/177 [==============================] - 6s 36ms/step - loss: 0.1938 - categorical_accuracy: 0.6091 - tp: 922.0000 - fp: 221.0000 - tn: 28059.0000 - fn: 4734.0000 - precision: 0.8066 - recall: 0.1630 - auc: 0.8829 - prc: 0.6280 - val_loss: 0.1900 - val_categorical_accuracy: 0.6226 - val_tp: 62.0000 - val_fp: 14.0000 - val_tn: 9431.0000 - val_fn: 1827.0000 - val_precision: 0.8158 - val_recall: 0.0328 - val_auc: 0.8901 - val_prc: 0.6422
Epoch 22/66
177/177 [==============================] - 6s 36ms/step - loss: 0.1950 - categorical_accuracy: 0.6169 - tp: 886.0000 - fp: 235.0000 - tn: 28045.0000 - fn: 4770.0000 - precision: 0.7904 - recall: 0.1566 - auc: 0.8840 - prc: 0.6223 - val_loss: 0.1883 - val_categorical_accuracy: 0.6183 - val_tp: 376.0000 - val_fp: 88.0000 - val_tn: 9357.0000 - val_fn: 1513.0000 - val_precision: 0.8103 - val_recall: 0.1990 - val_auc: 0.8893 - val_prc: 0.6428
Epoch 23/66
177/177 [==============================] - 7s 36ms/step - loss: 0.1954 - categorical_accuracy: 0.6146 - tp: 877.0000 - fp: 243.0000 - tn: 28037.0000 - fn: 4779.0000 - precision: 0.7830 - recall: 0.1551 - auc: 0.8832 - prc: 0.6202 - val_loss: 0.1980 - val_categorical_accuracy: 0.6162 - val_tp: 162.0000 - val_fp: 69.0000 - val_tn: 9376.0000 - val_fn: 1727.0000 - val_precision: 0.7013 - val_recall: 0.0858 - val_auc: 0.8799 - val_prc: 0.5959
Epoch 24/66
177/177 [==============================] - 6s 36ms/step - loss: 0.1932 - categorical_accuracy: 0.6163 - tp: 928.0000 - fp: 218.0000 - tn: 28062.0000 - fn: 4728.0000 - precision: 0.8098 - recall: 0.1641 - auc: 0.8858 - prc: 0.6312 - val_loss: 0.1894 - val_categorical_accuracy: 0.6236 - val_tp: 227.0000 - val_fp: 39.0000 - val_tn: 9406.0000 - val_fn: 1662.0000 - val_precision: 0.8534 - val_recall: 0.1202 - val_auc: 0.8875 - val_prc: 0.6452
Epoch 25/66
177/177 [==============================] - 10s 54ms/step - loss: 0.1941 - categorical_accuracy: 0.6114 - tp: 931.0000 - fp: 235.0000 - tn: 28045.0000 - fn: 4725.0000 - precision: 0.7985 - recall: 0.1646 - auc: 0.8843 - prc: 0.6282 - val_loss: 0.1889 - val_categorical_accuracy: 0.6321 - val_tp: 90.0000 - val_fp: 22.0000 - val_tn: 9423.0000 - val_fn: 1799.0000 - val_precision: 0.8036 - val_recall: 0.0476 - val_auc: 0.8895 - val_prc: 0.6401
Epoch 26/66
177/177 [==============================] - 7s 36ms/step - loss: 0.1953 - categorical_accuracy: 0.6144 - tp: 874.0000 - fp: 247.0000 - tn: 28033.0000 - fn: 4782.0000 - precision: 0.7797 - recall: 0.1545 - auc: 0.8835 - prc: 0.6215 - val_loss: 0.1882 - val_categorical_accuracy: 0.6289 - val_tp: 129.0000 - val_fp: 27.0000 - val_tn: 9418.0000 - val_fn: 1760.0000 - val_precision: 0.8269 - val_recall: 0.0683 - val_auc: 0.8895 - val_prc: 0.6420
Epoch 27/66
177/177 [==============================] - 6s 36ms/step - loss: 0.1913 - categorical_accuracy: 0.6156 - tp: 955.0000 - fp: 217.0000 - tn: 28063.0000 - fn: 4701.0000 - precision: 0.8148 - recall: 0.1688 - auc: 0.8864 - prc: 0.6370 - val_loss: 0.1872 - val_categorical_accuracy: 0.6347 - val_tp: 172.0000 - val_fp: 37.0000 - val_tn: 9408.0000 - val_fn: 1717.0000 - val_precision: 0.8230 - val_recall: 0.0911 - val_auc: 0.8905 - val_prc: 0.6455
Epoch 28/66
177/177 [==============================] - 6s 36ms/step - loss: 0.1928 - categorical_accuracy: 0.6158 - tp: 964.0000 - fp: 232.0000 - tn: 28048.0000 - fn: 4692.0000 - precision: 0.8060 - recall: 0.1704 - auc: 0.8860 - prc: 0.6336 - val_loss: 0.1958 - val_categorical_accuracy: 0.6183 - val_tp: 69.0000 - val_fp: 16.0000 - val_tn: 9429.0000 - val_fn: 1820.0000 - val_precision: 0.8118 - val_recall: 0.0365 - val_auc: 0.8841 - val_prc: 0.6190
Epoch 29/66
177/177 [==============================] - 6s 36ms/step - loss: 0.1945 - categorical_accuracy: 0.6197 - tp: 896.0000 - fp: 231.0000 - tn: 28049.0000 - fn: 4760.0000 - precision: 0.7950 - recall: 0.1584 - auc: 0.8841 - prc: 0.6273 - val_loss: 0.1908 - val_categorical_accuracy: 0.6273 - val_tp: 208.0000 - val_fp: 59.0000 - val_tn: 9386.0000 - val_fn: 1681.0000 - val_precision: 0.7790 - val_recall: 0.1101 - val_auc: 0.8872 - val_prc: 0.6286
Epoch 30/66
177/177 [==============================] - 7s 37ms/step - loss: 0.1928 - categorical_accuracy: 0.6128 - tp: 951.0000 - fp: 221.0000 - tn: 28059.0000 - fn: 4705.0000 - precision: 0.8114 - recall: 0.1681 - auc: 0.8858 - prc: 0.6313 - val_loss: 0.2109 - val_categorical_accuracy: 0.5823 - val_tp: 112.0000 - val_fp: 54.0000 - val_tn: 9391.0000 - val_fn: 1777.0000 - val_precision: 0.6747 - val_recall: 0.0593 - val_auc: 0.8612 - val_prc: 0.5507
Epoch 31/66
177/177 [==============================] - 7s 36ms/step - loss: 0.1919 - categorical_accuracy: 0.6223 - tp: 899.0000 - fp: 246.0000 - tn: 28034.0000 - fn: 4757.0000 - precision: 0.7852 - recall: 0.1589 - auc: 0.8864 - prc: 0.6338 - val_loss: 0.1869 - val_categorical_accuracy: 0.6125 - val_tp: 391.0000 - val_fp: 73.0000 - val_tn: 9372.0000 - val_fn: 1498.0000 - val_precision: 0.8427 - val_recall: 0.2070 - val_auc: 0.8910 - val_prc: 0.6542
Epoch 32/66
177/177 [==============================] - 7s 37ms/step - loss: 0.1896 - categorical_accuracy: 0.6225 - tp: 970.0000 - fp: 224.0000 - tn: 28056.0000 - fn: 4686.0000 - precision: 0.8124 - recall: 0.1715 - auc: 0.8895 - prc: 0.6434 - val_loss: 0.1853 - val_categorical_accuracy: 0.6321 - val_tp: 161.0000 - val_fp: 32.0000 - val_tn: 9413.0000 - val_fn: 1728.0000 - val_precision: 0.8342 - val_recall: 0.0852 - val_auc: 0.8938 - val_prc: 0.6508
Epoch 33/66
177/177 [==============================] - 6s 36ms/step - loss: 0.1907 - categorical_accuracy: 0.6176 - tp: 985.0000 - fp: 225.0000 - tn: 28055.0000 - fn: 4671.0000 - precision: 0.8140 - recall: 0.1742 - auc: 0.8889 - prc: 0.6389 - val_loss: 0.1854 - val_categorical_accuracy: 0.6316 - val_tp: 443.0000 - val_fp: 95.0000 - val_tn: 9350.0000 - val_fn: 1446.0000 - val_precision: 0.8234 - val_recall: 0.2345 - val_auc: 0.8921 - val_prc: 0.6633
Epoch 34/66
177/177 [==============================] - 9s 50ms/step - loss: 0.1915 - categorical_accuracy: 0.6147 - tp: 1016.0000 - fp: 229.0000 - tn: 28051.0000 - fn: 4640.0000 - precision: 0.8161 - recall: 0.1796 - auc: 0.8876 - prc: 0.6390 - val_loss: 0.1867 - val_categorical_accuracy: 0.6427 - val_tp: 130.0000 - val_fp: 33.0000 - val_tn: 9412.0000 - val_fn: 1759.0000 - val_precision: 0.7975 - val_recall: 0.0688 - val_auc: 0.8951 - val_prc: 0.6447
Epoch 35/66
177/177 [==============================] - 6s 36ms/step - loss: 0.1889 - categorical_accuracy: 0.6209 - tp: 1026.0000 - fp: 225.0000 - tn: 28055.0000 - fn: 4630.0000 - precision: 0.8201 - recall: 0.1814 - auc: 0.8905 - prc: 0.6473 - val_loss: 0.1870 - val_categorical_accuracy: 0.6294 - val_tp: 346.0000 - val_fp: 78.0000 - val_tn: 9367.0000 - val_fn: 1543.0000 - val_precision: 0.8160 - val_recall: 0.1832 - val_auc: 0.8914 - val_prc: 0.6481
Epoch 36/66
177/177 [==============================] - 7s 37ms/step - loss: 0.1898 - categorical_accuracy: 0.6163 - tp: 1023.0000 - fp: 242.0000 - tn: 28038.0000 - fn: 4633.0000 - precision: 0.8087 - recall: 0.1809 - auc: 0.8895 - prc: 0.6438 - val_loss: 0.1839 - val_categorical_accuracy: 0.6300 - val_tp: 385.0000 - val_fp: 75.0000 - val_tn: 9370.0000 - val_fn: 1504.0000 - val_precision: 0.8370 - val_recall: 0.2038 - val_auc: 0.8943 - val_prc: 0.6628
Epoch 37/66
177/177 [==============================] - 6s 36ms/step - loss: 0.1902 - categorical_accuracy: 0.6250 - tp: 1018.0000 - fp: 243.0000 - tn: 28037.0000 - fn: 4638.0000 - precision: 0.8073 - recall: 0.1800 - auc: 0.8885 - prc: 0.6446 - val_loss: 0.1928 - val_categorical_accuracy: 0.6252 - val_tp: 86.0000 - val_fp: 27.0000 - val_tn: 9418.0000 - val_fn: 1803.0000 - val_precision: 0.7611 - val_recall: 0.0455 - val_auc: 0.8868 - val_prc: 0.6229
Epoch 38/66
177/177 [==============================] - 7s 36ms/step - loss: 0.1879 - categorical_accuracy: 0.6264 - tp: 1031.0000 - fp: 232.0000 - tn: 28048.0000 - fn: 4625.0000 - precision: 0.8163 - recall: 0.1823 - auc: 0.8910 - prc: 0.6508 - val_loss: 0.1833 - val_categorical_accuracy: 0.6353 - val_tp: 180.0000 - val_fp: 36.0000 - val_tn: 9409.0000 - val_fn: 1709.0000 - val_precision: 0.8333 - val_recall: 0.0953 - val_auc: 0.8968 - val_prc: 0.6570
Epoch 39/66
177/177 [==============================] - 6s 36ms/step - loss: 0.1897 - categorical_accuracy: 0.6169 - tp: 1041.0000 - fp: 249.0000 - tn: 28031.0000 - fn: 4615.0000 - precision: 0.8070 - recall: 0.1841 - auc: 0.8900 - prc: 0.6457 - val_loss: 0.1840 - val_categorical_accuracy: 0.6331 - val_tp: 304.0000 - val_fp: 60.0000 - val_tn: 9385.0000 - val_fn: 1585.0000 - val_precision: 0.8352 - val_recall: 0.1609 - val_auc: 0.8950 - val_prc: 0.6579
Epoch 40/66
177/177 [==============================] - 7s 36ms/step - loss: 0.1883 - categorical_accuracy: 0.6335 - tp: 1118.0000 - fp: 231.0000 - tn: 28049.0000 - fn: 4538.0000 - precision: 0.8288 - recall: 0.1977 - auc: 0.8928 - prc: 0.6549 - val_loss: 0.1801 - val_categorical_accuracy: 0.6368 - val_tp: 391.0000 - val_fp: 55.0000 - val_tn: 9390.0000 - val_fn: 1498.0000 - val_precision: 0.8767 - val_recall: 0.2070 - val_auc: 0.8982 - val_prc: 0.6809
Epoch 41/66
177/177 [==============================] - 7s 36ms/step - loss: 0.1888 - categorical_accuracy: 0.6231 - tp: 1063.0000 - fp: 241.0000 - tn: 28039.0000 - fn: 4593.0000 - precision: 0.8152 - recall: 0.1879 - auc: 0.8907 - prc: 0.6506 - val_loss: 0.1820 - val_categorical_accuracy: 0.6278 - val_tp: 332.0000 - val_fp: 48.0000 - val_tn: 9397.0000 - val_fn: 1557.0000 - val_precision: 0.8737 - val_recall: 0.1758 - val_auc: 0.8976 - val_prc: 0.6757
Epoch 42/66
177/177 [==============================] - 7s 40ms/step - loss: 0.1886 - categorical_accuracy: 0.6178 - tp: 1091.0000 - fp: 243.0000 - tn: 28037.0000 - fn: 4565.0000 - precision: 0.8178 - recall: 0.1929 - auc: 0.8918 - prc: 0.6508 - val_loss: 0.1831 - val_categorical_accuracy: 0.6432 - val_tp: 240.0000 - val_fp: 44.0000 - val_tn: 9401.0000 - val_fn: 1649.0000 - val_precision: 0.8451 - val_recall: 0.1271 - val_auc: 0.8959 - val_prc: 0.6590
Epoch 43/66
177/177 [==============================] - 8s 45ms/step - loss: 0.1873 - categorical_accuracy: 0.6232 - tp: 1099.0000 - fp: 231.0000 - tn: 28049.0000 - fn: 4557.0000 - precision: 0.8263 - recall: 0.1943 - auc: 0.8932 - prc: 0.6552 - val_loss: 0.1873 - val_categorical_accuracy: 0.6300 - val_tp: 497.0000 - val_fp: 112.0000 - val_tn: 9333.0000 - val_fn: 1392.0000 - val_precision: 0.8161 - val_recall: 0.2631 - val_auc: 0.8920 - val_prc: 0.6616
Epoch 44/66
177/177 [==============================] - 7s 36ms/step - loss: 0.1900 - categorical_accuracy: 0.6211 - tp: 1096.0000 - fp: 244.0000 - tn: 28036.0000 - fn: 4560.0000 - precision: 0.8179 - recall: 0.1938 - auc: 0.8905 - prc: 0.6513 - val_loss: 0.1829 - val_categorical_accuracy: 0.6379 - val_tp: 484.0000 - val_fp: 94.0000 - val_tn: 9351.0000 - val_fn: 1405.0000 - val_precision: 0.8374 - val_recall: 0.2562 - val_auc: 0.8985 - val_prc: 0.6775
Epoch 45/66
177/177 [==============================] - 7s 36ms/step - loss: 0.1866 - categorical_accuracy: 0.6218 - tp: 1123.0000 - fp: 232.0000 - tn: 28048.0000 - fn: 4533.0000 - precision: 0.8288 - recall: 0.1986 - auc: 0.8930 - prc: 0.6613 - val_loss: 0.1800 - val_categorical_accuracy: 0.6400 - val_tp: 409.0000 - val_fp: 82.0000 - val_tn: 9363.0000 - val_fn: 1480.0000 - val_precision: 0.8330 - val_recall: 0.2165 - val_auc: 0.8989 - val_prc: 0.6771
Epoch 46/66
177/177 [==============================] - 6s 36ms/step - loss: 0.1877 - categorical_accuracy: 0.6197 - tp: 1087.0000 - fp: 262.0000 - tn: 28018.0000 - fn: 4569.0000 - precision: 0.8058 - recall: 0.1922 - auc: 0.8937 - prc: 0.6546 - val_loss: 0.1803 - val_categorical_accuracy: 0.6416 - val_tp: 381.0000 - val_fp: 67.0000 - val_tn: 9378.0000 - val_fn: 1508.0000 - val_precision: 0.8504 - val_recall: 0.2017 - val_auc: 0.8995 - val_prc: 0.6761
Epoch 47/66
177/177 [==============================] - 7s 36ms/step - loss: 0.1880 - categorical_accuracy: 0.6223 - tp: 1088.0000 - fp: 220.0000 - tn: 28060.0000 - fn: 4568.0000 - precision: 0.8318 - recall: 0.1924 - auc: 0.8921 - prc: 0.6564 - val_loss: 0.1821 - val_categorical_accuracy: 0.6501 - val_tp: 224.0000 - val_fp: 48.0000 - val_tn: 9397.0000 - val_fn: 1665.0000 - val_precision: 0.8235 - val_recall: 0.1186 - val_auc: 0.8973 - val_prc: 0.6666
Epoch 48/66
177/177 [==============================] - 7s 36ms/step - loss: 0.1869 - categorical_accuracy: 0.6204 - tp: 1182.0000 - fp: 233.0000 - tn: 28047.0000 - fn: 4474.0000 - precision: 0.8353 - recall: 0.2090 - auc: 0.8939 - prc: 0.6608 - val_loss: 0.1768 - val_categorical_accuracy: 0.6496 - val_tp: 336.0000 - val_fp: 45.0000 - val_tn: 9400.0000 - val_fn: 1553.0000 - val_precision: 0.8819 - val_recall: 0.1779 - val_auc: 0.9026 - val_prc: 0.6908
Epoch 49/66
177/177 [==============================] - 7s 37ms/step - loss: 0.1870 - categorical_accuracy: 0.6174 - tp: 1134.0000 - fp: 223.0000 - tn: 28057.0000 - fn: 4522.0000 - precision: 0.8357 - recall: 0.2005 - auc: 0.8943 - prc: 0.6587 - val_loss: 0.1821 - val_categorical_accuracy: 0.6342 - val_tp: 339.0000 - val_fp: 41.0000 - val_tn: 9404.0000 - val_fn: 1550.0000 - val_precision: 0.8921 - val_recall: 0.1795 - val_auc: 0.8976 - val_prc: 0.6786
Epoch 50/66
177/177 [==============================] - 6s 36ms/step - loss: 0.1877 - categorical_accuracy: 0.6215 - tp: 1127.0000 - fp: 226.0000 - tn: 28054.0000 - fn: 4529.0000 - precision: 0.8330 - recall: 0.1993 - auc: 0.8916 - prc: 0.6601 - val_loss: 0.1845 - val_categorical_accuracy: 0.6384 - val_tp: 503.0000 - val_fp: 109.0000 - val_tn: 9336.0000 - val_fn: 1386.0000 - val_precision: 0.8219 - val_recall: 0.2663 - val_auc: 0.8957 - val_prc: 0.6725
Epoch 51/66
177/177 [==============================] - 9s 51ms/step - loss: 0.1856 - categorical_accuracy: 0.6236 - tp: 1207.0000 - fp: 217.0000 - tn: 28063.0000 - fn: 4449.0000 - precision: 0.8476 - recall: 0.2134 - auc: 0.8946 - prc: 0.6655 - val_loss: 0.1765 - val_categorical_accuracy: 0.6464 - val_tp: 406.0000 - val_fp: 69.0000 - val_tn: 9376.0000 - val_fn: 1483.0000 - val_precision: 0.8547 - val_recall: 0.2149 - val_auc: 0.9033 - val_prc: 0.6934
Epoch 52/66
177/177 [==============================] - 7s 36ms/step - loss: 0.1859 - categorical_accuracy: 0.6278 - tp: 1205.0000 - fp: 220.0000 - tn: 28060.0000 - fn: 4451.0000 - precision: 0.8456 - recall: 0.2130 - auc: 0.8950 - prc: 0.6689 - val_loss: 0.1843 - val_categorical_accuracy: 0.6342 - val_tp: 334.0000 - val_fp: 53.0000 - val_tn: 9392.0000 - val_fn: 1555.0000 - val_precision: 0.8630 - val_recall: 0.1768 - val_auc: 0.8944 - val_prc: 0.6704
Epoch 53/66
177/177 [==============================] - 6s 36ms/step - loss: 0.1873 - categorical_accuracy: 0.6204 - tp: 1141.0000 - fp: 209.0000 - tn: 28071.0000 - fn: 4515.0000 - precision: 0.8452 - recall: 0.2017 - auc: 0.8942 - prc: 0.6633 - val_loss: 0.1785 - val_categorical_accuracy: 0.6458 - val_tp: 399.0000 - val_fp: 73.0000 - val_tn: 9372.0000 - val_fn: 1490.0000 - val_precision: 0.8453 - val_recall: 0.2112 - val_auc: 0.9010 - val_prc: 0.6806
Epoch 54/66
177/177 [==============================] - 7s 36ms/step - loss: 0.1853 - categorical_accuracy: 0.6312 - tp: 1214.0000 - fp: 243.0000 - tn: 28037.0000 - fn: 4442.0000 - precision: 0.8332 - recall: 0.2146 - auc: 0.8956 - prc: 0.6682 - val_loss: 0.1785 - val_categorical_accuracy: 0.6506 - val_tp: 235.0000 - val_fp: 22.0000 - val_tn: 9423.0000 - val_fn: 1654.0000 - val_precision: 0.9144 - val_recall: 0.1244 - val_auc: 0.9029 - val_prc: 0.6949
Epoch 55/66
177/177 [==============================] - 6s 36ms/step - loss: 0.1867 - categorical_accuracy: 0.6248 - tp: 1173.0000 - fp: 218.0000 - tn: 28062.0000 - fn: 4483.0000 - precision: 0.8433 - recall: 0.2074 - auc: 0.8936 - prc: 0.6635 - val_loss: 0.1839 - val_categorical_accuracy: 0.6337 - val_tp: 493.0000 - val_fp: 96.0000 - val_tn: 9349.0000 - val_fn: 1396.0000 - val_precision: 0.8370 - val_recall: 0.2610 - val_auc: 0.8953 - val_prc: 0.6765
Epoch 56/66
177/177 [==============================] - 6s 36ms/step - loss: 0.1869 - categorical_accuracy: 0.6243 - tp: 1158.0000 - fp: 235.0000 - tn: 28045.0000 - fn: 4498.0000 - precision: 0.8313 - recall: 0.2047 - auc: 0.8926 - prc: 0.6646 - val_loss: 0.1836 - val_categorical_accuracy: 0.6337 - val_tp: 688.0000 - val_fp: 152.0000 - val_tn: 9293.0000 - val_fn: 1201.0000 - val_precision: 0.8190 - val_recall: 0.3642 - val_auc: 0.9023 - val_prc: 0.6962
Epoch 57/66
177/177 [==============================] - 6s 36ms/step - loss: 0.1866 - categorical_accuracy: 0.6236 - tp: 1205.0000 - fp: 225.0000 - tn: 28055.0000 - fn: 4451.0000 - precision: 0.8427 - recall: 0.2130 - auc: 0.8932 - prc: 0.6651 - val_loss: 0.1772 - val_categorical_accuracy: 0.6432 - val_tp: 448.0000 - val_fp: 74.0000 - val_tn: 9371.0000 - val_fn: 1441.0000 - val_precision: 0.8582 - val_recall: 0.2372 - val_auc: 0.9032 - val_prc: 0.6981
Epoch 58/66
177/177 [==============================] - 7s 37ms/step - loss: 0.1863 - categorical_accuracy: 0.6200 - tp: 1183.0000 - fp: 218.0000 - tn: 28062.0000 - fn: 4473.0000 - precision: 0.8444 - recall: 0.2092 - auc: 0.8947 - prc: 0.6662 - val_loss: 0.1825 - val_categorical_accuracy: 0.6432 - val_tp: 488.0000 - val_fp: 84.0000 - val_tn: 9361.0000 - val_fn: 1401.0000 - val_precision: 0.8531 - val_recall: 0.2583 - val_auc: 0.8966 - val_prc: 0.6773
Epoch 59/66
177/177 [==============================] - 7s 37ms/step - loss: 0.1886 - categorical_accuracy: 0.6294 - tp: 1181.0000 - fp: 227.0000 - tn: 28053.0000 - fn: 4475.0000 - precision: 0.8388 - recall: 0.2088 - auc: 0.8920 - prc: 0.6641 - val_loss: 0.1899 - val_categorical_accuracy: 0.6151 - val_tp: 219.0000 - val_fp: 40.0000 - val_tn: 9405.0000 - val_fn: 1670.0000 - val_precision: 0.8456 - val_recall: 0.1159 - val_auc: 0.8880 - val_prc: 0.6430
Epoch 60/66
177/177 [==============================] - 9s 51ms/step - loss: 0.1888 - categorical_accuracy: 0.6273 - tp: 1154.0000 - fp: 225.0000 - tn: 28055.0000 - fn: 4502.0000 - precision: 0.8368 - recall: 0.2040 - auc: 0.8915 - prc: 0.6629 - val_loss: 0.1794 - val_categorical_accuracy: 0.6416 - val_tp: 321.0000 - val_fp: 35.0000 - val_tn: 9410.0000 - val_fn: 1568.0000 - val_precision: 0.9017 - val_recall: 0.1699 - val_auc: 0.8987 - val_prc: 0.6919
Epoch 61/66
177/177 [==============================] - 7s 37ms/step - loss: 0.1863 - categorical_accuracy: 0.6300 - tp: 1224.0000 - fp: 232.0000 - tn: 28048.0000 - fn: 4432.0000 - precision: 0.8407 - recall: 0.2164 - auc: 0.8944 - prc: 0.6696 - val_loss: 0.2052 - val_categorical_accuracy: 0.6173 - val_tp: 698.0000 - val_fp: 183.0000 - val_tn: 9262.0000 - val_fn: 1191.0000 - val_precision: 0.7923 - val_recall: 0.3695 - val_auc: 0.8922 - val_prc: 0.6622
Epoch 62/66
177/177 [==============================] - 6s 36ms/step - loss: 0.1864 - categorical_accuracy: 0.6262 - tp: 1220.0000 - fp: 232.0000 - tn: 28048.0000 - fn: 4436.0000 - precision: 0.8402 - recall: 0.2157 - auc: 0.8947 - prc: 0.6701 - val_loss: 0.1824 - val_categorical_accuracy: 0.6326 - val_tp: 369.0000 - val_fp: 57.0000 - val_tn: 9388.0000 - val_fn: 1520.0000 - val_precision: 0.8662 - val_recall: 0.1953 - val_auc: 0.8950 - val_prc: 0.6771
Epoch 63/66
177/177 [==============================] - 7s 36ms/step - loss: 0.1869 - categorical_accuracy: 0.6292 - tp: 1215.0000 - fp: 230.0000 - tn: 28050.0000 - fn: 4441.0000 - precision: 0.8408 - recall: 0.2148 - auc: 0.8937 - prc: 0.6673 - val_loss: 0.1877 - val_categorical_accuracy: 0.6226 - val_tp: 599.0000 - val_fp: 120.0000 - val_tn: 9325.0000 - val_fn: 1290.0000 - val_precision: 0.8331 - val_recall: 0.3171 - val_auc: 0.8946 - val_prc: 0.6774
Epoch 64/66
177/177 [==============================] - 7s 36ms/step - loss: 0.1858 - categorical_accuracy: 0.6278 - tp: 1245.0000 - fp: 210.0000 - tn: 28070.0000 - fn: 4411.0000 - precision: 0.8557 - recall: 0.2201 - auc: 0.8950 - prc: 0.6726 - val_loss: 0.1800 - val_categorical_accuracy: 0.6411 - val_tp: 256.0000 - val_fp: 36.0000 - val_tn: 9409.0000 - val_fn: 1633.0000 - val_precision: 0.8767 - val_recall: 0.1355 - val_auc: 0.8992 - val_prc: 0.6832
Epoch 65/66
177/177 [==============================] - 7s 36ms/step - loss: 0.1887 - categorical_accuracy: 0.6183 - tp: 1220.0000 - fp: 238.0000 - tn: 28042.0000 - fn: 4436.0000 - precision: 0.8368 - recall: 0.2157 - auc: 0.8926 - prc: 0.6610 - val_loss: 0.1870 - val_categorical_accuracy: 0.6533 - val_tp: 123.0000 - val_fp: 21.0000 - val_tn: 9424.0000 - val_fn: 1766.0000 - val_precision: 0.8542 - val_recall: 0.0651 - val_auc: 0.8950 - val_prc: 0.6650
Epoch 66/66
177/177 [==============================] - 7s 36ms/step - loss: 0.1917 - categorical_accuracy: 0.6248 - tp: 1120.0000 - fp: 225.0000 - tn: 28055.0000 - fn: 4536.0000 - precision: 0.8327 - recall: 0.1980 - auc: 0.8910 - prc: 0.6577 - val_loss: 0.1995 - val_categorical_accuracy: 0.6268 - val_tp: 702.0000 - val_fp: 178.0000 - val_tn: 9267.0000 - val_fn: 1187.0000 - val_precision: 0.7977 - val_recall: 0.3716 - val_auc: 0.8936 - val_prc: 0.6771
In [433]:
%reload_ext tensorboard
%tensorboard --logdir ./tensorboard/data_augmented_model --bind_all

Screen Shot 2023-03-18 at 1.35.36 AM.png

5.3.6 Evaluate data augmentation on TEST set¶
In [38]:
evaluate_model_performance(data_augmented_model, val_ds, test_ds)
60/60 [==============================] - 2s 28ms/step - loss: 0.1993 - categorical_accuracy: 0.6268 - tp: 702.0000 - fp: 178.0000 - tn: 9267.0000 - fn: 1187.0000 - precision: 0.7977 - recall: 0.3716 - auc: 0.8936 - prc: 0.6771
Validation AUC: 0.894
Validation PRC: 0.677
Validation categorical accuracy: 0.627
59/59 [==============================] - 2s 37ms/step - loss: 0.1916 - categorical_accuracy: 0.6159 - tp: 684.0000 - fp: 142.0000 - tn: 9283.0000 - fn: 1201.0000 - precision: 0.8281 - recall: 0.3629 - auc: 0.9009 - prc: 0.6931
Test AUC: 0.901
Test PRC: 0.693
Test categorical accuracy: 0.616

Key takeways from this experiement is that data augmentation improves generalization on unseen test data compared to previous models without data augmentation.

  • The model is able to learn more robust features
  • Overfitting is reduced.

We can observe that the discrepancy between test and validation sets are no longer as significant as in the previously trained models

5.4 Using pretrained models and architectures¶

Usually a highly effective approach to deep learning on small image datasets is to use a pretrained model. In this case we can leverage the CropNet model from TensorFlow Hub. This is a key advantage because of the reusability and portability of learned features.

In this section, we will begin with feature extraction. The process is as follows

  1. Take the convolutional base of the model Cropnet model
  2. Run our dataset through it
  3. Train a new classifier on top of the output
5.4.1 Leveraging CropNet model for feature extraction¶

In this section, we measure the accuracy of our classifier on a split of the dataset.

In [35]:
pretrained_classifier = hub.KerasLayer(handle='https://tfhub.dev/google/cropnet/classifier/cassava_disease_V1/2')
In [36]:
pretrained_classifier.get_config()
Out[36]:
{'name': 'keras_layer',
 'trainable': False,
 'dtype': 'float32',
 'handle': 'https://tfhub.dev/google/cropnet/classifier/cassava_disease_V1/2'}
In [37]:
print("weights:", len(pretrained_classifier.weights))
print("trainable_weights:", len(pretrained_classifier.trainable_weights))
print("non_trainable_weights:", len(pretrained_classifier.non_trainable_weights))
weights: 266
trainable_weights: 0
non_trainable_weights: 266
In [17]:
def get_pretrained_model_features(modelname):
    # Model architecture
    inputs = keras.Input(shape=(224, 224, 3), name='preprocessedimage')
    outputs = hub.KerasLayer(handle='https://tfhub.dev/google/cropnet/classifier/cassava_disease_V1/2')(inputs)
    model = keras.Model(inputs=inputs, outputs=outputs, name=modelname)
    # Compile model
    model.compile(optimizer='rmsprop',
                  loss=tfa.losses.SigmoidFocalCrossEntropy(),
                  metrics=METRICS)
    return model
In [40]:
feature_ext_model = get_pretrained_model_features("feature_extraction_model_no_augmentation")
feature_ext_model.summary()
WARNING:tensorflow:From /opt/conda/lib/python3.7/site-packages/tensorflow/python/autograph/pyct/static_analysis/liveness.py:83: Analyzer.lamba_check (from tensorflow.python.autograph.pyct.static_analysis.liveness) is deprecated and will be removed after 2023-09-23.
Instructions for updating:
Lambda fuctions will be no more assumed to be used in the statement where they are used, or at least in the same block. https://github.com/tensorflow/tensorflow/issues/56089
WARNING:tensorflow:From /opt/conda/lib/python3.7/site-packages/tensorflow/python/autograph/pyct/static_analysis/liveness.py:83: Analyzer.lamba_check (from tensorflow.python.autograph.pyct.static_analysis.liveness) is deprecated and will be removed after 2023-09-23.
Instructions for updating:
Lambda fuctions will be no more assumed to be used in the statement where they are used, or at least in the same block. https://github.com/tensorflow/tensorflow/issues/56089
Model: "feature_extraction_model_no_augmentation"
_________________________________________________________________
 Layer (type)                Output Shape              Param #   
=================================================================
 preprocessedimage (InputLay  [(None, 224, 224, 3)]    0         
 er)                                                             
                                                                 
 keras_layer_1 (KerasLayer)  (None, 6)                 4234118   
                                                                 
=================================================================
Total params: 4,234,118
Trainable params: 0
Non-trainable params: 4,234,118
_________________________________________________________________
In [46]:
evaluate_model_performance(feature_ext_model, val_ds, test_ds)
2023-03-16 23:53:15.628417: I tensorflow/compiler/xla/stream_executor/cuda/cuda_dnn.cc:428] Loaded cuDNN version 8200
60/60 [==============================] - 11s 40ms/step - loss: 0.1014 - categorical_accuracy: 0.9058 - tp: 1686.0000 - fp: 150.0000 - tn: 9295.0000 - fn: 203.0000 - precision: 0.9183 - recall: 0.8925 - auc: 0.9918 - prc: 0.9686
Validation AUC: 0.992
Validation PRC: 0.969
Validation categorical accuracy: 0.906
59/59 [==============================] - 3s 44ms/step - loss: 0.1928 - categorical_accuracy: 0.8764 - tp: 1636.0000 - fp: 212.0000 - tn: 9213.0000 - fn: 249.0000 - precision: 0.8853 - recall: 0.8679 - auc: 0.9798 - prc: 0.9398
Test AUC: 0.980
Test PRC: 0.940
Test categorical accuracy: 0.876

This is a game changing improvement!

5.4.2 Leveraging EfficientNet-B4 architecture for Transfer Learning¶

EfficientNets achieve state-of-the-art accuracy on ImageNet with an order of magnitude better efficiency compared to other architectures.

EfficientNet-B4 specifically is categorized within the middle accuracy regime of the family of EfficientNet models. [https://github.com/tensorflow/tpu/tree/master/models/official/efficientnet]

Let's begin with feature extraction. params.png

In [143]:
# Fresh data
(train_ds, val_ds, test_ds), info = tfds.load('cassava', 
                                         split=['train', 'validation', 'test'],
                                         shuffle_files=True,
                                         as_supervised=True,
                                         with_info=True)
train_ds = prepare(train_ds)
val_ds = prepare(val_ds)
test_ds = prepare(test_ds)
In [19]:
# https://www.tensorflow.org/api_docs/python/tf/keras/applications/efficientnet/EfficientNetB4

# Instantiate model with imagenet weights
efficientnetb4_conv_base = keras.applications.efficientnet.EfficientNetB4(
    weights="imagenet",
    include_top=False,
    input_shape=(224,224,3)
)

First, instantiate a base model with pre-trained weights. Weights are downloaded automatically when instantiating a model. They are stored at ~/.keras/models/.

In [50]:
# efficientnetb4_conv_base.get_config()
print("weights:", len(efficientnetb4_conv_base.weights))
print("trainable_weights:", len(efficientnetb4_conv_base.trainable_weights))
print("non_trainable_weights:", len(efficientnetb4_conv_base.non_trainable_weights))
weights: 611
trainable_weights: 416
non_trainable_weights: 195

Before we create a new model on top of the output of this efficientnet model we need to Freeze all layers in the base model by setting trainable = False.

In [51]:
efficientnetb4_conv_base.trainable = False
In [52]:
print("weights:", len(efficientnetb4_conv_base.weights))
print("trainable_weights:", len(efficientnetb4_conv_base.trainable_weights))
print("non_trainable_weights:", len(efficientnetb4_conv_base.non_trainable_weights))
weights: 611
trainable_weights: 0
non_trainable_weights: 611
In [20]:
def build_model_efficientnet():
    # inputs = layers.Input(shape=(224, 224, 3))
    inputs = keras.Input(shape=(224, 224, 3))
    # x = img_augmentation(inputs)
    # model = EfficientNetB0(include_top=False, input_tensor=x, weights="imagenet")
    model = keras.applications.efficientnet.EfficientNetB4(
    weights="imagenet",
    include_top=False,
    input_shape=(224,224,3)
)

    # Freeze the pretrained weights
    model.trainable = False
    
    # Rebuild top
    x = model(inputs, training=False)
    x = layers.GlobalAveragePooling2D(name="avg_pool")(x)
    x = layers.BatchNormalization()(x)

    top_dropout_rate = 0.2
    x = layers.Dropout(top_dropout_rate, name="top_dropout")(x)
    outputs = layers.Dense(6, activation="softmax", name="pred")(x)

    # Compile
    model = tf.keras.Model(inputs, outputs, name="EfficientNet")
    optimizer = tf.keras.optimizers.Adam(learning_rate=1e-2)
    model.compile(
        optimizer=optimizer, loss=tfa.losses.SigmoidFocalCrossEntropy(), metrics=METRICS
    )
    return model
In [37]:
efficientnetb4_transfer_learning = build_model_efficientnet()
callbacks = [
    keras.callbacks.ModelCheckpoint(
      filepath="efficientnetb4_transfer_learning.keras",
      save_best_only=True,
      monitor="val_loss"),
    keras.callbacks.TensorBoard(log_dir=f"./tensorboard/efficientnetb4_transfer_learning")
]
# efficientnetb4_transfer_learning.summary()
if os.path.exists("efficientnetb4_transfer_learning.keras"):
    model = keras.models.load_model("efficientnetb4_transfer_learning.keras")
    efficientnetb4_transfer_learning_history = model.fit(
        # train_features, train_labels,
        train_ds,
        epochs=150,
        # validation_data=(val_features, val_labels),
        validation_data=val_ds,
        callbacks=callbacks)    
else:
    
    efficientnetb4_transfer_learning_history = efficientnetb4_transfer_learning.fit(
        # train_features, train_labels,
        train_ds,
        epochs=150,
        # validation_data=(val_features, val_labels),
        validation_data=val_ds,
        callbacks=callbacks)
Epoch 1/150
177/177 [==============================] - 47s 205ms/step - loss: 0.3122 - categorical_accuracy: 0.4566 - tp: 1062.0000 - fp: 629.0000 - tn: 37096.0000 - fn: 6483.0000 - precision: 0.6280 - recall: 0.1408 - auc: 0.7890 - prc: 0.4366 - val_loss: 0.2754 - val_categorical_accuracy: 0.4690 - val_tp: 36.0000 - val_fp: 8.0000 - val_tn: 9437.0000 - val_fn: 1853.0000 - val_precision: 0.8182 - val_recall: 0.0191 - val_auc: 0.8190 - val_prc: 0.5119
Epoch 2/150
177/177 [==============================] - 34s 192ms/step - loss: 0.2511 - categorical_accuracy: 0.4880 - tp: 532.0000 - fp: 239.0000 - tn: 28041.0000 - fn: 5124.0000 - precision: 0.6900 - recall: 0.0941 - auc: 0.8176 - prc: 0.4779 - val_loss: 0.2519 - val_categorical_accuracy: 0.4696 - val_tp: 1.0000 - val_fp: 1.0000 - val_tn: 9444.0000 - val_fn: 1888.0000 - val_precision: 0.5000 - val_recall: 5.2938e-04 - val_auc: 0.8011 - val_prc: 0.4841
Epoch 3/150
177/177 [==============================] - 34s 193ms/step - loss: 0.2614 - categorical_accuracy: 0.4692 - tp: 615.0000 - fp: 318.0000 - tn: 27962.0000 - fn: 5041.0000 - precision: 0.6592 - recall: 0.1087 - auc: 0.8047 - prc: 0.4587 - val_loss: 0.2303 - val_categorical_accuracy: 0.5257 - val_tp: 274.0000 - val_fp: 80.0000 - val_tn: 9365.0000 - val_fn: 1615.0000 - val_precision: 0.7740 - val_recall: 0.1451 - val_auc: 0.8305 - val_prc: 0.5344
Epoch 4/150
177/177 [==============================] - 34s 190ms/step - loss: 0.2372 - categorical_accuracy: 0.4913 - tp: 429.0000 - fp: 178.0000 - tn: 28102.0000 - fn: 5227.0000 - precision: 0.7068 - recall: 0.0758 - auc: 0.8226 - prc: 0.4869 - val_loss: 0.2614 - val_categorical_accuracy: 0.5114 - val_tp: 548.0000 - val_fp: 253.0000 - val_tn: 9192.0000 - val_fn: 1341.0000 - val_precision: 0.6841 - val_recall: 0.2901 - val_auc: 0.8193 - val_prc: 0.5278
Epoch 5/150
177/177 [==============================] - 34s 193ms/step - loss: 0.2362 - categorical_accuracy: 0.5016 - tp: 471.0000 - fp: 185.0000 - tn: 28095.0000 - fn: 5185.0000 - precision: 0.7180 - recall: 0.0833 - auc: 0.8266 - prc: 0.4928 - val_loss: 0.2393 - val_categorical_accuracy: 0.4950 - val_tp: 416.0000 - val_fp: 153.0000 - val_tn: 9292.0000 - val_fn: 1473.0000 - val_precision: 0.7311 - val_recall: 0.2202 - val_auc: 0.8426 - val_prc: 0.5353
Epoch 6/150
177/177 [==============================] - 35s 196ms/step - loss: 0.2319 - categorical_accuracy: 0.5134 - tp: 433.0000 - fp: 151.0000 - tn: 28129.0000 - fn: 5223.0000 - precision: 0.7414 - recall: 0.0766 - auc: 0.8333 - prc: 0.5060 - val_loss: 0.2524 - val_categorical_accuracy: 0.5124 - val_tp: 490.0000 - val_fp: 187.0000 - val_tn: 9258.0000 - val_fn: 1399.0000 - val_precision: 0.7238 - val_recall: 0.2594 - val_auc: 0.8341 - val_prc: 0.5383
Epoch 7/150
177/177 [==============================] - 36s 202ms/step - loss: 0.2284 - categorical_accuracy: 0.5262 - tp: 401.0000 - fp: 144.0000 - tn: 28136.0000 - fn: 5255.0000 - precision: 0.7358 - recall: 0.0709 - auc: 0.8354 - prc: 0.5159 - val_loss: 0.2263 - val_categorical_accuracy: 0.5352 - val_tp: 426.0000 - val_fp: 144.0000 - val_tn: 9301.0000 - val_fn: 1463.0000 - val_precision: 0.7474 - val_recall: 0.2255 - val_auc: 0.8472 - val_prc: 0.5555
Epoch 8/150
177/177 [==============================] - 35s 195ms/step - loss: 0.2247 - categorical_accuracy: 0.5313 - tp: 391.0000 - fp: 150.0000 - tn: 28130.0000 - fn: 5265.0000 - precision: 0.7227 - recall: 0.0691 - auc: 0.8382 - prc: 0.5187 - val_loss: 0.2303 - val_categorical_accuracy: 0.5262 - val_tp: 405.0000 - val_fp: 140.0000 - val_tn: 9305.0000 - val_fn: 1484.0000 - val_precision: 0.7431 - val_recall: 0.2144 - val_auc: 0.8398 - val_prc: 0.5446
Epoch 9/150
177/177 [==============================] - 35s 200ms/step - loss: 0.2216 - categorical_accuracy: 0.5341 - tp: 336.0000 - fp: 129.0000 - tn: 28151.0000 - fn: 5320.0000 - precision: 0.7226 - recall: 0.0594 - auc: 0.8397 - prc: 0.5216 - val_loss: 0.2251 - val_categorical_accuracy: 0.5394 - val_tp: 344.0000 - val_fp: 97.0000 - val_tn: 9348.0000 - val_fn: 1545.0000 - val_precision: 0.7800 - val_recall: 0.1821 - val_auc: 0.8395 - val_prc: 0.5482
Epoch 10/150
177/177 [==============================] - 35s 199ms/step - loss: 0.2198 - categorical_accuracy: 0.5348 - tp: 405.0000 - fp: 126.0000 - tn: 28154.0000 - fn: 5251.0000 - precision: 0.7627 - recall: 0.0716 - auc: 0.8407 - prc: 0.5276 - val_loss: 0.2167 - val_categorical_accuracy: 0.5474 - val_tp: 178.0000 - val_fp: 44.0000 - val_tn: 9401.0000 - val_fn: 1711.0000 - val_precision: 0.8018 - val_recall: 0.0942 - val_auc: 0.8431 - val_prc: 0.5548
Epoch 11/150
177/177 [==============================] - 35s 195ms/step - loss: 0.2179 - categorical_accuracy: 0.5384 - tp: 376.0000 - fp: 132.0000 - tn: 28148.0000 - fn: 5280.0000 - precision: 0.7402 - recall: 0.0665 - auc: 0.8440 - prc: 0.5325 - val_loss: 0.2272 - val_categorical_accuracy: 0.5124 - val_tp: 402.0000 - val_fp: 122.0000 - val_tn: 9323.0000 - val_fn: 1487.0000 - val_precision: 0.7672 - val_recall: 0.2128 - val_auc: 0.8448 - val_prc: 0.5477
Epoch 12/150
177/177 [==============================] - 37s 209ms/step - loss: 0.2182 - categorical_accuracy: 0.5368 - tp: 372.0000 - fp: 138.0000 - tn: 28142.0000 - fn: 5284.0000 - precision: 0.7294 - recall: 0.0658 - auc: 0.8434 - prc: 0.5304 - val_loss: 0.2136 - val_categorical_accuracy: 0.5574 - val_tp: 102.0000 - val_fp: 26.0000 - val_tn: 9419.0000 - val_fn: 1787.0000 - val_precision: 0.7969 - val_recall: 0.0540 - val_auc: 0.8484 - val_prc: 0.5634
Epoch 13/150
177/177 [==============================] - 35s 195ms/step - loss: 0.2188 - categorical_accuracy: 0.5389 - tp: 364.0000 - fp: 121.0000 - tn: 28159.0000 - fn: 5292.0000 - precision: 0.7505 - recall: 0.0644 - auc: 0.8421 - prc: 0.5291 - val_loss: 0.2202 - val_categorical_accuracy: 0.5336 - val_tp: 257.0000 - val_fp: 62.0000 - val_tn: 9383.0000 - val_fn: 1632.0000 - val_precision: 0.8056 - val_recall: 0.1361 - val_auc: 0.8425 - val_prc: 0.5505
Epoch 14/150
177/177 [==============================] - 34s 195ms/step - loss: 0.2182 - categorical_accuracy: 0.5366 - tp: 388.0000 - fp: 135.0000 - tn: 28145.0000 - fn: 5268.0000 - precision: 0.7419 - recall: 0.0686 - auc: 0.8431 - prc: 0.5301 - val_loss: 0.2184 - val_categorical_accuracy: 0.5320 - val_tp: 309.0000 - val_fp: 75.0000 - val_tn: 9370.0000 - val_fn: 1580.0000 - val_precision: 0.8047 - val_recall: 0.1636 - val_auc: 0.8489 - val_prc: 0.5550
Epoch 15/150
177/177 [==============================] - 36s 205ms/step - loss: 0.2179 - categorical_accuracy: 0.5341 - tp: 348.0000 - fp: 124.0000 - tn: 28156.0000 - fn: 5308.0000 - precision: 0.7373 - recall: 0.0615 - auc: 0.8436 - prc: 0.5335 - val_loss: 0.2146 - val_categorical_accuracy: 0.5606 - val_tp: 205.0000 - val_fp: 51.0000 - val_tn: 9394.0000 - val_fn: 1684.0000 - val_precision: 0.8008 - val_recall: 0.1085 - val_auc: 0.8486 - val_prc: 0.5618
Epoch 16/150
177/177 [==============================] - 35s 195ms/step - loss: 0.2181 - categorical_accuracy: 0.5389 - tp: 355.0000 - fp: 124.0000 - tn: 28156.0000 - fn: 5301.0000 - precision: 0.7411 - recall: 0.0628 - auc: 0.8433 - prc: 0.5297 - val_loss: 0.2184 - val_categorical_accuracy: 0.5416 - val_tp: 275.0000 - val_fp: 70.0000 - val_tn: 9375.0000 - val_fn: 1614.0000 - val_precision: 0.7971 - val_recall: 0.1456 - val_auc: 0.8457 - val_prc: 0.5543
Epoch 17/150
177/177 [==============================] - 34s 195ms/step - loss: 0.2232 - categorical_accuracy: 0.5285 - tp: 433.0000 - fp: 167.0000 - tn: 28113.0000 - fn: 5223.0000 - precision: 0.7217 - recall: 0.0766 - auc: 0.8395 - prc: 0.5186 - val_loss: 0.2454 - val_categorical_accuracy: 0.4992 - val_tp: 532.0000 - val_fp: 224.0000 - val_tn: 9221.0000 - val_fn: 1357.0000 - val_precision: 0.7037 - val_recall: 0.2816 - val_auc: 0.8360 - val_prc: 0.5377
Epoch 18/150
177/177 [==============================] - 35s 195ms/step - loss: 0.2205 - categorical_accuracy: 0.5269 - tp: 408.0000 - fp: 131.0000 - tn: 28149.0000 - fn: 5248.0000 - precision: 0.7570 - recall: 0.0721 - auc: 0.8408 - prc: 0.5277 - val_loss: 0.2310 - val_categorical_accuracy: 0.5024 - val_tp: 438.0000 - val_fp: 138.0000 - val_tn: 9307.0000 - val_fn: 1451.0000 - val_precision: 0.7604 - val_recall: 0.2319 - val_auc: 0.8428 - val_prc: 0.5432
Epoch 19/150
177/177 [==============================] - 35s 195ms/step - loss: 0.2195 - categorical_accuracy: 0.5295 - tp: 407.0000 - fp: 140.0000 - tn: 28140.0000 - fn: 5249.0000 - precision: 0.7441 - recall: 0.0720 - auc: 0.8431 - prc: 0.5308 - val_loss: 0.2242 - val_categorical_accuracy: 0.5267 - val_tp: 310.0000 - val_fp: 80.0000 - val_tn: 9365.0000 - val_fn: 1579.0000 - val_precision: 0.7949 - val_recall: 0.1641 - val_auc: 0.8379 - val_prc: 0.5461
Epoch 20/150
177/177 [==============================] - 35s 195ms/step - loss: 0.2189 - categorical_accuracy: 0.5304 - tp: 386.0000 - fp: 140.0000 - tn: 28140.0000 - fn: 5270.0000 - precision: 0.7338 - recall: 0.0682 - auc: 0.8423 - prc: 0.5287 - val_loss: 0.2230 - val_categorical_accuracy: 0.5442 - val_tp: 326.0000 - val_fp: 86.0000 - val_tn: 9359.0000 - val_fn: 1563.0000 - val_precision: 0.7913 - val_recall: 0.1726 - val_auc: 0.8390 - val_prc: 0.5527
Epoch 21/150
177/177 [==============================] - 36s 205ms/step - loss: 0.2192 - categorical_accuracy: 0.5304 - tp: 372.0000 - fp: 130.0000 - tn: 28150.0000 - fn: 5284.0000 - precision: 0.7410 - recall: 0.0658 - auc: 0.8412 - prc: 0.5288 - val_loss: 0.2150 - val_categorical_accuracy: 0.5479 - val_tp: 245.0000 - val_fp: 58.0000 - val_tn: 9387.0000 - val_fn: 1644.0000 - val_precision: 0.8086 - val_recall: 0.1297 - val_auc: 0.8491 - val_prc: 0.5593
Epoch 22/150
177/177 [==============================] - 36s 204ms/step - loss: 0.2170 - categorical_accuracy: 0.5423 - tp: 386.0000 - fp: 140.0000 - tn: 28140.0000 - fn: 5270.0000 - precision: 0.7338 - recall: 0.0682 - auc: 0.8451 - prc: 0.5347 - val_loss: 0.2343 - val_categorical_accuracy: 0.4966 - val_tp: 460.0000 - val_fp: 158.0000 - val_tn: 9287.0000 - val_fn: 1429.0000 - val_precision: 0.7443 - val_recall: 0.2435 - val_auc: 0.8416 - val_prc: 0.5471
Epoch 23/150
177/177 [==============================] - 35s 195ms/step - loss: 0.2183 - categorical_accuracy: 0.5320 - tp: 428.0000 - fp: 139.0000 - tn: 28141.0000 - fn: 5228.0000 - precision: 0.7549 - recall: 0.0757 - auc: 0.8427 - prc: 0.5328 - val_loss: 0.2443 - val_categorical_accuracy: 0.4876 - val_tp: 543.0000 - val_fp: 228.0000 - val_tn: 9217.0000 - val_fn: 1346.0000 - val_precision: 0.7043 - val_recall: 0.2875 - val_auc: 0.8397 - val_prc: 0.5422
Epoch 24/150
177/177 [==============================] - 34s 195ms/step - loss: 0.2216 - categorical_accuracy: 0.5271 - tp: 434.0000 - fp: 144.0000 - tn: 28136.0000 - fn: 5222.0000 - precision: 0.7509 - recall: 0.0767 - auc: 0.8391 - prc: 0.5190 - val_loss: 0.2516 - val_categorical_accuracy: 0.4902 - val_tp: 595.0000 - val_fp: 286.0000 - val_tn: 9159.0000 - val_fn: 1294.0000 - val_precision: 0.6754 - val_recall: 0.3150 - val_auc: 0.8425 - val_prc: 0.5437
Epoch 25/150
177/177 [==============================] - 34s 195ms/step - loss: 0.2211 - categorical_accuracy: 0.5255 - tp: 445.0000 - fp: 148.0000 - tn: 28132.0000 - fn: 5211.0000 - precision: 0.7504 - recall: 0.0787 - auc: 0.8397 - prc: 0.5245 - val_loss: 0.2259 - val_categorical_accuracy: 0.5447 - val_tp: 279.0000 - val_fp: 63.0000 - val_tn: 9382.0000 - val_fn: 1610.0000 - val_precision: 0.8158 - val_recall: 0.1477 - val_auc: 0.8314 - val_prc: 0.5469
Epoch 26/150
177/177 [==============================] - 36s 205ms/step - loss: 0.2206 - categorical_accuracy: 0.5237 - tp: 451.0000 - fp: 133.0000 - tn: 28147.0000 - fn: 5205.0000 - precision: 0.7723 - recall: 0.0797 - auc: 0.8399 - prc: 0.5269 - val_loss: 0.2198 - val_categorical_accuracy: 0.5241 - val_tp: 272.0000 - val_fp: 60.0000 - val_tn: 9385.0000 - val_fn: 1617.0000 - val_precision: 0.8193 - val_recall: 0.1440 - val_auc: 0.8449 - val_prc: 0.5506
Epoch 27/150
177/177 [==============================] - 34s 194ms/step - loss: 0.2181 - categorical_accuracy: 0.5370 - tp: 430.0000 - fp: 142.0000 - tn: 28138.0000 - fn: 5226.0000 - precision: 0.7517 - recall: 0.0760 - auc: 0.8426 - prc: 0.5353 - val_loss: 0.2176 - val_categorical_accuracy: 0.5389 - val_tp: 284.0000 - val_fp: 65.0000 - val_tn: 9380.0000 - val_fn: 1605.0000 - val_precision: 0.8138 - val_recall: 0.1503 - val_auc: 0.8467 - val_prc: 0.5582
Epoch 28/150
177/177 [==============================] - 35s 195ms/step - loss: 0.2175 - categorical_accuracy: 0.5366 - tp: 399.0000 - fp: 139.0000 - tn: 28141.0000 - fn: 5257.0000 - precision: 0.7416 - recall: 0.0705 - auc: 0.8438 - prc: 0.5367 - val_loss: 0.2203 - val_categorical_accuracy: 0.5453 - val_tp: 321.0000 - val_fp: 75.0000 - val_tn: 9370.0000 - val_fn: 1568.0000 - val_precision: 0.8106 - val_recall: 0.1699 - val_auc: 0.8434 - val_prc: 0.5567
Epoch 29/150
177/177 [==============================] - 34s 195ms/step - loss: 0.2178 - categorical_accuracy: 0.5378 - tp: 414.0000 - fp: 131.0000 - tn: 28149.0000 - fn: 5242.0000 - precision: 0.7596 - recall: 0.0732 - auc: 0.8437 - prc: 0.5344 - val_loss: 0.2431 - val_categorical_accuracy: 0.4939 - val_tp: 556.0000 - val_fp: 232.0000 - val_tn: 9213.0000 - val_fn: 1333.0000 - val_precision: 0.7056 - val_recall: 0.2943 - val_auc: 0.8452 - val_prc: 0.5473
Epoch 30/150
177/177 [==============================] - 36s 205ms/step - loss: 0.2178 - categorical_accuracy: 0.5341 - tp: 396.0000 - fp: 119.0000 - tn: 28161.0000 - fn: 5260.0000 - precision: 0.7689 - recall: 0.0700 - auc: 0.8435 - prc: 0.5332 - val_loss: 0.2270 - val_categorical_accuracy: 0.5193 - val_tp: 415.0000 - val_fp: 133.0000 - val_tn: 9312.0000 - val_fn: 1474.0000 - val_precision: 0.7573 - val_recall: 0.2197 - val_auc: 0.8434 - val_prc: 0.5483
Epoch 31/150
177/177 [==============================] - 35s 195ms/step - loss: 0.2193 - categorical_accuracy: 0.5339 - tp: 413.0000 - fp: 146.0000 - tn: 28134.0000 - fn: 5243.0000 - precision: 0.7388 - recall: 0.0730 - auc: 0.8422 - prc: 0.5289 - val_loss: 0.2227 - val_categorical_accuracy: 0.5087 - val_tp: 307.0000 - val_fp: 78.0000 - val_tn: 9367.0000 - val_fn: 1582.0000 - val_precision: 0.7974 - val_recall: 0.1625 - val_auc: 0.8451 - val_prc: 0.5480
Epoch 32/150
177/177 [==============================] - 34s 194ms/step - loss: 0.2213 - categorical_accuracy: 0.5246 - tp: 397.0000 - fp: 137.0000 - tn: 28143.0000 - fn: 5259.0000 - precision: 0.7434 - recall: 0.0702 - auc: 0.8389 - prc: 0.5228 - val_loss: 0.2319 - val_categorical_accuracy: 0.5034 - val_tp: 425.0000 - val_fp: 137.0000 - val_tn: 9308.0000 - val_fn: 1464.0000 - val_precision: 0.7562 - val_recall: 0.2250 - val_auc: 0.8367 - val_prc: 0.5424
Epoch 33/150
177/177 [==============================] - 34s 194ms/step - loss: 0.2191 - categorical_accuracy: 0.5276 - tp: 419.0000 - fp: 135.0000 - tn: 28145.0000 - fn: 5237.0000 - precision: 0.7563 - recall: 0.0741 - auc: 0.8417 - prc: 0.5317 - val_loss: 0.2340 - val_categorical_accuracy: 0.5251 - val_tp: 472.0000 - val_fp: 163.0000 - val_tn: 9282.0000 - val_fn: 1417.0000 - val_precision: 0.7433 - val_recall: 0.2499 - val_auc: 0.8368 - val_prc: 0.5476
Epoch 34/150
177/177 [==============================] - 34s 194ms/step - loss: 0.2172 - categorical_accuracy: 0.5279 - tp: 415.0000 - fp: 135.0000 - tn: 28145.0000 - fn: 5241.0000 - precision: 0.7545 - recall: 0.0734 - auc: 0.8445 - prc: 0.5361 - val_loss: 0.2361 - val_categorical_accuracy: 0.5093 - val_tp: 468.0000 - val_fp: 171.0000 - val_tn: 9274.0000 - val_fn: 1421.0000 - val_precision: 0.7324 - val_recall: 0.2478 - val_auc: 0.8353 - val_prc: 0.5424
Epoch 35/150
177/177 [==============================] - 35s 195ms/step - loss: 0.2186 - categorical_accuracy: 0.5387 - tp: 452.0000 - fp: 135.0000 - tn: 28145.0000 - fn: 5204.0000 - precision: 0.7700 - recall: 0.0799 - auc: 0.8433 - prc: 0.5332 - val_loss: 0.2498 - val_categorical_accuracy: 0.4955 - val_tp: 577.0000 - val_fp: 257.0000 - val_tn: 9188.0000 - val_fn: 1312.0000 - val_precision: 0.6918 - val_recall: 0.3055 - val_auc: 0.8358 - val_prc: 0.5419
Epoch 36/150
177/177 [==============================] - 34s 195ms/step - loss: 0.2222 - categorical_accuracy: 0.5170 - tp: 451.0000 - fp: 157.0000 - tn: 28123.0000 - fn: 5205.0000 - precision: 0.7418 - recall: 0.0797 - auc: 0.8377 - prc: 0.5198 - val_loss: 0.2214 - val_categorical_accuracy: 0.5506 - val_tp: 397.0000 - val_fp: 123.0000 - val_tn: 9322.0000 - val_fn: 1492.0000 - val_precision: 0.7635 - val_recall: 0.2102 - val_auc: 0.8504 - val_prc: 0.5612
Epoch 37/150
177/177 [==============================] - 35s 195ms/step - loss: 0.2199 - categorical_accuracy: 0.5249 - tp: 408.0000 - fp: 143.0000 - tn: 28137.0000 - fn: 5248.0000 - precision: 0.7405 - recall: 0.0721 - auc: 0.8408 - prc: 0.5297 - val_loss: 0.2440 - val_categorical_accuracy: 0.4891 - val_tp: 575.0000 - val_fp: 256.0000 - val_tn: 9189.0000 - val_fn: 1314.0000 - val_precision: 0.6919 - val_recall: 0.3044 - val_auc: 0.8491 - val_prc: 0.5517
Epoch 38/150
177/177 [==============================] - 36s 205ms/step - loss: 0.2185 - categorical_accuracy: 0.5341 - tp: 407.0000 - fp: 132.0000 - tn: 28148.0000 - fn: 5249.0000 - precision: 0.7551 - recall: 0.0720 - auc: 0.8428 - prc: 0.5305 - val_loss: 0.2405 - val_categorical_accuracy: 0.4918 - val_tp: 509.0000 - val_fp: 190.0000 - val_tn: 9255.0000 - val_fn: 1380.0000 - val_precision: 0.7282 - val_recall: 0.2695 - val_auc: 0.8375 - val_prc: 0.5414
Epoch 39/150
177/177 [==============================] - 35s 195ms/step - loss: 0.2200 - categorical_accuracy: 0.5311 - tp: 391.0000 - fp: 151.0000 - tn: 28129.0000 - fn: 5265.0000 - precision: 0.7214 - recall: 0.0691 - auc: 0.8410 - prc: 0.5253 - val_loss: 0.2201 - val_categorical_accuracy: 0.5400 - val_tp: 292.0000 - val_fp: 70.0000 - val_tn: 9375.0000 - val_fn: 1597.0000 - val_precision: 0.8066 - val_recall: 0.1546 - val_auc: 0.8431 - val_prc: 0.5520
Epoch 40/150
177/177 [==============================] - 36s 205ms/step - loss: 0.2225 - categorical_accuracy: 0.5196 - tp: 478.0000 - fp: 169.0000 - tn: 28111.0000 - fn: 5178.0000 - precision: 0.7388 - recall: 0.0845 - auc: 0.8390 - prc: 0.5201 - val_loss: 0.2433 - val_categorical_accuracy: 0.4569 - val_tp: 32.0000 - val_fp: 8.0000 - val_tn: 9437.0000 - val_fn: 1857.0000 - val_precision: 0.8000 - val_recall: 0.0169 - val_auc: 0.7991 - val_prc: 0.4605
Epoch 41/150
177/177 [==============================] - 36s 205ms/step - loss: 0.2206 - categorical_accuracy: 0.5290 - tp: 441.0000 - fp: 154.0000 - tn: 28126.0000 - fn: 5215.0000 - precision: 0.7412 - recall: 0.0780 - auc: 0.8405 - prc: 0.5263 - val_loss: 0.2313 - val_categorical_accuracy: 0.5416 - val_tp: 397.0000 - val_fp: 122.0000 - val_tn: 9323.0000 - val_fn: 1492.0000 - val_precision: 0.7649 - val_recall: 0.2102 - val_auc: 0.8319 - val_prc: 0.5462
Epoch 42/150
177/177 [==============================] - 35s 195ms/step - loss: 0.2224 - categorical_accuracy: 0.5276 - tp: 442.0000 - fp: 143.0000 - tn: 28137.0000 - fn: 5214.0000 - precision: 0.7556 - recall: 0.0781 - auc: 0.8376 - prc: 0.5209 - val_loss: 0.2267 - val_categorical_accuracy: 0.5103 - val_tp: 405.0000 - val_fp: 128.0000 - val_tn: 9317.0000 - val_fn: 1484.0000 - val_precision: 0.7598 - val_recall: 0.2144 - val_auc: 0.8460 - val_prc: 0.5474
Epoch 43/150
177/177 [==============================] - 34s 195ms/step - loss: 0.2205 - categorical_accuracy: 0.5309 - tp: 416.0000 - fp: 142.0000 - tn: 28138.0000 - fn: 5240.0000 - precision: 0.7455 - recall: 0.0736 - auc: 0.8392 - prc: 0.5279 - val_loss: 0.2289 - val_categorical_accuracy: 0.5071 - val_tp: 391.0000 - val_fp: 114.0000 - val_tn: 9331.0000 - val_fn: 1498.0000 - val_precision: 0.7743 - val_recall: 0.2070 - val_auc: 0.8386 - val_prc: 0.5440
Epoch 44/150
177/177 [==============================] - 35s 195ms/step - loss: 0.2222 - categorical_accuracy: 0.5309 - tp: 447.0000 - fp: 153.0000 - tn: 28127.0000 - fn: 5209.0000 - precision: 0.7450 - recall: 0.0790 - auc: 0.8402 - prc: 0.5223 - val_loss: 0.2227 - val_categorical_accuracy: 0.5474 - val_tp: 354.0000 - val_fp: 106.0000 - val_tn: 9339.0000 - val_fn: 1535.0000 - val_precision: 0.7696 - val_recall: 0.1874 - val_auc: 0.8423 - val_prc: 0.5578
Epoch 45/150
177/177 [==============================] - 34s 195ms/step - loss: 0.2212 - categorical_accuracy: 0.5216 - tp: 436.0000 - fp: 144.0000 - tn: 28136.0000 - fn: 5220.0000 - precision: 0.7517 - recall: 0.0771 - auc: 0.8387 - prc: 0.5256 - val_loss: 0.2410 - val_categorical_accuracy: 0.4955 - val_tp: 537.0000 - val_fp: 221.0000 - val_tn: 9224.0000 - val_fn: 1352.0000 - val_precision: 0.7084 - val_recall: 0.2843 - val_auc: 0.8427 - val_prc: 0.5458
Epoch 46/150
177/177 [==============================] - 34s 194ms/step - loss: 0.2204 - categorical_accuracy: 0.5290 - tp: 435.0000 - fp: 138.0000 - tn: 28142.0000 - fn: 5221.0000 - precision: 0.7592 - recall: 0.0769 - auc: 0.8400 - prc: 0.5255 - val_loss: 0.2192 - val_categorical_accuracy: 0.5532 - val_tp: 332.0000 - val_fp: 90.0000 - val_tn: 9355.0000 - val_fn: 1557.0000 - val_precision: 0.7867 - val_recall: 0.1758 - val_auc: 0.8489 - val_prc: 0.5617
Epoch 47/150
177/177 [==============================] - 34s 194ms/step - loss: 0.2189 - categorical_accuracy: 0.5253 - tp: 412.0000 - fp: 134.0000 - tn: 28146.0000 - fn: 5244.0000 - precision: 0.7546 - recall: 0.0728 - auc: 0.8423 - prc: 0.5296 - val_loss: 0.2245 - val_categorical_accuracy: 0.5262 - val_tp: 427.0000 - val_fp: 130.0000 - val_tn: 9315.0000 - val_fn: 1462.0000 - val_precision: 0.7666 - val_recall: 0.2260 - val_auc: 0.8475 - val_prc: 0.5537
Epoch 48/150
177/177 [==============================] - 35s 195ms/step - loss: 0.2183 - categorical_accuracy: 0.5354 - tp: 425.0000 - fp: 138.0000 - tn: 28142.0000 - fn: 5231.0000 - precision: 0.7549 - recall: 0.0751 - auc: 0.8436 - prc: 0.5336 - val_loss: 0.2505 - val_categorical_accuracy: 0.4902 - val_tp: 589.0000 - val_fp: 272.0000 - val_tn: 9173.0000 - val_fn: 1300.0000 - val_precision: 0.6841 - val_recall: 0.3118 - val_auc: 0.8435 - val_prc: 0.5459
Epoch 49/150
177/177 [==============================] - 36s 204ms/step - loss: 0.2305 - categorical_accuracy: 0.5152 - tp: 523.0000 - fp: 193.0000 - tn: 28087.0000 - fn: 5133.0000 - precision: 0.7304 - recall: 0.0925 - auc: 0.8316 - prc: 0.5082 - val_loss: 0.2229 - val_categorical_accuracy: 0.5437 - val_tp: 179.0000 - val_fp: 49.0000 - val_tn: 9396.0000 - val_fn: 1710.0000 - val_precision: 0.7851 - val_recall: 0.0948 - val_auc: 0.8449 - val_prc: 0.5451
Epoch 50/150
177/177 [==============================] - 35s 196ms/step - loss: 0.2231 - categorical_accuracy: 0.5219 - tp: 447.0000 - fp: 146.0000 - tn: 28134.0000 - fn: 5209.0000 - precision: 0.7538 - recall: 0.0790 - auc: 0.8365 - prc: 0.5183 - val_loss: 0.2440 - val_categorical_accuracy: 0.4881 - val_tp: 531.0000 - val_fp: 214.0000 - val_tn: 9231.0000 - val_fn: 1358.0000 - val_precision: 0.7128 - val_recall: 0.2811 - val_auc: 0.8397 - val_prc: 0.5399
Epoch 51/150
177/177 [==============================] - 34s 195ms/step - loss: 0.2212 - categorical_accuracy: 0.5225 - tp: 408.0000 - fp: 139.0000 - tn: 28141.0000 - fn: 5248.0000 - precision: 0.7459 - recall: 0.0721 - auc: 0.8395 - prc: 0.5216 - val_loss: 0.2174 - val_categorical_accuracy: 0.5357 - val_tp: 339.0000 - val_fp: 89.0000 - val_tn: 9356.0000 - val_fn: 1550.0000 - val_precision: 0.7921 - val_recall: 0.1795 - val_auc: 0.8524 - val_prc: 0.5593
Epoch 52/150
177/177 [==============================] - 35s 195ms/step - loss: 0.2178 - categorical_accuracy: 0.5375 - tp: 431.0000 - fp: 135.0000 - tn: 28145.0000 - fn: 5225.0000 - precision: 0.7615 - recall: 0.0762 - auc: 0.8442 - prc: 0.5323 - val_loss: 0.2530 - val_categorical_accuracy: 0.4833 - val_tp: 623.0000 - val_fp: 303.0000 - val_tn: 9142.0000 - val_fn: 1266.0000 - val_precision: 0.6728 - val_recall: 0.3298 - val_auc: 0.8473 - val_prc: 0.5474
Epoch 53/150
177/177 [==============================] - 36s 205ms/step - loss: 0.2228 - categorical_accuracy: 0.5230 - tp: 434.0000 - fp: 151.0000 - tn: 28129.0000 - fn: 5222.0000 - precision: 0.7419 - recall: 0.0767 - auc: 0.8376 - prc: 0.5188 - val_loss: 0.2359 - val_categorical_accuracy: 0.5003 - val_tp: 502.0000 - val_fp: 186.0000 - val_tn: 9259.0000 - val_fn: 1387.0000 - val_precision: 0.7297 - val_recall: 0.2657 - val_auc: 0.8448 - val_prc: 0.5463
Epoch 54/150
177/177 [==============================] - 34s 194ms/step - loss: 0.2219 - categorical_accuracy: 0.5225 - tp: 421.0000 - fp: 149.0000 - tn: 28131.0000 - fn: 5235.0000 - precision: 0.7386 - recall: 0.0744 - auc: 0.8383 - prc: 0.5193 - val_loss: 0.2311 - val_categorical_accuracy: 0.5267 - val_tp: 465.0000 - val_fp: 171.0000 - val_tn: 9274.0000 - val_fn: 1424.0000 - val_precision: 0.7311 - val_recall: 0.2462 - val_auc: 0.8449 - val_prc: 0.5489
Epoch 55/150
177/177 [==============================] - 35s 195ms/step - loss: 0.2190 - categorical_accuracy: 0.5295 - tp: 393.0000 - fp: 134.0000 - tn: 28146.0000 - fn: 5263.0000 - precision: 0.7457 - recall: 0.0695 - auc: 0.8422 - prc: 0.5293 - val_loss: 0.2301 - val_categorical_accuracy: 0.5024 - val_tp: 450.0000 - val_fp: 147.0000 - val_tn: 9298.0000 - val_fn: 1439.0000 - val_precision: 0.7538 - val_recall: 0.2382 - val_auc: 0.8443 - val_prc: 0.5494
Epoch 56/150
177/177 [==============================] - 35s 195ms/step - loss: 0.2184 - categorical_accuracy: 0.5338 - tp: 419.0000 - fp: 144.0000 - tn: 28136.0000 - fn: 5237.0000 - precision: 0.7442 - recall: 0.0741 - auc: 0.8428 - prc: 0.5324 - val_loss: 0.2365 - val_categorical_accuracy: 0.5236 - val_tp: 444.0000 - val_fp: 150.0000 - val_tn: 9295.0000 - val_fn: 1445.0000 - val_precision: 0.7475 - val_recall: 0.2350 - val_auc: 0.8300 - val_prc: 0.5414
Epoch 57/150
177/177 [==============================] - 34s 194ms/step - loss: 0.2211 - categorical_accuracy: 0.5313 - tp: 432.0000 - fp: 135.0000 - tn: 28145.0000 - fn: 5224.0000 - precision: 0.7619 - recall: 0.0764 - auc: 0.8409 - prc: 0.5268 - val_loss: 0.2235 - val_categorical_accuracy: 0.5135 - val_tp: 337.0000 - val_fp: 90.0000 - val_tn: 9355.0000 - val_fn: 1552.0000 - val_precision: 0.7892 - val_recall: 0.1784 - val_auc: 0.8454 - val_prc: 0.5491
Epoch 58/150
177/177 [==============================] - 35s 195ms/step - loss: 0.2196 - categorical_accuracy: 0.5341 - tp: 440.0000 - fp: 138.0000 - tn: 28142.0000 - fn: 5216.0000 - precision: 0.7612 - recall: 0.0778 - auc: 0.8418 - prc: 0.5302 - val_loss: 0.2362 - val_categorical_accuracy: 0.4981 - val_tp: 486.0000 - val_fp: 172.0000 - val_tn: 9273.0000 - val_fn: 1403.0000 - val_precision: 0.7386 - val_recall: 0.2573 - val_auc: 0.8390 - val_prc: 0.5441
Epoch 59/150
177/177 [==============================] - 35s 195ms/step - loss: 0.2200 - categorical_accuracy: 0.5324 - tp: 413.0000 - fp: 147.0000 - tn: 28133.0000 - fn: 5243.0000 - precision: 0.7375 - recall: 0.0730 - auc: 0.8405 - prc: 0.5275 - val_loss: 0.2365 - val_categorical_accuracy: 0.4886 - val_tp: 498.0000 - val_fp: 180.0000 - val_tn: 9265.0000 - val_fn: 1391.0000 - val_precision: 0.7345 - val_recall: 0.2636 - val_auc: 0.8449 - val_prc: 0.5468
Epoch 60/150
177/177 [==============================] - 34s 194ms/step - loss: 0.2177 - categorical_accuracy: 0.5394 - tp: 402.0000 - fp: 134.0000 - tn: 28146.0000 - fn: 5254.0000 - precision: 0.7500 - recall: 0.0711 - auc: 0.8439 - prc: 0.5349 - val_loss: 0.2526 - val_categorical_accuracy: 0.4780 - val_tp: 579.0000 - val_fp: 259.0000 - val_tn: 9186.0000 - val_fn: 1310.0000 - val_precision: 0.6909 - val_recall: 0.3065 - val_auc: 0.8387 - val_prc: 0.5404
Epoch 61/150
177/177 [==============================] - 34s 194ms/step - loss: 0.2187 - categorical_accuracy: 0.5308 - tp: 424.0000 - fp: 138.0000 - tn: 28142.0000 - fn: 5232.0000 - precision: 0.7544 - recall: 0.0750 - auc: 0.8426 - prc: 0.5312 - val_loss: 0.2459 - val_categorical_accuracy: 0.5040 - val_tp: 552.0000 - val_fp: 226.0000 - val_tn: 9219.0000 - val_fn: 1337.0000 - val_precision: 0.7095 - val_recall: 0.2922 - val_auc: 0.8338 - val_prc: 0.5424
Epoch 62/150
177/177 [==============================] - 35s 195ms/step - loss: 0.2185 - categorical_accuracy: 0.5255 - tp: 425.0000 - fp: 128.0000 - tn: 28152.0000 - fn: 5231.0000 - precision: 0.7685 - recall: 0.0751 - auc: 0.8427 - prc: 0.5311 - val_loss: 0.2323 - val_categorical_accuracy: 0.5183 - val_tp: 500.0000 - val_fp: 179.0000 - val_tn: 9266.0000 - val_fn: 1389.0000 - val_precision: 0.7364 - val_recall: 0.2647 - val_auc: 0.8470 - val_prc: 0.5527
Epoch 63/150
177/177 [==============================] - 34s 195ms/step - loss: 0.2202 - categorical_accuracy: 0.5359 - tp: 423.0000 - fp: 150.0000 - tn: 28130.0000 - fn: 5233.0000 - precision: 0.7382 - recall: 0.0748 - auc: 0.8408 - prc: 0.5257 - val_loss: 0.2162 - val_categorical_accuracy: 0.5363 - val_tp: 264.0000 - val_fp: 62.0000 - val_tn: 9383.0000 - val_fn: 1625.0000 - val_precision: 0.8098 - val_recall: 0.1398 - val_auc: 0.8507 - val_prc: 0.5614
Epoch 64/150
177/177 [==============================] - 35s 195ms/step - loss: 0.2185 - categorical_accuracy: 0.5345 - tp: 428.0000 - fp: 131.0000 - tn: 28149.0000 - fn: 5228.0000 - precision: 0.7657 - recall: 0.0757 - auc: 0.8426 - prc: 0.5340 - val_loss: 0.2311 - val_categorical_accuracy: 0.5077 - val_tp: 481.0000 - val_fp: 168.0000 - val_tn: 9277.0000 - val_fn: 1408.0000 - val_precision: 0.7411 - val_recall: 0.2546 - val_auc: 0.8466 - val_prc: 0.5513
Epoch 65/150
177/177 [==============================] - 35s 195ms/step - loss: 0.2193 - categorical_accuracy: 0.5262 - tp: 436.0000 - fp: 154.0000 - tn: 28126.0000 - fn: 5220.0000 - precision: 0.7390 - recall: 0.0771 - auc: 0.8420 - prc: 0.5284 - val_loss: 0.2244 - val_categorical_accuracy: 0.5447 - val_tp: 392.0000 - val_fp: 112.0000 - val_tn: 9333.0000 - val_fn: 1497.0000 - val_precision: 0.7778 - val_recall: 0.2075 - val_auc: 0.8417 - val_prc: 0.5560
Epoch 66/150
177/177 [==============================] - 34s 194ms/step - loss: 0.2185 - categorical_accuracy: 0.5401 - tp: 421.0000 - fp: 137.0000 - tn: 28143.0000 - fn: 5235.0000 - precision: 0.7545 - recall: 0.0744 - auc: 0.8431 - prc: 0.5349 - val_loss: 0.2277 - val_categorical_accuracy: 0.5193 - val_tp: 450.0000 - val_fp: 153.0000 - val_tn: 9292.0000 - val_fn: 1439.0000 - val_precision: 0.7463 - val_recall: 0.2382 - val_auc: 0.8479 - val_prc: 0.5525
Epoch 67/150
177/177 [==============================] - 35s 195ms/step - loss: 0.2224 - categorical_accuracy: 0.5253 - tp: 437.0000 - fp: 150.0000 - tn: 28130.0000 - fn: 5219.0000 - precision: 0.7445 - recall: 0.0773 - auc: 0.8387 - prc: 0.5216 - val_loss: 0.2259 - val_categorical_accuracy: 0.5495 - val_tp: 375.0000 - val_fp: 111.0000 - val_tn: 9334.0000 - val_fn: 1514.0000 - val_precision: 0.7716 - val_recall: 0.1985 - val_auc: 0.8400 - val_prc: 0.5564
Epoch 68/150
177/177 [==============================] - 34s 195ms/step - loss: 0.2229 - categorical_accuracy: 0.5320 - tp: 443.0000 - fp: 162.0000 - tn: 28118.0000 - fn: 5213.0000 - precision: 0.7322 - recall: 0.0783 - auc: 0.8380 - prc: 0.5238 - val_loss: 0.2229 - val_categorical_accuracy: 0.5034 - val_tp: 247.0000 - val_fp: 57.0000 - val_tn: 9388.0000 - val_fn: 1642.0000 - val_precision: 0.8125 - val_recall: 0.1308 - val_auc: 0.8359 - val_prc: 0.5431
Epoch 69/150
177/177 [==============================] - 36s 205ms/step - loss: 0.2204 - categorical_accuracy: 0.5265 - tp: 412.0000 - fp: 152.0000 - tn: 28128.0000 - fn: 5244.0000 - precision: 0.7305 - recall: 0.0728 - auc: 0.8401 - prc: 0.5248 - val_loss: 0.2336 - val_categorical_accuracy: 0.5024 - val_tp: 486.0000 - val_fp: 173.0000 - val_tn: 9272.0000 - val_fn: 1403.0000 - val_precision: 0.7375 - val_recall: 0.2573 - val_auc: 0.8441 - val_prc: 0.5503
Epoch 70/150
177/177 [==============================] - 36s 205ms/step - loss: 0.2200 - categorical_accuracy: 0.5304 - tp: 392.0000 - fp: 137.0000 - tn: 28143.0000 - fn: 5264.0000 - precision: 0.7410 - recall: 0.0693 - auc: 0.8409 - prc: 0.5269 - val_loss: 0.2219 - val_categorical_accuracy: 0.5230 - val_tp: 415.0000 - val_fp: 123.0000 - val_tn: 9322.0000 - val_fn: 1474.0000 - val_precision: 0.7714 - val_recall: 0.2197 - val_auc: 0.8505 - val_prc: 0.5576
Epoch 71/150
177/177 [==============================] - 36s 205ms/step - loss: 0.2179 - categorical_accuracy: 0.5362 - tp: 449.0000 - fp: 151.0000 - tn: 28129.0000 - fn: 5207.0000 - precision: 0.7483 - recall: 0.0794 - auc: 0.8443 - prc: 0.5330 - val_loss: 0.2476 - val_categorical_accuracy: 0.5040 - val_tp: 564.0000 - val_fp: 241.0000 - val_tn: 9204.0000 - val_fn: 1325.0000 - val_precision: 0.7006 - val_recall: 0.2986 - val_auc: 0.8363 - val_prc: 0.5429
Epoch 72/150
177/177 [==============================] - 34s 194ms/step - loss: 0.2199 - categorical_accuracy: 0.5320 - tp: 419.0000 - fp: 135.0000 - tn: 28145.0000 - fn: 5237.0000 - precision: 0.7563 - recall: 0.0741 - auc: 0.8409 - prc: 0.5271 - val_loss: 0.2245 - val_categorical_accuracy: 0.5373 - val_tp: 360.0000 - val_fp: 93.0000 - val_tn: 9352.0000 - val_fn: 1529.0000 - val_precision: 0.7947 - val_recall: 0.1906 - val_auc: 0.8383 - val_prc: 0.5527
Epoch 73/150
177/177 [==============================] - 34s 194ms/step - loss: 0.2186 - categorical_accuracy: 0.5339 - tp: 455.0000 - fp: 163.0000 - tn: 28117.0000 - fn: 5201.0000 - precision: 0.7362 - recall: 0.0804 - auc: 0.8434 - prc: 0.5333 - val_loss: 0.2160 - val_categorical_accuracy: 0.5416 - val_tp: 264.0000 - val_fp: 61.0000 - val_tn: 9384.0000 - val_fn: 1625.0000 - val_precision: 0.8123 - val_recall: 0.1398 - val_auc: 0.8489 - val_prc: 0.5613
Epoch 74/150
177/177 [==============================] - 34s 194ms/step - loss: 0.2185 - categorical_accuracy: 0.5304 - tp: 427.0000 - fp: 121.0000 - tn: 28159.0000 - fn: 5229.0000 - precision: 0.7792 - recall: 0.0755 - auc: 0.8424 - prc: 0.5350 - val_loss: 0.2208 - val_categorical_accuracy: 0.5289 - val_tp: 353.0000 - val_fp: 101.0000 - val_tn: 9344.0000 - val_fn: 1536.0000 - val_precision: 0.7775 - val_recall: 0.1869 - val_auc: 0.8482 - val_prc: 0.5566
Epoch 75/150
177/177 [==============================] - 34s 194ms/step - loss: 0.2178 - categorical_accuracy: 0.5385 - tp: 421.0000 - fp: 138.0000 - tn: 28142.0000 - fn: 5235.0000 - precision: 0.7531 - recall: 0.0744 - auc: 0.8434 - prc: 0.5353 - val_loss: 0.2352 - val_categorical_accuracy: 0.4987 - val_tp: 482.0000 - val_fp: 165.0000 - val_tn: 9280.0000 - val_fn: 1407.0000 - val_precision: 0.7450 - val_recall: 0.2552 - val_auc: 0.8405 - val_prc: 0.5437
Epoch 76/150
177/177 [==============================] - 34s 194ms/step - loss: 0.2173 - categorical_accuracy: 0.5384 - tp: 411.0000 - fp: 138.0000 - tn: 28142.0000 - fn: 5245.0000 - precision: 0.7486 - recall: 0.0727 - auc: 0.8445 - prc: 0.5365 - val_loss: 0.2554 - val_categorical_accuracy: 0.4823 - val_tp: 626.0000 - val_fp: 309.0000 - val_tn: 9136.0000 - val_fn: 1263.0000 - val_precision: 0.6695 - val_recall: 0.3314 - val_auc: 0.8429 - val_prc: 0.5476
Epoch 77/150
177/177 [==============================] - 34s 194ms/step - loss: 0.2196 - categorical_accuracy: 0.5281 - tp: 437.0000 - fp: 145.0000 - tn: 28135.0000 - fn: 5219.0000 - precision: 0.7509 - recall: 0.0773 - auc: 0.8418 - prc: 0.5277 - val_loss: 0.2475 - val_categorical_accuracy: 0.5119 - val_tp: 557.0000 - val_fp: 242.0000 - val_tn: 9203.0000 - val_fn: 1332.0000 - val_precision: 0.6971 - val_recall: 0.2949 - val_auc: 0.8349 - val_prc: 0.5409
Epoch 83/150
177/177 [==============================] - 36s 205ms/step - loss: 0.2209 - categorical_accuracy: 0.5279 - tp: 432.0000 - fp: 142.0000 - tn: 28138.0000 - fn: 5224.0000 - precision: 0.7526 - recall: 0.0764 - auc: 0.8399 - prc: 0.5211 - val_loss: 0.2306 - val_categorical_accuracy: 0.5066 - val_tp: 501.0000 - val_fp: 180.0000 - val_tn: 9265.0000 - val_fn: 1388.0000 - val_precision: 0.7357 - val_recall: 0.2652 - val_auc: 0.8488 - val_prc: 0.5544
Epoch 84/150
177/177 [==============================] - 35s 195ms/step - loss: 0.2172 - categorical_accuracy: 0.5339 - tp: 416.0000 - fp: 136.0000 - tn: 28144.0000 - fn: 5240.0000 - precision: 0.7536 - recall: 0.0736 - auc: 0.8446 - prc: 0.5378 - val_loss: 0.2510 - val_categorical_accuracy: 0.4817 - val_tp: 594.0000 - val_fp: 281.0000 - val_tn: 9164.0000 - val_fn: 1295.0000 - val_precision: 0.6789 - val_recall: 0.3145 - val_auc: 0.8419 - val_prc: 0.5455
Epoch 85/150
177/177 [==============================] - 34s 194ms/step - loss: 0.2203 - categorical_accuracy: 0.5272 - tp: 437.0000 - fp: 142.0000 - tn: 28138.0000 - fn: 5219.0000 - precision: 0.7547 - recall: 0.0773 - auc: 0.8417 - prc: 0.5287 - val_loss: 0.2247 - val_categorical_accuracy: 0.5416 - val_tp: 446.0000 - val_fp: 156.0000 - val_tn: 9289.0000 - val_fn: 1443.0000 - val_precision: 0.7409 - val_recall: 0.2361 - val_auc: 0.8498 - val_prc: 0.5627
Epoch 86/150
177/177 [==============================] - 34s 195ms/step - loss: 0.2235 - categorical_accuracy: 0.5171 - tp: 466.0000 - fp: 161.0000 - tn: 28119.0000 - fn: 5190.0000 - precision: 0.7432 - recall: 0.0824 - auc: 0.8366 - prc: 0.5210 - val_loss: 0.2210 - val_categorical_accuracy: 0.5199 - val_tp: 381.0000 - val_fp: 116.0000 - val_tn: 9329.0000 - val_fn: 1508.0000 - val_precision: 0.7666 - val_recall: 0.2017 - val_auc: 0.8494 - val_prc: 0.5591
Epoch 87/150
177/177 [==============================] - 34s 194ms/step - loss: 0.2266 - categorical_accuracy: 0.5156 - tp: 489.0000 - fp: 186.0000 - tn: 28094.0000 - fn: 5167.0000 - precision: 0.7244 - recall: 0.0865 - auc: 0.8347 - prc: 0.5125 - val_loss: 0.2163 - val_categorical_accuracy: 0.5437 - val_tp: 312.0000 - val_fp: 80.0000 - val_tn: 9365.0000 - val_fn: 1577.0000 - val_precision: 0.7959 - val_recall: 0.1652 - val_auc: 0.8544 - val_prc: 0.5620
Epoch 88/150
177/177 [==============================] - 34s 195ms/step - loss: 0.2258 - categorical_accuracy: 0.5186 - tp: 459.0000 - fp: 172.0000 - tn: 28108.0000 - fn: 5197.0000 - precision: 0.7274 - recall: 0.0812 - auc: 0.8344 - prc: 0.5115 - val_loss: 0.2318 - val_categorical_accuracy: 0.5379 - val_tp: 483.0000 - val_fp: 173.0000 - val_tn: 9272.0000 - val_fn: 1406.0000 - val_precision: 0.7363 - val_recall: 0.2557 - val_auc: 0.8447 - val_prc: 0.5557
Epoch 89/150
177/177 [==============================] - 34s 194ms/step - loss: 0.2208 - categorical_accuracy: 0.5256 - tp: 415.0000 - fp: 150.0000 - tn: 28130.0000 - fn: 5241.0000 - precision: 0.7345 - recall: 0.0734 - auc: 0.8395 - prc: 0.5234 - val_loss: 0.2286 - val_categorical_accuracy: 0.5066 - val_tp: 376.0000 - val_fp: 109.0000 - val_tn: 9336.0000 - val_fn: 1513.0000 - val_precision: 0.7753 - val_recall: 0.1990 - val_auc: 0.8365 - val_prc: 0.5420
Epoch 90/150
177/177 [==============================] - 34s 194ms/step - loss: 0.2191 - categorical_accuracy: 0.5309 - tp: 394.0000 - fp: 137.0000 - tn: 28143.0000 - fn: 5262.0000 - precision: 0.7420 - recall: 0.0697 - auc: 0.8419 - prc: 0.5302 - val_loss: 0.2252 - val_categorical_accuracy: 0.5124 - val_tp: 354.0000 - val_fp: 91.0000 - val_tn: 9354.0000 - val_fn: 1535.0000 - val_precision: 0.7955 - val_recall: 0.1874 - val_auc: 0.8400 - val_prc: 0.5489
Epoch 91/150
177/177 [==============================] - 36s 205ms/step - loss: 0.2187 - categorical_accuracy: 0.5352 - tp: 431.0000 - fp: 150.0000 - tn: 28130.0000 - fn: 5225.0000 - precision: 0.7418 - recall: 0.0762 - auc: 0.8433 - prc: 0.5308 - val_loss: 0.2243 - val_categorical_accuracy: 0.5034 - val_tp: 357.0000 - val_fp: 99.0000 - val_tn: 9346.0000 - val_fn: 1532.0000 - val_precision: 0.7829 - val_recall: 0.1890 - val_auc: 0.8442 - val_prc: 0.5504
Epoch 92/150
177/177 [==============================] - 35s 195ms/step - loss: 0.2184 - categorical_accuracy: 0.5288 - tp: 440.0000 - fp: 126.0000 - tn: 28154.0000 - fn: 5216.0000 - precision: 0.7774 - recall: 0.0778 - auc: 0.8429 - prc: 0.5333 - val_loss: 0.2305 - val_categorical_accuracy: 0.5156 - val_tp: 449.0000 - val_fp: 150.0000 - val_tn: 9295.0000 - val_fn: 1440.0000 - val_precision: 0.7496 - val_recall: 0.2377 - val_auc: 0.8402 - val_prc: 0.5492
Epoch 93/150
177/177 [==============================] - 35s 195ms/step - loss: 0.2181 - categorical_accuracy: 0.5347 - tp: 420.0000 - fp: 136.0000 - tn: 28144.0000 - fn: 5236.0000 - precision: 0.7554 - recall: 0.0743 - auc: 0.8439 - prc: 0.5297 - val_loss: 0.2382 - val_categorical_accuracy: 0.5066 - val_tp: 553.0000 - val_fp: 236.0000 - val_tn: 9209.0000 - val_fn: 1336.0000 - val_precision: 0.7009 - val_recall: 0.2927 - val_auc: 0.8466 - val_prc: 0.5488
Epoch 94/150
177/177 [==============================] - 35s 195ms/step - loss: 0.2196 - categorical_accuracy: 0.5347 - tp: 433.0000 - fp: 139.0000 - tn: 28141.0000 - fn: 5223.0000 - precision: 0.7570 - recall: 0.0766 - auc: 0.8417 - prc: 0.5275 - val_loss: 0.2269 - val_categorical_accuracy: 0.5056 - val_tp: 420.0000 - val_fp: 129.0000 - val_tn: 9316.0000 - val_fn: 1469.0000 - val_precision: 0.7650 - val_recall: 0.2223 - val_auc: 0.8482 - val_prc: 0.5509
Epoch 95/150
177/177 [==============================] - 36s 205ms/step - loss: 0.2204 - categorical_accuracy: 0.5308 - tp: 476.0000 - fp: 153.0000 - tn: 28127.0000 - fn: 5180.0000 - precision: 0.7568 - recall: 0.0842 - auc: 0.8412 - prc: 0.5297 - val_loss: 0.2360 - val_categorical_accuracy: 0.5267 - val_tp: 470.0000 - val_fp: 168.0000 - val_tn: 9277.0000 - val_fn: 1419.0000 - val_precision: 0.7367 - val_recall: 0.2488 - val_auc: 0.8367 - val_prc: 0.5429
Epoch 96/150
177/177 [==============================] - 35s 195ms/step - loss: 0.2186 - categorical_accuracy: 0.5309 - tp: 417.0000 - fp: 146.0000 - tn: 28134.0000 - fn: 5239.0000 - precision: 0.7407 - recall: 0.0737 - auc: 0.8434 - prc: 0.5290 - val_loss: 0.2333 - val_categorical_accuracy: 0.5077 - val_tp: 449.0000 - val_fp: 145.0000 - val_tn: 9300.0000 - val_fn: 1440.0000 - val_precision: 0.7559 - val_recall: 0.2377 - val_auc: 0.8363 - val_prc: 0.5418
Epoch 97/150
177/177 [==============================] - 36s 205ms/step - loss: 0.2197 - categorical_accuracy: 0.5316 - tp: 462.0000 - fp: 156.0000 - tn: 28124.0000 - fn: 5194.0000 - precision: 0.7476 - recall: 0.0817 - auc: 0.8420 - prc: 0.5291 - val_loss: 0.2523 - val_categorical_accuracy: 0.4923 - val_tp: 586.0000 - val_fp: 268.0000 - val_tn: 9177.0000 - val_fn: 1303.0000 - val_precision: 0.6862 - val_recall: 0.3102 - val_auc: 0.8392 - val_prc: 0.5422
Epoch 98/150
177/177 [==============================] - 35s 195ms/step - loss: 0.2193 - categorical_accuracy: 0.5288 - tp: 431.0000 - fp: 136.0000 - tn: 28144.0000 - fn: 5225.0000 - precision: 0.7601 - recall: 0.0762 - auc: 0.8417 - prc: 0.5285 - val_loss: 0.2381 - val_categorical_accuracy: 0.5019 - val_tp: 528.0000 - val_fp: 199.0000 - val_tn: 9246.0000 - val_fn: 1361.0000 - val_precision: 0.7263 - val_recall: 0.2795 - val_auc: 0.8431 - val_prc: 0.5454
Epoch 99/150
177/177 [==============================] - 35s 195ms/step - loss: 0.2197 - categorical_accuracy: 0.5334 - tp: 411.0000 - fp: 136.0000 - tn: 28144.0000 - fn: 5245.0000 - precision: 0.7514 - recall: 0.0727 - auc: 0.8409 - prc: 0.5276 - val_loss: 0.2267 - val_categorical_accuracy: 0.5347 - val_tp: 369.0000 - val_fp: 112.0000 - val_tn: 9333.0000 - val_fn: 1520.0000 - val_precision: 0.7672 - val_recall: 0.1953 - val_auc: 0.8370 - val_prc: 0.5478
Epoch 100/150
177/177 [==============================] - 34s 194ms/step - loss: 0.2378 - categorical_accuracy: 0.4981 - tp: 498.0000 - fp: 226.0000 - tn: 28054.0000 - fn: 5158.0000 - precision: 0.6878 - recall: 0.0880 - auc: 0.8232 - prc: 0.4819 - val_loss: 0.2523 - val_categorical_accuracy: 0.5056 - val_tp: 615.0000 - val_fp: 309.0000 - val_tn: 9136.0000 - val_fn: 1274.0000 - val_precision: 0.6656 - val_recall: 0.3256 - val_auc: 0.8436 - val_prc: 0.5456
Epoch 101/150
177/177 [==============================] - 34s 194ms/step - loss: 0.2224 - categorical_accuracy: 0.5272 - tp: 493.0000 - fp: 157.0000 - tn: 28123.0000 - fn: 5163.0000 - precision: 0.7585 - recall: 0.0872 - auc: 0.8402 - prc: 0.5270 - val_loss: 0.2329 - val_categorical_accuracy: 0.5040 - val_tp: 436.0000 - val_fp: 140.0000 - val_tn: 9305.0000 - val_fn: 1453.0000 - val_precision: 0.7569 - val_recall: 0.2308 - val_auc: 0.8378 - val_prc: 0.5444
Epoch 102/150
177/177 [==============================] - 35s 195ms/step - loss: 0.2208 - categorical_accuracy: 0.5304 - tp: 437.0000 - fp: 153.0000 - tn: 28127.0000 - fn: 5219.0000 - precision: 0.7407 - recall: 0.0773 - auc: 0.8400 - prc: 0.5260 - val_loss: 0.2145 - val_categorical_accuracy: 0.5659 - val_tp: 199.0000 - val_fp: 47.0000 - val_tn: 9398.0000 - val_fn: 1690.0000 - val_precision: 0.8089 - val_recall: 0.1053 - val_auc: 0.8466 - val_prc: 0.5658
Epoch 103/150
177/177 [==============================] - 34s 194ms/step - loss: 0.2182 - categorical_accuracy: 0.5345 - tp: 414.0000 - fp: 149.0000 - tn: 28131.0000 - fn: 5242.0000 - precision: 0.7353 - recall: 0.0732 - auc: 0.8438 - prc: 0.5294 - val_loss: 0.2395 - val_categorical_accuracy: 0.5034 - val_tp: 537.0000 - val_fp: 197.0000 - val_tn: 9248.0000 - val_fn: 1352.0000 - val_precision: 0.7316 - val_recall: 0.2843 - val_auc: 0.8418 - val_prc: 0.5462
Epoch 104/150
177/177 [==============================] - 36s 205ms/step - loss: 0.2175 - categorical_accuracy: 0.5361 - tp: 373.0000 - fp: 120.0000 - tn: 28160.0000 - fn: 5283.0000 - precision: 0.7566 - recall: 0.0659 - auc: 0.8438 - prc: 0.5340 - val_loss: 0.2171 - val_categorical_accuracy: 0.5400 - val_tp: 280.0000 - val_fp: 66.0000 - val_tn: 9379.0000 - val_fn: 1609.0000 - val_precision: 0.8092 - val_recall: 0.1482 - val_auc: 0.8479 - val_prc: 0.5562
Epoch 105/150
177/177 [==============================] - 35s 195ms/step - loss: 0.2183 - categorical_accuracy: 0.5359 - tp: 415.0000 - fp: 132.0000 - tn: 28148.0000 - fn: 5241.0000 - precision: 0.7587 - recall: 0.0734 - auc: 0.8433 - prc: 0.5317 - val_loss: 0.2302 - val_categorical_accuracy: 0.5199 - val_tp: 418.0000 - val_fp: 136.0000 - val_tn: 9309.0000 - val_fn: 1471.0000 - val_precision: 0.7545 - val_recall: 0.2213 - val_auc: 0.8363 - val_prc: 0.5458
Epoch 106/150
177/177 [==============================] - 36s 205ms/step - loss: 0.2170 - categorical_accuracy: 0.5359 - tp: 396.0000 - fp: 134.0000 - tn: 28146.0000 - fn: 5260.0000 - precision: 0.7472 - recall: 0.0700 - auc: 0.8452 - prc: 0.5368 - val_loss: 0.2246 - val_categorical_accuracy: 0.5167 - val_tp: 360.0000 - val_fp: 96.0000 - val_tn: 9349.0000 - val_fn: 1529.0000 - val_precision: 0.7895 - val_recall: 0.1906 - val_auc: 0.8393 - val_prc: 0.5483
Epoch 107/150
177/177 [==============================] - 36s 205ms/step - loss: 0.2190 - categorical_accuracy: 0.5336 - tp: 389.0000 - fp: 141.0000 - tn: 28139.0000 - fn: 5267.0000 - precision: 0.7340 - recall: 0.0688 - auc: 0.8424 - prc: 0.5293 - val_loss: 0.2281 - val_categorical_accuracy: 0.5220 - val_tp: 365.0000 - val_fp: 106.0000 - val_tn: 9339.0000 - val_fn: 1524.0000 - val_precision: 0.7749 - val_recall: 0.1932 - val_auc: 0.8338 - val_prc: 0.5471
Epoch 108/150
177/177 [==============================] - 35s 195ms/step - loss: 0.2194 - categorical_accuracy: 0.5354 - tp: 424.0000 - fp: 136.0000 - tn: 28144.0000 - fn: 5232.0000 - precision: 0.7571 - recall: 0.0750 - auc: 0.8416 - prc: 0.5292 - val_loss: 0.2247 - val_categorical_accuracy: 0.4971 - val_tp: 370.0000 - val_fp: 110.0000 - val_tn: 9335.0000 - val_fn: 1519.0000 - val_precision: 0.7708 - val_recall: 0.1959 - val_auc: 0.8486 - val_prc: 0.5518
Epoch 109/150
177/177 [==============================] - 34s 195ms/step - loss: 0.2203 - categorical_accuracy: 0.5226 - tp: 419.0000 - fp: 132.0000 - tn: 28148.0000 - fn: 5237.0000 - precision: 0.7604 - recall: 0.0741 - auc: 0.8404 - prc: 0.5239 - val_loss: 0.2269 - val_categorical_accuracy: 0.5056 - val_tp: 466.0000 - val_fp: 154.0000 - val_tn: 9291.0000 - val_fn: 1423.0000 - val_precision: 0.7516 - val_recall: 0.2467 - val_auc: 0.8476 - val_prc: 0.5559
Epoch 110/150
177/177 [==============================] - 35s 195ms/step - loss: 0.2196 - categorical_accuracy: 0.5285 - tp: 455.0000 - fp: 141.0000 - tn: 28139.0000 - fn: 5201.0000 - precision: 0.7634 - recall: 0.0804 - auc: 0.8414 - prc: 0.5298 - val_loss: 0.2308 - val_categorical_accuracy: 0.5225 - val_tp: 407.0000 - val_fp: 132.0000 - val_tn: 9313.0000 - val_fn: 1482.0000 - val_precision: 0.7551 - val_recall: 0.2155 - val_auc: 0.8344 - val_prc: 0.5446
Epoch 111/150
177/177 [==============================] - 36s 204ms/step - loss: 0.2189 - categorical_accuracy: 0.5302 - tp: 436.0000 - fp: 146.0000 - tn: 28134.0000 - fn: 5220.0000 - precision: 0.7491 - recall: 0.0771 - auc: 0.8428 - prc: 0.5302 - val_loss: 0.2378 - val_categorical_accuracy: 0.5087 - val_tp: 511.0000 - val_fp: 185.0000 - val_tn: 9260.0000 - val_fn: 1378.0000 - val_precision: 0.7342 - val_recall: 0.2705 - val_auc: 0.8382 - val_prc: 0.5438
Epoch 112/150
177/177 [==============================] - 34s 194ms/step - loss: 0.2178 - categorical_accuracy: 0.5385 - tp: 391.0000 - fp: 128.0000 - tn: 28152.0000 - fn: 5265.0000 - precision: 0.7534 - recall: 0.0691 - auc: 0.8436 - prc: 0.5348 - val_loss: 0.2264 - val_categorical_accuracy: 0.5167 - val_tp: 427.0000 - val_fp: 136.0000 - val_tn: 9309.0000 - val_fn: 1462.0000 - val_precision: 0.7584 - val_recall: 0.2260 - val_auc: 0.8459 - val_prc: 0.5517
Epoch 113/150
177/177 [==============================] - 34s 194ms/step - loss: 0.2169 - categorical_accuracy: 0.5391 - tp: 420.0000 - fp: 138.0000 - tn: 28142.0000 - fn: 5236.0000 - precision: 0.7527 - recall: 0.0743 - auc: 0.8452 - prc: 0.5376 - val_loss: 0.2251 - val_categorical_accuracy: 0.5273 - val_tp: 374.0000 - val_fp: 108.0000 - val_tn: 9337.0000 - val_fn: 1515.0000 - val_precision: 0.7759 - val_recall: 0.1980 - val_auc: 0.8419 - val_prc: 0.5501
Epoch 114/150
177/177 [==============================] - 35s 195ms/step - loss: 0.2185 - categorical_accuracy: 0.5292 - tp: 433.0000 - fp: 147.0000 - tn: 28133.0000 - fn: 5223.0000 - precision: 0.7466 - recall: 0.0766 - auc: 0.8427 - prc: 0.5318 - val_loss: 0.2257 - val_categorical_accuracy: 0.5188 - val_tp: 404.0000 - val_fp: 119.0000 - val_tn: 9326.0000 - val_fn: 1485.0000 - val_precision: 0.7725 - val_recall: 0.2139 - val_auc: 0.8411 - val_prc: 0.5509
Epoch 115/150
177/177 [==============================] - 35s 195ms/step - loss: 0.2197 - categorical_accuracy: 0.5244 - tp: 419.0000 - fp: 141.0000 - tn: 28139.0000 - fn: 5237.0000 - precision: 0.7482 - recall: 0.0741 - auc: 0.8417 - prc: 0.5251 - val_loss: 0.2232 - val_categorical_accuracy: 0.5400 - val_tp: 425.0000 - val_fp: 136.0000 - val_tn: 9309.0000 - val_fn: 1464.0000 - val_precision: 0.7576 - val_recall: 0.2250 - val_auc: 0.8499 - val_prc: 0.5566
Epoch 116/150
177/177 [==============================] - 36s 204ms/step - loss: 0.2195 - categorical_accuracy: 0.5239 - tp: 449.0000 - fp: 144.0000 - tn: 28136.0000 - fn: 5207.0000 - precision: 0.7572 - recall: 0.0794 - auc: 0.8413 - prc: 0.5305 - val_loss: 0.2313 - val_categorical_accuracy: 0.5241 - val_tp: 480.0000 - val_fp: 165.0000 - val_tn: 9280.0000 - val_fn: 1409.0000 - val_precision: 0.7442 - val_recall: 0.2541 - val_auc: 0.8436 - val_prc: 0.5517
Epoch 117/150
177/177 [==============================] - 35s 195ms/step - loss: 0.2203 - categorical_accuracy: 0.5288 - tp: 441.0000 - fp: 152.0000 - tn: 28128.0000 - fn: 5215.0000 - precision: 0.7437 - recall: 0.0780 - auc: 0.8405 - prc: 0.5262 - val_loss: 0.2574 - val_categorical_accuracy: 0.4854 - val_tp: 607.0000 - val_fp: 293.0000 - val_tn: 9152.0000 - val_fn: 1282.0000 - val_precision: 0.6744 - val_recall: 0.3213 - val_auc: 0.8408 - val_prc: 0.5404
Epoch 118/150
177/177 [==============================] - 34s 194ms/step - loss: 0.2204 - categorical_accuracy: 0.5256 - tp: 438.0000 - fp: 140.0000 - tn: 28140.0000 - fn: 5218.0000 - precision: 0.7578 - recall: 0.0774 - auc: 0.8405 - prc: 0.5265 - val_loss: 0.2181 - val_categorical_accuracy: 0.5304 - val_tp: 245.0000 - val_fp: 61.0000 - val_tn: 9384.0000 - val_fn: 1644.0000 - val_precision: 0.8007 - val_recall: 0.1297 - val_auc: 0.8446 - val_prc: 0.5522
Epoch 119/150
177/177 [==============================] - 34s 195ms/step - loss: 0.2180 - categorical_accuracy: 0.5384 - tp: 429.0000 - fp: 143.0000 - tn: 28137.0000 - fn: 5227.0000 - precision: 0.7500 - recall: 0.0758 - auc: 0.8438 - prc: 0.5346 - val_loss: 0.2453 - val_categorical_accuracy: 0.4934 - val_tp: 559.0000 - val_fp: 229.0000 - val_tn: 9216.0000 - val_fn: 1330.0000 - val_precision: 0.7094 - val_recall: 0.2959 - val_auc: 0.8391 - val_prc: 0.5431
Epoch 120/150
177/177 [==============================] - 34s 194ms/step - loss: 0.2208 - categorical_accuracy: 0.5274 - tp: 426.0000 - fp: 132.0000 - tn: 28148.0000 - fn: 5230.0000 - precision: 0.7634 - recall: 0.0753 - auc: 0.8397 - prc: 0.5230 - val_loss: 0.2236 - val_categorical_accuracy: 0.5363 - val_tp: 366.0000 - val_fp: 104.0000 - val_tn: 9341.0000 - val_fn: 1523.0000 - val_precision: 0.7787 - val_recall: 0.1938 - val_auc: 0.8409 - val_prc: 0.5535
Epoch 121/150
177/177 [==============================] - 34s 194ms/step - loss: 0.2182 - categorical_accuracy: 0.5308 - tp: 391.0000 - fp: 127.0000 - tn: 28153.0000 - fn: 5265.0000 - precision: 0.7548 - recall: 0.0691 - auc: 0.8431 - prc: 0.5312 - val_loss: 0.2183 - val_categorical_accuracy: 0.5246 - val_tp: 352.0000 - val_fp: 100.0000 - val_tn: 9345.0000 - val_fn: 1537.0000 - val_precision: 0.7788 - val_recall: 0.1863 - val_auc: 0.8516 - val_prc: 0.5573
Epoch 122/150
177/177 [==============================] - 34s 194ms/step - loss: 0.2196 - categorical_accuracy: 0.5239 - tp: 453.0000 - fp: 149.0000 - tn: 28131.0000 - fn: 5203.0000 - precision: 0.7525 - recall: 0.0801 - auc: 0.8416 - prc: 0.5256 - val_loss: 0.2339 - val_categorical_accuracy: 0.5087 - val_tp: 520.0000 - val_fp: 199.0000 - val_tn: 9246.0000 - val_fn: 1369.0000 - val_precision: 0.7232 - val_recall: 0.2753 - val_auc: 0.8478 - val_prc: 0.5515
Epoch 123/150
177/177 [==============================] - 34s 195ms/step - loss: 0.2253 - categorical_accuracy: 0.5249 - tp: 475.0000 - fp: 167.0000 - tn: 28113.0000 - fn: 5181.0000 - precision: 0.7399 - recall: 0.0840 - auc: 0.8375 - prc: 0.5221 - val_loss: 0.2244 - val_categorical_accuracy: 0.5516 - val_tp: 410.0000 - val_fp: 123.0000 - val_tn: 9322.0000 - val_fn: 1479.0000 - val_precision: 0.7692 - val_recall: 0.2170 - val_auc: 0.8459 - val_prc: 0.5599
Epoch 124/150
177/177 [==============================] - 35s 195ms/step - loss: 0.2200 - categorical_accuracy: 0.5293 - tp: 423.0000 - fp: 150.0000 - tn: 28130.0000 - fn: 5233.0000 - precision: 0.7382 - recall: 0.0748 - auc: 0.8406 - prc: 0.5259 - val_loss: 0.2547 - val_categorical_accuracy: 0.4860 - val_tp: 617.0000 - val_fp: 316.0000 - val_tn: 9129.0000 - val_fn: 1272.0000 - val_precision: 0.6613 - val_recall: 0.3266 - val_auc: 0.8462 - val_prc: 0.5456
Epoch 125/150
177/177 [==============================] - 34s 195ms/step - loss: 0.2208 - categorical_accuracy: 0.5281 - tp: 417.0000 - fp: 146.0000 - tn: 28134.0000 - fn: 5239.0000 - precision: 0.7407 - recall: 0.0737 - auc: 0.8397 - prc: 0.5241 - val_loss: 0.2484 - val_categorical_accuracy: 0.4870 - val_tp: 587.0000 - val_fp: 264.0000 - val_tn: 9181.0000 - val_fn: 1302.0000 - val_precision: 0.6898 - val_recall: 0.3107 - val_auc: 0.8453 - val_prc: 0.5449
Epoch 126/150
177/177 [==============================] - 34s 194ms/step - loss: 0.2190 - categorical_accuracy: 0.5265 - tp: 414.0000 - fp: 143.0000 - tn: 28137.0000 - fn: 5242.0000 - precision: 0.7433 - recall: 0.0732 - auc: 0.8420 - prc: 0.5288 - val_loss: 0.2350 - val_categorical_accuracy: 0.4971 - val_tp: 513.0000 - val_fp: 192.0000 - val_tn: 9253.0000 - val_fn: 1376.0000 - val_precision: 0.7277 - val_recall: 0.2716 - val_auc: 0.8465 - val_prc: 0.5487
Epoch 127/150
177/177 [==============================] - 36s 205ms/step - loss: 0.2198 - categorical_accuracy: 0.5285 - tp: 430.0000 - fp: 132.0000 - tn: 28148.0000 - fn: 5226.0000 - precision: 0.7651 - recall: 0.0760 - auc: 0.8413 - prc: 0.5269 - val_loss: 0.2320 - val_categorical_accuracy: 0.4971 - val_tp: 479.0000 - val_fp: 169.0000 - val_tn: 9276.0000 - val_fn: 1410.0000 - val_precision: 0.7392 - val_recall: 0.2536 - val_auc: 0.8489 - val_prc: 0.5507
Epoch 128/150
177/177 [==============================] - 34s 194ms/step - loss: 0.2195 - categorical_accuracy: 0.5315 - tp: 408.0000 - fp: 146.0000 - tn: 28134.0000 - fn: 5248.0000 - precision: 0.7365 - recall: 0.0721 - auc: 0.8416 - prc: 0.5288 - val_loss: 0.2390 - val_categorical_accuracy: 0.4929 - val_tp: 514.0000 - val_fp: 195.0000 - val_tn: 9250.0000 - val_fn: 1375.0000 - val_precision: 0.7250 - val_recall: 0.2721 - val_auc: 0.8453 - val_prc: 0.5447
Epoch 129/150
177/177 [==============================] - 35s 195ms/step - loss: 0.2219 - categorical_accuracy: 0.5260 - tp: 485.0000 - fp: 174.0000 - tn: 28106.0000 - fn: 5171.0000 - precision: 0.7360 - recall: 0.0857 - auc: 0.8414 - prc: 0.5216 - val_loss: 0.2307 - val_categorical_accuracy: 0.5214 - val_tp: 501.0000 - val_fp: 188.0000 - val_tn: 9257.0000 - val_fn: 1388.0000 - val_precision: 0.7271 - val_recall: 0.2652 - val_auc: 0.8512 - val_prc: 0.5555
Epoch 130/150
177/177 [==============================] - 34s 195ms/step - loss: 0.2228 - categorical_accuracy: 0.5271 - tp: 470.0000 - fp: 173.0000 - tn: 28107.0000 - fn: 5186.0000 - precision: 0.7309 - recall: 0.0831 - auc: 0.8371 - prc: 0.5218 - val_loss: 0.2291 - val_categorical_accuracy: 0.5061 - val_tp: 429.0000 - val_fp: 140.0000 - val_tn: 9305.0000 - val_fn: 1460.0000 - val_precision: 0.7540 - val_recall: 0.2271 - val_auc: 0.8441 - val_prc: 0.5470
Epoch 131/150
177/177 [==============================] - 34s 194ms/step - loss: 0.2186 - categorical_accuracy: 0.5348 - tp: 416.0000 - fp: 137.0000 - tn: 28143.0000 - fn: 5240.0000 - precision: 0.7523 - recall: 0.0736 - auc: 0.8423 - prc: 0.5326 - val_loss: 0.2345 - val_categorical_accuracy: 0.5040 - val_tp: 484.0000 - val_fp: 172.0000 - val_tn: 9273.0000 - val_fn: 1405.0000 - val_precision: 0.7378 - val_recall: 0.2562 - val_auc: 0.8411 - val_prc: 0.5452
Epoch 132/150
177/177 [==============================] - 35s 195ms/step - loss: 0.2191 - categorical_accuracy: 0.5295 - tp: 394.0000 - fp: 139.0000 - tn: 28141.0000 - fn: 5262.0000 - precision: 0.7392 - recall: 0.0697 - auc: 0.8416 - prc: 0.5286 - val_loss: 0.2573 - val_categorical_accuracy: 0.4870 - val_tp: 587.0000 - val_fp: 270.0000 - val_tn: 9175.0000 - val_fn: 1302.0000 - val_precision: 0.6849 - val_recall: 0.3107 - val_auc: 0.8297 - val_prc: 0.5337
Epoch 133/150
177/177 [==============================] - 34s 195ms/step - loss: 0.2185 - categorical_accuracy: 0.5332 - tp: 420.0000 - fp: 151.0000 - tn: 28129.0000 - fn: 5236.0000 - precision: 0.7356 - recall: 0.0743 - auc: 0.8433 - prc: 0.5303 - val_loss: 0.2618 - val_categorical_accuracy: 0.4929 - val_tp: 640.0000 - val_fp: 321.0000 - val_tn: 9124.0000 - val_fn: 1249.0000 - val_precision: 0.6660 - val_recall: 0.3388 - val_auc: 0.8371 - val_prc: 0.5426
Epoch 134/150
177/177 [==============================] - 35s 195ms/step - loss: 0.2198 - categorical_accuracy: 0.5301 - tp: 422.0000 - fp: 144.0000 - tn: 28136.0000 - fn: 5234.0000 - precision: 0.7456 - recall: 0.0746 - auc: 0.8415 - prc: 0.5268 - val_loss: 0.2388 - val_categorical_accuracy: 0.4870 - val_tp: 530.0000 - val_fp: 202.0000 - val_tn: 9243.0000 - val_fn: 1359.0000 - val_precision: 0.7240 - val_recall: 0.2806 - val_auc: 0.8459 - val_prc: 0.5490
Epoch 135/150
177/177 [==============================] - 36s 205ms/step - loss: 0.2171 - categorical_accuracy: 0.5357 - tp: 404.0000 - fp: 134.0000 - tn: 28146.0000 - fn: 5252.0000 - precision: 0.7509 - recall: 0.0714 - auc: 0.8453 - prc: 0.5369 - val_loss: 0.2406 - val_categorical_accuracy: 0.5024 - val_tp: 499.0000 - val_fp: 183.0000 - val_tn: 9262.0000 - val_fn: 1390.0000 - val_precision: 0.7317 - val_recall: 0.2642 - val_auc: 0.8317 - val_prc: 0.5403
Epoch 136/150
177/177 [==============================] - 35s 196ms/step - loss: 0.2195 - categorical_accuracy: 0.5336 - tp: 415.0000 - fp: 155.0000 - tn: 28125.0000 - fn: 5241.0000 - precision: 0.7281 - recall: 0.0734 - auc: 0.8422 - prc: 0.5271 - val_loss: 0.2153 - val_categorical_accuracy: 0.5474 - val_tp: 266.0000 - val_fp: 65.0000 - val_tn: 9380.0000 - val_fn: 1623.0000 - val_precision: 0.8036 - val_recall: 0.1408 - val_auc: 0.8498 - val_prc: 0.5617
Epoch 137/150
177/177 [==============================] - 35s 195ms/step - loss: 0.2210 - categorical_accuracy: 0.5196 - tp: 438.0000 - fp: 146.0000 - tn: 28134.0000 - fn: 5218.0000 - precision: 0.7500 - recall: 0.0774 - auc: 0.8390 - prc: 0.5233 - val_loss: 0.2376 - val_categorical_accuracy: 0.4939 - val_tp: 546.0000 - val_fp: 224.0000 - val_tn: 9221.0000 - val_fn: 1343.0000 - val_precision: 0.7091 - val_recall: 0.2890 - val_auc: 0.8487 - val_prc: 0.5509
Epoch 138/150
177/177 [==============================] - 34s 194ms/step - loss: 0.2212 - categorical_accuracy: 0.5251 - tp: 451.0000 - fp: 152.0000 - tn: 28128.0000 - fn: 5205.0000 - precision: 0.7479 - recall: 0.0797 - auc: 0.8389 - prc: 0.5237 - val_loss: 0.2242 - val_categorical_accuracy: 0.5273 - val_tp: 428.0000 - val_fp: 138.0000 - val_tn: 9307.0000 - val_fn: 1461.0000 - val_precision: 0.7562 - val_recall: 0.2266 - val_auc: 0.8503 - val_prc: 0.5546
Epoch 139/150
177/177 [==============================] - 35s 195ms/step - loss: 0.2177 - categorical_accuracy: 0.5354 - tp: 412.0000 - fp: 138.0000 - tn: 28142.0000 - fn: 5244.0000 - precision: 0.7491 - recall: 0.0728 - auc: 0.8436 - prc: 0.5333 - val_loss: 0.2351 - val_categorical_accuracy: 0.4976 - val_tp: 438.0000 - val_fp: 142.0000 - val_tn: 9303.0000 - val_fn: 1451.0000 - val_precision: 0.7552 - val_recall: 0.2319 - val_auc: 0.8323 - val_prc: 0.5368
Epoch 140/150
177/177 [==============================] - 35s 195ms/step - loss: 0.2189 - categorical_accuracy: 0.5384 - tp: 454.0000 - fp: 153.0000 - tn: 28127.0000 - fn: 5202.0000 - precision: 0.7479 - recall: 0.0803 - auc: 0.8428 - prc: 0.5305 - val_loss: 0.2496 - val_categorical_accuracy: 0.4950 - val_tp: 574.0000 - val_fp: 259.0000 - val_tn: 9186.0000 - val_fn: 1315.0000 - val_precision: 0.6891 - val_recall: 0.3039 - val_auc: 0.8375 - val_prc: 0.5397
Epoch 141/150
177/177 [==============================] - 36s 205ms/step - loss: 0.2227 - categorical_accuracy: 0.5194 - tp: 425.0000 - fp: 152.0000 - tn: 28128.0000 - fn: 5231.0000 - precision: 0.7366 - recall: 0.0751 - auc: 0.8374 - prc: 0.5186 - val_loss: 0.2522 - val_categorical_accuracy: 0.5140 - val_tp: 541.0000 - val_fp: 223.0000 - val_tn: 9222.0000 - val_fn: 1348.0000 - val_precision: 0.7081 - val_recall: 0.2864 - val_auc: 0.8252 - val_prc: 0.5345
Epoch 142/150
177/177 [==============================] - 35s 195ms/step - loss: 0.2223 - categorical_accuracy: 0.5168 - tp: 457.0000 - fp: 141.0000 - tn: 28139.0000 - fn: 5199.0000 - precision: 0.7642 - recall: 0.0808 - auc: 0.8374 - prc: 0.5223 - val_loss: 0.2233 - val_categorical_accuracy: 0.5119 - val_tp: 347.0000 - val_fp: 95.0000 - val_tn: 9350.0000 - val_fn: 1542.0000 - val_precision: 0.7851 - val_recall: 0.1837 - val_auc: 0.8448 - val_prc: 0.5500
Epoch 143/150
177/177 [==============================] - 34s 194ms/step - loss: 0.2191 - categorical_accuracy: 0.5331 - tp: 399.0000 - fp: 128.0000 - tn: 28152.0000 - fn: 5257.0000 - precision: 0.7571 - recall: 0.0705 - auc: 0.8414 - prc: 0.5297 - val_loss: 0.2261 - val_categorical_accuracy: 0.5146 - val_tp: 410.0000 - val_fp: 130.0000 - val_tn: 9315.0000 - val_fn: 1479.0000 - val_precision: 0.7593 - val_recall: 0.2170 - val_auc: 0.8448 - val_prc: 0.5471
Epoch 144/150
177/177 [==============================] - 36s 205ms/step - loss: 0.2210 - categorical_accuracy: 0.5286 - tp: 435.0000 - fp: 152.0000 - tn: 28128.0000 - fn: 5221.0000 - precision: 0.7411 - recall: 0.0769 - auc: 0.8406 - prc: 0.5225 - val_loss: 0.2261 - val_categorical_accuracy: 0.5294 - val_tp: 372.0000 - val_fp: 118.0000 - val_tn: 9327.0000 - val_fn: 1517.0000 - val_precision: 0.7592 - val_recall: 0.1969 - val_auc: 0.8418 - val_prc: 0.5466
Epoch 145/150
177/177 [==============================] - 35s 195ms/step - loss: 0.2213 - categorical_accuracy: 0.5309 - tp: 455.0000 - fp: 160.0000 - tn: 28120.0000 - fn: 5201.0000 - precision: 0.7398 - recall: 0.0804 - auc: 0.8394 - prc: 0.5243 - val_loss: 0.2537 - val_categorical_accuracy: 0.4886 - val_tp: 608.0000 - val_fp: 290.0000 - val_tn: 9155.0000 - val_fn: 1281.0000 - val_precision: 0.6771 - val_recall: 0.3219 - val_auc: 0.8459 - val_prc: 0.5469
Epoch 146/150
177/177 [==============================] - 35s 195ms/step - loss: 0.2235 - categorical_accuracy: 0.5145 - tp: 417.0000 - fp: 155.0000 - tn: 28125.0000 - fn: 5239.0000 - precision: 0.7290 - recall: 0.0737 - auc: 0.8356 - prc: 0.5140 - val_loss: 0.2261 - val_categorical_accuracy: 0.5336 - val_tp: 396.0000 - val_fp: 125.0000 - val_tn: 9320.0000 - val_fn: 1493.0000 - val_precision: 0.7601 - val_recall: 0.2096 - val_auc: 0.8415 - val_prc: 0.5517
Epoch 147/150
177/177 [==============================] - 34s 195ms/step - loss: 0.2181 - categorical_accuracy: 0.5377 - tp: 405.0000 - fp: 135.0000 - tn: 28145.0000 - fn: 5251.0000 - precision: 0.7500 - recall: 0.0716 - auc: 0.8428 - prc: 0.5360 - val_loss: 0.2183 - val_categorical_accuracy: 0.5183 - val_tp: 273.0000 - val_fp: 67.0000 - val_tn: 9378.0000 - val_fn: 1616.0000 - val_precision: 0.8029 - val_recall: 0.1445 - val_auc: 0.8485 - val_prc: 0.5542
Epoch 148/150
177/177 [==============================] - 34s 195ms/step - loss: 0.2190 - categorical_accuracy: 0.5309 - tp: 403.0000 - fp: 147.0000 - tn: 28133.0000 - fn: 5253.0000 - precision: 0.7327 - recall: 0.0713 - auc: 0.8422 - prc: 0.5297 - val_loss: 0.2367 - val_categorical_accuracy: 0.5034 - val_tp: 521.0000 - val_fp: 196.0000 - val_tn: 9249.0000 - val_fn: 1368.0000 - val_precision: 0.7266 - val_recall: 0.2758 - val_auc: 0.8447 - val_prc: 0.5455
Epoch 149/150
177/177 [==============================] - 35s 196ms/step - loss: 0.2194 - categorical_accuracy: 0.5288 - tp: 395.0000 - fp: 137.0000 - tn: 28143.0000 - fn: 5261.0000 - precision: 0.7425 - recall: 0.0698 - auc: 0.8418 - prc: 0.5273 - val_loss: 0.2304 - val_categorical_accuracy: 0.5410 - val_tp: 452.0000 - val_fp: 151.0000 - val_tn: 9294.0000 - val_fn: 1437.0000 - val_precision: 0.7496 - val_recall: 0.2393 - val_auc: 0.8428 - val_prc: 0.5534
Epoch 150/150
177/177 [==============================] - 34s 194ms/step - loss: 0.2200 - categorical_accuracy: 0.5292 - tp: 422.0000 - fp: 141.0000 - tn: 28139.0000 - fn: 5234.0000 - precision: 0.7496 - recall: 0.0746 - auc: 0.8408 - prc: 0.5291 - val_loss: 0.2328 - val_categorical_accuracy: 0.4992 - val_tp: 489.0000 - val_fp: 176.0000 - val_tn: 9269.0000 - val_fn: 1400.0000 - val_precision: 0.7353 - val_recall: 0.2589 - val_auc: 0.8483 - val_prc: 0.5476

Transfer Learning

In [ ]:
evaluate_model_performance(efficientnetb4_transfer_learning, val_ds, test_ds)
60/60 [==============================] - 9s 143ms/step - loss: 0.2329 - categorical_accuracy: 0.4992 - tp: 489.0000 - fp: 176.0000 - tn: 9269.0000 - fn: 1400.0000 - precision: 0.7353 - recall: 0.2589 - auc: 0.8483 - prc: 0.5476
Validation AUC: 0.848
Validation PRC: 0.548
Validation categorical accuracy: 0.499
59/59 [==============================] - 10s 161ms/step - loss: 0.2300 - categorical_accuracy: 0.4992 - tp: 504.0000 - fp: 161.0000 - tn: 9264.0000 - fn: 1381.0000 - precision: 0.7579 - recall: 0.2674 - auc: 0.8496 - prc: 0.5554
Test AUC: 0.850
Test PRC: 0.555
Test categorical accuracy: 0.499

Terrible performance here.

5.4.3 Fast feature extraction without data augmentation¶
In [29]:
def get_features_and_labels(dataset):
    all_features = []
    all_labels = []
    for images, labels in dataset:
        preprocessed_images = keras.applications.efficientnet.preprocess_input(images)
        features = efficientnetb4_conv_base.predict(preprocessed_images)
        all_features.append(features)
        all_labels.append(labels)
    return np.concatenate(all_features), np.concatenate(all_labels)

train_features, train_labels =  get_features_and_labels(train_ds)
val_features, val_labels =  get_features_and_labels(val_ds)
test_features, test_labels =  get_features_and_labels(test_ds)
2023-03-19 02:45:32.901102: I tensorflow/stream_executor/cuda/cuda_dnn.cc:368] Loaded cuDNN version 8200
In [68]:
train_features.shape
Out[68]:
(5656, 7, 7, 1792)
In [69]:
val_labels.shape
Out[69]:
(1889, 6)
5.4.5 Defining and training the densely connected classifier¶
In [128]:
inputs = keras.Input(shape=(7, 7, 1792))
x = layers.Flatten()(inputs)
x = layers.Dense(256)(x)
x = layers.Dropout(0.5)(x)
outputs = layers.Dense(6, activation='softmax', name='softmax_layer')(x)
efficientnetb4_fe_model = keras.Model(inputs, outputs, name="efficientnetb4_fe_model")
efficientnetb4_fe_model.compile(loss=tfa.losses.SigmoidFocalCrossEntropy(),
              optimizer="rmsprop",
              metrics=METRICS)

callbacks = [
    keras.callbacks.ModelCheckpoint(
      filepath="efficientnetb4_feature_extraction.keras",
      save_best_only=True,
      monitor="val_loss"),
    keras.callbacks.TensorBoard(log_dir=f"./tensorboard/efficientnetb4_fe_model")
]
efficientnetb4_fe_model.summary()
efficientnetb4_fe_history = efficientnetb4_fe_model.fit(
    train_features, train_labels,
    epochs=66,
    validation_data=(val_features, val_labels),
    callbacks=callbacks)
Model: "efficientnetb4_fe_model"
_________________________________________________________________
 Layer (type)                Output Shape              Param #   
=================================================================
 input_6 (InputLayer)        [(None, 7, 7, 1792)]      0         
                                                                 
 flatten_4 (Flatten)         (None, 87808)             0         
                                                                 
 dense_4 (Dense)             (None, 256)               22479104  
                                                                 
 dropout_4 (Dropout)         (None, 256)               0         
                                                                 
 softmax_layer (Dense)       (None, 6)                 1542      
                                                                 
=================================================================
Total params: 22,480,646
Trainable params: 22,480,646
Non-trainable params: 0
_________________________________________________________________
Epoch 1/66
177/177 [==============================] - 6s 27ms/step - loss: 8.1498 - categorical_accuracy: 0.3618 - tp: 2728.0000 - fp: 4806.0000 - tn: 32919.0000 - fn: 4817.0000 - precision: 0.3621 - recall: 0.3616 - auc: 0.6177 - prc: 0.2667 - val_loss: 8.1084 - val_categorical_accuracy: 0.4696 - val_tp: 887.0000 - val_fp: 1002.0000 - val_tn: 8443.0000 - val_fn: 1002.0000 - val_precision: 0.4696 - val_recall: 0.4696 - val_auc: 0.6817 - val_prc: 0.3459
Epoch 2/66
177/177 [==============================] - 2s 13ms/step - loss: 8.7673 - categorical_accuracy: 0.4287 - tp: 2425.0000 - fp: 3231.0000 - tn: 25049.0000 - fn: 3231.0000 - precision: 0.4287 - recall: 0.4287 - auc: 0.6572 - prc: 0.3135 - val_loss: 11.3968 - val_categorical_accuracy: 0.2552 - val_tp: 482.0000 - val_fp: 1407.0000 - val_tn: 8038.0000 - val_fn: 1407.0000 - val_precision: 0.2552 - val_recall: 0.2552 - val_auc: 0.5531 - val_prc: 0.2044
Epoch 3/66
177/177 [==============================] - 2s 13ms/step - loss: 11.4375 - categorical_accuracy: 0.2551 - tp: 1443.0000 - fp: 4213.0000 - tn: 24067.0000 - fn: 4213.0000 - precision: 0.2551 - recall: 0.2551 - auc: 0.5531 - prc: 0.2044 - val_loss: 11.3968 - val_categorical_accuracy: 0.2552 - val_tp: 482.0000 - val_fp: 1407.0000 - val_tn: 8038.0000 - val_fn: 1407.0000 - val_precision: 0.2552 - val_recall: 0.2552 - val_auc: 0.5531 - val_prc: 0.2044
Epoch 4/66
177/177 [==============================] - 2s 13ms/step - loss: 11.4390 - categorical_accuracy: 0.2551 - tp: 1443.0000 - fp: 4213.0000 - tn: 24067.0000 - fn: 4213.0000 - precision: 0.2551 - recall: 0.2551 - auc: 0.5531 - prc: 0.2044 - val_loss: 11.3968 - val_categorical_accuracy: 0.2552 - val_tp: 482.0000 - val_fp: 1407.0000 - val_tn: 8038.0000 - val_fn: 1407.0000 - val_precision: 0.2552 - val_recall: 0.2552 - val_auc: 0.5531 - val_prc: 0.2044
Epoch 5/66
177/177 [==============================] - 2s 13ms/step - loss: 11.4397 - categorical_accuracy: 0.2551 - tp: 1443.0000 - fp: 4213.0000 - tn: 24067.0000 - fn: 4213.0000 - precision: 0.2551 - recall: 0.2551 - auc: 0.5531 - prc: 0.2044 - val_loss: 11.3968 - val_categorical_accuracy: 0.2552 - val_tp: 482.0000 - val_fp: 1407.0000 - val_tn: 8038.0000 - val_fn: 1407.0000 - val_precision: 0.2552 - val_recall: 0.2552 - val_auc: 0.5531 - val_prc: 0.2044
Epoch 6/66
177/177 [==============================] - 2s 13ms/step - loss: 11.4376 - categorical_accuracy: 0.2551 - tp: 1443.0000 - fp: 4213.0000 - tn: 24067.0000 - fn: 4213.0000 - precision: 0.2551 - recall: 0.2551 - auc: 0.5531 - prc: 0.2044 - val_loss: 11.3968 - val_categorical_accuracy: 0.2552 - val_tp: 482.0000 - val_fp: 1407.0000 - val_tn: 8038.0000 - val_fn: 1407.0000 - val_precision: 0.2552 - val_recall: 0.2552 - val_auc: 0.5531 - val_prc: 0.2044
Epoch 7/66
177/177 [==============================] - 2s 13ms/step - loss: 11.4390 - categorical_accuracy: 0.2551 - tp: 1443.0000 - fp: 4213.0000 - tn: 24067.0000 - fn: 4213.0000 - precision: 0.2551 - recall: 0.2551 - auc: 0.5531 - prc: 0.2044 - val_loss: 11.3968 - val_categorical_accuracy: 0.2552 - val_tp: 482.0000 - val_fp: 1407.0000 - val_tn: 8038.0000 - val_fn: 1407.0000 - val_precision: 0.2552 - val_recall: 0.2552 - val_auc: 0.5531 - val_prc: 0.2044
Epoch 8/66
177/177 [==============================] - 10s 58ms/step - loss: 11.4356 - categorical_accuracy: 0.2551 - tp: 1443.0000 - fp: 4213.0000 - tn: 24067.0000 - fn: 4213.0000 - precision: 0.2551 - recall: 0.2551 - auc: 0.5531 - prc: 0.2044 - val_loss: 11.3968 - val_categorical_accuracy: 0.2552 - val_tp: 482.0000 - val_fp: 1407.0000 - val_tn: 8038.0000 - val_fn: 1407.0000 - val_precision: 0.2552 - val_recall: 0.2552 - val_auc: 0.5531 - val_prc: 0.2044
Epoch 9/66
177/177 [==============================] - 4s 20ms/step - loss: 11.4390 - categorical_accuracy: 0.2551 - tp: 1443.0000 - fp: 4213.0000 - tn: 24067.0000 - fn: 4213.0000 - precision: 0.2551 - recall: 0.2551 - auc: 0.5531 - prc: 0.2044 - val_loss: 11.3968 - val_categorical_accuracy: 0.2552 - val_tp: 482.0000 - val_fp: 1407.0000 - val_tn: 8038.0000 - val_fn: 1407.0000 - val_precision: 0.2552 - val_recall: 0.2552 - val_auc: 0.5531 - val_prc: 0.2044
Epoch 10/66
177/177 [==============================] - 2s 12ms/step - loss: 11.4369 - categorical_accuracy: 0.2551 - tp: 1443.0000 - fp: 4213.0000 - tn: 24067.0000 - fn: 4213.0000 - precision: 0.2551 - recall: 0.2551 - auc: 0.5531 - prc: 0.2044 - val_loss: 11.3968 - val_categorical_accuracy: 0.2552 - val_tp: 482.0000 - val_fp: 1407.0000 - val_tn: 8038.0000 - val_fn: 1407.0000 - val_precision: 0.2552 - val_recall: 0.2552 - val_auc: 0.5531 - val_prc: 0.2044
Epoch 11/66
177/177 [==============================] - 2s 13ms/step - loss: 11.4369 - categorical_accuracy: 0.2551 - tp: 1443.0000 - fp: 4213.0000 - tn: 24067.0000 - fn: 4213.0000 - precision: 0.2551 - recall: 0.2551 - auc: 0.5531 - prc: 0.2044 - val_loss: 11.3968 - val_categorical_accuracy: 0.2552 - val_tp: 482.0000 - val_fp: 1407.0000 - val_tn: 8038.0000 - val_fn: 1407.0000 - val_precision: 0.2552 - val_recall: 0.2552 - val_auc: 0.5531 - val_prc: 0.2044
Epoch 12/66
177/177 [==============================] - 2s 13ms/step - loss: 11.4369 - categorical_accuracy: 0.2551 - tp: 1443.0000 - fp: 4213.0000 - tn: 24067.0000 - fn: 4213.0000 - precision: 0.2551 - recall: 0.2551 - auc: 0.5531 - prc: 0.2044 - val_loss: 11.3968 - val_categorical_accuracy: 0.2552 - val_tp: 482.0000 - val_fp: 1407.0000 - val_tn: 8038.0000 - val_fn: 1407.0000 - val_precision: 0.2552 - val_recall: 0.2552 - val_auc: 0.5531 - val_prc: 0.2044
Epoch 13/66
177/177 [==============================] - 2s 13ms/step - loss: 11.4397 - categorical_accuracy: 0.2551 - tp: 1443.0000 - fp: 4213.0000 - tn: 24067.0000 - fn: 4213.0000 - precision: 0.2551 - recall: 0.2551 - auc: 0.5531 - prc: 0.2044 - val_loss: 11.3968 - val_categorical_accuracy: 0.2552 - val_tp: 482.0000 - val_fp: 1407.0000 - val_tn: 8038.0000 - val_fn: 1407.0000 - val_precision: 0.2552 - val_recall: 0.2552 - val_auc: 0.5531 - val_prc: 0.2044
Epoch 14/66
177/177 [==============================] - 2s 13ms/step - loss: 11.4403 - categorical_accuracy: 0.2551 - tp: 1443.0000 - fp: 4213.0000 - tn: 24067.0000 - fn: 4213.0000 - precision: 0.2551 - recall: 0.2551 - auc: 0.5531 - prc: 0.2044 - val_loss: 11.3968 - val_categorical_accuracy: 0.2552 - val_tp: 482.0000 - val_fp: 1407.0000 - val_tn: 8038.0000 - val_fn: 1407.0000 - val_precision: 0.2552 - val_recall: 0.2552 - val_auc: 0.5531 - val_prc: 0.2044
Epoch 15/66
177/177 [==============================] - 2s 12ms/step - loss: 11.4390 - categorical_accuracy: 0.2551 - tp: 1443.0000 - fp: 4213.0000 - tn: 24067.0000 - fn: 4213.0000 - precision: 0.2551 - recall: 0.2551 - auc: 0.5531 - prc: 0.2044 - val_loss: 11.3968 - val_categorical_accuracy: 0.2552 - val_tp: 482.0000 - val_fp: 1407.0000 - val_tn: 8038.0000 - val_fn: 1407.0000 - val_precision: 0.2552 - val_recall: 0.2552 - val_auc: 0.5531 - val_prc: 0.2044
Epoch 16/66
177/177 [==============================] - 2s 13ms/step - loss: 11.4369 - categorical_accuracy: 0.2551 - tp: 1443.0000 - fp: 4213.0000 - tn: 24067.0000 - fn: 4213.0000 - precision: 0.2551 - recall: 0.2551 - auc: 0.5531 - prc: 0.2044 - val_loss: 11.3968 - val_categorical_accuracy: 0.2552 - val_tp: 482.0000 - val_fp: 1407.0000 - val_tn: 8038.0000 - val_fn: 1407.0000 - val_precision: 0.2552 - val_recall: 0.2552 - val_auc: 0.5531 - val_prc: 0.2044
Epoch 17/66
177/177 [==============================] - 2s 13ms/step - loss: 11.4356 - categorical_accuracy: 0.2551 - tp: 1443.0000 - fp: 4213.0000 - tn: 24067.0000 - fn: 4213.0000 - precision: 0.2551 - recall: 0.2551 - auc: 0.5531 - prc: 0.2044 - val_loss: 11.3968 - val_categorical_accuracy: 0.2552 - val_tp: 482.0000 - val_fp: 1407.0000 - val_tn: 8038.0000 - val_fn: 1407.0000 - val_precision: 0.2552 - val_recall: 0.2552 - val_auc: 0.5531 - val_prc: 0.2044
Epoch 18/66
177/177 [==============================] - 2s 13ms/step - loss: 11.4397 - categorical_accuracy: 0.2551 - tp: 1443.0000 - fp: 4213.0000 - tn: 24067.0000 - fn: 4213.0000 - precision: 0.2551 - recall: 0.2551 - auc: 0.5531 - prc: 0.2044 - val_loss: 11.3968 - val_categorical_accuracy: 0.2552 - val_tp: 482.0000 - val_fp: 1407.0000 - val_tn: 8038.0000 - val_fn: 1407.0000 - val_precision: 0.2552 - val_recall: 0.2552 - val_auc: 0.5531 - val_prc: 0.2044
Epoch 19/66
177/177 [==============================] - 2s 13ms/step - loss: 11.4397 - categorical_accuracy: 0.2551 - tp: 1443.0000 - fp: 4213.0000 - tn: 24067.0000 - fn: 4213.0000 - precision: 0.2551 - recall: 0.2551 - auc: 0.5531 - prc: 0.2044 - val_loss: 11.3968 - val_categorical_accuracy: 0.2552 - val_tp: 482.0000 - val_fp: 1407.0000 - val_tn: 8038.0000 - val_fn: 1407.0000 - val_precision: 0.2552 - val_recall: 0.2552 - val_auc: 0.5531 - val_prc: 0.2044
Epoch 20/66
177/177 [==============================] - 2s 13ms/step - loss: 11.4376 - categorical_accuracy: 0.2551 - tp: 1443.0000 - fp: 4213.0000 - tn: 24067.0000 - fn: 4213.0000 - precision: 0.2551 - recall: 0.2551 - auc: 0.5531 - prc: 0.2044 - val_loss: 11.3968 - val_categorical_accuracy: 0.2552 - val_tp: 482.0000 - val_fp: 1407.0000 - val_tn: 8038.0000 - val_fn: 1407.0000 - val_precision: 0.2552 - val_recall: 0.2552 - val_auc: 0.5531 - val_prc: 0.2044
Epoch 21/66
177/177 [==============================] - 2s 13ms/step - loss: 11.4397 - categorical_accuracy: 0.2551 - tp: 1443.0000 - fp: 4213.0000 - tn: 24067.0000 - fn: 4213.0000 - precision: 0.2551 - recall: 0.2551 - auc: 0.5531 - prc: 0.2044 - val_loss: 11.3968 - val_categorical_accuracy: 0.2552 - val_tp: 482.0000 - val_fp: 1407.0000 - val_tn: 8038.0000 - val_fn: 1407.0000 - val_precision: 0.2552 - val_recall: 0.2552 - val_auc: 0.5531 - val_prc: 0.2044
Epoch 22/66
177/177 [==============================] - 2s 13ms/step - loss: 11.4397 - categorical_accuracy: 0.2551 - tp: 1443.0000 - fp: 4213.0000 - tn: 24067.0000 - fn: 4213.0000 - precision: 0.2551 - recall: 0.2551 - auc: 0.5531 - prc: 0.2044 - val_loss: 11.3968 - val_categorical_accuracy: 0.2552 - val_tp: 482.0000 - val_fp: 1407.0000 - val_tn: 8038.0000 - val_fn: 1407.0000 - val_precision: 0.2552 - val_recall: 0.2552 - val_auc: 0.5531 - val_prc: 0.2044
Epoch 23/66
177/177 [==============================] - 2s 13ms/step - loss: 11.4363 - categorical_accuracy: 0.2551 - tp: 1443.0000 - fp: 4213.0000 - tn: 24067.0000 - fn: 4213.0000 - precision: 0.2551 - recall: 0.2551 - auc: 0.5531 - prc: 0.2044 - val_loss: 11.3968 - val_categorical_accuracy: 0.2552 - val_tp: 482.0000 - val_fp: 1407.0000 - val_tn: 8038.0000 - val_fn: 1407.0000 - val_precision: 0.2552 - val_recall: 0.2552 - val_auc: 0.5531 - val_prc: 0.2044
Epoch 24/66
177/177 [==============================] - 2s 13ms/step - loss: 11.4369 - categorical_accuracy: 0.2551 - tp: 1443.0000 - fp: 4213.0000 - tn: 24067.0000 - fn: 4213.0000 - precision: 0.2551 - recall: 0.2551 - auc: 0.5531 - prc: 0.2044 - val_loss: 11.3968 - val_categorical_accuracy: 0.2552 - val_tp: 482.0000 - val_fp: 1407.0000 - val_tn: 8038.0000 - val_fn: 1407.0000 - val_precision: 0.2552 - val_recall: 0.2552 - val_auc: 0.5531 - val_prc: 0.2044
Epoch 25/66
177/177 [==============================] - 2s 13ms/step - loss: 11.4369 - categorical_accuracy: 0.2551 - tp: 1443.0000 - fp: 4213.0000 - tn: 24067.0000 - fn: 4213.0000 - precision: 0.2551 - recall: 0.2551 - auc: 0.5531 - prc: 0.2044 - val_loss: 11.3968 - val_categorical_accuracy: 0.2552 - val_tp: 482.0000 - val_fp: 1407.0000 - val_tn: 8038.0000 - val_fn: 1407.0000 - val_precision: 0.2552 - val_recall: 0.2552 - val_auc: 0.5531 - val_prc: 0.2044
Epoch 26/66
177/177 [==============================] - 2s 13ms/step - loss: 11.4376 - categorical_accuracy: 0.2551 - tp: 1443.0000 - fp: 4213.0000 - tn: 24067.0000 - fn: 4213.0000 - precision: 0.2551 - recall: 0.2551 - auc: 0.5531 - prc: 0.2044 - val_loss: 11.3968 - val_categorical_accuracy: 0.2552 - val_tp: 482.0000 - val_fp: 1407.0000 - val_tn: 8038.0000 - val_fn: 1407.0000 - val_precision: 0.2552 - val_recall: 0.2552 - val_auc: 0.5531 - val_prc: 0.2044
Epoch 27/66
177/177 [==============================] - 2s 12ms/step - loss: 11.4390 - categorical_accuracy: 0.2551 - tp: 1443.0000 - fp: 4213.0000 - tn: 24067.0000 - fn: 4213.0000 - precision: 0.2551 - recall: 0.2551 - auc: 0.5531 - prc: 0.2044 - val_loss: 11.3968 - val_categorical_accuracy: 0.2552 - val_tp: 482.0000 - val_fp: 1407.0000 - val_tn: 8038.0000 - val_fn: 1407.0000 - val_precision: 0.2552 - val_recall: 0.2552 - val_auc: 0.5531 - val_prc: 0.2044
Epoch 28/66
177/177 [==============================] - 2s 13ms/step - loss: 11.4369 - categorical_accuracy: 0.2551 - tp: 1443.0000 - fp: 4213.0000 - tn: 24067.0000 - fn: 4213.0000 - precision: 0.2551 - recall: 0.2551 - auc: 0.5531 - prc: 0.2044 - val_loss: 11.3968 - val_categorical_accuracy: 0.2552 - val_tp: 482.0000 - val_fp: 1407.0000 - val_tn: 8038.0000 - val_fn: 1407.0000 - val_precision: 0.2552 - val_recall: 0.2552 - val_auc: 0.5531 - val_prc: 0.2044
Epoch 29/66
177/177 [==============================] - 3s 18ms/step - loss: 11.4390 - categorical_accuracy: 0.2551 - tp: 1443.0000 - fp: 4213.0000 - tn: 24067.0000 - fn: 4213.0000 - precision: 0.2551 - recall: 0.2551 - auc: 0.5531 - prc: 0.2044 - val_loss: 11.3968 - val_categorical_accuracy: 0.2552 - val_tp: 482.0000 - val_fp: 1407.0000 - val_tn: 8038.0000 - val_fn: 1407.0000 - val_precision: 0.2552 - val_recall: 0.2552 - val_auc: 0.5531 - val_prc: 0.2044
Epoch 30/66
177/177 [==============================] - 4s 23ms/step - loss: 11.4369 - categorical_accuracy: 0.2551 - tp: 1443.0000 - fp: 4213.0000 - tn: 24067.0000 - fn: 4213.0000 - precision: 0.2551 - recall: 0.2551 - auc: 0.5531 - prc: 0.2044 - val_loss: 11.3968 - val_categorical_accuracy: 0.2552 - val_tp: 482.0000 - val_fp: 1407.0000 - val_tn: 8038.0000 - val_fn: 1407.0000 - val_precision: 0.2552 - val_recall: 0.2552 - val_auc: 0.5531 - val_prc: 0.2044
Epoch 31/66
177/177 [==============================] - 2s 13ms/step - loss: 11.4397 - categorical_accuracy: 0.2551 - tp: 1443.0000 - fp: 4213.0000 - tn: 24067.0000 - fn: 4213.0000 - precision: 0.2551 - recall: 0.2551 - auc: 0.5531 - prc: 0.2044 - val_loss: 11.3968 - val_categorical_accuracy: 0.2552 - val_tp: 482.0000 - val_fp: 1407.0000 - val_tn: 8038.0000 - val_fn: 1407.0000 - val_precision: 0.2552 - val_recall: 0.2552 - val_auc: 0.5531 - val_prc: 0.2044
Epoch 32/66
177/177 [==============================] - 2s 13ms/step - loss: 11.4376 - categorical_accuracy: 0.2551 - tp: 1443.0000 - fp: 4213.0000 - tn: 24067.0000 - fn: 4213.0000 - precision: 0.2551 - recall: 0.2551 - auc: 0.5531 - prc: 0.2044 - val_loss: 11.3968 - val_categorical_accuracy: 0.2552 - val_tp: 482.0000 - val_fp: 1407.0000 - val_tn: 8038.0000 - val_fn: 1407.0000 - val_precision: 0.2552 - val_recall: 0.2552 - val_auc: 0.5531 - val_prc: 0.2044
Epoch 33/66
177/177 [==============================] - 2s 13ms/step - loss: 11.4376 - categorical_accuracy: 0.2551 - tp: 1443.0000 - fp: 4213.0000 - tn: 24067.0000 - fn: 4213.0000 - precision: 0.2551 - recall: 0.2551 - auc: 0.5531 - prc: 0.2044 - val_loss: 11.3968 - val_categorical_accuracy: 0.2552 - val_tp: 482.0000 - val_fp: 1407.0000 - val_tn: 8038.0000 - val_fn: 1407.0000 - val_precision: 0.2552 - val_recall: 0.2552 - val_auc: 0.5531 - val_prc: 0.2044
Epoch 34/66
177/177 [==============================] - 2s 13ms/step - loss: 11.4356 - categorical_accuracy: 0.2551 - tp: 1443.0000 - fp: 4213.0000 - tn: 24067.0000 - fn: 4213.0000 - precision: 0.2551 - recall: 0.2551 - auc: 0.5531 - prc: 0.2044 - val_loss: 11.3968 - val_categorical_accuracy: 0.2552 - val_tp: 482.0000 - val_fp: 1407.0000 - val_tn: 8038.0000 - val_fn: 1407.0000 - val_precision: 0.2552 - val_recall: 0.2552 - val_auc: 0.5531 - val_prc: 0.2044
Epoch 35/66
177/177 [==============================] - 2s 13ms/step - loss: 11.4390 - categorical_accuracy: 0.2551 - tp: 1443.0000 - fp: 4213.0000 - tn: 24067.0000 - fn: 4213.0000 - precision: 0.2551 - recall: 0.2551 - auc: 0.5531 - prc: 0.2044 - val_loss: 11.3968 - val_categorical_accuracy: 0.2552 - val_tp: 482.0000 - val_fp: 1407.0000 - val_tn: 8038.0000 - val_fn: 1407.0000 - val_precision: 0.2552 - val_recall: 0.2552 - val_auc: 0.5531 - val_prc: 0.2044
Epoch 36/66
177/177 [==============================] - 2s 13ms/step - loss: 11.4390 - categorical_accuracy: 0.2551 - tp: 1443.0000 - fp: 4213.0000 - tn: 24067.0000 - fn: 4213.0000 - precision: 0.2551 - recall: 0.2551 - auc: 0.5531 - prc: 0.2044 - val_loss: 11.3968 - val_categorical_accuracy: 0.2552 - val_tp: 482.0000 - val_fp: 1407.0000 - val_tn: 8038.0000 - val_fn: 1407.0000 - val_precision: 0.2552 - val_recall: 0.2552 - val_auc: 0.5531 - val_prc: 0.2044
Epoch 37/66
177/177 [==============================] - 2s 13ms/step - loss: 11.4363 - categorical_accuracy: 0.2551 - tp: 1443.0000 - fp: 4213.0000 - tn: 24067.0000 - fn: 4213.0000 - precision: 0.2551 - recall: 0.2551 - auc: 0.5531 - prc: 0.2044 - val_loss: 11.3968 - val_categorical_accuracy: 0.2552 - val_tp: 482.0000 - val_fp: 1407.0000 - val_tn: 8038.0000 - val_fn: 1407.0000 - val_precision: 0.2552 - val_recall: 0.2552 - val_auc: 0.5531 - val_prc: 0.2044
Epoch 38/66
177/177 [==============================] - 2s 13ms/step - loss: 11.4403 - categorical_accuracy: 0.2551 - tp: 1443.0000 - fp: 4213.0000 - tn: 24067.0000 - fn: 4213.0000 - precision: 0.2551 - recall: 0.2551 - auc: 0.5531 - prc: 0.2044 - val_loss: 11.3968 - val_categorical_accuracy: 0.2552 - val_tp: 482.0000 - val_fp: 1407.0000 - val_tn: 8038.0000 - val_fn: 1407.0000 - val_precision: 0.2552 - val_recall: 0.2552 - val_auc: 0.5531 - val_prc: 0.2044
Epoch 39/66
177/177 [==============================] - 2s 13ms/step - loss: 11.4383 - categorical_accuracy: 0.2551 - tp: 1443.0000 - fp: 4213.0000 - tn: 24067.0000 - fn: 4213.0000 - precision: 0.2551 - recall: 0.2551 - auc: 0.5531 - prc: 0.2044 - val_loss: 11.3968 - val_categorical_accuracy: 0.2552 - val_tp: 482.0000 - val_fp: 1407.0000 - val_tn: 8038.0000 - val_fn: 1407.0000 - val_precision: 0.2552 - val_recall: 0.2552 - val_auc: 0.5531 - val_prc: 0.2044
Epoch 40/66
177/177 [==============================] - 2s 13ms/step - loss: 11.4376 - categorical_accuracy: 0.2551 - tp: 1443.0000 - fp: 4213.0000 - tn: 24067.0000 - fn: 4213.0000 - precision: 0.2551 - recall: 0.2551 - auc: 0.5531 - prc: 0.2044 - val_loss: 11.3968 - val_categorical_accuracy: 0.2552 - val_tp: 482.0000 - val_fp: 1407.0000 - val_tn: 8038.0000 - val_fn: 1407.0000 - val_precision: 0.2552 - val_recall: 0.2552 - val_auc: 0.5531 - val_prc: 0.2044
Epoch 41/66
177/177 [==============================] - 2s 13ms/step - loss: 11.4403 - categorical_accuracy: 0.2551 - tp: 1443.0000 - fp: 4213.0000 - tn: 24067.0000 - fn: 4213.0000 - precision: 0.2551 - recall: 0.2551 - auc: 0.5531 - prc: 0.2044 - val_loss: 11.3968 - val_categorical_accuracy: 0.2552 - val_tp: 482.0000 - val_fp: 1407.0000 - val_tn: 8038.0000 - val_fn: 1407.0000 - val_precision: 0.2552 - val_recall: 0.2552 - val_auc: 0.5531 - val_prc: 0.2044
Epoch 42/66
177/177 [==============================] - 2s 13ms/step - loss: 11.4397 - categorical_accuracy: 0.2551 - tp: 1443.0000 - fp: 4213.0000 - tn: 24067.0000 - fn: 4213.0000 - precision: 0.2551 - recall: 0.2551 - auc: 0.5531 - prc: 0.2044 - val_loss: 11.3968 - val_categorical_accuracy: 0.2552 - val_tp: 482.0000 - val_fp: 1407.0000 - val_tn: 8038.0000 - val_fn: 1407.0000 - val_precision: 0.2552 - val_recall: 0.2552 - val_auc: 0.5531 - val_prc: 0.2044
Epoch 43/66
177/177 [==============================] - 2s 13ms/step - loss: 11.4369 - categorical_accuracy: 0.2551 - tp: 1443.0000 - fp: 4213.0000 - tn: 24067.0000 - fn: 4213.0000 - precision: 0.2551 - recall: 0.2551 - auc: 0.5531 - prc: 0.2044 - val_loss: 11.3968 - val_categorical_accuracy: 0.2552 - val_tp: 482.0000 - val_fp: 1407.0000 - val_tn: 8038.0000 - val_fn: 1407.0000 - val_precision: 0.2552 - val_recall: 0.2552 - val_auc: 0.5531 - val_prc: 0.2044
Epoch 44/66
177/177 [==============================] - 2s 13ms/step - loss: 11.4383 - categorical_accuracy: 0.2551 - tp: 1443.0000 - fp: 4213.0000 - tn: 24067.0000 - fn: 4213.0000 - precision: 0.2551 - recall: 0.2551 - auc: 0.5531 - prc: 0.2044 - val_loss: 11.3968 - val_categorical_accuracy: 0.2552 - val_tp: 482.0000 - val_fp: 1407.0000 - val_tn: 8038.0000 - val_fn: 1407.0000 - val_precision: 0.2552 - val_recall: 0.2552 - val_auc: 0.5531 - val_prc: 0.2044
Epoch 45/66
177/177 [==============================] - 2s 13ms/step - loss: 11.4363 - categorical_accuracy: 0.2551 - tp: 1443.0000 - fp: 4213.0000 - tn: 24067.0000 - fn: 4213.0000 - precision: 0.2551 - recall: 0.2551 - auc: 0.5531 - prc: 0.2044 - val_loss: 11.3968 - val_categorical_accuracy: 0.2552 - val_tp: 482.0000 - val_fp: 1407.0000 - val_tn: 8038.0000 - val_fn: 1407.0000 - val_precision: 0.2552 - val_recall: 0.2552 - val_auc: 0.5531 - val_prc: 0.2044
Epoch 46/66
177/177 [==============================] - 2s 13ms/step - loss: 11.4369 - categorical_accuracy: 0.2551 - tp: 1443.0000 - fp: 4213.0000 - tn: 24067.0000 - fn: 4213.0000 - precision: 0.2551 - recall: 0.2551 - auc: 0.5531 - prc: 0.2044 - val_loss: 11.3968 - val_categorical_accuracy: 0.2552 - val_tp: 482.0000 - val_fp: 1407.0000 - val_tn: 8038.0000 - val_fn: 1407.0000 - val_precision: 0.2552 - val_recall: 0.2552 - val_auc: 0.5531 - val_prc: 0.2044
Epoch 47/66
177/177 [==============================] - 2s 13ms/step - loss: 11.4363 - categorical_accuracy: 0.2551 - tp: 1443.0000 - fp: 4213.0000 - tn: 24067.0000 - fn: 4213.0000 - precision: 0.2551 - recall: 0.2551 - auc: 0.5531 - prc: 0.2044 - val_loss: 11.3968 - val_categorical_accuracy: 0.2552 - val_tp: 482.0000 - val_fp: 1407.0000 - val_tn: 8038.0000 - val_fn: 1407.0000 - val_precision: 0.2552 - val_recall: 0.2552 - val_auc: 0.5531 - val_prc: 0.2044
Epoch 48/66
177/177 [==============================] - 2s 13ms/step - loss: 11.4369 - categorical_accuracy: 0.2551 - tp: 1443.0000 - fp: 4213.0000 - tn: 24067.0000 - fn: 4213.0000 - precision: 0.2551 - recall: 0.2551 - auc: 0.5531 - prc: 0.2044 - val_loss: 11.3968 - val_categorical_accuracy: 0.2552 - val_tp: 482.0000 - val_fp: 1407.0000 - val_tn: 8038.0000 - val_fn: 1407.0000 - val_precision: 0.2552 - val_recall: 0.2552 - val_auc: 0.5531 - val_prc: 0.2044
Epoch 49/66
177/177 [==============================] - 2s 13ms/step - loss: 11.4410 - categorical_accuracy: 0.2551 - tp: 1443.0000 - fp: 4213.0000 - tn: 24067.0000 - fn: 4213.0000 - precision: 0.2551 - recall: 0.2551 - auc: 0.5531 - prc: 0.2044 - val_loss: 11.3968 - val_categorical_accuracy: 0.2552 - val_tp: 482.0000 - val_fp: 1407.0000 - val_tn: 8038.0000 - val_fn: 1407.0000 - val_precision: 0.2552 - val_recall: 0.2552 - val_auc: 0.5531 - val_prc: 0.2044
Epoch 50/66
177/177 [==============================] - 2s 13ms/step - loss: 11.4369 - categorical_accuracy: 0.2551 - tp: 1443.0000 - fp: 4213.0000 - tn: 24067.0000 - fn: 4213.0000 - precision: 0.2551 - recall: 0.2551 - auc: 0.5531 - prc: 0.2044 - val_loss: 11.3968 - val_categorical_accuracy: 0.2552 - val_tp: 482.0000 - val_fp: 1407.0000 - val_tn: 8038.0000 - val_fn: 1407.0000 - val_precision: 0.2552 - val_recall: 0.2552 - val_auc: 0.5531 - val_prc: 0.2044
Epoch 51/66
177/177 [==============================] - 2s 13ms/step - loss: 11.4383 - categorical_accuracy: 0.2551 - tp: 1443.0000 - fp: 4213.0000 - tn: 24067.0000 - fn: 4213.0000 - precision: 0.2551 - recall: 0.2551 - auc: 0.5531 - prc: 0.2044 - val_loss: 11.3968 - val_categorical_accuracy: 0.2552 - val_tp: 482.0000 - val_fp: 1407.0000 - val_tn: 8038.0000 - val_fn: 1407.0000 - val_precision: 0.2552 - val_recall: 0.2552 - val_auc: 0.5531 - val_prc: 0.2044
Epoch 52/66
177/177 [==============================] - 2s 13ms/step - loss: 11.4376 - categorical_accuracy: 0.2551 - tp: 1443.0000 - fp: 4213.0000 - tn: 24067.0000 - fn: 4213.0000 - precision: 0.2551 - recall: 0.2551 - auc: 0.5531 - prc: 0.2044 - val_loss: 11.3968 - val_categorical_accuracy: 0.2552 - val_tp: 482.0000 - val_fp: 1407.0000 - val_tn: 8038.0000 - val_fn: 1407.0000 - val_precision: 0.2552 - val_recall: 0.2552 - val_auc: 0.5531 - val_prc: 0.2044
Epoch 53/66
177/177 [==============================] - 2s 13ms/step - loss: 11.4397 - categorical_accuracy: 0.2551 - tp: 1443.0000 - fp: 4213.0000 - tn: 24067.0000 - fn: 4213.0000 - precision: 0.2551 - recall: 0.2551 - auc: 0.5531 - prc: 0.2044 - val_loss: 11.3968 - val_categorical_accuracy: 0.2552 - val_tp: 482.0000 - val_fp: 1407.0000 - val_tn: 8038.0000 - val_fn: 1407.0000 - val_precision: 0.2552 - val_recall: 0.2552 - val_auc: 0.5531 - val_prc: 0.2044
Epoch 54/66
177/177 [==============================] - 2s 13ms/step - loss: 11.4397 - categorical_accuracy: 0.2551 - tp: 1443.0000 - fp: 4213.0000 - tn: 24067.0000 - fn: 4213.0000 - precision: 0.2551 - recall: 0.2551 - auc: 0.5531 - prc: 0.2044 - val_loss: 11.3968 - val_categorical_accuracy: 0.2552 - val_tp: 482.0000 - val_fp: 1407.0000 - val_tn: 8038.0000 - val_fn: 1407.0000 - val_precision: 0.2552 - val_recall: 0.2552 - val_auc: 0.5531 - val_prc: 0.2044
Epoch 55/66
177/177 [==============================] - 5s 26ms/step - loss: 11.4410 - categorical_accuracy: 0.2551 - tp: 1443.0000 - fp: 4213.0000 - tn: 24067.0000 - fn: 4213.0000 - precision: 0.2551 - recall: 0.2551 - auc: 0.5531 - prc: 0.2044 - val_loss: 11.3968 - val_categorical_accuracy: 0.2552 - val_tp: 482.0000 - val_fp: 1407.0000 - val_tn: 8038.0000 - val_fn: 1407.0000 - val_precision: 0.2552 - val_recall: 0.2552 - val_auc: 0.5531 - val_prc: 0.2044
Epoch 56/66
177/177 [==============================] - 3s 15ms/step - loss: 11.4397 - categorical_accuracy: 0.2551 - tp: 1443.0000 - fp: 4213.0000 - tn: 24067.0000 - fn: 4213.0000 - precision: 0.2551 - recall: 0.2551 - auc: 0.5531 - prc: 0.2044 - val_loss: 11.3968 - val_categorical_accuracy: 0.2552 - val_tp: 482.0000 - val_fp: 1407.0000 - val_tn: 8038.0000 - val_fn: 1407.0000 - val_precision: 0.2552 - val_recall: 0.2552 - val_auc: 0.5531 - val_prc: 0.2044
Epoch 57/66
177/177 [==============================] - 2s 13ms/step - loss: 11.4403 - categorical_accuracy: 0.2551 - tp: 1443.0000 - fp: 4213.0000 - tn: 24067.0000 - fn: 4213.0000 - precision: 0.2551 - recall: 0.2551 - auc: 0.5531 - prc: 0.2044 - val_loss: 11.3968 - val_categorical_accuracy: 0.2552 - val_tp: 482.0000 - val_fp: 1407.0000 - val_tn: 8038.0000 - val_fn: 1407.0000 - val_precision: 0.2552 - val_recall: 0.2552 - val_auc: 0.5531 - val_prc: 0.2044
Epoch 58/66
177/177 [==============================] - 2s 13ms/step - loss: 11.4390 - categorical_accuracy: 0.2551 - tp: 1443.0000 - fp: 4213.0000 - tn: 24067.0000 - fn: 4213.0000 - precision: 0.2551 - recall: 0.2551 - auc: 0.5531 - prc: 0.2044 - val_loss: 11.3968 - val_categorical_accuracy: 0.2552 - val_tp: 482.0000 - val_fp: 1407.0000 - val_tn: 8038.0000 - val_fn: 1407.0000 - val_precision: 0.2552 - val_recall: 0.2552 - val_auc: 0.5531 - val_prc: 0.2044
Epoch 59/66
177/177 [==============================] - 2s 13ms/step - loss: 11.4397 - categorical_accuracy: 0.2551 - tp: 1443.0000 - fp: 4213.0000 - tn: 24067.0000 - fn: 4213.0000 - precision: 0.2551 - recall: 0.2551 - auc: 0.5531 - prc: 0.2044 - val_loss: 11.3968 - val_categorical_accuracy: 0.2552 - val_tp: 482.0000 - val_fp: 1407.0000 - val_tn: 8038.0000 - val_fn: 1407.0000 - val_precision: 0.2552 - val_recall: 0.2552 - val_auc: 0.5531 - val_prc: 0.2044
Epoch 60/66
177/177 [==============================] - 2s 13ms/step - loss: 11.4376 - categorical_accuracy: 0.2551 - tp: 1443.0000 - fp: 4213.0000 - tn: 24067.0000 - fn: 4213.0000 - precision: 0.2551 - recall: 0.2551 - auc: 0.5531 - prc: 0.2044 - val_loss: 11.3968 - val_categorical_accuracy: 0.2552 - val_tp: 482.0000 - val_fp: 1407.0000 - val_tn: 8038.0000 - val_fn: 1407.0000 - val_precision: 0.2552 - val_recall: 0.2552 - val_auc: 0.5531 - val_prc: 0.2044
Epoch 61/66
177/177 [==============================] - 2s 13ms/step - loss: 11.4410 - categorical_accuracy: 0.2551 - tp: 1443.0000 - fp: 4213.0000 - tn: 24067.0000 - fn: 4213.0000 - precision: 0.2551 - recall: 0.2551 - auc: 0.5531 - prc: 0.2044 - val_loss: 11.3968 - val_categorical_accuracy: 0.2552 - val_tp: 482.0000 - val_fp: 1407.0000 - val_tn: 8038.0000 - val_fn: 1407.0000 - val_precision: 0.2552 - val_recall: 0.2552 - val_auc: 0.5531 - val_prc: 0.2044
Epoch 62/66
177/177 [==============================] - 2s 13ms/step - loss: 11.4403 - categorical_accuracy: 0.2551 - tp: 1443.0000 - fp: 4213.0000 - tn: 24067.0000 - fn: 4213.0000 - precision: 0.2551 - recall: 0.2551 - auc: 0.5531 - prc: 0.2044 - val_loss: 11.3968 - val_categorical_accuracy: 0.2552 - val_tp: 482.0000 - val_fp: 1407.0000 - val_tn: 8038.0000 - val_fn: 1407.0000 - val_precision: 0.2552 - val_recall: 0.2552 - val_auc: 0.5531 - val_prc: 0.2044
Epoch 63/66
177/177 [==============================] - 2s 13ms/step - loss: 11.4390 - categorical_accuracy: 0.2551 - tp: 1443.0000 - fp: 4213.0000 - tn: 24067.0000 - fn: 4213.0000 - precision: 0.2551 - recall: 0.2551 - auc: 0.5531 - prc: 0.2044 - val_loss: 11.3968 - val_categorical_accuracy: 0.2552 - val_tp: 482.0000 - val_fp: 1407.0000 - val_tn: 8038.0000 - val_fn: 1407.0000 - val_precision: 0.2552 - val_recall: 0.2552 - val_auc: 0.5531 - val_prc: 0.2044
Epoch 64/66
177/177 [==============================] - 2s 13ms/step - loss: 11.4424 - categorical_accuracy: 0.2551 - tp: 1443.0000 - fp: 4213.0000 - tn: 24067.0000 - fn: 4213.0000 - precision: 0.2551 - recall: 0.2551 - auc: 0.5531 - prc: 0.2044 - val_loss: 11.3968 - val_categorical_accuracy: 0.2552 - val_tp: 482.0000 - val_fp: 1407.0000 - val_tn: 8038.0000 - val_fn: 1407.0000 - val_precision: 0.2552 - val_recall: 0.2552 - val_auc: 0.5531 - val_prc: 0.2044
Epoch 65/66
177/177 [==============================] - 2s 13ms/step - loss: 11.4397 - categorical_accuracy: 0.2551 - tp: 1443.0000 - fp: 4213.0000 - tn: 24067.0000 - fn: 4213.0000 - precision: 0.2551 - recall: 0.2551 - auc: 0.5531 - prc: 0.2044 - val_loss: 11.3968 - val_categorical_accuracy: 0.2552 - val_tp: 482.0000 - val_fp: 1407.0000 - val_tn: 8038.0000 - val_fn: 1407.0000 - val_precision: 0.2552 - val_recall: 0.2552 - val_auc: 0.5531 - val_prc: 0.2044
Epoch 66/66
177/177 [==============================] - 2s 13ms/step - loss: 11.4403 - categorical_accuracy: 0.2551 - tp: 1443.0000 - fp: 4213.0000 - tn: 24067.0000 - fn: 4213.0000 - precision: 0.2551 - recall: 0.2551 - auc: 0.5531 - prc: 0.2044 - val_loss: 11.3968 - val_categorical_accuracy: 0.2552 - val_tp: 482.0000 - val_fp: 1407.0000 - val_tn: 8038.0000 - val_fn: 1407.0000 - val_precision: 0.2552 - val_recall: 0.2552 - val_auc: 0.5531 - val_prc: 0.2044

Experiment on VGG16

In [ ]:
def get_features_and_labels(dataset):
    all_features = []
    all_labels = []
    for images, labels in dataset:
        preprocessed_images = keras.applications.vgg16.preprocess_input(images)
        features = BASE.predict(preprocessed_images)
        all_features.append(features)
        all_labels.append(labels)
    return np.concatenate(all_features), np.concatenate(all_labels)

vgg_train_features, vgg_train_labels =  get_features_and_labels(train_ds)
vgg_val_features, vgg_val_labels =  get_features_and_labels(val_ds)
vgg_test_features, vgg_test_labels =  get_features_and_labels(test_ds)
In [75]:
BASE = keras.applications.vgg16.VGG16(
weights="imagenet",
include_top=False,
input_shape=(224,224,3)
)

BASE.summary()
Downloading data from https://storage.googleapis.com/tensorflow/keras-applications/vgg16/vgg16_weights_tf_dim_ordering_tf_kernels_notop.h5
58889256/58889256 [==============================] - 0s 0us/step
Model: "vgg16"
_________________________________________________________________
 Layer (type)                Output Shape              Param #   
=================================================================
 input_2 (InputLayer)        [(None, 224, 224, 3)]     0         
                                                                 
 block1_conv1 (Conv2D)       (None, 224, 224, 64)      1792      
                                                                 
 block1_conv2 (Conv2D)       (None, 224, 224, 64)      36928     
                                                                 
 block1_pool (MaxPooling2D)  (None, 112, 112, 64)      0         
                                                                 
 block2_conv1 (Conv2D)       (None, 112, 112, 128)     73856     
                                                                 
 block2_conv2 (Conv2D)       (None, 112, 112, 128)     147584    
                                                                 
 block2_pool (MaxPooling2D)  (None, 56, 56, 128)       0         
                                                                 
 block3_conv1 (Conv2D)       (None, 56, 56, 256)       295168    
                                                                 
 block3_conv2 (Conv2D)       (None, 56, 56, 256)       590080    
                                                                 
 block3_conv3 (Conv2D)       (None, 56, 56, 256)       590080    
                                                                 
 block3_pool (MaxPooling2D)  (None, 28, 28, 256)       0         
                                                                 
 block4_conv1 (Conv2D)       (None, 28, 28, 512)       1180160   
                                                                 
 block4_conv2 (Conv2D)       (None, 28, 28, 512)       2359808   
                                                                 
 block4_conv3 (Conv2D)       (None, 28, 28, 512)       2359808   
                                                                 
 block4_pool (MaxPooling2D)  (None, 14, 14, 512)       0         
                                                                 
 block5_conv1 (Conv2D)       (None, 14, 14, 512)       2359808   
                                                                 
 block5_conv2 (Conv2D)       (None, 14, 14, 512)       2359808   
                                                                 
 block5_conv3 (Conv2D)       (None, 14, 14, 512)       2359808   
                                                                 
 block5_pool (MaxPooling2D)  (None, 7, 7, 512)         0         
                                                                 
=================================================================
Total params: 14,714,688
Trainable params: 14,714,688
Non-trainable params: 0
_________________________________________________________________
5.4.7 Finetuning workflow¶
In [114]:
from functools import partial
In [27]:
# Scale normalization batch as normalization layer is after rescaling layer in b4
def get_normalization_batch(tfr, batchsize, img_size=(512, 512)):
    return prepare(ds=tfr, img_size=img_size)

def get_tfrecord_size(tfrecord):
    return sum(1 for _ in tfrecord)
In [21]:
# https://arxiv.org/pdf/1911.04252.pdf
def efficientnetb4(weights="kerasapps/efficientnets/noisy.student.notop-b4.h5/noisy.student.notop-b4.h5",
                   dropout=(0.4, 0.5)):
    
    keras.backend.reset_uids()
    
    efficientnet = keras.applications.efficientnet.EfficientNetB4(weights=weights, include_top=False,
                                                                  drop_connect_rate=dropout[0])
    
    for layer in reversed(efficientnet.layers):
        if isinstance(layer, tf.keras.layers.BatchNormalization):
            layer.trainable = False
        else:
            layer.trainable = True
    
    inputs = keras.layers.Input(shape=(224, 224, 3))
    
    efficientnet = efficientnet(inputs)
    pooling = keras.layers.GlobalAveragePooling2D()(efficientnet)
    dropout = keras.layers.Dropout(dropout[1])(pooling)
    outputs = keras.layers.Dense(6, activation="softmax")(dropout)
    
    model = keras.Model(inputs=inputs, outputs=outputs, name="efficientnetb4-noisystudent")

    return model
In [ ]:
emodel = efficientnetb4()
# emodel.get_layer('efficientnetb4').get_layer('normalization').adapt(get_normalization_batch(train_ds, BATCH_SIZE, (512, 512)))

opt = keras.optimizers.Adam()
                                                                    
# opt="rmsprop"                                                                    
emodel.compile(loss=tfa.losses.SigmoidFocalCrossEntropy(), optimizer=opt, metrics=METRICS)

emodel.summary()

cb_checkpoint = tf.keras.callbacks.ModelCheckpoint("models/emodel-noisy.student.h5",monitor="val_categorical_accuracy", verbose=1, save_best_only=True)
# cb_earlystop = keras.callbacks.EarlyStopping(monitor='val_categorical_accuracy', mode='max', 
#                    patience=4, restore_best_weights=True, verbose=1)
cb_tensorboard = keras.callbacks.TensorBoard(log_dir=f"./tensorboard/efficientnetb4_noisystudent_model")
# cb_lr = tf.keras.callbacks.LearningRateScheduler(lambda epoch: lrfn(epoch), verbose=1)

params = {"epochs":66,
        "validation_data": val_ds,
        "callbacks": [cb_checkpoint, cb_tensorboard]} 

# print(f"Fold: {fold}, {train_size} train images {validation_size} validation images")

emodel_history = emodel.fit(train_ds, **params)

# cv_history.append(history.history)
In [105]:
emodel.summary()
Model: "model"
_________________________________________________________________
 Layer (type)                Output Shape              Param #   
=================================================================
 input_2 (InputLayer)        [(None, 224, 224, 3)]     0         
                                                                 
 efficientnetb4 (Functional)  (None, None, None, 1792)  17673823 
                                                                 
 global_average_pooling2d (G  (None, 1792)             0         
 lobalAveragePooling2D)                                          
                                                                 
 dropout (Dropout)           (None, 1792)              0         
                                                                 
 dense (Dense)               (None, 5)                 8965      
                                                                 
=================================================================
Total params: 17,682,788
Trainable params: 17,432,381
Non-trainable params: 250,407
_________________________________________________________________
In [439]:
%reload_ext tensorboard
%tensorboard --logdir ./tensorboard/efficientnetb4_noisystudent_model --bind_all

Screen Shot 2023-03-18 at 1.41.38 AM.png

The plot represents the training history on Finetuning EfficientNet architecture. It seems like the noisy student weights are not all that great, or, perhaps I destroyed them. I did not proceed here as a result of extreme hardware requirements to go through this finetuning process. I then thought about Ensembles as a better way to put all my previously trained models to work.

5.5 Ensemble with Majority Voting¶

The idea of ensembling is that you can more likely outperform single carefully devised models with less specialized but multiple good models. In this section I will test an ensemble made up of

  1. the best performing SOTA model (pretrained cropnet classifier)
  2. four feature vecotrs (cropnet_cassava, cropnet_concat, cropnet_imagenet, and mobilenet_v3_large_100_224).

Ensembling is not only a way to gain predictive performance, but it is also a teaming strategy. The idea behind this section of the report is also to demonstrate understanding of the important skill of collaboration that teams leverage in competitions in order to provide flexibility to other people to extend and improve my work. I find this will be one of the necessary strategies Africa needs to deploy to accelerate and scale AI-for-good developments faster across the continent in record times. My ensembling technique aims to implement a majority voting layer.

Instantiate models

At this point we need to predict on the test set for each model. To do this, we will be using the mode function from SciPy

In [440]:
m1_cropnet_cassava ='https://tfhub.dev/google/cropnet/feature_vector/cassava_disease_V1/1'
m2_cropnet_concat='https://tfhub.dev/google/cropnet/feature_vector/concat/1'
m3_cropnet_imagenet='https://tfhub.dev/google/cropnet/feature_vector/imagenet/1'
m4_mobilenet_v3_large_100_224='https://tfhub.dev/google/imagenet/mobilenet_v3_large_100_224/feature_vector/5'
m0 = hub.KerasLayer(handle='https://tfhub.dev/google/cropnet/classifier/cassava_disease_V1/2')
m1 = hub.KerasLayer(m1_cropnet_cassava)
m2 = hub.KerasLayer(m2_cropnet_concat)
m3 = hub.KerasLayer(m3_cropnet_imagenet)
m4 = hub.KerasLayer(m4_mobilenet_v3_large_100_224)
5.5.1 Ensemble 1¶
In [22]:
def get_label_index(img, label):
    pred = m1_cropnet_cassava_model.predict(img)
    return pred, label
In [23]:
def get_int_labels(ds):    
    y_tensor = ds.map(get_label_index)
    y_int = np.array([record.argmax(axis=1) for record in y_tensor.as_numpy_iterator()])
    return y_int
In [24]:
# Fresh data
(train_ds, val_ds, test_ds), info = tfds.load('cassava', 
                                         split=['train', 'validation', 'test'],
                                         shuffle_files=True,
                                         as_supervised=True,
                                         with_info=True)
train_ds = prepare(train_ds, shuffle=True, augment=False)
val_ds = prepare(val_ds)
test_ds = prepare(test_ds)
In [25]:
def get_ensemble_model(model_name, model_handle):
    # Model architecture
    inputs = keras.Input(shape=(224, 224, 3), name='preprocessedimage')
    x = hub.KerasLayer(handle=model_handle)(inputs)
    x = layers.Flatten()(x)
    outputs = layers.Dense(6, activation='softmax', name='softmax_layer')(x)
    model = keras.Model(inputs=inputs, outputs=outputs, name=model_name)
    # Compile model
    model.compile(optimizer='rmsprop',
                  loss=tfa.losses.SigmoidFocalCrossEntropy(),
                  metrics=METRICS)
    return model
In [26]:
def train_ensemble_model(model, model_name, e=15):

    callbacks_list = [
        keras.callbacks.EarlyStopping(monitor="val_loss", patience=15), # interrupts training when categorical accurcay has stopped improving for 5 epochs
        keras.callbacks.ModelCheckpoint(filepath=f"models/{model_name}.keras", monitor="val_prc", save_best_only=True), # prevents overwriting model file unless validation loss has improved
        keras.callbacks.TensorBoard(log_dir=f"./tensorboard/{model_name}") # path where callback writes logs
    ]
    model_history = model.fit(
        train_ds,
        epochs=e,
        validation_data=val_ds,
        callbacks=callbacks_list
    )
In [246]:
m1_cropnet_cassava_model = get_ensemble_model(model_name='m1_cropnet_cassava', model_handle=m1_cropnet_cassava)
train_ensemble_model(m1_cropnet_cassava_model, model_name='m1_cropnet_cassava')
Epoch 1/15
177/177 [==============================] - 26s 72ms/step - loss: 0.0626 - categorical_accuracy: 0.8986 - tp: 5924.0000 - fp: 296.0000 - tn: 37409.0000 - fn: 1617.0000 - precision: 0.9524 - recall: 0.7856 - auc: 0.9880 - prc: 0.9540 - val_loss: 0.0492 - val_categorical_accuracy: 0.9169 - val_tp: 1569.0000 - val_fp: 50.0000 - val_tn: 9395.0000 - val_fn: 320.0000 - val_precision: 0.9691 - val_recall: 0.8306 - val_auc: 0.9933 - val_prc: 0.9724
Epoch 2/15
177/177 [==============================] - 9s 52ms/step - loss: 0.0507 - categorical_accuracy: 0.9215 - tp: 4715.0000 - fp: 177.0000 - tn: 28103.0000 - fn: 941.0000 - precision: 0.9638 - recall: 0.8336 - auc: 0.9928 - prc: 0.9709 - val_loss: 0.0539 - val_categorical_accuracy: 0.9148 - val_tp: 1591.0000 - val_fp: 72.0000 - val_tn: 9373.0000 - val_fn: 298.0000 - val_precision: 0.9567 - val_recall: 0.8422 - val_auc: 0.9924 - val_prc: 0.9689
Epoch 3/15
177/177 [==============================] - 9s 52ms/step - loss: 0.0485 - categorical_accuracy: 0.9229 - tp: 4772.0000 - fp: 188.0000 - tn: 28092.0000 - fn: 884.0000 - precision: 0.9621 - recall: 0.8437 - auc: 0.9935 - prc: 0.9733 - val_loss: 0.0534 - val_categorical_accuracy: 0.9111 - val_tp: 1600.0000 - val_fp: 75.0000 - val_tn: 9370.0000 - val_fn: 289.0000 - val_precision: 0.9552 - val_recall: 0.8470 - val_auc: 0.9925 - val_prc: 0.9689
Epoch 4/15
177/177 [==============================] - 9s 50ms/step - loss: 0.0471 - categorical_accuracy: 0.9256 - tp: 4812.0000 - fp: 186.0000 - tn: 28094.0000 - fn: 844.0000 - precision: 0.9628 - recall: 0.8508 - auc: 0.9939 - prc: 0.9747 - val_loss: 0.0505 - val_categorical_accuracy: 0.9153 - val_tp: 1594.0000 - val_fp: 66.0000 - val_tn: 9379.0000 - val_fn: 295.0000 - val_precision: 0.9602 - val_recall: 0.8438 - val_auc: 0.9932 - val_prc: 0.9714
Epoch 5/15
177/177 [==============================] - 9s 50ms/step - loss: 0.0457 - categorical_accuracy: 0.9238 - tp: 4844.0000 - fp: 166.0000 - tn: 28114.0000 - fn: 812.0000 - precision: 0.9669 - recall: 0.8564 - auc: 0.9942 - prc: 0.9761 - val_loss: 0.0533 - val_categorical_accuracy: 0.9158 - val_tp: 1628.0000 - val_fp: 72.0000 - val_tn: 9373.0000 - val_fn: 261.0000 - val_precision: 0.9576 - val_recall: 0.8618 - val_auc: 0.9930 - val_prc: 0.9709
Epoch 6/15
177/177 [==============================] - 9s 50ms/step - loss: 0.0446 - categorical_accuracy: 0.9273 - tp: 4877.0000 - fp: 170.0000 - tn: 28110.0000 - fn: 779.0000 - precision: 0.9663 - recall: 0.8623 - auc: 0.9945 - prc: 0.9772 - val_loss: 0.0518 - val_categorical_accuracy: 0.9201 - val_tp: 1591.0000 - val_fp: 65.0000 - val_tn: 9380.0000 - val_fn: 298.0000 - val_precision: 0.9607 - val_recall: 0.8422 - val_auc: 0.9928 - val_prc: 0.9702
Epoch 7/15
177/177 [==============================] - 12s 70ms/step - loss: 0.0436 - categorical_accuracy: 0.9302 - tp: 4866.0000 - fp: 164.0000 - tn: 28116.0000 - fn: 790.0000 - precision: 0.9674 - recall: 0.8603 - auc: 0.9947 - prc: 0.9783 - val_loss: 0.0519 - val_categorical_accuracy: 0.9148 - val_tp: 1604.0000 - val_fp: 69.0000 - val_tn: 9376.0000 - val_fn: 285.0000 - val_precision: 0.9588 - val_recall: 0.8491 - val_auc: 0.9926 - val_prc: 0.9703
Epoch 8/15
177/177 [==============================] - 9s 50ms/step - loss: 0.0428 - categorical_accuracy: 0.9291 - tp: 4906.0000 - fp: 155.0000 - tn: 28125.0000 - fn: 750.0000 - precision: 0.9694 - recall: 0.8674 - auc: 0.9949 - prc: 0.9789 - val_loss: 0.0505 - val_categorical_accuracy: 0.9190 - val_tp: 1613.0000 - val_fp: 69.0000 - val_tn: 9376.0000 - val_fn: 276.0000 - val_precision: 0.9590 - val_recall: 0.8539 - val_auc: 0.9932 - val_prc: 0.9720
Epoch 9/15
177/177 [==============================] - 9s 50ms/step - loss: 0.0423 - categorical_accuracy: 0.9295 - tp: 4914.0000 - fp: 161.0000 - tn: 28119.0000 - fn: 742.0000 - precision: 0.9683 - recall: 0.8688 - auc: 0.9950 - prc: 0.9795 - val_loss: 0.0536 - val_categorical_accuracy: 0.9201 - val_tp: 1620.0000 - val_fp: 72.0000 - val_tn: 9373.0000 - val_fn: 269.0000 - val_precision: 0.9574 - val_recall: 0.8576 - val_auc: 0.9928 - val_prc: 0.9703
Epoch 10/15
177/177 [==============================] - 9s 50ms/step - loss: 0.0415 - categorical_accuracy: 0.9316 - tp: 4896.0000 - fp: 141.0000 - tn: 28139.0000 - fn: 760.0000 - precision: 0.9720 - recall: 0.8656 - auc: 0.9952 - prc: 0.9801 - val_loss: 0.0535 - val_categorical_accuracy: 0.9201 - val_tp: 1617.0000 - val_fp: 77.0000 - val_tn: 9368.0000 - val_fn: 272.0000 - val_precision: 0.9545 - val_recall: 0.8560 - val_auc: 0.9928 - val_prc: 0.9705
Epoch 11/15
177/177 [==============================] - 9s 50ms/step - loss: 0.0403 - categorical_accuracy: 0.9321 - tp: 4938.0000 - fp: 150.0000 - tn: 28130.0000 - fn: 718.0000 - precision: 0.9705 - recall: 0.8731 - auc: 0.9955 - prc: 0.9812 - val_loss: 0.0531 - val_categorical_accuracy: 0.9132 - val_tp: 1610.0000 - val_fp: 75.0000 - val_tn: 9370.0000 - val_fn: 279.0000 - val_precision: 0.9555 - val_recall: 0.8523 - val_auc: 0.9926 - val_prc: 0.9701
Epoch 12/15
177/177 [==============================] - 9s 50ms/step - loss: 0.0400 - categorical_accuracy: 0.9346 - tp: 4940.0000 - fp: 147.0000 - tn: 28133.0000 - fn: 716.0000 - precision: 0.9711 - recall: 0.8734 - auc: 0.9956 - prc: 0.9815 - val_loss: 0.0572 - val_categorical_accuracy: 0.9153 - val_tp: 1635.0000 - val_fp: 82.0000 - val_tn: 9363.0000 - val_fn: 254.0000 - val_precision: 0.9522 - val_recall: 0.8655 - val_auc: 0.9923 - val_prc: 0.9691
Epoch 13/15
177/177 [==============================] - 12s 66ms/step - loss: 0.0401 - categorical_accuracy: 0.9355 - tp: 4938.0000 - fp: 155.0000 - tn: 28125.0000 - fn: 718.0000 - precision: 0.9696 - recall: 0.8731 - auc: 0.9955 - prc: 0.9815 - val_loss: 0.0534 - val_categorical_accuracy: 0.9153 - val_tp: 1615.0000 - val_fp: 75.0000 - val_tn: 9370.0000 - val_fn: 274.0000 - val_precision: 0.9556 - val_recall: 0.8549 - val_auc: 0.9925 - val_prc: 0.9695
Epoch 14/15
177/177 [==============================] - 10s 56ms/step - loss: 0.0391 - categorical_accuracy: 0.9326 - tp: 4952.0000 - fp: 151.0000 - tn: 28129.0000 - fn: 704.0000 - precision: 0.9704 - recall: 0.8755 - auc: 0.9957 - prc: 0.9822 - val_loss: 0.0552 - val_categorical_accuracy: 0.9084 - val_tp: 1615.0000 - val_fp: 85.0000 - val_tn: 9360.0000 - val_fn: 274.0000 - val_precision: 0.9500 - val_recall: 0.8549 - val_auc: 0.9920 - val_prc: 0.9682
Epoch 15/15
177/177 [==============================] - 9s 50ms/step - loss: 0.0382 - categorical_accuracy: 0.9362 - tp: 4975.0000 - fp: 139.0000 - tn: 28141.0000 - fn: 681.0000 - precision: 0.9728 - recall: 0.8796 - auc: 0.9959 - prc: 0.9830 - val_loss: 0.0525 - val_categorical_accuracy: 0.9116 - val_tp: 1629.0000 - val_fp: 79.0000 - val_tn: 9366.0000 - val_fn: 260.0000 - val_precision: 0.9537 - val_recall: 0.8624 - val_auc: 0.9929 - val_prc: 0.9713
In [288]:
evaluate_model_performance(m1_cropnet_cassava_model, val_ds, test_ds)
60/60 [==============================] - 2s 38ms/step - loss: 0.0525 - categorical_accuracy: 0.9116 - tp: 1629.0000 - fp: 79.0000 - tn: 9366.0000 - fn: 260.0000 - precision: 0.9537 - recall: 0.8624 - auc: 0.9929 - prc: 0.9713
Validation AUC: 0.993
Validation PRC: 0.971
Validation categorical accuracy: 0.912
59/59 [==============================] - 2s 37ms/step - loss: 0.0780 - categorical_accuracy: 0.8939 - tp: 1574.0000 - fp: 123.0000 - tn: 9302.0000 - fn: 311.0000 - precision: 0.9275 - recall: 0.8350 - auc: 0.9861 - prc: 0.9502
Test AUC: 0.986
Test PRC: 0.950
Test categorical accuracy: 0.894
5.5.2 Ensemble 2¶
In [34]:
input_shape=(224, 224, 3)
# Define the input layers for each feature vector
input_cassava = tf.keras.layers.Input(shape=input_shape, name='cassava_disease_input')

# Define the feature extraction layers for each input
cassava_features = hub.KerasLayer('https://tfhub.dev/google/cropnet/feature_vector/cassava_disease_V1/1', trainable=False)(input_cassava)
concat_features = hub.KerasLayer('https://tfhub.dev/google/cropnet/feature_vector/concat/1', trainable=False)(input_cassava)
imagenet_features = hub.KerasLayer('https://tfhub.dev/google/cropnet/feature_vector/imagenet/1', trainable=False)(input_cassava)
mobilenet_features = hub.KerasLayer('https://tfhub.dev/google/imagenet/mobilenet_v3_large_100_224/feature_vector/5', trainable=False)(input_cassava)

# Concatenate the features into a single vector
concatenated = tf.keras.layers.concatenate([cassava_features, concat_features, imagenet_features, mobilenet_features], name='concatenated')

# Define the output layer for classification
output = tf.keras.layers.Dense(units=6, activation='softmax', name='output')(concatenated)

# Define the model with the input and output layers
model = tf.keras.Model(inputs=input_cassava, outputs=output)

# Compile the model with an appropriate optimizer, loss function, and evaluation metrics
model.compile(optimizer=tf.keras.optimizers.Adam(),
              loss=tf.keras.losses.CategoricalCrossentropy(),
              metrics=METRICS)
In [35]:
keras.utils.plot_model(model, "4_cropnet_feature_vectors.png")
Out[35]:

This model takes four feature vectors and concatenates them into a single vector using the tf.keras.layers.concatenate layer. The concatenated vector is then fed into a single output layer with softmax activation for classification. The model is compiled with the Adam optimizer, categorical cross-entropy loss function, and accuracy metrics. Note that the feature vectors are loaded as Keras layers using TensorFlow Hub, with trainable=False to freeze the weights of the pre-trained feature extraction layers.

In [468]:
train_ensemble_model(model, model_name='cropnet_feature_vec', e=30)
Epoch 1/30
177/177 [==============================] - 62s 189ms/step - loss: 123.2704 - categorical_accuracy: 0.7010 - tp: 5144.0000 - fp: 2090.0000 - tn: 35615.0000 - fn: 2397.0000 - precision: 0.7111 - recall: 0.6821 - auc: 0.8382 - prc: 0.5850 - val_loss: 137.0878 - val_categorical_accuracy: 0.7464 - val_tp: 1408.0000 - val_fp: 477.0000 - val_tn: 8968.0000 - val_fn: 481.0000 - val_precision: 0.7469 - val_recall: 0.7454 - val_auc: 0.8670 - val_prc: 0.6672
Epoch 2/30
177/177 [==============================] - 29s 163ms/step - loss: 144.9534 - categorical_accuracy: 0.7309 - tp: 4129.0000 - fp: 1516.0000 - tn: 26764.0000 - fn: 1527.0000 - precision: 0.7314 - recall: 0.7300 - auc: 0.8605 - prc: 0.6550 - val_loss: 151.3244 - val_categorical_accuracy: 0.7692 - val_tp: 1453.0000 - val_fp: 435.0000 - val_tn: 9010.0000 - val_fn: 436.0000 - val_precision: 0.7696 - val_recall: 0.7692 - val_auc: 0.8755 - val_prc: 0.6898
Epoch 3/30
177/177 [==============================] - 29s 166ms/step - loss: 91.2772 - categorical_accuracy: 0.7696 - tp: 4351.0000 - fp: 1299.0000 - tn: 26981.0000 - fn: 1305.0000 - precision: 0.7701 - recall: 0.7693 - auc: 0.8794 - prc: 0.6983 - val_loss: 132.5305 - val_categorical_accuracy: 0.7845 - val_tp: 1482.0000 - val_fp: 407.0000 - val_tn: 9038.0000 - val_fn: 407.0000 - val_precision: 0.7845 - val_recall: 0.7845 - val_auc: 0.8815 - val_prc: 0.7086
Epoch 4/30
177/177 [==============================] - 29s 162ms/step - loss: 94.6897 - categorical_accuracy: 0.7827 - tp: 4427.0000 - fp: 1224.0000 - tn: 27056.0000 - fn: 1229.0000 - precision: 0.7834 - recall: 0.7827 - auc: 0.8847 - prc: 0.7119 - val_loss: 161.6780 - val_categorical_accuracy: 0.7962 - val_tp: 1504.0000 - val_fp: 385.0000 - val_tn: 9060.0000 - val_fn: 385.0000 - val_precision: 0.7962 - val_recall: 0.7962 - val_auc: 0.8883 - val_prc: 0.7222
Epoch 5/30
177/177 [==============================] - 29s 166ms/step - loss: 79.7797 - categorical_accuracy: 0.8087 - tp: 4574.0000 - fp: 1080.0000 - tn: 27200.0000 - fn: 1082.0000 - precision: 0.8090 - recall: 0.8087 - auc: 0.8971 - prc: 0.7413 - val_loss: 174.9648 - val_categorical_accuracy: 0.7777 - val_tp: 1469.0000 - val_fp: 420.0000 - val_tn: 9025.0000 - val_fn: 420.0000 - val_precision: 0.7777 - val_recall: 0.7777 - val_auc: 0.8768 - val_prc: 0.6954
Epoch 6/30
177/177 [==============================] - 30s 170ms/step - loss: 93.6682 - categorical_accuracy: 0.8022 - tp: 4536.0000 - fp: 1119.0000 - tn: 27161.0000 - fn: 1120.0000 - precision: 0.8021 - recall: 0.8020 - auc: 0.8917 - prc: 0.7290 - val_loss: 179.7065 - val_categorical_accuracy: 0.7565 - val_tp: 1429.0000 - val_fp: 460.0000 - val_tn: 8985.0000 - val_fn: 460.0000 - val_precision: 0.7565 - val_recall: 0.7565 - val_auc: 0.8635 - val_prc: 0.6639
Epoch 7/30
177/177 [==============================] - 29s 166ms/step - loss: 92.7132 - categorical_accuracy: 0.8122 - tp: 4594.0000 - fp: 1061.0000 - tn: 27219.0000 - fn: 1062.0000 - precision: 0.8124 - recall: 0.8122 - auc: 0.8970 - prc: 0.7407 - val_loss: 227.5612 - val_categorical_accuracy: 0.7745 - val_tp: 1462.0000 - val_fp: 426.0000 - val_tn: 9019.0000 - val_fn: 427.0000 - val_precision: 0.7744 - val_recall: 0.7740 - val_auc: 0.8728 - val_prc: 0.6895
Epoch 8/30
177/177 [==============================] - 29s 162ms/step - loss: 108.4444 - categorical_accuracy: 0.8198 - tp: 4636.0000 - fp: 1018.0000 - tn: 27262.0000 - fn: 1020.0000 - precision: 0.8200 - recall: 0.8197 - auc: 0.9009 - prc: 0.7498 - val_loss: 154.9166 - val_categorical_accuracy: 0.8020 - val_tp: 1515.0000 - val_fp: 373.0000 - val_tn: 9072.0000 - val_fn: 374.0000 - val_precision: 0.8024 - val_recall: 0.8020 - val_auc: 0.8900 - val_prc: 0.7260
Epoch 9/30
177/177 [==============================] - 29s 165ms/step - loss: 70.0210 - categorical_accuracy: 0.8455 - tp: 4782.0000 - fp: 873.0000 - tn: 27407.0000 - fn: 874.0000 - precision: 0.8456 - recall: 0.8455 - auc: 0.9156 - prc: 0.7847 - val_loss: 138.6201 - val_categorical_accuracy: 0.7951 - val_tp: 1502.0000 - val_fp: 387.0000 - val_tn: 9058.0000 - val_fn: 387.0000 - val_precision: 0.7951 - val_recall: 0.7951 - val_auc: 0.8861 - val_prc: 0.7188
Epoch 10/30
177/177 [==============================] - 29s 162ms/step - loss: 88.2827 - categorical_accuracy: 0.8379 - tp: 4738.0000 - fp: 917.0000 - tn: 27363.0000 - fn: 918.0000 - precision: 0.8378 - recall: 0.8377 - auc: 0.9103 - prc: 0.7722 - val_loss: 200.0543 - val_categorical_accuracy: 0.7972 - val_tp: 1506.0000 - val_fp: 382.0000 - val_tn: 9063.0000 - val_fn: 383.0000 - val_precision: 0.7977 - val_recall: 0.7972 - val_auc: 0.8850 - val_prc: 0.7153
Epoch 11/30
177/177 [==============================] - 29s 165ms/step - loss: 88.8364 - categorical_accuracy: 0.8495 - tp: 4805.0000 - fp: 850.0000 - tn: 27430.0000 - fn: 851.0000 - precision: 0.8497 - recall: 0.8495 - auc: 0.9174 - prc: 0.7890 - val_loss: 160.2347 - val_categorical_accuracy: 0.8195 - val_tp: 1548.0000 - val_fp: 341.0000 - val_tn: 9104.0000 - val_fn: 341.0000 - val_precision: 0.8195 - val_recall: 0.8195 - val_auc: 0.8972 - val_prc: 0.7444
Epoch 12/30
177/177 [==============================] - 29s 162ms/step - loss: 63.8109 - categorical_accuracy: 0.8520 - tp: 4819.0000 - fp: 837.0000 - tn: 27443.0000 - fn: 837.0000 - precision: 0.8520 - recall: 0.8520 - auc: 0.9179 - prc: 0.7899 - val_loss: 167.5037 - val_categorical_accuracy: 0.7988 - val_tp: 1509.0000 - val_fp: 380.0000 - val_tn: 9065.0000 - val_fn: 380.0000 - val_precision: 0.7988 - val_recall: 0.7988 - val_auc: 0.8847 - val_prc: 0.7162
Epoch 13/30
177/177 [==============================] - 29s 165ms/step - loss: 61.1459 - categorical_accuracy: 0.8605 - tp: 4867.0000 - fp: 789.0000 - tn: 27491.0000 - fn: 789.0000 - precision: 0.8605 - recall: 0.8605 - auc: 0.9241 - prc: 0.8051 - val_loss: 165.1913 - val_categorical_accuracy: 0.7650 - val_tp: 1445.0000 - val_fp: 443.0000 - val_tn: 9002.0000 - val_fn: 444.0000 - val_precision: 0.7654 - val_recall: 0.7650 - val_auc: 0.8652 - val_prc: 0.6724
Epoch 14/30
177/177 [==============================] - 29s 162ms/step - loss: 97.8662 - categorical_accuracy: 0.8510 - tp: 4813.0000 - fp: 842.0000 - tn: 27438.0000 - fn: 843.0000 - precision: 0.8511 - recall: 0.8510 - auc: 0.9175 - prc: 0.7898 - val_loss: 196.7064 - val_categorical_accuracy: 0.7967 - val_tp: 1505.0000 - val_fp: 383.0000 - val_tn: 9062.0000 - val_fn: 384.0000 - val_precision: 0.7971 - val_recall: 0.7967 - val_auc: 0.8838 - val_prc: 0.7116
Epoch 15/30
177/177 [==============================] - 30s 171ms/step - loss: 91.7232 - categorical_accuracy: 0.8552 - tp: 4837.0000 - fp: 819.0000 - tn: 27461.0000 - fn: 819.0000 - precision: 0.8552 - recall: 0.8552 - auc: 0.9210 - prc: 0.7975 - val_loss: 152.0739 - val_categorical_accuracy: 0.8216 - val_tp: 1552.0000 - val_fp: 337.0000 - val_tn: 9108.0000 - val_fn: 337.0000 - val_precision: 0.8216 - val_recall: 0.8216 - val_auc: 0.8977 - val_prc: 0.7441
Epoch 16/30
177/177 [==============================] - 29s 162ms/step - loss: 72.2875 - categorical_accuracy: 0.8545 - tp: 4833.0000 - fp: 823.0000 - tn: 27457.0000 - fn: 823.0000 - precision: 0.8545 - recall: 0.8545 - auc: 0.9192 - prc: 0.7928 - val_loss: 227.6128 - val_categorical_accuracy: 0.8174 - val_tp: 1544.0000 - val_fp: 345.0000 - val_tn: 9100.0000 - val_fn: 345.0000 - val_precision: 0.8174 - val_recall: 0.8174 - val_auc: 0.8982 - val_prc: 0.7441
Epoch 17/30
177/177 [==============================] - 30s 169ms/step - loss: 81.6810 - categorical_accuracy: 0.8614 - tp: 4872.0000 - fp: 783.0000 - tn: 27497.0000 - fn: 784.0000 - precision: 0.8615 - recall: 0.8614 - auc: 0.9235 - prc: 0.8038 - val_loss: 195.3600 - val_categorical_accuracy: 0.8094 - val_tp: 1529.0000 - val_fp: 360.0000 - val_tn: 9085.0000 - val_fn: 360.0000 - val_precision: 0.8094 - val_recall: 0.8094 - val_auc: 0.8923 - val_prc: 0.7318
Epoch 18/30
177/177 [==============================] - 29s 162ms/step - loss: 73.5337 - categorical_accuracy: 0.8518 - tp: 4818.0000 - fp: 838.0000 - tn: 27442.0000 - fn: 838.0000 - precision: 0.8518 - recall: 0.8518 - auc: 0.9179 - prc: 0.7901 - val_loss: 217.4397 - val_categorical_accuracy: 0.8137 - val_tp: 1536.0000 - val_fp: 352.0000 - val_tn: 9093.0000 - val_fn: 353.0000 - val_precision: 0.8136 - val_recall: 0.8131 - val_auc: 0.8939 - val_prc: 0.7348
In [470]:
evaluate_model_performance(model, val_ds, test_ds)
60/60 [==============================] - 9s 153ms/step - loss: 217.4397 - categorical_accuracy: 0.8137 - tp: 1536.0000 - fp: 352.0000 - tn: 9093.0000 - fn: 353.0000 - precision: 0.8136 - recall: 0.8131 - auc: 0.8939 - prc: 0.7348
Validation AUC: 0.894
Validation PRC: 0.735
Validation categorical accuracy: 0.814
59/59 [==============================] - 7s 121ms/step - loss: 180.6257 - categorical_accuracy: 0.8111 - tp: 1529.0000 - fp: 356.0000 - tn: 9069.0000 - fn: 356.0000 - precision: 0.8111 - recall: 0.8111 - auc: 0.8928 - prc: 0.7311
Test AUC: 0.893
Test PRC: 0.731
Test categorical accuracy: 0.811
5.5.3 Ensemble 3¶
In [27]:
# Define the input shape for the feature vectors
input_shape = (224, 224,3)

# Define the input layers for each feature vector
input_cassava = tf.keras.layers.Input(shape=input_shape, name='cassava_disease_input')
# input_concat = tf.keras.layers.Input(shape=input_shape, name='concat_input')
# input_imagenet = tf.keras.layers.Input(shape=input_shape, name='imagenet_input')
# input_mobilenet = tf.keras.layers.Input(shape=input_shape, name='mobilenet_input')

# Define the feature extraction layers for each input
cassava_features = hub.KerasLayer('https://tfhub.dev/google/cropnet/feature_vector/cassava_disease_V1/1', trainable=False)(input_cassava)
concat_features = hub.KerasLayer('https://tfhub.dev/google/cropnet/feature_vector/concat/1', trainable=False)(input_cassava)
imagenet_features = hub.KerasLayer('https://tfhub.dev/google/cropnet/feature_vector/imagenet/1', trainable=False)(input_cassava)
mobilenet_features = hub.KerasLayer('https://tfhub.dev/google/imagenet/mobilenet_v3_large_100_224/feature_vector/5', trainable=False)(input_cassava)

# Define the ensemble model by concatenating the features and adding a fully connected layer
concatenated = tf.keras.layers.concatenate([cassava_features, concat_features, imagenet_features, mobilenet_features], name='concatenated')
ensemble_output = tf.keras.layers.Dense(units=128, activation='relu', name='ensemble_layer')(concatenated)
output = tf.keras.layers.Dense(units=6, activation='softmax', name='output')(ensemble_output)

# Define the models for each input feature vector
cassava_model = tf.keras.Model(inputs=input_cassava, outputs=cassava_features)
concat_model = tf.keras.Model(inputs=input_cassava, outputs=concat_features)
imagenet_model = tf.keras.Model(inputs=input_cassava, outputs=imagenet_features)
mobilenet_model = tf.keras.Model(inputs=input_cassava, outputs=mobilenet_features)

# Freeze the weights of the feature extraction layers in the individual models
for model in [cassava_model, concat_model, imagenet_model, mobilenet_model]:
    model.trainable = False

# Combine the individual models into a single ensemble model
# ensemble_inputs = [input_cassava, input_concat, input_imagenet, input_mobilenet]
ensemble_inputs = input_cassava
ensemble_models = [cassava_model, concat_model, imagenet_model, mobilenet_model]
ensemble_outputs = [model.output for model in ensemble_models]
ensemble_model = tf.keras.Model(inputs=ensemble_inputs, outputs=output)

# Compile the ensemble model with an appropriate optimizer, loss function, and evaluation metrics
ensemble_model.compile(optimizer=tf.keras.optimizers.Adam(),
                       loss=tf.keras.losses.CategoricalCrossentropy(),
                       metrics=METRICS)
In [29]:
ensemble_model.summary()
Model: "model_4"
__________________________________________________________________________________________________
 Layer (type)                   Output Shape         Param #     Connected to                     
==================================================================================================
 cassava_disease_input (InputLa  [(None, 224, 224, 3  0          []                               
 yer)                           )]                                                                
                                                                                                  
 keras_layer (KerasLayer)       (None, 1280)         4234118     ['cassava_disease_input[0][0]']  
                                                                                                  
 keras_layer_1 (KerasLayer)     (None, 1280)         15581216    ['cassava_disease_input[0][0]']  
                                                                                                  
 keras_layer_2 (KerasLayer)     (None, 1280)         5507432     ['cassava_disease_input[0][0]']  
                                                                                                  
 keras_layer_3 (KerasLayer)     (None, 1280)         4226432     ['cassava_disease_input[0][0]']  
                                                                                                  
 concatenated (Concatenate)     (None, 5120)         0           ['keras_layer[0][0]',            
                                                                  'keras_layer_1[0][0]',          
                                                                  'keras_layer_2[0][0]',          
                                                                  'keras_layer_3[0][0]']          
                                                                                                  
 ensemble_layer (Dense)         (None, 128)          655488      ['concatenated[0][0]']           
                                                                                                  
 output (Dense)                 (None, 6)            774         ['ensemble_layer[0][0]']         
                                                                                                  
==================================================================================================
Total params: 30,205,460
Trainable params: 656,262
Non-trainable params: 29,549,198
__________________________________________________________________________________________________
In [28]:
keras.utils.plot_model(ensemble_model, "ensemble_classifier.png")
Out[28]:
In [30]:
train_ensemble_model(ensemble_model, model_name='ensemble_model', e=30)
Epoch 1/30
2023-03-19 03:20:18.692481: I tensorflow/stream_executor/cuda/cuda_dnn.cc:368] Loaded cuDNN version 8200
177/177 [==============================] - 79s 195ms/step - loss: 57.7150 - categorical_accuracy: 0.6186 - tp: 2939.0000 - fp: 1325.0000 - tn: 26955.0000 - fn: 2717.0000 - precision: 0.6893 - recall: 0.5196 - auc: 0.8243 - prc: 0.5219 - val_loss: 5.6161 - val_categorical_accuracy: 0.7110 - val_tp: 909.0000 - val_fp: 145.0000 - val_tn: 9300.0000 - val_fn: 980.0000 - val_precision: 0.8624 - val_recall: 0.4812 - val_auc: 0.9064 - val_prc: 0.7195
Epoch 2/30
177/177 [==============================] - 30s 168ms/step - loss: 1.2738 - categorical_accuracy: 0.7438 - tp: 3062.0000 - fp: 358.0000 - tn: 27922.0000 - fn: 2594.0000 - precision: 0.8953 - recall: 0.5414 - auc: 0.9344 - prc: 0.7987 - val_loss: 1.8867 - val_categorical_accuracy: 0.7459 - val_tp: 1113.0000 - val_fp: 184.0000 - val_tn: 9261.0000 - val_fn: 776.0000 - val_precision: 0.8581 - val_recall: 0.5892 - val_auc: 0.9334 - val_prc: 0.8054
Epoch 3/30
177/177 [==============================] - 29s 164ms/step - loss: 0.9144 - categorical_accuracy: 0.7686 - tp: 3301.0000 - fp: 301.0000 - tn: 27979.0000 - fn: 2355.0000 - precision: 0.9164 - recall: 0.5836 - auc: 0.9507 - prc: 0.8453 - val_loss: 0.8515 - val_categorical_accuracy: 0.7697 - val_tp: 1093.0000 - val_fp: 108.0000 - val_tn: 9337.0000 - val_fn: 796.0000 - val_precision: 0.9101 - val_recall: 0.5786 - val_auc: 0.9450 - val_prc: 0.8352
Epoch 4/30
177/177 [==============================] - 30s 168ms/step - loss: 0.6848 - categorical_accuracy: 0.7825 - tp: 3485.0000 - fp: 298.0000 - tn: 27982.0000 - fn: 2171.0000 - precision: 0.9212 - recall: 0.6162 - auc: 0.9600 - prc: 0.8716 - val_loss: 0.8227 - val_categorical_accuracy: 0.7697 - val_tp: 1138.0000 - val_fp: 130.0000 - val_tn: 9315.0000 - val_fn: 751.0000 - val_precision: 0.8975 - val_recall: 0.6024 - val_auc: 0.9477 - val_prc: 0.8447
Epoch 5/30
177/177 [==============================] - 29s 164ms/step - loss: 0.6621 - categorical_accuracy: 0.7960 - tp: 3601.0000 - fp: 233.0000 - tn: 28047.0000 - fn: 2055.0000 - precision: 0.9392 - recall: 0.6367 - auc: 0.9654 - prc: 0.8877 - val_loss: 0.8169 - val_categorical_accuracy: 0.7692 - val_tp: 1140.0000 - val_fp: 130.0000 - val_tn: 9315.0000 - val_fn: 749.0000 - val_precision: 0.8976 - val_recall: 0.6035 - val_auc: 0.9493 - val_prc: 0.8476
Epoch 6/30
177/177 [==============================] - 30s 167ms/step - loss: 0.6639 - categorical_accuracy: 0.7749 - tp: 3467.0000 - fp: 284.0000 - tn: 27996.0000 - fn: 2189.0000 - precision: 0.9243 - recall: 0.6130 - auc: 0.9623 - prc: 0.8748 - val_loss: 0.8350 - val_categorical_accuracy: 0.7560 - val_tp: 1125.0000 - val_fp: 139.0000 - val_tn: 9306.0000 - val_fn: 764.0000 - val_precision: 0.8900 - val_recall: 0.5956 - val_auc: 0.9477 - val_prc: 0.8427
Epoch 7/30
177/177 [==============================] - 29s 164ms/step - loss: 0.5915 - categorical_accuracy: 0.7894 - tp: 3571.0000 - fp: 225.0000 - tn: 28055.0000 - fn: 2085.0000 - precision: 0.9407 - recall: 0.6314 - auc: 0.9672 - prc: 0.8911 - val_loss: 0.7762 - val_categorical_accuracy: 0.7639 - val_tp: 1118.0000 - val_fp: 131.0000 - val_tn: 9314.0000 - val_fn: 771.0000 - val_precision: 0.8951 - val_recall: 0.5918 - val_auc: 0.9500 - val_prc: 0.8487
Epoch 8/30
177/177 [==============================] - 30s 168ms/step - loss: 0.5723 - categorical_accuracy: 0.7988 - tp: 3620.0000 - fp: 217.0000 - tn: 28063.0000 - fn: 2036.0000 - precision: 0.9434 - recall: 0.6400 - auc: 0.9687 - prc: 0.8969 - val_loss: 0.7916 - val_categorical_accuracy: 0.7612 - val_tp: 1144.0000 - val_fp: 147.0000 - val_tn: 9298.0000 - val_fn: 745.0000 - val_precision: 0.8861 - val_recall: 0.6056 - val_auc: 0.9486 - val_prc: 0.8449
Epoch 9/30
177/177 [==============================] - 29s 164ms/step - loss: 0.5393 - categorical_accuracy: 0.8060 - tp: 3708.0000 - fp: 208.0000 - tn: 28072.0000 - fn: 1948.0000 - precision: 0.9469 - recall: 0.6556 - auc: 0.9721 - prc: 0.9051 - val_loss: 0.9190 - val_categorical_accuracy: 0.7681 - val_tp: 1184.0000 - val_fp: 151.0000 - val_tn: 9294.0000 - val_fn: 705.0000 - val_precision: 0.8869 - val_recall: 0.6268 - val_auc: 0.9488 - val_prc: 0.8478
Epoch 10/30
177/177 [==============================] - 30s 167ms/step - loss: 0.6995 - categorical_accuracy: 0.8122 - tp: 3761.0000 - fp: 193.0000 - tn: 28087.0000 - fn: 1895.0000 - precision: 0.9512 - recall: 0.6650 - auc: 0.9728 - prc: 0.9081 - val_loss: 0.7604 - val_categorical_accuracy: 0.7724 - val_tp: 1202.0000 - val_fp: 140.0000 - val_tn: 9305.0000 - val_fn: 687.0000 - val_precision: 0.8957 - val_recall: 0.6363 - val_auc: 0.9528 - val_prc: 0.8596
Epoch 11/30
177/177 [==============================] - 29s 164ms/step - loss: 0.5007 - categorical_accuracy: 0.8182 - tp: 3832.0000 - fp: 189.0000 - tn: 28091.0000 - fn: 1824.0000 - precision: 0.9530 - recall: 0.6775 - auc: 0.9753 - prc: 0.9160 - val_loss: 0.7951 - val_categorical_accuracy: 0.7665 - val_tp: 1180.0000 - val_fp: 152.0000 - val_tn: 9293.0000 - val_fn: 709.0000 - val_precision: 0.8859 - val_recall: 0.6247 - val_auc: 0.9495 - val_prc: 0.8504
Epoch 12/30
177/177 [==============================] - 29s 167ms/step - loss: 0.4954 - categorical_accuracy: 0.8204 - tp: 3847.0000 - fp: 154.0000 - tn: 28126.0000 - fn: 1809.0000 - precision: 0.9615 - recall: 0.6802 - auc: 0.9752 - prc: 0.9158 - val_loss: 0.8251 - val_categorical_accuracy: 0.7618 - val_tp: 1105.0000 - val_fp: 133.0000 - val_tn: 9312.0000 - val_fn: 784.0000 - val_precision: 0.8926 - val_recall: 0.5850 - val_auc: 0.9453 - val_prc: 0.8405
Epoch 13/30
177/177 [==============================] - 29s 163ms/step - loss: 0.4990 - categorical_accuracy: 0.8218 - tp: 3751.0000 - fp: 117.0000 - tn: 28163.0000 - fn: 1905.0000 - precision: 0.9698 - recall: 0.6632 - auc: 0.9753 - prc: 0.9160 - val_loss: 0.7653 - val_categorical_accuracy: 0.7713 - val_tp: 1163.0000 - val_fp: 125.0000 - val_tn: 9320.0000 - val_fn: 726.0000 - val_precision: 0.9030 - val_recall: 0.6157 - val_auc: 0.9510 - val_prc: 0.8560
Epoch 14/30
177/177 [==============================] - 30s 167ms/step - loss: 0.4692 - categorical_accuracy: 0.8278 - tp: 3860.0000 - fp: 121.0000 - tn: 28159.0000 - fn: 1796.0000 - precision: 0.9696 - recall: 0.6825 - auc: 0.9774 - prc: 0.9216 - val_loss: 0.7859 - val_categorical_accuracy: 0.7702 - val_tp: 1185.0000 - val_fp: 144.0000 - val_tn: 9301.0000 - val_fn: 704.0000 - val_precision: 0.8916 - val_recall: 0.6273 - val_auc: 0.9501 - val_prc: 0.8525
Epoch 15/30
177/177 [==============================] - 29s 164ms/step - loss: 0.4607 - categorical_accuracy: 0.8301 - tp: 3859.0000 - fp: 99.0000 - tn: 28181.0000 - fn: 1797.0000 - precision: 0.9750 - recall: 0.6823 - auc: 0.9782 - prc: 0.9241 - val_loss: 0.7781 - val_categorical_accuracy: 0.7803 - val_tp: 1198.0000 - val_fp: 122.0000 - val_tn: 9323.0000 - val_fn: 691.0000 - val_precision: 0.9076 - val_recall: 0.6342 - val_auc: 0.9520 - val_prc: 0.8589
Epoch 16/30
177/177 [==============================] - 29s 166ms/step - loss: 0.4752 - categorical_accuracy: 0.8327 - tp: 3876.0000 - fp: 117.0000 - tn: 28163.0000 - fn: 1780.0000 - precision: 0.9707 - recall: 0.6853 - auc: 0.9780 - prc: 0.9243 - val_loss: 0.8001 - val_categorical_accuracy: 0.7740 - val_tp: 1189.0000 - val_fp: 137.0000 - val_tn: 9308.0000 - val_fn: 700.0000 - val_precision: 0.8967 - val_recall: 0.6294 - val_auc: 0.9506 - val_prc: 0.8563
Epoch 17/30
177/177 [==============================] - 29s 164ms/step - loss: 0.4767 - categorical_accuracy: 0.8241 - tp: 3809.0000 - fp: 112.0000 - tn: 28168.0000 - fn: 1847.0000 - precision: 0.9714 - recall: 0.6734 - auc: 0.9763 - prc: 0.9189 - val_loss: 1.0847 - val_categorical_accuracy: 0.7681 - val_tp: 1163.0000 - val_fp: 135.0000 - val_tn: 9310.0000 - val_fn: 726.0000 - val_precision: 0.8960 - val_recall: 0.6157 - val_auc: 0.9479 - val_prc: 0.8474
Epoch 18/30
177/177 [==============================] - 30s 167ms/step - loss: 0.5473 - categorical_accuracy: 0.8264 - tp: 3808.0000 - fp: 106.0000 - tn: 28174.0000 - fn: 1848.0000 - precision: 0.9729 - recall: 0.6733 - auc: 0.9762 - prc: 0.9196 - val_loss: 1.0710 - val_categorical_accuracy: 0.7702 - val_tp: 1120.0000 - val_fp: 134.0000 - val_tn: 9311.0000 - val_fn: 769.0000 - val_precision: 0.8931 - val_recall: 0.5929 - val_auc: 0.9470 - val_prc: 0.8434
Epoch 19/30
177/177 [==============================] - 29s 163ms/step - loss: 0.4762 - categorical_accuracy: 0.8313 - tp: 3767.0000 - fp: 70.0000 - tn: 28210.0000 - fn: 1889.0000 - precision: 0.9818 - recall: 0.6660 - auc: 0.9765 - prc: 0.9203 - val_loss: 0.8629 - val_categorical_accuracy: 0.7697 - val_tp: 1148.0000 - val_fp: 137.0000 - val_tn: 9308.0000 - val_fn: 741.0000 - val_precision: 0.8934 - val_recall: 0.6077 - val_auc: 0.9487 - val_prc: 0.8500
Epoch 20/30
177/177 [==============================] - 30s 167ms/step - loss: 0.4698 - categorical_accuracy: 0.8333 - tp: 3793.0000 - fp: 57.0000 - tn: 28223.0000 - fn: 1863.0000 - precision: 0.9852 - recall: 0.6706 - auc: 0.9779 - prc: 0.9239 - val_loss: 0.8649 - val_categorical_accuracy: 0.7734 - val_tp: 1159.0000 - val_fp: 115.0000 - val_tn: 9330.0000 - val_fn: 730.0000 - val_precision: 0.9097 - val_recall: 0.6136 - val_auc: 0.9489 - val_prc: 0.8488
Epoch 21/30
177/177 [==============================] - 29s 164ms/step - loss: 0.4569 - categorical_accuracy: 0.8292 - tp: 3804.0000 - fp: 83.0000 - tn: 28197.0000 - fn: 1852.0000 - precision: 0.9786 - recall: 0.6726 - auc: 0.9781 - prc: 0.9239 - val_loss: 1.0245 - val_categorical_accuracy: 0.7687 - val_tp: 1163.0000 - val_fp: 155.0000 - val_tn: 9290.0000 - val_fn: 726.0000 - val_precision: 0.8824 - val_recall: 0.6157 - val_auc: 0.9466 - val_prc: 0.8467
Epoch 22/30
177/177 [==============================] - 30s 167ms/step - loss: 0.4728 - categorical_accuracy: 0.8294 - tp: 3823.0000 - fp: 96.0000 - tn: 28184.0000 - fn: 1833.0000 - precision: 0.9755 - recall: 0.6759 - auc: 0.9772 - prc: 0.9220 - val_loss: 0.8758 - val_categorical_accuracy: 0.7729 - val_tp: 1181.0000 - val_fp: 134.0000 - val_tn: 9311.0000 - val_fn: 708.0000 - val_precision: 0.8981 - val_recall: 0.6252 - val_auc: 0.9518 - val_prc: 0.8573
Epoch 23/30
177/177 [==============================] - 29s 164ms/step - loss: 0.4418 - categorical_accuracy: 0.8343 - tp: 3876.0000 - fp: 94.0000 - tn: 28186.0000 - fn: 1780.0000 - precision: 0.9763 - recall: 0.6853 - auc: 0.9794 - prc: 0.9275 - val_loss: 0.8508 - val_categorical_accuracy: 0.7729 - val_tp: 1176.0000 - val_fp: 139.0000 - val_tn: 9306.0000 - val_fn: 713.0000 - val_precision: 0.8943 - val_recall: 0.6226 - val_auc: 0.9510 - val_prc: 0.8548
Epoch 24/30
177/177 [==============================] - 30s 167ms/step - loss: 0.4418 - categorical_accuracy: 0.8366 - tp: 3938.0000 - fp: 83.0000 - tn: 28197.0000 - fn: 1718.0000 - precision: 0.9794 - recall: 0.6963 - auc: 0.9796 - prc: 0.9295 - val_loss: 0.8733 - val_categorical_accuracy: 0.7708 - val_tp: 1196.0000 - val_fp: 150.0000 - val_tn: 9295.0000 - val_fn: 693.0000 - val_precision: 0.8886 - val_recall: 0.6331 - val_auc: 0.9495 - val_prc: 0.8512
Epoch 25/30
177/177 [==============================] - 29s 164ms/step - loss: 0.4340 - categorical_accuracy: 0.8407 - tp: 3952.0000 - fp: 66.0000 - tn: 28214.0000 - fn: 1704.0000 - precision: 0.9836 - recall: 0.6987 - auc: 0.9803 - prc: 0.9317 - val_loss: 0.8415 - val_categorical_accuracy: 0.7718 - val_tp: 1190.0000 - val_fp: 130.0000 - val_tn: 9315.0000 - val_fn: 699.0000 - val_precision: 0.9015 - val_recall: 0.6300 - val_auc: 0.9527 - val_prc: 0.8584
In [31]:
evaluate_model_performance(ensemble_model, val_ds, test_ds)
60/60 [==============================] - 7s 119ms/step - loss: 0.8415 - categorical_accuracy: 0.7718 - tp: 1190.0000 - fp: 130.0000 - tn: 9315.0000 - fn: 699.0000 - precision: 0.9015 - recall: 0.6300 - auc: 0.9527 - prc: 0.8584
Validation AUC: 0.953
Validation PRC: 0.858
Validation categorical accuracy: 0.772
59/59 [==============================] - 8s 129ms/step - loss: 0.9579 - categorical_accuracy: 0.7618 - tp: 1144.0000 - fp: 172.0000 - tn: 9253.0000 - fn: 741.0000 - precision: 0.8693 - recall: 0.6069 - auc: 0.9423 - prc: 0.8339
Test AUC: 0.942
Test PRC: 0.834
Test categorical accuracy: 0.762
5.5.4 Final Ensemble¶

Since there is no Voting layer in Tensorflow or Tensorflow Addons, I had to create a custom voting layer using Tensorflow operations such as tf.reduce_sum and tf.math.argmax. ChatGPT played a crucial role in pair programming this solution. The Voting layer supports two voting modes: "majority" and "average". For "majority" mode, it first computes the class index with the highest probability for each input tensor, and then combines the results by summing the one-hot encoded class indices along the specified axis. Finally, it returns the class index with the highest vote count. For "average" mode, it computes the element-wise average of the input tensors along the specified axis, and returns the class index with the highest average probability.

In [166]:
# Fresh data
(train_ds, val_ds, test_ds), info = tfds.load('cassava', 
                                         split=['train', 'validation', 'test'],
                                         shuffle_files=True,
                                         as_supervised=True,
                                         with_info=True)
train_ds = prepare(train_ds, augment=True, shuffle=True) # apply image augmentations
val_ds = prepare(val_ds)
test_ds = prepare(test_ds)
In [54]:
class Voting(tf.keras.layers.Layer):
    def __init__(self, mode, voting_weights=None, **kwargs):
        super().__init__(**kwargs)
        self.mode = mode
        self.voting_weights = voting_weights

    def call(self, inputs):
        if self.mode == 'majority':
            votes = tf.argmax(inputs, axis=-1)
            outputs = tf.one_hot(votes,  6)
            if self.voting_weights is not None:
                outputs = tf.multiply(outputs, self.voting_weights)
            outputs = tf.reduce_sum(outputs, axis=1)
            outputs = tf.argmax(outputs, axis=-1)
        elif self.mode == 'average':
            if self.voting_weights is not None:
                inputs = tf.multiply(inputs, self.voting_weights)
            outputs = tf.reduce_mean(inputs, axis=1)
            outputs = tf.argmax(outputs, axis=-1)
        return outputs
    def get_config(self):
        config = super(Voting, self).get_config()
        config.update({'mode': self.mode,
                       'weights': self.weights,
                       'axis': self.axis})
        return config
In [175]:
# Define the input shape for the feature vectors
input_shape = (224, 224,3)

# Define the input layers for each feature vector
input_cassava = tf.keras.layers.Input(shape=input_shape, name='cassava_disease_input')


# Define the feature extraction layers for each input
cassava_features = hub.KerasLayer('https://tfhub.dev/google/cropnet/feature_vector/cassava_disease_V1/1', trainable=False)(input_cassava)
concat_features = hub.KerasLayer('https://tfhub.dev/google/cropnet/feature_vector/concat/1', trainable=False)(input_cassava)
imagenet_features = hub.KerasLayer('https://tfhub.dev/google/cropnet/feature_vector/imagenet/1', trainable=False)(input_cassava)
mobilenet_features = hub.KerasLayer('https://tfhub.dev/google/imagenet/mobilenet_v3_large_100_224/feature_vector/5', trainable=False)(input_cassava)
pretrained_cropnet_features = hub.KerasLayer('https://tfhub.dev/google/cropnet/classifier/cassava_disease_V1/2', trainable=False)(input_cassava)

# Define the ensemble model by concatenating the features and adding a fully connected layer
concatenated = tf.keras.layers.concatenate([cassava_features, concat_features, imagenet_features, mobilenet_features], name='concatenated')
ensemble_output = tf.keras.layers.Dense(units=128, activation='relu', name='ensemble_layer')(concatenated)
fv_output = tf.keras.layers.Dense(units=6, activation='softmax', name='fv_output')(ensemble_output)
# votes = tf.keras.layers.concatenate([fv_output, pretrained_cropnet_features], name='majority_vote_layer')
votes = Voting(mode='majority', voting_weights=None, name="majority_vote_layer")([fv_output, pretrained_cropnet_features])
output = tf.keras.layers.Dense(units=6, activation='softmax', name='output')(votes)


# Define the models for each input feature vector
cassava_model = tf.keras.Model(inputs=input_cassava, outputs=cassava_features)
concat_model = tf.keras.Model(inputs=input_cassava, outputs=concat_features)
imagenet_model = tf.keras.Model(inputs=input_cassava, outputs=imagenet_features)
mobilenet_model = tf.keras.Model(inputs=input_cassava, outputs=mobilenet_features)
pretrained_cropnet_model = tf.keras.Model(inputs=input_cassava, outputs=pretrained_cropnet_features)


# Freeze the weights of the feature extraction layers in the individual models
for model in [cassava_model, concat_model, imagenet_model, mobilenet_model, pretrained_cropnet_model]:
    model.trainable = False

# Combine the individual models into a single ensemble model
ensemble_inputs = input_cassava
ensemble_models = [cassava_model, concat_model, imagenet_model, mobilenet_model, pretrained_cropnet_model]
ensemble_outputs = [model.output for model in ensemble_models]
final_ensemble_model = tf.keras.Model(inputs=ensemble_inputs, outputs=output)


# Compile the ensemble model with an appropriate optimizer, loss function, and evaluation metrics
final_ensemble_model.compile(optimizer=tf.keras.optimizers.Adam(),
                       loss=tf.keras.losses.CategoricalCrossentropy(),
                       metrics=METRICS)
In [179]:
keras.utils.plot_model(final_ensemble_model, "final_ensemble_classifier.png")
Out[179]:
In [176]:
train_ensemble_model(final_ensemble_model, model_name='final_ensemble_model', e=30)
Epoch 1/30
177/177 [==============================] - 82s 246ms/step - loss: 1.3062 - categorical_accuracy: 0.7701 - tp: 1144.0000 - fp: 172.0000 - tn: 37533.0000 - fn: 6397.0000 - precision: 0.8693 - recall: 0.1517 - auc: 0.9101 - prc: 0.7552 - val_loss: 1.1290 - val_categorical_accuracy: 0.8115 - val_tp: 0.0000e+00 - val_fp: 0.0000e+00 - val_tn: 9445.0000 - val_fn: 1889.0000 - val_precision: 0.0000e+00 - val_recall: 0.0000e+00 - val_auc: 0.9389 - val_prc: 0.8600
Epoch 2/30
177/177 [==============================] - 37s 207ms/step - loss: 1.0521 - categorical_accuracy: 0.7779 - tp: 1761.0000 - fp: 65.0000 - tn: 28215.0000 - fn: 3895.0000 - precision: 0.9644 - recall: 0.3114 - auc: 0.9351 - prc: 0.8335 - val_loss: 0.9365 - val_categorical_accuracy: 0.8052 - val_tp: 960.0000 - val_fp: 29.0000 - val_tn: 9416.0000 - val_fn: 929.0000 - val_precision: 0.9707 - val_recall: 0.5082 - val_auc: 0.9487 - val_prc: 0.8695
Epoch 3/30
177/177 [==============================] - 38s 211ms/step - loss: 0.8962 - categorical_accuracy: 0.7802 - tp: 3013.0000 - fp: 160.0000 - tn: 28120.0000 - fn: 2643.0000 - precision: 0.9496 - recall: 0.5327 - auc: 0.9495 - prc: 0.8554 - val_loss: 0.8156 - val_categorical_accuracy: 0.8015 - val_tp: 1090.0000 - val_fp: 45.0000 - val_tn: 9400.0000 - val_fn: 799.0000 - val_precision: 0.9604 - val_recall: 0.5770 - val_auc: 0.9580 - val_prc: 0.8821
Epoch 4/30
177/177 [==============================] - 36s 205ms/step - loss: 0.8037 - categorical_accuracy: 0.7893 - tp: 3219.0000 - fp: 204.0000 - tn: 28076.0000 - fn: 2437.0000 - precision: 0.9404 - recall: 0.5691 - auc: 0.9564 - prc: 0.8693 - val_loss: 0.7347 - val_categorical_accuracy: 0.8041 - val_tp: 1159.0000 - val_fp: 61.0000 - val_tn: 9384.0000 - val_fn: 730.0000 - val_precision: 0.9500 - val_recall: 0.6136 - val_auc: 0.9661 - val_prc: 0.8970
Epoch 5/30
177/177 [==============================] - 37s 210ms/step - loss: 0.7347 - categorical_accuracy: 0.7893 - tp: 3468.0000 - fp: 244.0000 - tn: 28036.0000 - fn: 2188.0000 - precision: 0.9343 - recall: 0.6132 - auc: 0.9627 - prc: 0.8810 - val_loss: 0.6522 - val_categorical_accuracy: 0.8105 - val_tp: 1340.0000 - val_fp: 69.0000 - val_tn: 9376.0000 - val_fn: 549.0000 - val_precision: 0.9510 - val_recall: 0.7094 - val_auc: 0.9737 - val_prc: 0.9150
Epoch 6/30
177/177 [==============================] - 38s 213ms/step - loss: 0.6782 - categorical_accuracy: 0.7931 - tp: 3922.0000 - fp: 294.0000 - tn: 27986.0000 - fn: 1734.0000 - precision: 0.9303 - recall: 0.6934 - auc: 0.9681 - prc: 0.8962 - val_loss: 0.5905 - val_categorical_accuracy: 0.8311 - val_tp: 1424.0000 - val_fp: 69.0000 - val_tn: 9376.0000 - val_fn: 465.0000 - val_precision: 0.9538 - val_recall: 0.7538 - val_auc: 0.9792 - val_prc: 0.9297
Epoch 7/30
177/177 [==============================] - 36s 204ms/step - loss: 0.6206 - categorical_accuracy: 0.8290 - tp: 4107.0000 - fp: 303.0000 - tn: 27977.0000 - fn: 1549.0000 - precision: 0.9313 - recall: 0.7261 - auc: 0.9735 - prc: 0.9106 - val_loss: 0.5468 - val_categorical_accuracy: 0.8862 - val_tp: 1441.0000 - val_fp: 71.0000 - val_tn: 9374.0000 - val_fn: 448.0000 - val_precision: 0.9530 - val_recall: 0.7628 - val_auc: 0.9824 - val_prc: 0.9399
Epoch 8/30
177/177 [==============================] - 37s 211ms/step - loss: 0.5774 - categorical_accuracy: 0.8591 - tp: 4174.0000 - fp: 321.0000 - tn: 27959.0000 - fn: 1482.0000 - precision: 0.9286 - recall: 0.7380 - auc: 0.9759 - prc: 0.9188 - val_loss: 0.5039 - val_categorical_accuracy: 0.9063 - val_tp: 1466.0000 - val_fp: 77.0000 - val_tn: 9368.0000 - val_fn: 423.0000 - val_precision: 0.9501 - val_recall: 0.7761 - val_auc: 0.9840 - val_prc: 0.9459
Epoch 9/30
177/177 [==============================] - 37s 208ms/step - loss: 0.5491 - categorical_accuracy: 0.8667 - tp: 4264.0000 - fp: 341.0000 - tn: 27939.0000 - fn: 1392.0000 - precision: 0.9260 - recall: 0.7539 - auc: 0.9768 - prc: 0.9219 - val_loss: 0.4741 - val_categorical_accuracy: 0.9068 - val_tp: 1503.0000 - val_fp: 92.0000 - val_tn: 9353.0000 - val_fn: 386.0000 - val_precision: 0.9423 - val_recall: 0.7957 - val_auc: 0.9837 - val_prc: 0.9447
Epoch 10/30
177/177 [==============================] - 37s 207ms/step - loss: 0.5279 - categorical_accuracy: 0.8676 - tp: 4344.0000 - fp: 358.0000 - tn: 27922.0000 - fn: 1312.0000 - precision: 0.9239 - recall: 0.7680 - auc: 0.9772 - prc: 0.9231 - val_loss: 0.4427 - val_categorical_accuracy: 0.9037 - val_tp: 1535.0000 - val_fp: 97.0000 - val_tn: 9348.0000 - val_fn: 354.0000 - val_precision: 0.9406 - val_recall: 0.8126 - val_auc: 0.9856 - val_prc: 0.9506
Epoch 11/30
177/177 [==============================] - 37s 211ms/step - loss: 0.4927 - categorical_accuracy: 0.8741 - tp: 4466.0000 - fp: 373.0000 - tn: 27907.0000 - fn: 1190.0000 - precision: 0.9229 - recall: 0.7896 - auc: 0.9794 - prc: 0.9306 - val_loss: 0.4162 - val_categorical_accuracy: 0.9089 - val_tp: 1564.0000 - val_fp: 98.0000 - val_tn: 9347.0000 - val_fn: 325.0000 - val_precision: 0.9410 - val_recall: 0.8280 - val_auc: 0.9863 - val_prc: 0.9534
Epoch 12/30
177/177 [==============================] - 36s 204ms/step - loss: 0.4847 - categorical_accuracy: 0.8750 - tp: 4489.0000 - fp: 394.0000 - tn: 27886.0000 - fn: 1167.0000 - precision: 0.9193 - recall: 0.7937 - auc: 0.9787 - prc: 0.9281 - val_loss: 0.4000 - val_categorical_accuracy: 0.9137 - val_tp: 1582.0000 - val_fp: 97.0000 - val_tn: 9348.0000 - val_fn: 307.0000 - val_precision: 0.9422 - val_recall: 0.8375 - val_auc: 0.9868 - val_prc: 0.9527
Epoch 13/30
177/177 [==============================] - 37s 210ms/step - loss: 0.4678 - categorical_accuracy: 0.8789 - tp: 4534.0000 - fp: 409.0000 - tn: 27871.0000 - fn: 1122.0000 - precision: 0.9173 - recall: 0.8016 - auc: 0.9802 - prc: 0.9318 - val_loss: 0.3902 - val_categorical_accuracy: 0.9111 - val_tp: 1591.0000 - val_fp: 100.0000 - val_tn: 9345.0000 - val_fn: 298.0000 - val_precision: 0.9409 - val_recall: 0.8422 - val_auc: 0.9873 - val_prc: 0.9535
Epoch 14/30
177/177 [==============================] - 38s 213ms/step - loss: 0.4560 - categorical_accuracy: 0.8778 - tp: 4582.0000 - fp: 411.0000 - tn: 27869.0000 - fn: 1074.0000 - precision: 0.9177 - recall: 0.8101 - auc: 0.9800 - prc: 0.9315 - val_loss: 0.3744 - val_categorical_accuracy: 0.9121 - val_tp: 1632.0000 - val_fp: 101.0000 - val_tn: 9344.0000 - val_fn: 257.0000 - val_precision: 0.9417 - val_recall: 0.8639 - val_auc: 0.9872 - val_prc: 0.9543
Epoch 15/30
177/177 [==============================] - 36s 205ms/step - loss: 0.4432 - categorical_accuracy: 0.8800 - tp: 4677.0000 - fp: 421.0000 - tn: 27859.0000 - fn: 979.0000 - precision: 0.9174 - recall: 0.8269 - auc: 0.9805 - prc: 0.9331 - val_loss: 0.3627 - val_categorical_accuracy: 0.9132 - val_tp: 1649.0000 - val_fp: 99.0000 - val_tn: 9346.0000 - val_fn: 240.0000 - val_precision: 0.9434 - val_recall: 0.8729 - val_auc: 0.9878 - val_prc: 0.9559
Epoch 16/30
177/177 [==============================] - 37s 209ms/step - loss: 0.4380 - categorical_accuracy: 0.8792 - tp: 4713.0000 - fp: 447.0000 - tn: 27833.0000 - fn: 943.0000 - precision: 0.9134 - recall: 0.8333 - auc: 0.9808 - prc: 0.9348 - val_loss: 0.3516 - val_categorical_accuracy: 0.9148 - val_tp: 1657.0000 - val_fp: 99.0000 - val_tn: 9346.0000 - val_fn: 232.0000 - val_precision: 0.9436 - val_recall: 0.8772 - val_auc: 0.9889 - val_prc: 0.9587
Epoch 17/30
177/177 [==============================] - 36s 204ms/step - loss: 0.4271 - categorical_accuracy: 0.8812 - tp: 4761.0000 - fp: 454.0000 - tn: 27826.0000 - fn: 895.0000 - precision: 0.9129 - recall: 0.8418 - auc: 0.9812 - prc: 0.9353 - val_loss: 0.3449 - val_categorical_accuracy: 0.9148 - val_tp: 1675.0000 - val_fp: 106.0000 - val_tn: 9339.0000 - val_fn: 214.0000 - val_precision: 0.9405 - val_recall: 0.8867 - val_auc: 0.9882 - val_prc: 0.9576
Epoch 18/30
177/177 [==============================] - 37s 211ms/step - loss: 0.4193 - categorical_accuracy: 0.8824 - tp: 4760.0000 - fp: 464.0000 - tn: 27816.0000 - fn: 896.0000 - precision: 0.9112 - recall: 0.8416 - auc: 0.9809 - prc: 0.9333 - val_loss: 0.3381 - val_categorical_accuracy: 0.9127 - val_tp: 1676.0000 - val_fp: 107.0000 - val_tn: 9338.0000 - val_fn: 213.0000 - val_precision: 0.9400 - val_recall: 0.8872 - val_auc: 0.9883 - val_prc: 0.9580
Epoch 19/30
177/177 [==============================] - 37s 211ms/step - loss: 0.4126 - categorical_accuracy: 0.8817 - tp: 4800.0000 - fp: 472.0000 - tn: 27808.0000 - fn: 856.0000 - precision: 0.9105 - recall: 0.8487 - auc: 0.9813 - prc: 0.9360 - val_loss: 0.3238 - val_categorical_accuracy: 0.9148 - val_tp: 1685.0000 - val_fp: 106.0000 - val_tn: 9339.0000 - val_fn: 204.0000 - val_precision: 0.9408 - val_recall: 0.8920 - val_auc: 0.9893 - val_prc: 0.9604
Epoch 20/30
177/177 [==============================] - 36s 204ms/step - loss: 0.4077 - categorical_accuracy: 0.8800 - tp: 4792.0000 - fp: 486.0000 - tn: 27794.0000 - fn: 864.0000 - precision: 0.9079 - recall: 0.8472 - auc: 0.9820 - prc: 0.9371 - val_loss: 0.3215 - val_categorical_accuracy: 0.9142 - val_tp: 1688.0000 - val_fp: 112.0000 - val_tn: 9333.0000 - val_fn: 201.0000 - val_precision: 0.9378 - val_recall: 0.8936 - val_auc: 0.9889 - val_prc: 0.9594
Epoch 21/30
177/177 [==============================] - 37s 211ms/step - loss: 0.4073 - categorical_accuracy: 0.8766 - tp: 4796.0000 - fp: 493.0000 - tn: 27787.0000 - fn: 860.0000 - precision: 0.9068 - recall: 0.8479 - auc: 0.9817 - prc: 0.9377 - val_loss: 0.3147 - val_categorical_accuracy: 0.9148 - val_tp: 1687.0000 - val_fp: 107.0000 - val_tn: 9338.0000 - val_fn: 202.0000 - val_precision: 0.9404 - val_recall: 0.8931 - val_auc: 0.9892 - val_prc: 0.9599
Epoch 22/30
177/177 [==============================] - 38s 214ms/step - loss: 0.4050 - categorical_accuracy: 0.8800 - tp: 4818.0000 - fp: 495.0000 - tn: 27785.0000 - fn: 838.0000 - precision: 0.9068 - recall: 0.8518 - auc: 0.9818 - prc: 0.9371 - val_loss: 0.3110 - val_categorical_accuracy: 0.9185 - val_tp: 1694.0000 - val_fp: 111.0000 - val_tn: 9334.0000 - val_fn: 195.0000 - val_precision: 0.9385 - val_recall: 0.8968 - val_auc: 0.9894 - val_prc: 0.9604
Epoch 23/30
177/177 [==============================] - 36s 204ms/step - loss: 0.3961 - categorical_accuracy: 0.8805 - tp: 4818.0000 - fp: 491.0000 - tn: 27789.0000 - fn: 838.0000 - precision: 0.9075 - recall: 0.8518 - auc: 0.9821 - prc: 0.9383 - val_loss: 0.3087 - val_categorical_accuracy: 0.9169 - val_tp: 1692.0000 - val_fp: 112.0000 - val_tn: 9333.0000 - val_fn: 197.0000 - val_precision: 0.9379 - val_recall: 0.8957 - val_auc: 0.9895 - val_prc: 0.9604
Epoch 24/30
177/177 [==============================] - 37s 211ms/step - loss: 0.4024 - categorical_accuracy: 0.8778 - tp: 4821.0000 - fp: 500.0000 - tn: 27780.0000 - fn: 835.0000 - precision: 0.9060 - recall: 0.8524 - auc: 0.9816 - prc: 0.9361 - val_loss: 0.3120 - val_categorical_accuracy: 0.9127 - val_tp: 1688.0000 - val_fp: 111.0000 - val_tn: 9334.0000 - val_fn: 201.0000 - val_precision: 0.9383 - val_recall: 0.8936 - val_auc: 0.9888 - val_prc: 0.9588
Epoch 25/30
177/177 [==============================] - 36s 204ms/step - loss: 0.3869 - categorical_accuracy: 0.8849 - tp: 4845.0000 - fp: 483.0000 - tn: 27797.0000 - fn: 811.0000 - precision: 0.9093 - recall: 0.8566 - auc: 0.9827 - prc: 0.9387 - val_loss: 0.3096 - val_categorical_accuracy: 0.9127 - val_tp: 1685.0000 - val_fp: 114.0000 - val_tn: 9331.0000 - val_fn: 204.0000 - val_precision: 0.9366 - val_recall: 0.8920 - val_auc: 0.9889 - val_prc: 0.9598
Epoch 26/30
177/177 [==============================] - 37s 210ms/step - loss: 0.4006 - categorical_accuracy: 0.8784 - tp: 4830.0000 - fp: 532.0000 - tn: 27748.0000 - fn: 826.0000 - precision: 0.9008 - recall: 0.8540 - auc: 0.9815 - prc: 0.9358 - val_loss: 0.3041 - val_categorical_accuracy: 0.9153 - val_tp: 1697.0000 - val_fp: 118.0000 - val_tn: 9327.0000 - val_fn: 192.0000 - val_precision: 0.9350 - val_recall: 0.8984 - val_auc: 0.9892 - val_prc: 0.9602
Epoch 27/30
177/177 [==============================] - 37s 211ms/step - loss: 0.3935 - categorical_accuracy: 0.8798 - tp: 4832.0000 - fp: 512.0000 - tn: 27768.0000 - fn: 824.0000 - precision: 0.9042 - recall: 0.8543 - auc: 0.9824 - prc: 0.9383 - val_loss: 0.2982 - val_categorical_accuracy: 0.9158 - val_tp: 1700.0000 - val_fp: 118.0000 - val_tn: 9327.0000 - val_fn: 189.0000 - val_precision: 0.9351 - val_recall: 0.8999 - val_auc: 0.9899 - val_prc: 0.9616
Epoch 28/30
177/177 [==============================] - 36s 205ms/step - loss: 0.3929 - categorical_accuracy: 0.8801 - tp: 4854.0000 - fp: 531.0000 - tn: 27749.0000 - fn: 802.0000 - precision: 0.9014 - recall: 0.8582 - auc: 0.9819 - prc: 0.9353 - val_loss: 0.2946 - val_categorical_accuracy: 0.9153 - val_tp: 1696.0000 - val_fp: 116.0000 - val_tn: 9329.0000 - val_fn: 193.0000 - val_precision: 0.9360 - val_recall: 0.8978 - val_auc: 0.9900 - val_prc: 0.9617
Epoch 29/30
177/177 [==============================] - 37s 211ms/step - loss: 0.3898 - categorical_accuracy: 0.8815 - tp: 4831.0000 - fp: 514.0000 - tn: 27766.0000 - fn: 825.0000 - precision: 0.9038 - recall: 0.8541 - auc: 0.9824 - prc: 0.9388 - val_loss: 0.3086 - val_categorical_accuracy: 0.9116 - val_tp: 1676.0000 - val_fp: 120.0000 - val_tn: 9325.0000 - val_fn: 213.0000 - val_precision: 0.9332 - val_recall: 0.8872 - val_auc: 0.9889 - val_prc: 0.9590
Epoch 30/30
177/177 [==============================] - 37s 211ms/step - loss: 0.3958 - categorical_accuracy: 0.8798 - tp: 4834.0000 - fp: 524.0000 - tn: 27756.0000 - fn: 822.0000 - precision: 0.9022 - recall: 0.8547 - auc: 0.9812 - prc: 0.9352 - val_loss: 0.2961 - val_categorical_accuracy: 0.9179 - val_tp: 1696.0000 - val_fp: 111.0000 - val_tn: 9334.0000 - val_fn: 193.0000 - val_precision: 0.9386 - val_recall: 0.8978 - val_auc: 0.9894 - val_prc: 0.9600
In [177]:
evaluate_model_performance(final_ensemble_model, val_ds, test_ds)
60/60 [==============================] - 9s 146ms/step - loss: 0.2961 - categorical_accuracy: 0.9179 - tp: 1696.0000 - fp: 111.0000 - tn: 9334.0000 - fn: 193.0000 - precision: 0.9386 - recall: 0.8978 - auc: 0.9894 - prc: 0.9600
Validation AUC: 0.989
Validation PRC: 0.960
Validation categorical accuracy: 0.918
59/59 [==============================] - 9s 150ms/step - loss: 0.3881 - categorical_accuracy: 0.8891 - tp: 1646.0000 - fp: 172.0000 - tn: 9253.0000 - fn: 239.0000 - precision: 0.9054 - recall: 0.8732 - auc: 0.9809 - prc: 0.9322
Test AUC: 0.981
Test PRC: 0.932
Test categorical accuracy: 0.889
In [183]:
# %reload_ext tensorboard
# %tensorboard --logdir ./tensorboard/final_ensemble_model --bind_all

Screen Shot 2023-03-19 at 3.37.31 AM.png

Screen Shot 2023-03-19 at 3.39.46 AM.png

6. Interpreting what Convnets learn¶

In [335]:
model = keras.applications.xception.Xception(weights="imagenet")
Downloading data from https://storage.googleapis.com/tensorflow/keras-applications/xception/xception_weights_tf_dim_ordering_tf_kernels.h5
91889664/91884032 [==============================] - 2s 0us/step
91897856/91884032 [==============================] - 2s 0us/step
6.1 Preprocessing an input for Xception¶
In [379]:
img_path = '/home/jupyter/cassava.jpeg'
In [380]:
def get_img_array(img_path, target_size):
    img = keras.utils.load_img(img_path, target_size=target_size)
    array = keras.utils.img_to_array(img)
    array = np.expand_dims(array, axis=0)
    array = keras.applications.xception.preprocess_input(array)
    return array

img_array = get_img_array(img_path, target_size=(299, 299))
6.2 Get the last convolutional output¶
In [383]:
last_conv_layer_name = "block14_sepconv2_act"
classifier_layer_names = [
    "avg_pool",
    "predictions",
]
last_conv_layer = model.get_layer(last_conv_layer_name)
last_conv_layer_model = keras.Model(model.inputs, last_conv_layer.output)
6.3 Reapply classifier on last convolutional output¶
In [384]:
classifier_input = keras.Input(shape=last_conv_layer.output.shape[1:])
x = classifier_input
for layer_name in classifier_layer_names:
    x = model.get_layer(layer_name)(x)
classifier_model = keras.Model(classifier_input, x)
6.4 Retrieving the gradients of the top predicted class¶
In [385]:
import tensorflow as tf

with tf.GradientTape() as tape:
    last_conv_layer_output = last_conv_layer_model(img_array)
    tape.watch(last_conv_layer_output)
    preds = classifier_model(last_conv_layer_output)
    top_pred_index = tf.argmax(preds[0])
    top_class_channel = preds[:, top_pred_index]

grads = tape.gradient(top_class_channel, last_conv_layer_output)
In [386]:
pooled_grads = tf.reduce_mean(grads, axis=(0, 1, 2)).numpy()
last_conv_layer_output = last_conv_layer_output.numpy()[0]
for i in range(pooled_grads.shape[-1]):
    last_conv_layer_output[:, :, i] *= pooled_grads[i]
heatmap = np.mean(last_conv_layer_output, axis=-1)
6.4 Heatmap post-processing¶
In [387]:
heatmap = np.maximum(heatmap, 0)
heatmap /= np.max(heatmap)
plt.matshow(heatmap)
Out[387]:
<matplotlib.image.AxesImage at 0x7efc95526b10>
In [405]:
import matplotlib.cm as cm

img = keras.utils.load_img(img_path, target_size=(1000,1000))
# print(img)
img = keras.utils.img_to_array(img)

heatmap = np.uint8(255 * heatmap)

jet = cm.get_cmap("jet")
jet_colors = jet(np.arange(256))[:, :3]
jet_heatmap = jet_colors[heatmap]

jet_heatmap = keras.utils.array_to_img(jet_heatmap)
# print(img.shape[1])
# print(img.shape[0])
jet_heatmap = jet_heatmap.resize((1000, 1000))
jet_heatmap = keras.utils.img_to_array(jet_heatmap)
# print(img.shape)
superimposed_img = jet_heatmap * 0.4 + img
superimposed_img = keras.utils.array_to_img(superimposed_img)

# save_path = "cassava_heatmap_cw2.jpg"
# superimposed_img.save(save_path)
superimposed_img
Out[405]:

7. Conclusion¶

7.1 Summary of model performance¶

This table shows the summary of model performance on the TEST dataset. The key metrics being monitored are Categorical accuracy and PRC

summaryperformance.png

7.2 Key takeaways¶
  1. It is possible to train Convnets from scratch even on small datasets like the iCassava dataset I used in this notebook. Their performance is decent as shown with the two Baseline Classifiers

  2. On a small dataset, overfitting became the main issue I was struggling with. Data augmentation was the most powerful way to fight overfitting. This is evident from the results of the subsequent regularized models, with the best performing one being the one with Data augmentation.

  3. In this notebook, I also demonstrate that it is easy to reuse existing Convnets on a new dataset via feature extraction. This was a valuable technique for working with small image datasets. I discovered a pretrained Cropnet classifier which skyrocketed my model performance. Other architectures such as EfficientNetB4 (with imagenet, and with noisystudent weights) did not perform as well as expected, eventhough they were used in Kaggle competitions. Further exploration is required to fully understand how they were implemented successfully.

  4. As a complement to feature extraction, I used ensembling techniques to try and push the performance a bit further. In Deep Learning even an improvement of 1% can be significant.

  5. Finally, I attempted to generate visualizations of heatmaps of class activity in order to understand the reasoning the model had behind it's predictions.

7.3 Future considerations¶
  1. Data curation: In future iterations of the project I believe focusing on better data curation is going to help improve performance of the model. Since the dataset is noisy and contains ambiguous and uncertain features, the input feature space is associated with multiple classes at the same time. This means that class labels have no objective boundaries so the same picture might be classified differently by different human labelers. Better data curation may help solve this problem.

  2. Imbalanced dataset: The dataset class distribution was skewed. In future iterations, I would move beyond monitoring specific metrics such as PRC, and apply other techniques such as oversampling or undersampling in order to experiment on performance if the dataset were to be balanced.

8. References¶

  1. E. Mwebaze, T. Gebru, A. Frome, S. Nsumba, and J. Tusubira, “ICASSAVA 2019 fine-grained visual categorization challenge,” arXiv.org, 24-Dec-2019. [Online]. Available: https://arxiv.org/abs/1908.02900. [Accessed: 20-Mar-2023]

  2. “National five year development plan - Tanzania,” Clearinghouse. [Online]. Available: https://smartdatafinance.org/resources?topic_id%5B0%5D=48&countries%5B0%5D=219&page=1. [Accessed: 20-Mar-2023]

  3. Deep learning with python. New York: Manning Publications, 2017.

  4. Fchollet, “Fchollet/design-details-with-python-notebooks: Jupyter Notebooks for the code samples of the book ‘Deep learning with python,’” GitHub. [Online]. Available: https://github.com/fchollet/design-details-with-python-notebooks. [Accessed: 20-Mar-2023]

  5. “CropNet: Cassava disease detection  :  Tensorflow hub,” TensorFlow. [Online]. Available: https://www.tensorflow.org/hub/tutorials/cropnet_cassava. [Accessed: 20-Mar-2023]

  6. A. An, “Decision Tree Learning,” York University: CSE-4412, Data Mining Winter 2020. [Online]. Available: https://www.eecs.yorku.ca/course_archive/2019-20/W/4412/lecnotes/notes.html. [Accessed: 19-Mar-2023]

  7. “Fine tuning models for Plant Disease Detection  :   tensorflow hub,” TensorFlow. [Online]. Available: https://www.tensorflow.org/hub/tutorials/cropnet_on_device. [Accessed: 20-Mar-2023]

  8. “1st Place Solution: Cassava leaf disease classification,” Kaggle. [Online]. Available: https://www.kaggle.com/c/cassava-leaf-disease-classification/discussion/221957. [Accessed: 20-Mar-2023]

  9. Jannish, “Final version inference,” Kaggle, 24-Feb-2021. [Online]. Available: https://www.kaggle.com/code/jannish/final-version-inference. [Accessed: 20-Mar-2023]

  10. Jannish, “Cassava leaf disease efficientnetb4 TPU,” Kaggle, 23-Feb-2021. [Online]. Available: https://www.kaggle.com/code/jannish/cassava-leaf-disease-efficientnetb4-tpu. [Accessed: 20-Mar-2023]

  11. Markwijkhuizen, “TF efficientnetb4 Mixup Cutmix gridmask CV 0.90,” Kaggle, 19-Feb-2021. [Online]. Available: https://www.kaggle.com/markwijkhuizen/tf-efficientnetb4-mixup-cutmix-gridmask-cv-0-90. [Accessed: 20-Mar-2023]

  12. PacktPublishing, “Ensembling with Blending and Stacking Solutions· packtpublishing/the-kaggle-book,” GitHub, 18-Mar-2022. [Online]. Available: https://github.com/PacktPublishing/The-Kaggle-Book/blob/main/chapter_09/ensembling.ipynb. [Accessed: 20-Mar-2023]

  13. “Classification on imbalanced data  :   Tensorflow Core,” TensorFlow. [Online]. Available: https://www.tensorflow.org/tutorials/structured_data/imbalanced_data. [Accessed: 20-Mar-2023]

  14. “Tensorflow addons,” TensorFlow. [Online]. Available: https://www.tensorflow.org/addons. [Accessed: 20-Mar-2023]

In [190]:
!jupyter nbconvert --to html TzCropNet2.ipynb
[NbConvertApp] Converting notebook TzCropNet2.ipynb to html
/opt/conda/lib/python3.7/site-packages/bleach/sanitizer.py:168: NoCssSanitizerWarning: 'style' attribute specified, but css_sanitizer not set.
  category=NoCssSanitizerWarning,
[NbConvertApp] Writing 20163080 bytes to TzCropNet2.html
In [ ]: