The final exam
Emersyn was running out of ideas to create new questions for the final exam.
In a last attempt to do something original, she copied an example code fragment from the Keras website and decided to ask a few questions about it:
= keras.Sequential([
model =(28, 28, 1)),
keras.Input(shape32, kernel_size=(3, 3), activation="relu"),
layers.Conv2D(=(2, 2)),
layers.MaxPooling2D(pool_size64, kernel_size=(3, 3), activation="relu"),
layers.Conv2D(=(2, 2)),
layers.MaxPooling2D(pool_size
layers.Flatten(),0.5),
layers.Dropout(10, activation="softmax"),
layers.Dense( ])
Based on the code above, which of the questions Emersyn wrote are correct?
The kernel and pool size should have the same value. The model will likely fail to learn anything meaningful.
The default activation function in MaxPooling2D layers is ReLU, which is why the code doesn’t explicitly use it.
The Dropout right before the output layer will cut down the number of learnable parameters from 1,600 to 800.
The softmax activation function in the last layer hints that this is a multi-class classification model.