AlexNet

AlexNet

AlexNet

  • contains eight layers: the first five are convolutional layers, some of them followed by max-pooling layers, and the last three are fully connected layers. The network, except the last layer, is split into two copies, each run on one GPU. The entire structure can be written as:

    • (CNN → RN → MP)2 → (CNN3 → MP) → (FC → DO)2 → Linear → Softmax

  • where:

    • CNN - convolutional layer (with ReLU)

    • RN - local response normalization

    • MP - max pooling

    • FC - fully connected layer (with ReLU)

    • Linear - fully connected layer (without activation)

    • DO - dropout