A fully connected layer (also known as a dense layer) is a neural network layer in which each neuron is connected to every neuron in the previous layer. These layers are typically found in the final stages of a neural network, where they perform the actual classification or regression task.
Each connection in a fully connected layer has an associated weight, and the neuron computes a weighted sum of its inputs followed by a non-linear activation function. This dense interconnection allows the layer to learn complex patterns.
Fully connected layers are commonly used in multi-layer perceptrons (MLPs) and in the final stages of CNNs for tasks like image classification or decision-making.