Deep learning is a family of machine learning methods based on deep (artificial) neural networks. A deep neural network is a type of neural network that stacks many layers.
Rectified Linear Unit (ReLU)
ReLU is a piecewise linear function widely used as the activation function for deep neural networks. It is defined as f(x) = max(0, x).
Discrete Convolution
A (discrete) convolution is a summation of products between two functions g and h (e.g., often in the form of 1D or 2D arrays). The convolution expresses the amount of overlap of the function g as it is shifted over the function f. It therefore “blends” one function with another. A convolution is a linear operation.
Convolutional Neural Network
A neural network is called a convolutional neural network if at least one of its layers uses convolution as the linear operation.
Pooling
A pooling operation summarizes the features within each local window. For example, max pooling outputs the maximum responses for each window.
Loss Function
A loss function compares the output of a neural network to target label, and produces a real number representing the “cost” between the model prediction and the target label. This cost will be a small value when the model output matches the target label, otherwise the cost will be large number.