This is an old revision of the document!
Table of Contents
Glossary of machine learning terms
A
accuracy
Percentage of correct predictions by a classification model.
It is defined as $$ \frac{TP + TN}{TP + FP + FN + TN}\,.$$
TP…true positive, TN…true negative, FP…false positive, FN…false negative
activation function
A function that defines the output of a layer in a neural network given an input from the previous layer (e.g. ReLU).
active learning
An ML approach in which the algorithm chooses the data to learn from. An active learning approach is particularly useful when there is a lot of unlabeled data and manual labeling is very expensive. Often, the number of examples to learn from is lower than when blindly seeking a diverse range of labeled examples in normal supervised learning.
B
batch normalisation
A method that makes the training of a deep neural network faster and more stable. It consists of normalising the input and ouput of an activation function in a hidden layer.
C
classification
The prediction of a model is a category.
clustering
Grouping of data, particulary during unsupervised learning. There exist many clustering algorithms.
convolutional neural network (CNN)
cross-validation
D
deep learning
deep neural network
A type of neural network containing multiple hidden layers.
E
early stopping
epoch
Describes the number of times the algorithm sees the whole data set.
F
F1
false negative (FN)
false positive (FN)
false positive rate (FPR)
feature
An input variable for making predictions.
feature engineering
The process of converting data into useful features for training a model. Feature selection is a part of feature engineering.
feature selection
The process of selecting relevant features from a data set.
feature vector
A list of features passed into a model.
G
H
hidden layer
hierarchical agglomerative clustering
A clustering approach that creates a tree of clusters, specifially well-suited for hierarchically organised data. In a first step, the algorithm assigns a cluster to each example. In a second step, it merges the closest clusters to create a hierarchical tree.
hyperparameters
I
J
K
k-fold validation
L
label
long short-term memory (LSTM)
loss
M
model
multi-class classification
N
neural network
P
precision
prediction
Output of a model.
predictor
R
recall
rectified linear unit (ReLU)
An activation function defined as follows:
- If the input is negative or zero, the ouput is zero.
- if the input is positive, the output is equal to the input.