glossary
Differences
This shows you the differences between two versions of the page.
Both sides previous revisionPrevious revisionNext revision | Previous revision | ||
glossary [2022/09/12 13:48] – admin | glossary [2022/09/12 16:51] (current) – admin | ||
---|---|---|---|
Line 45: | Line 45: | ||
==== convolutional neural network (CNN) ==== | ==== convolutional neural network (CNN) ==== | ||
- | A neural network in which at least one layer is a [[: | + | A neural network in which at least one layer is a [[: |
==== cross-validation ==== | ==== cross-validation ==== | ||
+ | |||
+ | A method to estimate how well a model will generalise to new data. In cross-validation, | ||
===== D ===== | ===== D ===== | ||
Line 85: | Line 87: | ||
==== feature engineering ==== | ==== feature engineering ==== | ||
- | The process of converting data into useful [[: | + | The process of converting data into useful [[: |
==== feature selection ==== | ==== feature selection ==== | ||
Line 117: | Line 119: | ||
===== K ===== | ===== K ===== | ||
- | ==== k-fold validation ==== | + | ==== k-fold |
+ | |||
+ | The training set is split into k smaller subsets. The model is trained on one of the k folds as training set and validated on the remaining (k-1) folds. This is done for all k folds. The performance measure calculated by the k-fold cross-validation is the average of the results of all k folds. | ||
===== L ===== | ===== L ===== |
glossary.1662983310.txt.gz · Last modified: 2022/09/12 13:48 by admin