glossary
Differences
This shows you the differences between two versions of the page.
Both sides previous revisionPrevious revisionNext revision | Previous revision | ||
glossary [2022/09/12 13:55] – admin | glossary [2022/09/12 16:51] (current) – admin | ||
---|---|---|---|
Line 49: | Line 49: | ||
==== cross-validation ==== | ==== cross-validation ==== | ||
- | A method to estimate how well a model will generalise to new data. In cross-validation, | + | A method to estimate how well a model will generalise to new data. In cross-validation, |
===== D ===== | ===== D ===== | ||
Line 87: | Line 87: | ||
==== feature engineering ==== | ==== feature engineering ==== | ||
- | The process of converting data into useful [[: | + | The process of converting data into useful [[: |
==== feature selection ==== | ==== feature selection ==== | ||
Line 119: | Line 119: | ||
===== K ===== | ===== K ===== | ||
- | ==== k-fold validation ==== | + | ==== k-fold |
+ | |||
+ | The training set is split into k smaller subsets. The model is trained on one of the k folds as training set and validated on the remaining (k-1) folds. This is done for all k folds. The performance measure calculated by the k-fold cross-validation is the average of the results of all k folds. | ||
===== L ===== | ===== L ===== |
glossary.1662983742.txt.gz · Last modified: 2022/09/12 13:55 by admin