in machine learning if sample training data is too less what happen
Attachments:
![](https://hi-static.z-dn.net/files/d4f/c274606667c69f10f006f68043369cfe.jpg)
Answers
Answered by
0
Answer:
Generally, it is common knowledge that too little training data results in a poor approximation. An over-constrained model will underfit the small training dataset, whereas an under-constrained model, in turn, will likely overfit the training data, both resulting in poor performance.
Explanation:
Please mark as Brainlist Answer
Similar questions