in machine learning if sample training data is too less what happen
Attachments:
Answers
Answered by
0
Answer:
Generally, it is common knowledge that too little training data results in a poor approximation. An over-constrained model will underfit the small training dataset, whereas an under-constrained model, in turn, will likely overfit the training data, both resulting in poor performance.
Explanation:
Please mark as Brainlist Answer
Similar questions