what do you mean by the word health
Answers
Answered by
1
Answer:
The word health refers to a state of complete emotional and physical well-being. Healthcare exists to help people maintain this optimal state of health.
Explanation:
mark me as brainliest
Answered by
2
Answer:
"Health is a state of complete physical, mental, and social well-being and not merely the absence of disease or infirmity.”
HOPE IT WILL HELP YOU.....
Similar questions