Biology, asked by corol, 11 months ago

what do you mean by the word health ​

Answers

Answered by anirudh0502
1

Answer:

The word health refers to a state of complete emotional and physical well-being. Healthcare exists to help people maintain this optimal state of health.

Explanation:

mark me as brainliest

Answered by kumar3940
2

Answer:

"Health is a state of complete physical, mental, and social well-being and not merely the absence of disease or infirmity.”

HOPE IT WILL HELP YOU.....

Similar questions