Social Sciences, asked by muskanshaikh9866, 4 months ago

what is know as health care​

Answers

Answered by aastha28275
1

Answer:

Healthcare refers to the organized provision of medical care to people and communities. By that definition, healthcare careers do not just include doctors, nurses, and other frontline clinicians who often come to mind first when people think of healthcare jobs.

Answered by prajaktahanamgaon
0

Answer:

health care is very important in our life

Explanation:

because our body is fully depending on health

pleasemark me as brilliant

Similar questions