Social Sciences, asked by spidey71, 7 months ago

define the term anatomy​

Answers

Answered by harikairuvuru
4

Answer:

Anatomy is the study of structures of living things, so it's the branch of science that describes what body parts like your fingers, mouth, nose, heart, and lungs look like. ... The structure of a body part helps to determine what it can do for you.It can be also defined as the branch of science concerned with the bodily structure of humans, animals, and other living organisms, especially as revealed by dissection and the separation of parts.

Similar questions