Social Sciences, asked by keerthanae24091998, 11 months ago

definition of social science​

Answers

Answered by Anonymous
0

Answer:

Social science is the branch of science devoted to the study of human societies and the relationships among individuals within those societies. ... Positivist social scientists use methods resembling those of the natural sciences as tools for understanding society, and so define science in its stricter modern sense.

Answered by EthicalElite
3

the scientific study of human society and social relationships.

a subject within the field of social science, such as economics or politics.

Please mark me as brainlist.

Similar questions