Political Science, asked by AndyAndSweety, 1 year ago

what is degitimate government?

Answers

Answered by moon1237879
1
It should be legitimate government!!!
Definition - A government generally acknowledged as being in control of a nation and deserving formal recognition, which is symbolized by the exchange of diplomats between that government and the governments of other countries.
Answered by ashu277
1
Legitimacy is commonly defined in political science and sociology as the belief that a rule, institution, or leader has the right to govern. It is a judgment by an individual about the rightfulness of a hierarchy between rule or ruler and its subject and about the subordinate's obligations toward the rule or ruler

ashu277: Please add to brainlist answer friend
ashu277: Please
Similar questions