what is degitimate government?
Answers
Answered by
1
It should be legitimate government!!!
Definition - A government generally acknowledged as being in control of a nation and deserving formal recognition, which is symbolized by the exchange of diplomats between that government and the governments of other countries.
Definition - A government generally acknowledged as being in control of a nation and deserving formal recognition, which is symbolized by the exchange of diplomats between that government and the governments of other countries.
Answered by
1
Legitimacy is commonly defined in political science and sociology as the belief that a rule, institution, or leader has the right to govern. It is a judgment by an individual about the rightfulness of a hierarchy between rule or ruler and its subject and about the subordinate's obligations toward the rule or ruler
ashu277:
Please add to brainlist answer friend
Similar questions
Business Studies,
8 months ago
Computer Science,
8 months ago
English,
8 months ago
Math,
1 year ago
Math,
1 year ago
Social Sciences,
1 year ago