what does the word legitimate government mean ????
Answers
Answered by
1
A government generally acknowledged as being in control of a nation and deserving recognition, which is symbolized by the exchange of diplomats (a person appointed for diplomacy) between that government and the governments of other countries.
Hope it helps! :)
Hope it helps! :)
Anonymous:
copied answer!!
Answered by
2
In political science, legitimacy is the right and acceptance of an authority, usually a governing law or a régime. Whereas "authority" denotes a specific position in an established government, the term "legitimacy" denotes a system of government—wherein "government" denotes "sphere of influence".
Similar questions
English,
8 months ago
History,
8 months ago
Social Sciences,
8 months ago
Science,
1 year ago
Chemistry,
1 year ago