Social Sciences, asked by seemagupta220531, 10 months ago

define the term Republic​

Answers

Answered by Anonymous
2

Answer:

A republic is a form of government in which the country is considered a "public matter", not the private concern or property of the rulers. The primary positions of power within a republic are attained, through democracy, oligarchy, autocracy, or a mix thereof, rather than being unalterably occupied.

Answered by AnnieStar
7

Answer:

Republic

A republic is defined as a form of government that is ruled by the people and their elected officials.

Similar questions