Social Sciences, asked by ananyasuhina, 7 months ago

define the word government​

Answers

Answered by IƚȥCαɳԃყBʅυʂԋ
1

Answer:

The word government refers to a governing body that makes decisions and gets things done for the welfare of its citizens. The government provides legal supports to its citizens against any discrimination and injustice. It maintains peace and so keeps society in order.

hope it helps you

Answered by paras4099
1

the group of people with the authority to govern a country or state; a particular ministry in office.

the group of people with the authority to govern a country or state; a particular ministry in office."the government's economic record"

hope it helps you

please mark as brainlist

and please follow me

Similar questions