History, asked by rmunda1962, 7 months ago

what do you mean by the term colony​

Answers

Answered by sudeepraul2005
6

Answer:

a country or area under the full or partial political control of another country and occupied by settlers from that country.

Explanation:

Answered by ruchimankodiya
6

Answer:

a group of people who leave their native country to form in a new land a settlement subject to, or connected with, the parent nation. the country or district settled or colonized: Many Western nations are former European colonies.

Similar questions