Social Sciences, asked by Anonymous, 5 months ago

what do you mean by term colony ? ​

Answers

Answered by Anonymous
3

Answer:

\huge\bold\red{AnsWer}

A colony is a group of people who settle in a new place but keep ties to their homeland. The people who founded the United States first came to America to live as part of a British colony.

Similar questions