what do we understand from the word colonialism
Answers
Answered by
3
Colonialism refers to acquiring colonies in a nation. It is the system in which the external forces form colonies in a state
Similar questions