Was the "settlement" of the American western frontier inevitable?
Answers
Answered by
2
Explanation:
1800s belief that Americans had the right to spread across the continent. American's settlement of the west was inevitable (going to happen) and a God-given right. Government helped. Financed railroads, also offered land to settlers at little to no cost.
Answered by
0
Answer:
It's not It's not a big deal for me but first
Explanation:
Mark me as brainliest plzzzzzzzzzzzzzzzzzzzzzzz_zzzzzzzzzzzzzzzzzzz
Similar questions