History, asked by Natsuki56789, 7 months ago

What did the Americans gain from the Treaty of Paris?

Answers

Answered by Anonymous
0

Answer:

In the Treaty of Paris, the British Crown formally recognized American independence and ceded most of its territory east of the Mississippi River to the United States, doubling the size of the new nation and paving the way for westward expansion.

Answered by Anonymous
0

Answer:

In the Treaty of Paris, the British Crown formally recognized American independence and ceded most of its territory east of the Mississippi River to the United States, doubling the size of the new nation and paving the way for westward expansion.

Hope it helps ❤️

Similar questions