History, asked by dolon892759945, 5 months ago

what did England recognised signing the Paris pact,1783?​

Answers

Answered by madadharansai
0

Answer:

In the Treaty of Paris, the British Crown formally recognized American independence and ceded most of its territory east of the Mississippi River to the United States, doubling the size of the new nation and paving the way for westward expansion

Explanation:

Answered by oishijaac
0

Answer:

in the Paris pact 1783, the British  recognised America as an independent country and granting it western territories

Explanation:

Similar questions