English, asked by nadeemahmadafz00, 1 year ago

how did america become a land of ​

Answers

Answered by rejibala
0

Answer:

They won the Revolutionary War and started a new country. They signed the constitution in 1787 and the Bill of Rights in 1791. General George Washington, who had led the war, became its first president. During the 19th century, the United States gained much more land in the West and began to become industrialized.

Hope it will help you

Please mark me the branliest

Similar questions