Sociology, asked by jatinthenua186, 10 months ago

what is history of America​

Answers

Answered by milkclouds22
0

Answer:

HEY MATE!

HEY MATE!HERE IS YOUR ANSWER.

The history of the United States is what happened in the past in the United States, a country in North America. They won the Revolutionary War and started a new country. They signed the constitution in 1787 and the Bill of Rights in 1791.

HOPE IT HELPS...

Similar questions