what is the history of America
Answers
Answered by
4
Answer:
The history of the United States is what happened in the past in the United States, a country in North America. ... They won the Revolutionary War and started a new country. They signed the constitution in 1787 and the Bill of Rights in 1791.
Military conflict: Seminole War, Cold War, Gulf War
Organization: United Nations
Similar questions