History, asked by Angelmurugan2134, 9 months ago

what is the history of USA? ​

Answers

Answered by adhende115
1

Answer:

The history of the United States is what happened in the past in the United States, a country in North America. ... They won the Revolutionary War and started a new country. They signed the constitution in 1787 and the Bill of Rights in 1791.

plz mark as brain list

Answered by kishankumar99
2

Explanation:

The history of the United States is what happened in the past in the United States, a country in North America.

Native Americans lived in the Americas for thousands of years. English people in 1607 went to the place now called Jamestown, Virginia. Other European settlers went to the colonies, mostly from England and later Great Britain. France, Spain, and the Netherlands also colonized North America. In 1775, a war between the thirteen colonies and Britain began when the colonists were upset over paying taxation to their government in the UK, but were not being given any chance to vote in the UK/British elections, to contribute to how that money was spent.

Just after dawn on April 19, 1775, the British attempted to disarm the Massachusetts militia at Concord, Massachusetts, thus beginning the war with the "Shot Heard Round the World." On July 4, 1776, Founding Fathers wrote the United States Declaration of Independence. They won the Revolutionary War and started a new country. They signed the constitution in 1787 and the Bill of Rights in 1791. General George Washington, who had led the war, became its first president. During the 19th century, the United States gained much more land in the West and began to become industrialized. In 1861, several states in the South attempted to leave the United States to start a new country called the Confederate States of America. This caused the American Civil War. After the war, Immigration resumed. Some Americans became very rich in this Gilded Age, and the country developed one of the largest economies in the world.

In the early 20th century, the United States became a world power, fighting in World War I and World War II. Between the wars, there was an economic boom called the Roaring Twenties, when many people became richer, and a bust, called the Great Depression, when most were poorer. The Great Depression ended with World War II.

The United States and the Soviet Union entered the Cold War. This included wars in Korea and Vietnam. During this time, African-Americans, Chicanos, and women sought more rights. In the 1970s and 1980s, the United States started to make fewer things in factories. The country then went through the worst recession it had since the Great Depression. In the late 1980s, the Cold War ended, helping the United States out of recession. The Middle East became more important in American foreign policy, especially after the September 11 attacks in 2001.

Similar questions
Math, 9 months ago