History, asked by Jason999, 7 months ago

Write a short note on American History

Answers

Answered by SanskarNikam161107
1

Explanation:

The history of the United States is what happened in the past in the United States, a country in North America.

Native Americans lived in the Americas for thousands of years. English people in 1607 went to the place now called Jamestown, Virginia. Other European settlers went to the colonies, mostly from England and later Great Britain. France, Spain, and the Netherlands also colonized North America. In 1775, a war between the thirteen colonies and Britain began when the colonists were upset over paying taxation to their government in the UK, but were not being given any chance to vote in the UK/British elections, to contribute to how that money was spent.

Just after dawn on April 19, 1775, the British attempted to disarm the Massachusetts militia at Concord, Massachusetts[source?], this beginning the war with the "Shot Heard Round the World." On July 4, 1776, Founding Fathers wrote the United States Declaration of Independence. They won the Revolutionary War and started a new country. They signed the constitution in 1787 and the Bill of Rights in 1791. General George Washington, who had led the war, became its first president. During the 19th century, the United States gained much more land in the West and began to become industrialized. In 1861, several states in the South attempted to leave the United States to start a new country called the Confederate States of America. This caused the American Civil War. After the war, Immigration resumed. Some Americans became very rich in this Gilded Age, and the country developed one of the largest economies in the world.

In the early 20th century, the United States became a world power, fighting in World War I and World War II. Between the wars, there was an economic boom called the Roaring Twenties, when many people became richer, and a bust, called the Great Depression, when most were poorer. The Great Depression ended with World War II.

Answered by 2001roars
0
The history of the United States is what happened in the past in the United States, a country in North America. Native Americans lived in the Americas for thousands of years. ... On July 4, 1776, Founding Fathers wrote the United States Declaration of Independence. They won the Revolutionary War and started a new country.
Similar questions