History, asked by khushkhan5924, 1 year ago

Before it entered World War I, the United States had declared itself to be

Answers

Answered by topanswers
12

The United States declared itself to be a neutral country before entering the World War I.

United States always wanted to stay neutral during the World war I but had ties with Britain. Germany’s submarine warfare that sunk many American Subs and Germany’s idea of turning Mexico against America were the reasons behind the United States involvement in the World War I. The real war emerged out only after the involvement of U.S.

Answered by mccl041211
1

Answer:a neutral country

Explanation:

Similar questions