History, asked by anikkamenor5818, 11 months ago

Examine the relations between Japan and United States up to World War-1.

Answers

Answered by GYMlover
0

American History: US-Japan Relations Before World War Two. ... Instead, the United States would enter World War Two following a surprise attack by Japan on the large American naval base at Pearl Harbor, Hawaii. Relations between the United States and Japan had grown steadily worse throughout the nineteen thirties.

Similar questions