Social Sciences, asked by ishanvihahan, 1 month ago

what do you mean by the term world war?​

Answers

Answered by IIOyeloveyouII
0

Explanation:

The term "World War" (Weltkrieg) first appeared in Germany in 1914. The French and British referred to the war as "La Grande Guerre" or the "Great War", but also adopted the term "World War" later in the conflict.

Similar questions