Social Sciences, asked by Sristi378, 1 year ago

When we say west countries, what does it mean?

Answers

Answered by Rakshitsaini
0
Generally it's means country of Europe
Similar questions