explain the culture of America. ?
Answers
Answered by
0
the America was a lately discovered continent which was discovered by the Europeans. thus the American culture is western culture...
YUVARAJ11:
thank you very much.
Similar questions