English, asked by Anonymous, 10 months ago

☘☺why countries are launching wars...though war dont bring peace...♠♥♥​


Anonymous: hm?
Anonymous: @meyura
Anonymous: mars
Anonymous: hm

Answers

Answered by Anonymous
2

Popular thought today seems to support the idea that wars must be fought to bring about lasting peace. If the “bad guys” are killed, then peace will reign. To carry this thought to the extreme would mean Christians should kill all the people in the world who are not Christians, then peace would be here permanently, since all the people in the world would be Christians. The thinking that drives the policies of many countries, both in the past and in the present day, sees peace as the end product of armed conflict.


Anonymous: hm
Similar questions