Views on science and technology development are leading the world towards destruction
Answers
Answer:
development or advancing in the field of mobiles, automobile industry, etc...
Explanation:
Because of development of mobile and technology it's reduce our work and make our work easy and saves our precious time, but watching mobile and eating food, on sitting at one place it will affect mentally and physically to us, our brains will get disturbed and our vision is getting reduce.
Daily using of vehicle for going collage, office, jobs, etc. saves our time but make us physically unfit the body fat will increase.
Now-a-days we can find many youth busy with their phone, continuous watching mobile can cause neck problem.
Therefore, the development of science and technology are leading the world towards destruction.
THANKS FOR YOUR DOUBT..
Answer:
Philosopher Nick Bostrom believes it's entirely possible that artificial intelligence (AI) could lead to the extinction of Homo sapiens. In his 2014 bestseller Superintelligence: Paths, Dangers, Strategies, Bostrom paints a dark scenario in which researchers create a machine capable of steadily improving itself. At some point, it learns to make money from online transactions and begins purchasing goods and services in the real world. Using mail-ordered DNA, it builds simple nanosystems that in turn create more complex systems, giving it ever more power to shape the world.
Now suppose the AI suspects that humans might interfere with its plans, writes Bostrom, who's at the University of Oxford in the United Kingdom. It could decide to build tiny weapons and distribute them around the world covertly. "At a pre-set time, nanofactories producing nerve gas or target-seeking mosquito-like robots might then burgeon forth simultaneously from every square meter of the globe."
For Bostrom and a number of other scientists and philosophers, such scenarios are more than science fiction. They're studying which technological advances pose "existential risks" that could wipe out humanity or at least end civilization as we know it—and what could be done to stop them. "Think of what we're trying to do as providing a scientific red team for the things that could threaten our species," says philosopher Huw Price, who heads the Centre for the Study of Existential Risk (CSER) here at the University of Cambridge.
Explanation: