answer these questions-
What is the singularity and how will it affect me as an individual?
Will the singularity happen in my lifetime?
Will things be better or worse after the singularity?
Answers
Answer:
PLS make my answer brilliant PLS
Explanation:
The technological singularity—also, simply, the singularity[1]—is a hypothetical point in time at which technological growth becomes uncontrollable and irreversible, resulting in unforeseeable changes to human civilization.[2][3] According to the most popular version of the singularity hypothesis, called intelligence explosion, an upgradable intelligent agent will eventually enter a "runaway reaction" of self-improvement cycles, each new and more intelligent generation appearing more and more rapidly, causing an "explosion" in intelligence and resulting in a powerful superintelligence that qualitatively far surpasses all human intelligence.
The first use of the concept of a "singularity" in the technological context was John von Neumann.[4] Stanislaw Ulam reports a discussion with von Neumann "centered on the accelerating progress of technology and changes in the mode of human life, which gives the appearance of approaching some essential singularity in the history of the race beyond which human affairs, as we know them, could not continue".[5] Subsequent authors have echoed this viewpoint.[3][6]
I. J. Good's "intelligence explosion" model predicts that a future superintelligence will trigger a singularity.[7]
The concept and the term "singularity" were popularized by Vernor Vinge in his 1993 essay The Coming Technological Singularity, in which he wrote that it would signal the end of the human era, as the new superintelligence would continue to upgrade itself and would advance technologically at an incomprehensible rate. He wrote that he would be surprised if it occurred before 2005 or after 2030.[7]
Public figures such as Stephen Hawking and Elon Musk have expressed concern that full artificial intelligence (AI) could result in human extinction.[8][9] The consequences of the singularity and its potential benefit or harm to the human race have been intensely debated.
Four polls of AI researchers, conducted in 2012 and 2013 by Nick Bostrom and Vincent C. Müller, suggested a median probability estimate of 50% that artificial general intelligence (AGI) would be developed by 2040–2050
Answer ✍️
2. Experts disagree a lot on when it's likely to be developed. So if you believe the median of these experts, you'd expect that there's maybe a 50/50 chance of reaching the singularity within 120 years or so. ... It is probable singularity will happen and you nor ordinary folks won't have any access to it.
3. The Singularity assumes the speed of everything will increase faster, and faster, and faster and faster – until it gets to fast that it's too fast to understand. That will never happen with our human brains – we're simply not built for that kind of development.