Can machines learn morality common lit
Answers
Answer:
no
Explanation:
Can Machine Learn Morality by Randy Rieland discusses what equipping machines with morality would look like and the effect it could have on the world.
Explanation:
1. PART A: Which of the following identifies the central idea of the text?
B. Equipping robots with a sense of morality would allow them to be more useful in combat, however, this would also allow them more independence than some people approve of.
2. PART B: Which detail from the text best supports the answer to Part A?
C. “’I believe that there is a potential for non-combatant casualties to be lessened by these intelligent robots, but we do have to be very careful about how they’re used and not just release them into the battlefield without appropriate concern.’” (Paragraph 17)
3. PART A: What is the effect of the author’s reference to The Terminator movies in paragraph13?
D. It reinforces the idea that some robotic advances are only possible in fiction.
4. PART B: Which quote from the text best supports the answer to Part A?
C.“Maybe this will always be the stuff of science fiction.” (Paragraph 13)
5. How does the moral capacity of robots compare to humans as described in the text?
Machines grow smarter and more independent, but do they know the distinction between right and wrong? It won't occur long, but as the machine becomes smarter and more independent, the essential aspect of its transition will be its ability to learn moral. Ronald Arkin, Georgia Tech's professor of computer science and robotics expert, definitely does. He has created software, known as an "ethical governor" that will allow machinery capable of determining when to fire is necessary
Arkin recognizes that it may be decades away, but he believes that robots may one day be superior to human soldiers physically and ethically and not vulnerable to the emotional pain of war and the need to take revenge. he does not expect a whole robot army, just one where robots support people We don't expect a whole robot army, just one where robots support people and carry out high-risk jobs with demanding fast decisions, such as clearing buildings.
However Arkin says that "All is in the interest "of creating machines that do not threaten people, but are an advantage, particularly in the messy chaos of war". He said that it was important to start now to concentrate on setting guidelines for the correct robot behaviour. What one must do this new capability when one starts to open this box of Pandora. Further, he said that he thinks there is a possibility that these intelligent robots can lower non-combatant losses, however we should be careful how they are being used, and and not simply release them into the battlefield without proper