Page 2 of 4 FirstFirst 1234 LastLast
Results 11 to 20 of 32

Thread: Miscellaneous

  1. #11


    An Open Letter from Artificial Intelligence and Robotics Researchers: Professor Stuart Russell

    Published on Oct 15, 2015

  2. #12


    How can you stop killer robots | Toby Walsh | TEDxBerlin

    Published on Oct 8, 2015

    Tobay Walsh on "How can you stop killer robots" at TEDxBerlin

    Toby Walsh is one of the leading researchers in the world in Artificial Intelligence.

    He is currently working in Berlin thanks to a Humboldt Research award. He is a Professor of Artificial Intelligence at the University of New South Wales back in Sydney, Australia, and a Research Group leader at NICTA, Australia's Centre of Excellence for ICT Research. He has been elected a fellow of the Association for the Advancement of AI for his contributions to AI research.

    Earlier this year, he was one of the initial signatories of an Open Letter calling for a ban on offensive autonomous weapons. The letter was also signed by Stephen Hawking, Elon Musk and Steve Wozniak. In total, the letter now has close to 20,000 signatures and has pushed this issue into the world's spotlight. The letter argues that we need to take action today to prevent an arms race in which these lethal autonomous weapons fall into the hands of terrorists and rogue nations.

  3. #13


    Exploring the ethics of lethal autonomous weapons

    Uploaded on Nov 6, 2015

    AJung Moon, a Ph.D. candidate at The University of British Columbia and co-founder of the Open Roboethics initiative (ORi), discusses the results of a survey that explores the ethics of lethal autonomous weapons.

    Read the full story here:
    "Most people want fully autonomous weapons banned: UBC survey"

    November 9, 2015

  4. #14


    Published on Jan 23, 2016

    Remarkable advances in artificial intelligence may soon have implications for the future of warfare. What if autonomous weapon systems replace both soldiers and generals?

  5. #15


    Lethal autonomous weapons

    Published on Apr 8, 2016

    Biography:
    Stuart Russell received his B.A. with first-class honours in physics from Oxford University in 1982 and his Ph.D. in computer science from Stanford in 1986. He then joined the faculty of the University of California at Berkeley, where he is Professor (and formerly Chair) of Electrical Engineering and Computer Sciences and holder of the Smith-Zadeh Chair in Engineering. He is also an Adjunct Professor of Neurological Surgery at UC San Francisco and Vice-Chair of the World Economic Forum's Council on AI and Robotics. He has published over 150 papers on a wide range of topics in artificial intelligence including machine learning, probabilistic reasoning, knowledge representation, planning, real-time decision making, multitarget tracking, computer vision, computational physiology, and global seismic monitoring. His books include "The Use of Knowledge in Analogy and Induction", "Do the Right Thing: Studies in Limited Rationality" (with Eric Wefald), and "Artificial Intelligence: A Modern Approach" (with Peter Norvig).

    Abstract:
    Autonomous weapons systems select and engage targets without human intervention; they become lethal when those targets include humans. LAWS might include, for example, armed quadcopters that can search for and eliminate enemy combatants in a city, but do not include cruise missiles or remotely piloted drones for which humans make all targeting decisions. The artificial intelligence (AI) and robotics communities face an important ethical decision: whether to support or oppose the development of lethal autonomous weapons systems (LAWS).
    The UN has held three major meetings in Geneva under the auspices of the Convention on Certain Conventional Weapons, or CCW, to discuss the possibility of a treaty banning autonomous weapons. There is at present broad agreement on the need for "meaningful human control" over selection of targets and decisions to apply deadly force. Much work remains to be done on refining the necessary definitions and identifying exactly what should or should not be included in any proposed treaty.

    Wednesday, April 6, 2016 from 12:00 PM to 1:00 PM (PDT)
    Sutardja Dai Hall - Banatao Auditorium
    University of California, Berkeley

  6. #16
    "Don't be evil?"
    A survey of the tech sector’s stance on lethal autonomous weapons

    August 19, 2019

  7. #17
    Article "Are AI-Powered Killer Robots Inevitable?"
    Military scholars warn of a “battlefield singularity,” a point at which humans can no longer keep up with the pace of conflict.

    by Paul Scharre
    May 19, 2020

  8. #18


    Killer robots & human security | Jody Williams | TEDxGatewaySalon

    Jun 5, 2020

    Jody Williams explains the move toward killer robots - the third revolution in warfare - and the threat these lethal autonomous weapons pose both to global security and to human security. As she describes, killer robots, that on their own would be able to target and kill human beings, would be crossing a moral and ethical divide that should not be breached. Jody Williams received the Nobel Peace Prize in 1997 for her work as founding coordinator of the International Campaign to Ban Landmines, which shared the Peace Prize with her that year. She’s an outspoken peace activist who struggles to reclaim the real meaning of peace—a concept which goes far beyond the absence of armed conflict and is defined by human security, not national security. Since January of 2006, she has chaired the Nobel Women’s Initiative, an organization that uses the prestige and influence of the six women Nobel Peace laureates that make up the Initiative support and amplify the voices of women around the world working for sustainable peace with justice and equality.

  9. #19


    Oct 13, 2020

    "China Conducts Test Of Massive Suicide Drone Swarm Launched From A Box On A Truck"
    China shows off its ability to rapidly launch 48 weaponized drones from the back of a truck, as well as from helicopters.

    by Joseph Trevithick
    October 14, 2020

  10. #20
    Article "Ghost Robotics now makes a lethal robot dog"
    The machine, on display at an Army conference, packs a sniper-type rifle.

    by Kelsey D. Atherton
    October 13, 2021

    Article "Welp, Now We Have Robo-Dogs With Sniper Rifles"
    The "Special Purpose Unmanned Rifle" has materialized from your Black Mirror nightmares.

    by Kyle Mizokami
    October 15, 2021

Page 2 of 4 FirstFirst 1234 LastLast

Социальные закладки

Социальные закладки

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •