Page 3 of 6 FirstFirst 12345 ... LastLast
Results 21 to 30 of 54

Thread: Miscellaneous

  1. #21

    Published on Feb 28, 2014

    Cybersecurity expert Peter W. Singer discusses the similarities between drones and computer viruses. Singer is the author of Cybersecurity and Cyberwar: What Everyone Needs to Know. You can learn more at

    Peter W. Singer: There's been an enormous amount of changing forces on warfare in the twenty-first century. And they range from new actors in war like private contractors, the black waters of the world to the growth of warlord and child soldier groups to technologic shifts. The introduction of robotics to cyber. And one of the interesting things that ties these together is how not only the who of war is being expanded but also the where and the when. So one of the things that links, for example, drones and robotics with cyber weapons is that you're seeing a shift in both the geographic location of the human role. Humans are still involved. We're not in the world of the Terminator. Humans are still involved but there's been a geographic shift where the operation can be happening in Pakistan but the person flying the plane might be back in Nevada 7,000 miles away.

    Or on the cyber side where the software might be hitting Iranian nuclear research centrifuges like what Stuxnet did but the people who designed it and decided to send it are, again, thousands of miles away. And in that case it was a combined U.S./Israeli operation. One of the next steps in this both with the physical side of robotics and the software side of cyber is a shift in that human role -- not just geographically but chronologically where the humans are still making decisions but they're sending the weapon out in the world to then make its own decisions as it plays out there. In robotics we think about this as autonomy. With Stuxnet it was a weapon. It was a weapon like anything else in history, you know, a stone, a drone -- it caused physical damage.

    But it was sent out in the world on a mission in a way no previous weapon has done. Go out, find this one target and cause harm to that target and nothing else. And so it plays out over a matter of, you know, Stuxnet plays out over a series of time. It also is interesting because it's the first weapon that can be both here, there, everywhere and nowhere. Unlike a stone. Unlike a drone. It's not a thing and so that software is hitting the target, those Iranian nuclear research facilities, but it also pops up in 25,000 other computers around the world. That's actually how we discover it, how we know about it. The final thing that makes this interesting is it introduces a difficult ethical wrinkle.

    On one hand we can say this may have been the first ethical weapons ever developed. Again whether we're talking about the robots or Stuxnet, they can be programmed to do things that we would describe as potentially ethical. So Stuxnet could only cause harm to its intended target. Yet popped up in 25,000 computers around the world but it could only harm the ones with this particular setup, this particular geographic location of doing nuclear research. In fact, even if you had nuclear centrifuges in your basement, it still wouldn't harm them. It could only hit those Iranian ones. Wow, that's great but as the person who discovered it so to speak put it, "It's like opening Pandora's box." And not everyone is going to program it that way with ethics in mind.

    Directed/Produced by Jonathan Fowler and Dillon Fitton

  2. #22

    Accepting Artificial Intelligence - A robot designed to decrease the fear of future robot technology
    February 6, 2014

    This video shows the result of my graduation project. A robot with face detection, designed to decrease the fear of future (robot) technology, for people with technofobia,

    About the project: Accepting Artificial Intelligence
    Because technology is changing more rapidly everyday, not everyone has a positive view about the future. A certain amount of people fear and don’t trust technology, especially technology of the future, this is called technophobia. And because there is a great change that we can reach AI with human intelligence this century, which can give us many benefits, it is important for those people who fear this development, to get a more positive view of the future. Therefore I developped a robot that will help take away the fear of people with technophobia.

    Video production:
    Robin de Bruin & Sara Dubbeldam

    Amon Tobin - Piece of Paper

  3. #23

  4. #24

    The Terrifying Promise of Robot Bugs

    Published on May 5, 2013

    Imitating nature to build a better (or possibly more terrifying) future. We've been trying to build flapping-wing robots for hundreds of years, and now, ornithopters are finally being developed, and may be used mostly for military purposes.

    Piezoelectrics make those little bugs possible, and also enhances the ability of robot arms to feel, in other news from the International Journal of Robotics.

  5. #25

  6. #26

  7. #27

  8. #28

    How to tell if your robot is obsessed with you

    Published on Oct 1, 2015

    Thomas Kuc became friends with a robot and got way more than he bargained for.

  9. #29

    Killer robots, the end of humanity, and all that: What should a good AI researcher do?

    Published on Aug 12, 2015

    Buenos Aires-July, 29, 2015.

    Talk by Stuart Russell-Professor of Computer Science and Smith-Zadeh Professor in Engineering, University of California, Berkeley Adjunct Professor of Neurological Surgery, University of California, San Francisco.

    Hear an update on the campaign to ban lethal autonomous weapons, as well as the fears that AI poses an existential threat to mankind.

  10. #30

Page 3 of 6 FirstFirst 12345 ... LastLast

Социальные закладки

Социальные закладки

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts