Experts Warn of 'Terminator'-Style Military-Robot Rebellion

foxnews.com/story/0,2933,496309,00.html

full story:
technology.timesonline.co.uk/tol/news/tech_and_web/article5741334.ece

… discussion of a Terminator-style scenario in which robots turn on their human masters – is issued in a hefty report funded by and prepared for the US Navy’s high-tech and secretive Office of Naval Research.

The report, the first serious work of its kind on military robot ethics, envisages a fast-approaching era where robots are smart enough to make battlefield decisions that are at present the preserve of humans. Eventually, it notes, robots could come to display significant cognitive advantages over Homo sapiens soldiers.

There is a common misconception that robots will do only what we have programmed them to do,” Patrick Lin, the chief compiler of the report, said. “Unfortunately, such a belief is sorely outdated, harking back to a time when . . . programs could be written and understood by a single person.” The reality, Dr Lin said, was that modern programs included millions of lines of code and were written by teams of programmers, none of whom knew the entire program: accordingly, no individual could accurately predict how the various portions of large programs would interact without extensive testing in the field – an option that may either be unavailable or deliberately sidestepped by the designers of fighting robots.

full report:
ethics.calpoly.edu/ONR_report.pdf

I guess I’m not totally surprised, hence why I would like to see strict limits on the level of AI given to robots.

I’ll start to believe it when they make a robot that can plow the snow out of my driveway and mow my lawn.

Until then, I think its a far fetched notion.

Jim

I’d like to see strict limites on the level of funding for spurious reports.

I, for one, welcome our Robot overlords.

:rotfl: :rotfl: :rotfl:

I have heard many, many untruths about computers and programming in my time, but it is rare to come across such weapons-grade bollocks as this.

If a military robot ‘runs amok’ and starts barreling through its own lines it will be because somebody programmed it to do that. It is technically possible for this sort of thing to happen inadvertently as a result of different controls interacting in untested ways, but the odds of an unchecked malfunction giving rise to the Terminator – we’re talking a few small steps above spontaneous combustion here.

This sounds like a case of life imitating art.

You’re in luck for the robot lawn mower: friendlyrobotics.com/robomow/

There might be a working prototype of a robot snow shovel: i-shovel.com/

I know nothing about computers beyond believing my personal computer is the most annoying and frustrating thing in existence, so if the problem of killer robots is so easily dismissed as highly unrealistic, why would military scientists express these concerns?

For the people knocking this, this report was sponsored by “US Department of Navy, Office of Naval Research, under award # N00014-07-1-1152 and N0014-08-1-1209.” And ONR has not only funded but supported this research and is continuing to fund and support it (apparently the report is preliminary)

ethics.calpoly.edu/
ethics.calpoly.edu/pr_020209.html

“The public is generally surprised when they hear how great a role robots are playing in the military,” explained Dr. Patrick Lin, director of the research group and co-author of the report. “But there hasn’t been much dialogue about the risks posed by these machines, especially as they are expected to be given more autonomy or a greater ability to make choices on their own, such as attack decisions. So we commend the ONR for their foresight in supporting our investigation.”

Supported by ONR award # N00014-07-1-1152 and N0014-08-1-1209, the other co-authors of the report are Dr. George Bekey and Keith Abney. Bekey is also an emeritus professor at University of Southern California, founder of its robotics lab, and author of Autonomous Robots (MIT Press, 2005). Colin Allen (Indiana Univ.), Peter Asaro (Rutgers), and Wendell Wallach (Yale Univ.) were retained as consultants on the project.

And as you can see it was co-authored by bright minds from different Universities.

There’s been controversy over a new stealth aircraft in development due to the possibility raised of it being unmanned and capable of using nukes.

popsci.com/node/30794?page=1

It reminds me of the movie Stealth. The research here reminds me of Terminator; I, Robot; and Eagle Eye. The report covers more issues than the Terminator scenario.

BTW, you know that liquid metal stuff in Terminator? Science is working towards a similar goal right now! And they’ve had some preliminary success

marty.com.au/science-fiction-to-fact/16-new-tech/88-self-assembling-plastic-chips-first-step-towards-liquid-metal-terminator-robots.html

That’s what I’ve been wondering about. This report seems to have been put together by actual experts and researchers in this field (I could, of course, be wrong about that) so I have to wonder why they’d express concerns and publish a report like this if the problem doesn’t really exist, or is otherwise unlikely. It is not like the authors are luddites or want to smash the evil computers.

The military has also funded research into pheromonal gay bombs, telepathy, and weaponized halitosis. The only thing surprising about their coming up with an apocalyptic robot fantasy to wank over is that they didn’t do it when killer robots first got big in the movies.

They’ve also funded research that has been productive and useful. And I would not be surprised to find out that nonmilitary and nongovernment sources have funded some seemingly silly stuff. That doesn’t really address the report or why apparently reputable scientists involved in this field would express concerns such as these if those concerns are nonexistent or unrealistic.

There were ‘reputable scientists’ involved in the studies I mentioned as well. That doesn’t make the ideas any less silly in retrospect.

Worrying about the rise of the machines is completely unrealistic until the exact day someone creates a sentient, learning AI and puts it in charge of an assembly line. Creating a truly autonomous artificial intelligence will require a quantum leap in our understanding of several fields: mathematics, computer science, psychology, materials science, neurology, perhaps even physics itself. Until then any seemingly malicious behavior on the part of machines is actually programmer error or malice.

I am not an AI specialist, but I am a programmer, and when one of my creations screws up it’s my fault (or that of the people who wrote the operating system), not the computer hobgoblins’.

And I am sure some things that seemed silly at first later turned out to be correct.

To me it seems good policy to discuss the possibility and how we might avoid it before we actually start creating the kind of robots that might one day pose a threat.

Of that I have no doubt.

From what I gather from the article and what little bits of the report I’ve read, it seems that these scientists are trying to avoid making mistakes with their creations.

If the money could be found to fund research of fleas on the backs of dogs used as spies they would research it. Follow the money.:shrug:

I just want a robot butler, is that so much to ask for?

The technology already exists. The Japanese robot Asimo can walk, run and use stairs. New and better AI’s are being developed. It would be easy to put a gun in the hand of an Asimo style robot and give it the ability to shoot at anything that moves (already deployed in other robotic units for border patrol). So I would take the ONR report seriously because similar systems are being developed right now. And why send a man when a humanoid robot can walk in and, at the very least, enter enemy territory and just start shooting ahead of human troops right behind?

Peace,
Ed

As Edwest has pointed out, development of butler robots is currently underway. A quick Google search reveals a couple of articles that claim Gecko Systems has developed a personal care robot for the elderly, the sick, and the very young: geckosystems.com/. I have a robot acquired from Radio Shack that can pick up light objects for me, navigate its surroundings, and play games with me. It is a very small, very cheap, low-end robot, compared to what is out there. Today, I can get robots to mow my lawn (see link in my previous post), sweep my floor, mop my floor, clean my shop floor, clean my pool, and clean my gutters: store.irobot.com/shop/index.jsp?categoryId=2804605. Robots are going to get smarter over time, more sophisticated, and possibly more dangerous, if we are not cautious now.

Have you seen them? My husband works in robotics. He has helped put the ones into space that are working now. He worked on the ones that can go into reactors and fix them so humans don’t need to go into the reactor. The ones that are sent into coal mines and caves to search for things or people are also available now. The bomb squads can use robotics to keep their people safer.

BUT a “human” like robot that can maintain its balance in normal everyday situation that humans encounter has yet to be even near perfection. They require heavy battery packs that put them off balance or they are hard wired to a power source that limits their movements.

OTOH there have been great strides in “cyborg” medical devices. These can help people with physical disabilities do some things that would have been impossible in the past. The main problem is rejection of the integrated parts.

I don’t think anyone here, at least not me, has said there are going to be human-like robots marching across the battlefield in a few years. I think the whole purpose of the linked report was that they are a very real possibility during our lifetimes and so we must start a serious discussion on them now. We are moving in the direction of more and more military robots and those robots will become more and more advanced as time passes.

DISCLAIMER: The views and opinions expressed in these forums do not necessarily reflect those of Catholic Answers. For official apologetics resources please visit www.catholic.com.