Robots: Friend Or Foe?

Humans have long strived to create machines with heart and those at the forefront of robotics are edging closer to such a breakthrough. Whether the relationship we form with robots in the coming years is one way, or, eventually reciprocal, there is no doubt that these mechanical giants could become as integral to our lives as a beloved family animal or colleague. But will we ever be able to build authentic rapport?

Brought to you by Deakin University’s This. To uncover more unique stories on contemporary topics, visit this.deakin.edu.au

When Big Hero 6’s Baymax waddled his way onto screens in 2014, he did more than entertain children and adults alike, he helped to reset our perception of what robot friends might look like in the future. It’s easy to assume that Disney used plenty of creative license to conjure up Baymax, yet this fictional animation reflects very real research into soft robotics.

For many of us, cuddly characters don’t usually spring to mind when we’re asked to imagine the robots of our near future, but while conceiving Baymax, Disney director Don Hall was inspired by a real inflatable arm design created by researchers in Pittsburgh, Pennsylvania. While Baymax, with his artificial intelligence, is far more advanced than anything that currently exists in the field of robotics, he does provide a hint of what’s to come.

Appealing tactile qualities have always been a consideration in technology design so it makes sense that researchers will look to merge machinery with materials that don’t make us recoil. In the coming years, robotic limbs made of balloon-style fabrics are plausible, but genuine companionship could be a lifetime away, according to Deakin University’s expert in mechatronics, Dr Ben Horan.

We will see robots providing personal assistance, helping elderly people to get in and out of bed, for example. “Helping someone out of bed might not be the most complex task, but it does require physical strength that a robot can provide,” he says.

In Japan, that robot could be coming to hospitals and aged care facilities soon. Robear was created by scientist Toshiharu Mukai as a tool to assist the ageing Japanese population. The cuteness was a consideration for Mukai. “Patients, especially old people, don’t like mechanical appearance. Patients need to feel that robots are their friends,” he told The Verge in April 2015. Robear is still too expensive to be commercially available, but he could be present in the coming decades.

Helpful robots are already stepping (and rolling!) into our everyday lives. Consider Roomba, the iRobot, which will vacuum your floors for you everyday. It’ll set you back about US$600, but what price can you put on having your very own robotic vacuum cleaner gobbling the dirt and dust from your floors while you kick back and relax?

Baxter, a manufacturing robot designed to work in factories, is among the most progressive of his kind. When we meet Baxter he’s playing a casual game of Connect Four at Deakin’s Geelong Waurn Ponds Campus. While Baxter was created in the US, it is possible for anyone to teach him a new trick by writing software. Dr Horan says there was a gap in the market for a robot that could work with humans and they’re exploring the kinds of tasks they can assign him.

Baxter’s low inertia, plastic- and foam-covered exterior and smiling face make him a gentle ally, but is he a threat to the workforce? Dr Horan is quick to point out that despite looking like a replacement for human factory workers, Baxter simply shifts the responsibility. “We still need humans to manage the robots. And a robot can be in a situation where it might get crushed, but we don’t lose a life.” Just don’t mention that near Baxter.

“It’s one thing to make a robot do something, it’s another thing to instil cognitive behaviour and human emotion in it,” Michael Mortimer, Deakin University PhD candidate.

Rachael Oates, marketing and communications manager for Sage Automation, the Australian distributor of Baxter, says that the Haigh’s Chocolate manufacturing facility in Australia was the first to apply Baxter’s skills to the production line.

At $35,000, Baxter costs about the same amount as a minimum wage factory worker and doesn’t demand sick days, holiday pay, breaks or better conditions. However, he represents a shift in the way that humans might work in the future. Rather than being bogged down with repetitive tasks, humans are free to spend more time planning and evolving their businesses.

Mortimer suggests that although movies like Chappie and Ex-Machina create a sense that the capacity for the human-robot relationship is not outside the realm of possibility, their ability to be our friends in a cognitive sense may never come to fruition.

“It’s one thing to make a robot do something, it’s another thing to instil cognitive behaviour and human emotion in it,” he says and explains that in the artificial intelligence space what really gives robots their edge is their memory and their ability to search for answers faster than humans do. He says the Hollywood take on artificial intelligence creates an illusion.

“Robots and products with the capacity for artificial intelligence aren’t really as sophisticated as one might be led to believe. In many instances searching a database looking for an answer can surpass current capabilities of onboard artificial intelligence,” Mortimer adds.

While robots don’t have the capacity to feel a two-way empathetic relationship, that doesn’t mean we can’t feel empathy for them.

When Google-owned robotics firm Boston Dynamics posted a YouTube video of their robotic dog, Spot, being kicked there was outcry. You can’t kick a dog, even if it is a robot, people said.

Although the seemingly aggressive act was used to destabilise the dog only to highlight his ability to find his feet again, it raised a wider ethical technology dilemma — do we need to respect things that can’t respect us back? Whether we’ve taken the time to think about it or not, the fact that people gasp or cringe when this little robotic animal stumbles is a sign that we already have an unspoken respect for technology that makes our lives easier.

At the Defence Advanced Research Projects Agency (DARPA) robotics challenge in June 2015, robots created by tech companies competed to shut down a mock nuclear reactor in Pomona, California. Novelty aside, the crowd was audibly moved by the robots’ falls and triumphs along the way.

The challenge organiser, Gill Pratt, said in a statement, “We heard groans of sympathy when those robots fell. And what did people do every time a robot scored a point? They cheered. It’s an extraordinary thing.”

Dr Horan concurs, adding; “It’s not good for us as a society if we start disrespecting the technologies that we rely on.”

The ethical code of conduct towards robots might not exist yet, but there’s every chance that we’ll come to give robots the same kind of quiet love we give our iPhones and our computers in return for their thankless commitment to making our days easier.

Foe

Could humans be responsible for manufacturing their own enemy? If Avengers: The Age of Ultron is to be believed, the artificial intelligence we create in robots could in turn be humanity’s undoing. In the Hollywood blockbuster, Ultron is programmed to believe that he must eradicate humanity. It’s the kind of science fiction that’s best served with popcorn, but it does raise concerns about the potential for autonomous killer robots to exist one day. Are we really ready for robots to make their own decisions?

In 2015, Defence Force personnel are trained to use armed drones — these flying robots are sophisticated enough to kill at the push of a button. Deakin University’s mechatronics expert, Dr Ben Horan says you needn’t run screaming through the streets for fear of killer robots just yet, though. Uncontrollable Terminator-style technology won’t contribute to war efforts any time soon.

However, robots are fast becoming far more capable than ever before. Google-owned Boston Dynamics has developed humanoid robots Atlas and Escher. With the US military pouring billions of dollars in to research and development of these eerily agile machines, it’s no wonder people hold a genuine fear that autonomous robots could one day turn on us. When this sort of technology is owned by one of the world’s most ubiquitous companies, there’s no assured limit to how far they could take it.

Indeed, there are already cases of robots causing fatal harm. In July 2015, a 22-year-old Volkswagen factory worker was killed after a robot crushed him. Although a spokesperson said human error was to blame, someone must be accountable. Reports indicate that prosecutors are considering whether criminal charges should be laid.

International coalition, Campaign to Stop Killer Robots, is a pre-emptive attempt to have an international ban placed on autonomous weapons. In a statement on their website, the coalition argue, “Giving machines the power to decide who lives and dies on the battlefield is an unacceptable application of technology. Human control over any combat robot is essential to ensuring both humanitarian protection and effective legal control.”

In 2014, they were present at a United Nations meeting in Geneva, arguing that there were important legal and moral considerations to make moving forward and pushed for human control. Not everyone agrees with their agenda, though. NBC News technology writer Keith Wagstaff said, “hysteria over the robopocalypse could hold back technology that would save human lives.”

Around the world, the Navy and other military organisations have been using automated weapons for years. Michael Mortimer, PhD student at Deakin University, says extensive advances in artificial intelligence would need to occur if we were going to see a true dawn of the killer robots. He argues that military robotics, like any technological advance, are intended to be used for good and killer robots inflicting mass destruction in the future is unlikely.

But he does admit that while most robotic research is conducted with the intention of assisting humans, any technology in the wrong hands can become dangerous. “Whatever tool you give people, how they use it comes down to human psyche.”

There’s no need for Hollywood to over-dramatise the rise of drones. Semi-autonomous military aircraft are becoming increasingly prevalent, taking to the skies and battling through conflict on our behalves. In a time when robotic technology is fast becoming a war imperative, you’d be right to feel a bit uneasy about these things hovering above you. Among the most advanced is the US Air Force’s Global Hawk which can fly and spy for up to 30 hours non-stop.

However, Mortimier points out that drones like the Global Hawk ultimately have a human responsible for pressing the buttons. In fact, “they utilise a large number of operating staff to get it going”. Dr Horan adds, “the processes surrounding the operation are far beyond what perhaps one fighter jet pilot would have. There are so many processes behind the scenes before any decisions are made.”

The Australian Army is now operating unmanned aerial vehicles in order to provide intelligence surveillance. Dr Horan says this is not a conspiracy. “In defence they’re looking at how to reduce the danger for Australian military personnel,” he points out. In addition, tools are also being developed to assist developing nations where, in some cases, people are using very basic tools like sticks to probe mines. Dr Horan says this shows that the military isn’t the enemy. “Robots can go into hazardous locations and deactivate mines.”

But even if military robots are intended to be used for good, it’s doubtless that where wars are concerned robots could be used for ill-intended purposes.

DARPA’s advanced drone ARGUS- IS, a 1.8 gigapixel camera can watch targets across a 25-square kilometre stretch from 20,000 feet in the air — that’s like monitoring all of Manhattan at once through one lens. In the wrong hands, it’s not hard to imagine some worrying situations.

In Australia, there are laws that protect a person’s privacy to a point, but in many cases, the technology is developing faster than the legislation. Despite that, Dr Horan says paranoia is not warranted. “There have been satellites with cameras in them for quite a while. People were worried about being watched through Google glasses,” he says. Even cameras in mobile phones give civilians their own surveillance capabilities, if they choose to use them.

But that’s hardly comparable to the rise of mass surveillance. “We do need to be conscious of the ethical considerations surrounding the use of robots. It falls under the umbrella of ethical considerations we apply to all technologies,” Dr Horan cautions.

A better application for surveillance droids is in the field of search and rescue. In future, you can expect to see robots that look like BB-8 from the upcoming Star Wars film, rolling into dangerous situations. His real world purpose would be to use his floating head and small spherical body to get into tight spaces and collect data for humans to interpret.

It’s hard to be completely sure of what large tech companies and military groups are working towards when developments are so closely guarded so for now, we’ll have to trust that there’s nothing to fear unless we’re given a reason to. A lot of what we’re seeing in science fiction still remains just that. So it might be best to remain alert rather than alarmed. As aspects of the imagined futures we see in films begin to become reality, let’s hope that any artificially intelligent robots that do come to exist are trained to suppress any appetite for destruction.

Brought to you by Deakin University’s This. To uncover more unique stories on contemporary topics, visit this.deakin.edu.au


The Cheapest NBN 50 Plans

Here are the cheapest plans available for Australia’s most popular NBN speed tier.

At Lifehacker, we independently select and write about stuff we love and think you'll like too. We have affiliate and advertising partnerships, which means we may collect a share of sales or other compensation from the links on this page. BTW – prices are accurate and items in stock at the time of posting.

Comments