3

From The Independent:

For Ibrahim, life was bleak.

Bedbound and paralyzed as the result of a car accident over half a year ago, he had not had a visitor from his family in months and was in desperate need of life-changing surgery that the Saudi Arabian could not afford.

Alone and isolated, there was not too much joy in Ibrahim’s life.

That was until he posted a tweet aimed at encouraging one of the few followers he had to come and visit as he lay paralysed in his bed.

With his tweet he hoped that at least one sympathetic person might hear his call and provide some much craved company for him.

What he did not expect was the social media storm that it would create.

Within one day Ibrahim’s tweet became the most retweeted message in Saudi Arabia’s Twitter history.

The hashtag #VisitIbrahim circulated through social media circles and within just 24 hours it had been retweeted over 200,000 times.

Not only that, but Ibrahim got a lot more than he bargained for when hundreds of people from all over Saudi Arabia came to visit him.

Clutching flowers, pizza and other gifts there were cues of hundreds of people outside the King Khalid University Hospital in the country’s capital Riyadh, all with the aim of meeting Saudi’s latest social media star.

The hospital became so busy that its officials had to put a temporary ban on all of its visitors as the number of people in the building was affecting the work of its staff.

Yet Ibrahim did not only benefit from the tweet in terms of footfall and flowers, so compelled were the Saudi Twitter community by Ibrahim’s story that they contributed financially towards the life-changing surgery Ibrahim dreamed of.

The surgery which would cost $130,000 and involve Ibrahim travelling to Germany was quickly covered by the generous donors from the oil rich middle-eastern country.

Pictures showing visitors posing, talking to and even feeding the paralyzed man were posted on Twitter.

It is now hoped a trip to Germany can be organised soon so that that Ibrahim can get the surgery he is so in need of.

I am my own cancer.
—  // I control my own deadline. This cancer that’s in my mind is killing me bit by bit, so slowly, often in ways that goes unnoticed.
4

When the Dominator is aimed at a target, it continuously reads and sends their psychological data to the Sibyl System for calculation. When this value exceeds a certain level (indicating that the target is mentally unstable, and more likely to commit a violent crime), the gun will then be able to fire. The gun has three modes: Non-Lethal Paralyser, Lethal Eliminator, and Destroy Decomposer.

Ethical trap: robot paralysed by choice of who to save

Can a robot learn right from wrong? Attempts to imbue robots, self-driving cars and military machines with a sense of ethics reveal just how hard this is

CAN we teach a robot to be good? Fascinated by the idea, roboticist Alan Winfield of Bristol Robotics Laboratory in the UK built an ethical trap for a robot – and was stunned by the machine’s response.

In an experiment, Winfield and his colleagues programmed a robot to prevent other automatons – acting as proxies for humans – from falling into a hole. This is a simplified version of Isaac Asimov’s fictional First Law of Robotics – a robot must not allow a human being to come to harm.

At first, the robot was successful in its task. As a human proxy moved towards the hole, the robot rushed in to push it out of the path of danger. But when the team added a second human proxy rolling toward the hole at the same time, the robot was forced to choose. Sometimes, it managed to save one human while letting the other perish; a few times it even managed to save both. But in 14 out of 33 trials, the robot wasted so much time fretting over its decision that both humans fell into the hole. The work was presented on 2 September at the Towards Autonomous Robotic Systems meeting in Birmingham, UK.

Winfield describes his robot as an “ethical zombie” that has no choice but to behave as it does. Though it may save others according to a programmed code of conduct, it doesn’t understand the reasoning behind its actions. Winfield admits he once thought it was not possible for a robot to make ethical choices for itself. Today, he says, “my answer is: I have no idea”.

As robots integrate further into our everyday lives, this question will need to be answered. A self-driving car, for example, may one day have to weigh the safety of its passengers against the risk of harming other motorists or pedestrians. It may be very difficult to program robots with rules for such encounters.

But robots designed for military combat may offer the beginning of a solution. Ronald Arkin, a computer scientist at Georgia Institute of Technology in Atlanta, has built a set of algorithms for military robots – dubbed an “ethical governor” – which is meant to help them make smart decisions on the battlefield. He has already tested it in simulated combat, showing that drones with such programming can choose not to shoot, or try to minimise casualties during a battle near an area protected from combat according to the rules of war, like a school or hospital.

Arkin says that designing military robots to act more ethically may be low-hanging fruit, as these rules are well known. “The laws of war have been thought about for thousands of years and are encoded in treaties.” Unlike human fighters, who can be swayed by emotion and break these rules, automatons would not.

"When we’re talking about ethics, all of this is largely about robots that are developed to function in pretty prescribed spaces," says Wendell Wallach, author ofMoral Machines: Teaching robots right from wrong. Still, he says, experiments like Winfield’s hold promise in laying the foundations on which more complex ethical behaviour can be built. “If we can get them to function well in environments when we don’t know exactly all the circumstances they’ll encounter, that’s going to open up vast new applications for their use.”

This article appeared in print under the headline “The robot’s dilemma”

Watch a video of these ‘ethical’ robots in action here

Text
Photo
Quote
Link
Chat
Audio
Video