Artificial Intelligence – Who’s in Charge? The Machines or Us?

Datum

At the Herrenhäuser Gespräch on February 14, 2019, experts debated how we should deal with artificial intelligence: A discussion surrounding privacy, control and power at a time when the machines around us are becoming ever more intelligent.

Human and roboter in conversation
On February 14, 2019, AI was at the center of the Herrenhäuser Gespräch "What Artificial Intelligence Means for Us Human Beings" organized by the Foundation. (Photo: Halfpoint - stock.adobe.com)

Artificial intelligence controls the music and light in our homes, checks our tax returns and helps doctors detect cancer. But it can also write novels, paint pictures or even complete unfinished pieces of music. Intelligent machines have become a challenge for modern humankind.

For latecomers, there was standing room only in the Xplanatorium in Herrenhausen Palace. The 54th Herrenhäuser Gespräch was on the topic "Die Maschine denkt, die Maschine lenkt? Was Künstliche Intelligenz (KI) für uns Menschen bedeutet". NDR anchorman Ulrich Kühn talked with author Holger Volland, founder of the cultural festival THE ARTS+, as well as the cultural scholar Dr. Nathalie Weidenfeld from the LMU Munich, social psychologist Prof. Dr. Nicole Krämer from the University of Duisburg-Essen and computer scientist Tobias Krafft from the Algorithm Accountability Lab at the TU Kaiserslautern. It quickly became clear that there are no simple answers. AI harbors as many opportunities as potential threats.

The new listeners in our living rooms

Auf dem Podium (v.l.n.r.):
Anchor Ulrich Kühn discussed with (from left ro right): Holger Volland, founder of the cultural festival THE ARTS+, cultural scholar Dr. Nathalie Weidenfeld from the LMU Munich, social psychologist Prof. Dr. Nicole Krämer from the University of Duisburg-Essen and computer scientist Tobias Krafft from the Algorithm Accountability Lab at the TU Kaiserslautern. (Photo: Nico Herzog for VolkswagenFoundation)

Whether in the form of Chatbots or virtual assistants, artificial intelligence has long since found its way into our living rooms. That's where we’re likely to encounter it most directly. Social psychologist Nicole Krämer has been researching human interaction with these new roommates for some years now. She says: "It takes intelligence to participate in dialogue. But we can see that in this respect machines are not yet so advanced."

However, Krämer's research at the University of Duisburg-Essen confirms that even imperfect machines can still simulate social behavior and thus influence our reactions: "Where there are social clues, we react socially. It doesn’t take many clues. When you're greeted nicely and told: 'That’s a great dress you are wearing', it makes you feel good." Even the knowledge that you have a machine in front of you doesn't change that. A machine is quite capable of influencing our emotions.

On stage from left to right: Holger Volland, Nathalie Weidenfeld and Nicole Krämer.
Holger Volland (left), Nathalie Weidenfeld and Nicole Krämer (right) agree that the "power over data" should not lie exclusively in the hands of corporations. (Photo: Nico Herzog for Volkswagen Foundation)

Cultural scholar Nathalie Weidenfeld fears for our private life when we know machines are constantly listening: "There is the loss of privacy. This is in keeping with our deep-seated desire to express ourselves and our selfie culture. I plead very strongly for our right to privacy."

Together with her husband, the philosopher Julian Nida-Rümelin, Weidenfeld published the book "Digital Humanism: An Ethic for the Age of Artificial Intelligence". Both argue for a calm attitude towards artificial intelligence, a pragmatic approach rather than ideological condemnation in one direction or the other.

Be accountable, computers!

Tobias Krafft and Ulrich Kühn on stage.
Tobias Krafft (left) explains to the audience and host Ulrich Kühn his research at the Algorithm Accountability Lab at TU Kaiserslautern. (Photo: Nico Herzog for Volkswagen Foundation).

The intrusion of "smart algorithms" into all areas of life cannot be stopped anyway. For computer scientist Tobias Krafft, this means that algorithms must be made accountable to society. He sees this quite ambivalently – and feels backed up by his current research. If the clever virtual assistant hears its owner coughing and then recommends cough syrup, that’s positive. But how far can a machine’s autonomy be allowed to go?

Krafft asks: "When the virtual assistants trace the mood between married couples from the flow of language and then point to a divorce lawyer. Is that still good?" In order to be able to be in a position to judge that, he suggests turning the tables: The algorithms would have to account to us. Many things are still unclear; the algorithms are rather like black boxes. This is the subject of his research, which is supported by the Volkswagen Foundation in the funding initiative "Artificial Intelligence and the Society of the Future".

Holger Volland on stage next to Nathalie Weidenfeld.
Holger Volland (left) addressed the social "data dilemma" and the urgent need to raise the awareness of people using AI technologies. (Photo: Nico Herzog for Volkswagen Foundation)

Holger Volland believes that modern man or woman is not yet ready for the machines. "We have no protective mechanism in our brains to warn us: 'Attention, this is a robot. Everything you tell it goes into a company's database'." He is, though, confident that such abilities will one day develop: "It will take two or three generations before we will be able to know intuitively: Even if something pretends to be nice and personal, it is not necessarily so". Maybe man just needs a few more steps on the ladder of evolution before he is able to steer the smart machines?

When AI turns to painting and writing

There is disagreement among experts on the question of how the products of artificial intelligence should be evaluated. Do the machines merely simulate human activity, or can they really keep up? In his book "The Creative Power of Machines", Holger Volland examines, among other things, how machines write novels – and the quality of such works. For decades, scientists have been feeding machines with texts by Shakespeare, for example, so that they can recognize patterns and possibly write equally ingenious works. Volland says: "This mostly results in bullshit. You still need a human, though, to curate the bullshit before you get anything like worth reading".

Prof. Dr. Nicole Krämer on stage.
The social psychologist Nicole Krämer warns that succumbing to the convenience of virtual assistants can lead to the disclosure of personal data. (Photo: Nico Herzog for Volkswagen Foundation)

Nathalie Weidenfeld has her doubts that a machine can ever become an artist. In her understanding, art presupposes an intention, a need to communicate. "This desire is foreign to a machine, even if it produces something beautiful," she says. Volland takes a more pragmatic view: "Honestly, I don't think that's a big deal. We can often be taken in by artificial intelligence."

Who does AI belong to?

A central question of the evening revolved around who has the power over the data produced. The temptation to switch off the light by voice command from the sofa brings with it a danger: "Because it’s so convenient, one forgets whether this data might be used elsewhere,"  says Krämer. It’s the same ̶ whether switching off the light by voice command or using a medical app.

The auditorium at Herrenhausen Palace during the discussion.
Artificial intelligence is a hot topic – around 400 people came to the Herrenhausen Talks. (Photo: Nico Herzog for Volkswagen Foundation).

Volland adds: "Society faces a dilemma when dealing with data. Artificial intelligence can only function well if it has masses of data available. But the more data it has at its disposal, the more it can deduce about the individual. We want both. And that won't work."

Better to steer rather than be steered

Just because a new technology comes along doesn't mean you have to use it. We should not allow ourselves to overly succumb to technological temptations, says Weidenfeld. Her plea: "Take over control again. It’s not up to technology to decide. We have to decide". She admits, however, that this is not so easy when the technology is already invisibly present in so many applications.

Volland left the audience with a plea to find out more and do their own research: "It’s stimulating to investigate what developments are possible. It’s also stimulating to be frightened of it. Research for yourself!"

Author: Jakob Vicari