Freedom in a tech-driven world

Freedom in a tech-driven world

Last week the student organizers (ISC) of upcoming 50th St. Gallen Symposium came to Swiss Re for a day to talk about their findings and plans, and to meet with a number of Swiss Re leaders. Among them was Jeffrey Bohn, Swiss Re Institute's Chief Research & Innovation Officer.

The theme of this year's Symposium is "Freedom Revisited." When asked whether we're more free today, compared to 50 years ago, when the St. Gallen University students first protested, the answers were varied. Yes, at first glance the global society seems in a better place … but when you look below the surface, things get murky. Jeff Bohn put it this way: "Is there greater freedom compared to 50 years ago? That depends on the dimension, I'd say. We're certainly freer in terms of access, but not as much in terms of opportunities."

With regard to technology he put it bluntly, saying that "the jury's out" – whether technology will prove itself a blessing or a curse to freedom, remains to be seen. Consider freedom in terms of social media – are you still free if trained bots trigger your dopamine levels without you being even aware of it? Are you still free if you've been manipulated into acting – often against your own best interests? Jeffrey Bohn expressed his belief that the tech innovators of today can create a freer world. However, he highlighted that two things need to happen for this to become our future reality – one is education, the other is regulation.

Big Data 2.0

Jeffrey Bohn wants to see the evolution to "Big Data 2.0", meaning curated data that is a) open and b) used by people educated to draw balanced conclusions. "There is inequality of wealth, but there is also inequality of data," he said. Today only the few understand the technology behind what is happening. "Right now, the gap is education. You don't need to become a programmer or software developer, but you need to understand the conceptual basis. And then, to some extent, you're inoculated to the results."

He had a very interesting suggestion – just as there is a "Like" button, he would like to see an "I've changed my mind" button. As people learn to understand technology, the basics of recognition and adaption tools, they'll be able to counteract the confirmation bias, see beyond the surface and make more educated decisions. One of the students asked the next obvious question: "Can't education be manipulated as well?" That certainly is a risk, Bohn agreed. There need to be guardrails against manipulation – and that's where regulation comes into play.

Tech is neutral

Technology is neutral - it becomes good or bad depending on how we use it. At this point the challenge is that technology moves at a pace too fast for regulators. He shared one excellent example to illustrate how, in the past, we all managed to agree on the benefits of regulation to keep us safe: "Would you like to go back to the early 20th century with electricity where you had no idea whether, if you plugged something in, it would blow up and catch on fire?"

This was indeed the world of inventors and consumers a hundred years ago – then regulation stepped in to mandate safety standards. We have granted institutions and governments the right to put regulations in place to protect us. Today we are in the same place with regard to high-tech inventions … yet there are no guardrails in place. "As with inventions back then, we need to do the same thing for data and new technologies. Software engineers and architects must be held to higher standards, same as other inventors."

There is no panacea

Bohn knows that neither education nor regulation will solve everything, but that they are important steps. Better education leads to better understanding – and regulation can help put the right guardrails in place for a socially responsible machine intelligence future. "Regulation won't be easy and a lot of people argue that it will shut down innovation, but I think that much of that is just fear mongering." Today lots of people still think that algorithms used in social media are just trying to reflect their preferences. While that was true at the beginning, algorithms have progressed. "What they're doing now is customizing your experience to your specific weaknesses. And that's a very different methodology that we should be worried about - because it basically means that you're getting designer drugs."

?Bohn suggested the idea of a benevolent bot, one that, instead of manipulating you to buy something or maximize dopamine triggers, would be hanging around in the background to put guardrails to behavior and make sure that you're not exposed, e.g., to fake news. That feels like something most people would want – and no doubt it would be something that would also need to be strongly regulated against potential manipulation.

This was such an interesting session. It seemed to effortlessly flow from the present to the future and from technology to value systems. Technological advances are in a constant state of flux – and education and regulation must find ways to become the required balance. As Jeff said: "These are the pieces we need to address if we really want to protect freedom."

要查看或添加评论,请登录

社区洞察

其他会员也浏览了