Three Laureates Walk on a Stage… and Start Debating Technology
At this point in human history, society and technology are intertwined more than ever before. Sure, you could argue the plow was an impactful piece of technology at its time, but the immense range of technologies we have access to nowadays is unprecedented. From the phones we always carry around to the smart cars we drive and the vaccines that can get us through the pandemic, technology has truly become an indispensable part of our day-to-day lives – and at the core of this technology are mathematics and computer science.
Three people at the core of important advances in these fields are Vint Cerf, Leslie Lamport and Joseph Sifakis – all three ACM A.M. Turing Award recipients. They were joined by Vicki Hanson, CEO of the ACM.
9th HLF Forum. From left: Joseph Sifakis, Leslie Lamport, Vinton G. Cerf, Vicki Hanson. 19.09.2022 Foto: Bernhard Kreutzer for HLFF
Algorithms and internet
Leslie Lamport was awarded the prize for fundamental contributions to the theory and practice of concurrent systems. His work helped defined such concepts as causality, safety, and liveness of such systems. But Lamport says he takes a different approach from most of his peers: he doesn’t look at such algorithms as a mathematical challenge, but rather as a physical one.
“If you look at my work on algorithms, there’s a notion of a physicality in them,” Lamport mentioned.
Lamport is also the developer of a program used by students and researchers from all around the world: LaTeX. LaTeX is a software system for document preparation that provides descriptive markup language in a way that’s easier for writers. Over the years, LaTeX has probably made the life of millions of students and scholars easier.
“I wasn’t surprised people used it in the beginning, but I’m amazed that it’s still popular and no one has come up with anything better,” Lamport joked.
Cerf, who is widely celebrated as one of the “fathers” of the internet, co-designed the TCP/IP protocols and the architecture of the Internet. Switching the internet on was a big challenge, Cerf says, but keeping it going in realistic situations was even more challenging.
The laureate mentioned 1983 as the “birth date” of the internet. Work on the internet precursor ARPANET started back in the late 1960s, but version four of TCP/IP was installed in ARPANET in January 1983.
The US Department of Defense loved TCP/IP, making it the standard for all military computer networking in 1980. But for Cerf and his colleagues, this success also brought a problem.
ARPANET computers were designed to work in controlled environments, connected with cables and in air-conditioned rooms. Suddenly, the military wanted to use them on their submarines and planes, Cerf recalls, which is a whole new ballgame. Thankfully, they were able to find the robust technical solutions to enable the proto-internet to take off, paving the way for the protocols and technologies we use for the internet today.
Meanwhile, Joseph Sifakis worked on model-checking, developing an effective verification technology that is also widespread in hardware and software industries. There’s no praise like the praise of your peers, and Lamport took the time to praise Sifakis for his contributions.
“When Joseph and the others were developing model checking, I wasn’t interested in it, i was interested in proofs. But in the year 2000, a colleague built a model checker for a language that I developed and I started using it and i was blown away. It’s marvelous, it’s a wonderful idea, I never had the idea of how far you can go with this. So Joseph, a very belated thank you.”
Nowadays, Sifakis works on something else. He works in systems design and autonomous systems, especially focusing on self-driving cars.
Security and bricks
But every coin has two sides.
Technology brought us the internet, but it also brought us security challenges we’ve never faced before. It brought us fast cars and planes, but also climate change. Technology is a tool, but our use of the tool is perhaps not always ideal. At the panel, the discussion inevitably flowed to security.
Why was the internet so unsafe, Hanson asked the panel – and a debate was quickly ignited.
Cerf argued that from the very moment you start your browser and it interacts with third parties, the security challenges come immediately.
“What’s the first thing a browser does? It goes out to some computer and picks up an html file or xml and then it interprets it. Great, we just got some piece of software from another place we don’t know and now we’re running it on our computer, holy moly!”
But Lamport argued this is not an innate problem of the internet, it’s a decision that was chosen out of practicality not out of any technical reasons.
“Well, why does it open it on your computer? Because it’s a cheap and dirty way of doing things, because there were strong commercial interests to get functionality there without doing an impractical amount of effort,” Lamport mentioned.
Sifakis also believes that designing a completely safe internet is theoretically possible, but impossible in practice because the models required for this would be impractical. But he also mentioned that even the theory behind a hypothetical and completely secure internet is debatable.
The discussion seemed to echo some of the debates we’ve had during the pandemic, revolving around the idea of acceptable risk. If we want the internet to be functional and fast and cheaply available, we need to make a security compromise – just like the pandemic forced us to make some compromises to maintain a functional society.
Of course, the debate centers on where exactly this acceptable risk lies, and there is no definitive answer on this one. But one thing’s for sure: good practice can make the internet more secure, but it cannot make it completely secure.
“Well, if you want something really secure, I can give you a brick. It won’t do anything, but it will be secure,” Cerf quipped.
The laureates also debated some of the ethical and security breaches associated with novel technologies and algorithms, most notably Artificial Intelligence and machine learning. But Cerf argues that, while it’s important to pay attention to those, non-AI algorithms also pose a great deal of threat:
“There’s been an enormous amount of attention paid to AI and machine learning and great concerns about ethics, but my big worry, frankly, is not so much about machine learning and the dependents on it,” Cerf mentioned.
We rely too much on things like smartphones and algorithms uncritically, Cerf continued, and that’s potentially hazardous – in particular when we leave all the responsibility to the private sector. Even with something as simple as a camera, users often have no control over the software that runs on it. There are no software updates, the companies who make these products are often bought or disappear off the face of the market, and the user is left using software that may have potential security threats. We should probably pay more attention to these non-AI security hazards, Cerf emphasized.
Sifakis argued that the state and national or international agencies should work to regulate this, but it’s a gargantuan challenge both on the technological side and on the policy side.
Meanwhile, Lamport drew parallels between technology design and algorithm design. Too often, there’s not sufficient intelligent design in algorithms, and instead, there’s a lot of evolution – imperfect things go out in the world and then they’re tweaked and improved, which can have important, cascading repercussions.
At the end of the discussion, there were more open questions than at the beginning, but perhaps this is the key aspect: that we have conversations about both the benefits and the challenges that technology brings. Being able to attend a panel of laureates doing just that is a real treat.