The heroes in the war on computer viruses
by Karl D. Stephan | January 22, 2018
Most computer viruses and bugs go for particular operating systems, Windows being the most popular, because it's on the majority of PCs. So Mac users, although occasionally suffering their own kinds of attacks, usually breathe a sigh of relief every time a major PC-only virus hits the news.
But recently, you may have heard about a pair of bugs called Meltdown and Spectre that go for hardware, not software. In particular, Meltdown is a vulnerability associated with Intel processors made since 1995, and the dominance of Intel means Macs, PCs, and most you-name-it computers are potential targets. Spectre reportedly is even worse. But the key word here is "potentially."
In an announcement, Apple claimed that no known malicious hacks have actually been committed using either of these bugs. And by the time the general public learned about them, the major computer and software makers were already well on their way to devising fixes, although the fixes may have their own drawbacks.
The reason no bad guys have apparently used these bugs is that they were discovered independently by computer researchers in Austria and the United States. And following a policy called "responsible disclosure," the researchers notified Intel that their chips were vulnerable to these bugs. So until now, apparently the criminal elements of the computer world either didn't know of the bugs or didn't use them.
I am not a computer scientist, but the technical details of how Meltdown happens are interesting enough to try to summarize. Apparently, some years back chip designers started doing certain things to speed up the use of what is called "kernel memory."
If you think of the kernel as a little homonculus guy (call him the Kernel) sitting in the control room doing the computer math, the trick they were playing with the Kernel's memory amounts to having other homonculus-people in the room guess at what the Kernel's going to want to do next, and bring stuff out of memory so it can be waiting for him when he needs it. And all this stuff has to be secure from outside spying, so there's even security checks done way inside the control room there.
But Meltdown evidently exploits some little timing gap between the moment the contents of memory get there and the moment they are certified as secure. It's like some spy taking a picture of the secret document during the few seconds between its arrival in the room and when it's put into the "Top Secret" box. I'm sure some computer scientists are having a good laugh at my pitiful attempt to describe this thing, but that's the impression I got, anyway.
So there are two ways to fix it: redesign the hardware or write a software patch and put it in upgrades. Obviously, if you're running older hardware, you're not going to rip out your Intel processor and put in the new one once they've designed the flaw out of it. So the only practical thing right now is installing software fixes, which evidently will be included in standard operating system upgrades for PCs and Macs.
Realistically, though, it appears that actually using these bugs to steal data is very tricky, and that is probably why nobody has discovered evidence that they've ever been used maliciously. But even if they haven't, everybody knows about them now, and so theoretically a non-upgraded Mac could be spied on without a trace. I'll put upgrading my OS on my to-do list for the new year, anyway.
This whole episode puts a highlight on the question of what computer researchers do when they discover flaws that no one else had suspected. We can be grateful that Daniel Gruss and his colleagues at Austria's Graz Technical University, and Jann Horn at Google's Project Zero, who independently discovered the bugs as well, did the responsible thing and informed Intel and company of the problems as soon as they found they could be exploited.
But it's not that hard to imagine what might have happened if some criminal groups, or worse, a state bent on cyber-warfare, had discovered these flaws first. There are countries where both highly advanced computer science research is going on, and where researchers would be encouraged not to notify the manufacturers in the US, but to inform their government's military of such discoveries for use in future cyberattacks.
It's a little bit like thinking what World War II would have been like if Hitler hadn't chased away most of Germany's leading nuclear physicists, and he had gotten hold of nuclear weapons before the Allies did.
Recently I saw "Darkest Hour," the film about Winston Churchill during the crucial days in May of 1940, as Hitler's armies were overwhelming continental Europe and Churchill accepted the post of Prime Minister of England. Things looked really bad at the time, and many powerful people advised him to give up the fight as hopeless and settle with Hitler before all was lost. But needless to say, Churchill made the right decision and rallied Parliament with his famous speech in which he declares "We shall never surrender."
It's easy to get all nostalgic over times when issues were more clear-cut, and the only kinds of military threats were physical things like guns, airplanes, and bombs. Not that World War II was a picnic—it was the worst self-inflicted cataclysm humanity has devised so far. And tragic times make heroes, as World War II made a hero of Churchill and millions of otherwise ordinary people who lived through that extraordinary time.
But we have similar heroes working among us even today. For every researcher and scientist who worked on nuclear weapons, radar, or other advanced military technologies back then, we have people like Gruss and Horn now who discover potential threats to the world's infrastructure and turn them over to those who will mitigate them, not exploit them for evil ends.
So here is a verbal bouquet of thanks to both them and other computer wonks who use their discoveries for good and not evil. May their tribe increase, and may we never have cause to watch a future reality-based movie about how some nasty computer virus killed thousands before the good guys figured out how to stop it.
Karl D. Stephan is a professor of electrical engineering at Texas State University in San Marcos, Texas. This article has been republished, with permission, from his blog Engineering Ethics which is a MercatorNet partner site. His ebook Ethical and Otherwise: Engineering In the Headlines is available in Kindle format and also in the iTunes store.