Cory Doctorow: Demon-Haunted World

Cheating is a given.

Inspectors certify that gas-station pumps are pumping unadulter­ated fuel and accurately reporting the count, and they put tamper-evident seals on the pumps that will alert them to attempts by station owners to fiddle the pumps in their favor. Same for voting machines, cash registers, and the scales at your grocery store.

The basic theory of cheating is to assume that the cheater is ‘‘rational’’ and won’t spend more to cheat than they could make from the scam: the cost of cheating is the risk of getting caught, multiplied by the cost of the punishment (fines, reputational dam­age), added to the technical expense associated with breaking the anti-cheat mechanisms.

Software changes the theory. Software – whose basic underlying mechanism is ‘‘If this happens, then do this, otherwise do that’’ – allows cheaters to be a lot more subtle, and thus harder to catch. Software can say, ‘‘If there’s a chance I’m undergoing inspection, then be totally honest – but cheat the rest of the time.’’

This presents profound challenges to our current regulatory model: Vegas slot machines could detect their location and if they believe that they are any­where near the Nevada Gaming Commission’s testing labs, run an honest payout. The rest of the time, they could get up to all sorts of penny-shaving shenanigans that add up to millions at scale for the casino owners or the slot-machine vendors (or both).

Even when these systems don’t overtly cheat, software lets them tilt the balance away from humans and towards corporations. The Nevada Gaming Commission sets the payout schedule for slot machines, but it doesn’t regulate the losses. This allows slot machine vendors to tune their machines so that a losing spin is much more likely to look like a ‘‘near miss’’ (lemon and two cherries, paying zero; three cherries pays a jackpot). The machine looks like it’s doing the same thing with a win or a loss, but losses are actually fine-tuned performances of near-win designed to confound your intuition about how close victory might be.

Software makes for a much more dangerous set of cheats, though. It’s one thing to be cheated by a merchant’s equipment: there are only so many merchants, and to operate a business, they have to submit themselves to spot inspections and undercover audits by secret shoppers.

But what happens when the things you own start to cheat you? The most famous version of this is Volkswagen’s Dieselgate scandal, which has cost the company billions (and counting): Volkswagen engineered several models of its diesel vehicles to detect when the engine was undergoing emissions testing and to tilt the engines’ performance in favor of low emis­sions (which also meant more fuel consumption). The rest of the time, the engines defaulted to a much more polluting mode that also yielded better gas mileage. Thus the cars were able to be certified as low-emissions by regulators and as high efficiency by reviewers and owners – having their cake and eating it too.

Dieselgate killed people, but the cheating in the Dieselgate scandal was still aimed at government inspectors. The next level of cheating comes when systems try to fool independent researchers.

A celebrated recent example of this came with the Wannacry ransomware epidemic. Wannacry is an old piece of malicious software, and it uses a variety of vectors to find and infect vulnerable hosts; once it takes root, Wannacry encrypts all its victims’ files, and demands a Bitcoin ransom in exchange for the decryption key. In early summer 2017, Wannacry had a resurgence after it was harnessed to a leaked NSA cyberweapon that made it much more virulent.

But within days of that resurgence, Wannacry was stopped dead in its tracks, thanks to the discovery and deployment of a ‘‘killswitch’’ built into the software. When Wannacry took over a new computer, the first thing it did is check to see whether it could get an answer when it tried to open a web-session to <iuqerfsodp9ifjaposdfjhgosurijfaewrwergwea.com>. If there was a web-server at that address, Wannacry ceased all operations. By registering this domain and standing up a web-server that answered to it, a security researcher was able to turn off Wannacry, everywhere in the world, all at once.

A casual observer may be puzzled by this kill switch. Why would a crimi­nal put such a thing in their software? The answer is: to cheat.

The greatest risk to a program like Wannacry is that a security researcher will be able to trick it into infecting a computer under the researcher’s control, a ‘‘honey pot’’ system that is actually a virtual machine – a computer program pretending to be a computer. Virtual machines are under their owners’ perfect control: everything the malicious software does within them can be inspected. Researchers use virtual machines like cyberpunk villains use VR: to trap their prey in a virtual world that is subjec­tively indistinguishable from objective reality, an Inception-style ruse that puts the malware under the researcher’s omnipotent microscope.

These head-in-a-jar virtual machines are often configured to pretend to be the entire internet as well. When the malware caught within them tries to reach a distant web-server, the researcher answers on that server’s behalf, to see if they can trick the malware into attempting to communicate with its master and so reveal its secrets.

Wannacry’s author tried to give their software the ability to distinguish a honey-pot from the real world. If the software’s attempt to contact the nonexistent domain was successful, then the software knew that it was trapped in a re­searcher’s lab where all queries were duly answered in an attempt to draw it out. If Wannacry got an answer from <iuqerfsodp9ifjaposdfjhgosurijfae­wrwergwea.com>, it folded into a protective, encrypted foetal position and refused to uncurl. Registering the domain and standing up a web-server there tricked every new Wannacry infection in the world into thinking that it was running on a honey-pot system, so they all stopped working.

Wannacry was a precursor to a new kind of cheating: cheating the in­dependent investigator, rather than the government. Imagine that the next Dieselgate doesn’t attempt to trick the almighty pollution regulator (who has the power to visit billions in fines upon the cheater): instead, it tries to trick the reviewers, attempting to determine if it’s landed on a Car and Driver test-lot, and then switching into a high-pollution, high-fuel-efficiency mode. The rest of the time, it switches back to its default state: polluting less, burning more diesel.

This is already happening. MSI and Asus – two prominent vendors of computer graphics cards – have been repeatedly caught shipping hardware to reviewers whose software had been sped way, way up (‘‘overclocked’’) over the safe operating speed. These cards will run blazingly fast for the duration of the review process and a little while longer, before burning out and being rendered useless – but that will be long after the reviewers return them to the manufacturer. The reviewers advise their readers that these are much faster than competing cards, and readers shell out top dollar and wonder why they can’t match the performance they’ve read about in the reviews.

The cheating can be closer to home than that.

You’ve probably heard stories of inkjet cartridges that under-report their fill-levels, demanding that you throw them away and replace them while there’s still plenty of (precious and overpriced) ink inside of them. But that’s just for starters. In 2015, HP pushed a fake security update to millions of Officejet owners, which showed up as a routine, ‘‘You must update your soft­ware’’ notification on their printers’ screens. Running that update installed a new, secret feature in your printer, with a long fuse. After six months’ wait, the infected printers all checked to see whether their ink cartridges had been refilled, or manufactured by third parties, and to refuse to print with any ink that HP hadn’t given its corporate blessing to.

HP is an egregious cheater, and this kind of cheating is in the DNA of any company that makes its living selling consumables or service at extremely high markups – they do their business at war with their customers. The better the deal their customers get, the worse the deal is for the manufacturer, and so these products treat customers as enemies, untrusted parties who must be tricked or coerced into installing new versions of the manufacturer’s software (like the iTunes and Kindle ‘‘updates’’ that have removed features the products were sold with) and using only the manufacturer’s chosen consumables.

The mobile phone industry has long been at war with its customers. When phones were controlled primarily by carriers, they were designed to prevent customers from changing networks without buying a new phone, raising the cost on taking your busi­ness elsewhere. Apple wrested control back to itself, producing a phone that was locked primarily to its app store, so that the only way to sell software to an iPhone user was to give up 30% of the lifetime revenue that customer generated through the app. Carriers adapted custom versions of Android to lock customers to their networks with shovelware apps that couldn’t be removed from the home-screen and app store lock-in that forced customers to buy apps through their phone company.

What began with printers and spread to phones is coming to everything: this kind of technology has proliferated to smart thermostats (no apps that let you turn your AC cooler when the power company dials it up a couple degrees), tractors (no buying your parts from third-party companies), cars (no taking your GM to an independent mechanic), and many categories besides.

All these forms of cheating treat the owner of the device as an enemy of the company that made or sold it, to be thwarted, tricked, or forced into con­ducting their affairs in the best interest of the com­pany’s shareholders. To do this, they run programs and processes that attempt to hide themselves and their nature from their owners, and proxies for their owners (like reviewers and researchers).

Increasingly, cheating devices behave differ­ently depending on who is looking at them. When they believe themselves to be under close scrutiny, their behavior reverts to a more respectable, less egregious standard.

This is a shocking and ghastly turn of affairs, one that takes us back to the dark ages. Before the Englightenment, before the scientific method and its peer review, science was done by alchemists, who worked in secret.

Alchemists – like all humans – are mediocre lab-technicians. Without peer reviewers around to point out the flaws in their experiments, alchemists compounded their human frailty with bad experi­mental design. As a result, an alchemist might find that the same experiment would produce a ‘‘differ­ent outcome’’ every time.

In reality, the experiments lacked sufficient con­trols. But again, in the absence of a peer reviewer, alchemists were doomed to think up their own explanations for this mysterious variability in the natural world, and doomed again to have the self-serving logic of hubris infect these explanations.

That’s how alchemists came to believe that the world was haunted, that God, or the Devil, didn’t want them to understand the world. That the world actually rearranged itself when they weren’t looking to hide its workings from them. Angels punished them for trying to fly to the Sun. Devils tricked them when they tried to know the glory of God – indeed, Marcelo Rinesi from The Institute for Ethics and Emerging Technologies called modern computer science ‘‘applied demonology.’’

In the 21st century, we have come full circle. Non-human life forms – limited liability corpo­rations – are infecting the underpinnings of our ‘‘smart’’ homes and cities with devices that obey a different physics depending on who is using them and what they believe to be true about their surroundings.

What’s worse, 20th century law puts its thumb on the scales for these 21st century demons. The Computer Fraud and Abuse Act (1986) makes it a crime, with jail-time, to violate a company’s terms of service. Logging into a website under a fake ID to see if it behaves differently depending on who it is talking to is thus a potential felony, provided that doing so is banned in the small-print clickthrough agreement when you sign up.

Then there’s section 1201 of the Digital Millen­nium Copyright Act (1998), which makes it a felony to bypass the software controls access to a copy­righted work. Since all software is copyrightable, and since every smart gadget contains software, this allows manufacturers to threaten jail-terms for anyone who modifies their tractors to accept third-party carburetors (just add a software-based check to ensure that the part came from John Deere and not a rival), or changes their phone to accept an independent app store, or downloads some code to let them choose generic insulin for their implanted insulin pump.

The software in gadgets makes it very tempting indeed to fill them with pernicious demons, but these laws criminalize trying to exorcise those demons.

There’s some movement on this. A suit brought by the ACLU attempts to carve some legal exemp­tions for researchers out of the Computer Fraud and Abuse Act. Another suit brought by the Electronic Frontier Foundation seeks to invalidate Section 1201 of the Digital Millennium Copyright Act.

Getting rid of these laws is the first step towards restoring the order in which things you own treat you as their master, but it’s just the start. There must be anti-trust enforcement with the death penalty – corporate dissolution – for companies that are caught cheating. When the risk of getting caught is low, then increasing penalties are the best hedge against bad action. The alternative is toasters that won’t accept third-party bread and dishwashers that won’t wash unauthorized dishes.

Making better computers won’t solve the world’s problems, but none of the world’s problems are ours to solve for so long as the computers we rely on are sneaking around behind our backs, treating us as their enemies.


Cory Doctorow is the author of Walkaway, Little Brother, and Information Doesn’t Want to Be Free (among many others); he is the co-owner of Boing Boing, a special consultant to the Electronic Frontier Foundation, a visiting professor of Computer Science at the Open University and an MIT Media Lab Research Affiliate.


This review and more like it in the September 2017 issue of Locus.Locus Magazine, Science Fiction Fantasy

While you are here, please take a moment to support Locus with a one-time or recurring donation. We rely on reader donations to keep the magazine and site going, and would like to keep the site paywall free, but WE NEED YOUR FINANCIAL SUPPORT to continue quality coverage of the science fiction and fantasy field.

One thought on “Cory Doctorow: Demon-Haunted World

  • September 3, 2017 at 2:28 am
    Permalink

    This is an excellent article which demonstrates the real danger of Dieselgate. VAG’s (and others) cheating is potentially far more damaging and widespread than would appear at first sight. If unchecked it undermines centuries of the scientific method.

    One minor point. The dieselgate cheat was there to protect the engine from damage during normal driving. The vehicle operates in low NOx mode for the short duration of the test. Once on the road the system reverts to high NOx mode. Improving the reliability and durability of components and reducing warranty claims.

    Reply

Leave a Reply

Your email address will not be published. Required fields are marked *