-->

Pages

Friday, 25 September 2015

VW’s cheating proves we must open up the Internet of Things

It’s been a rough year for the Internet of Things. Security researchers uncovered terrifying vulnerabilities in products ranging from cars to garage doors to skateboards. Outages at smart home services Wink and Google’s Nest rendered customers’ gadgets temporarily useless. And the Volkswagen emissions scandal, though not precisely an Internet of Things issue, has exposed yet another issue with “smart” physical goods: the possibility of manufacturers embedding software in their products designed to skirt regulations.

And those are only the most immediate concerns. The Internet of Things brings with it privacy concerns and compatibility headaches. There’s also the potential for the companies that make this stuff to go belly-up at any moment—as Wink’s parent company Quirky just did. In the worst case scenario, customers could be left with a house full of expensive, not-so-smart gadgets.

It’s enough to make you wonder whether it’s time to scrap the whole idea of smart things and get back to basics. After all, having to get out of bed to turn the heat down or switch off the lights is the ultimate First World problem. That could be part of why consumer interest in smart home products has been sinking, at least according to one report.

But the Internet of Things also holds tremendous potential to improve our health; make our cars safer and more efficient; and conserve both water and energy. The Internet of Things doesn’t have to be a nightmare of deceit, outages, and self-interested black boxes. To protect consumers and realize its true promise, the Internet of Things must go the direction of the software and hardware that supports the Internet itself: it must open up.
The Safety of Objects

Today, the vast majority of smart home gadgets, connected cars, wearable devices, and other Internet of Things inhabitants are profoundly closed. Independent researchers can’t inspect the code that makes them run. You can’t wipe the factory-loaded software and load alternative software instead. In many cases you can’t even connect them to other devices unless the manufacturers of each product have worked out a deal with each other.

Ostensibly, this is for your own protection. If you can’t load your own software, you’re less likely to infect your car, burglar alarm, or heart monitor with a virus. But this opacity is also what helped Volkswagen get away with hiding the software it used to subvert emissions tests. It makes it harder to trust that your thermostat isn’t sending selling your personal info to door-to-door salesmen or handing it out to the National Security Agency.

One of the biggest ironies of the Volkswagen case is that the Environmental Protection Agency actually fought rules that could have made it easier for independent researchers to catch the company’s cheating. The EPA reasoned that making it easier for the public to experiment with the software that runs emissions systems would make it easier for consumers to circumvent pollution controls. Clearly that approach backfired. We can’t know for sure that researchers would have found the Volkswagen defeat device earlier if the software had been more open, but it surely wouldn’t have hurt.

Critics could point to long-standing bugs in open source software as evidence that open source software can be less secure than the proprietary kind. After all, the Shellshock bug in Bash, a standard part of Linux and other open source operating systems, went undiscovered for 22 years. The problem, however, isn’t inherent in the openness; it’s that in many cases open source software has received less scrutiny from researchers because the financial incentive to do so just wasn’t there. No one was posting big bug bounties for Bash. By that same logic, merely putting code out in the open doesn’t make it more secure. It needs to be examined by people who know what they’re doing. As the Volkswagen case shows, openness and vigilance go hand-in-hand.

Companies argue that publishing the code that powers their products will hurt their competitiveness in the marketplace. Competitors, they say, will be able to copy what they do. But there are differing degrees of openness. Writing about Volkswagen fiasco, sociologist and New York Times contributor Zeynep Tufekci recently pointed out that while the code that powers casino slot machines isn’t open to the public, it is audited by regulators. The real trick, as Tukekci points out, will be in getting regulators more involved in auditing devices under real-world conditions, as opposed to labs. While open source zealots would argue that providing the software that cars run on only to regulators and the researchers they hire isn’t enough, it would certainly be more open than what’s going on today.

“It’s a pity that casinos have better scrutiny of their software than the code running our voting machines, cars, and many other vital objects, including medical devices and even our infrastructure,” Tufekci wrote.

(Wired)

No comments:

Post a Comment