Allow me to apologize in advance for today’s offtopic post, which has nothing to do with crypto. Consider it a reflection on large organizations’ ability to manage and protect sensitive data without cryptography. Report card: not so good.
Some backstory. You probably remember that last year sometime Toyota Motors had a small amount of trouble with their automobiles. A few of them, it was said, seemed prone to sudden bouts of acceleration. Recalls were issued, malfunctioning throttle cables were held aloft, the CEO even apologized. That’s the part most people have heard about.
What you probably didn’t hear too much about (except maybe in passing) was that NASA and NHTSA spent nearly a year poring through the Toyota engine control module code to figure out if software could be at fault. Their report, issued in February of this year, basically said the software was ok. Or maybe it didn’t. It’s not really clear what it said, because major portions — particularly of the software evaluation section — were redacted.
Now, like every major redaction of the digital age, these redactions were done thoughtfully, by carefully chopping out the sensitive portions using sophisticated digital redaction software, designed to ensure that the original meaning could never leak through.
Seriously, as is par for the course in these things, NHTSA just drew big black squares over the parts they wanted to erase.
And this is where we get to the sophistication of organizations when it comes to managing secure data. You see, NHTSA released these reports online in February 2011, as a response to a FOIA request. They were up there for anybody to see, until about April — when somebody noticed that Google was magically unredacting these documents. Whoops. Time to put up some better documents!
Naturally NHTSA also remembered to contact Archive.org and ask that the old reports be pulled off of the Wayback Machine. No, really, I’m just kidding about that, too.
Of course, they’re all cached there for all to see, in their full un-redactable glory. All it takes is a copy and paste. For example, take this portion:
Where the redacted part decodes to:
The duty % is converted into three boolean flags, a flag describing the sign of the duty, a flag if the absolute value of the duty is greater than or equal to 88%, and a flag if the absolute value of the duty is less than 1%. The 64 combinations of these flags and their previous values are divided into ten cases. Of the ten cases, five will open the throttle, two of the five will make the throttle more open than currently but not wide open, two will provide 100% duty instantaneously, and one will perpetually open the throttle. Any duty command from the PID controller greater than or equal to 88% will perpetually open the throttle and lead to WOT [wide open throttle]. This also means that any duty greater than 88% will be interpreted by the hardware as a 100% duty command.
So what’s the computer security lesson from this? Once data’s out on the wire, it’s gone for good. People need to be more careful with these kinds of things. On the bright side, this was just information, possibly even information that might be useful to the public. It’s not like it was sensitive source code, which I’ve also seen find its way onto Google.