Thursday, August 14, 2008

Same as it ever was

Interesting story today in Technology Review, "How (Not) to Fix a Flaw." MIT students found security flaws in the Boston subway payment system, and they did what appears to be the honorable thing: rather than exploit it, they documented their discovery and attempted to bring it to the attention of others. The transit authority would prefer to keep it quiet while they try to fix the problem, so they moved to censor the students. So, desire for control confronts the threat of disclosure.

Wasn't it ever so? Every couple days I see a message on my machine that it's looking for 'updates' it thinks are essential. If I ask for more information about why I should install the updates it found, I get a fuzzy explanation that amounts to: Don't worry your little brain about this; we know what's best. How different it would be if instead, the message said "We've found a bug we created in the software you're running. An unassigned variable causes the application to freeze, requiring you to close and restart the application. This patch contains the fix for it." I'd love the honesty, and I'd also, strangely, give the software company more credibility just because they risked owning up to their mistakes. Even if they didn't have a fix, but knew about the problem (as with the subway payment system), wouldn't it make sense to get more minds working on the problem by letting others in on it?

We've gotta assume there are no secrets when a bug exists. Just because you don't acknowledge it, you think no one will notice? People who earn a living exploiting this vanity can only be grateful.

No comments: