[2001-07-11]

The Breaking Revisited: Why I did it

"Gather the faithful and propose a toast to the epoch of indifferance.
An all too ordinary story
with an aftertaste so bitter, so bitter."
  -- "Ordinary Story", In Flames

Introduction

The 11:th of March 2000 an essay called "The Breaking of Cyber Patrol(R) 4" was published on my then homepage. It was the result of weeks of joint work by me and fellow netizen Matthew Skala, and it described in technical detail the working and goings on of the Internet filtering software Cyber Patrol.

What happened next was interesting, but that part of the story is best told by Matthew's Cyber Patrol break FAQ. What hasn't been told though, is my explanation of why I did it. Matthew published his reasons shortly after the whole debacle, but I decided to hold back. Part laziness, part a naïve hope that people would get it without someone having to explain it in so many words.

Over a year has passed since, and most people still don't get it.

Why I did it

I knew from the first feedback we got that people apparently couldn't understand why we did it. Why would we do something which we knew would expose innocent kids to the incredible horrors of the Internet? Let me first answer this fallacy with my tongue firmly in my cheek; Because it was and still is the right thing to do.

One of the things I've learned from all of this is that people have a great deal to say on matters of which they have little or no understanding. Oh, I know what you're thinking "Hey, watch that pompous ass of yours..." but the fact of the matter is that many people -- most, even -- who's been in contact with me on this topic make their complaints and arguments seemingly without having read the source material, namely the essay. It's very hard to have a constructive dialogue with someone who isn't in possession of the relevant base facts.

So, why did I do it? I did it primarily to expose the product for the shabby piece of work it was, and the threat of a lawsuit that ensued is a testament to the quality our work. We exposed the developers as incompetent, and that - coupled with what they saw as a chance to expose their brand and get some good PR - that my friend is why the lawyers came to be involved.

The first fallacy: Obscurity works

One common objection goes something like "If parents wants to install filtering software then it is up to them. To post ways of getting around filtering software is highly irresponsible and totally unjustifiable!".

Anyone who read, or at least browsed, the essay should see that what we did, we did not do against parents, but for them. In fact, I've always thought about it as consumer enlightenment. The purpose is to put the facts on the table so that parents, teachers, librarians and everyone else can make informed decision about the product. This is everything but an attack on parents, this is about your rights.

To understand why this is so you really need to know a tiny bit about computer security, because at its core the essay is a technical review, and our full disclosure of how the password system worked is what many people seem to get in a fit about. Why did we do that?

Know this; in the realm of security there is a golden rule of sorts, and it goes like so: "Security through obscurity isn't."

A simple analogy follows. Postulate that a manufacturer of cars made claims about the safety of their product, claims which someone then proved were false. Say someone by lawful means gained access to such a car, had experts examine it, experts who discovered that the car locking mechanism is so bad anyone with a flat piece of iron can open it in seconds. Now, this is where we stand:

The manufacturer of course doesn't want this to come out, it would be bad press. They'd much rather it'd be kept a secret, with promises of improving things for the next series. They'll argue that releasing this information will set of a wave of car-theft. Clearly, that would be undesirable. The might also opt to go on the offensive, misrepresenting the work of the experts and maybe even try to convince the public that what the experts did when they analyzed the car was illegal, or at least ought to be.

Anyway, the people who discovered the flaw on the other hand will argue, and rightly so, that security through obscurity isn't. It's simply a fact that we've learned over the years in the field of security. The argument goes that not telling the world means that the consumers won't get the information they need to protect themselves by taking the appropriate action; changing the locks, purchase add-on security, replace the car, sue the manufacturer and force them to pay for fixing all faulty cars or whatever it may be. This choice, this consumer right and freedom should not be restricted.

Not only that, but there is no guarantee that those who would want to misuse the information to steal cars wouldn't learn of it anyway. It is, after all, their business to know that sort of thing. Then where would we be?

How many cars would have to be broken into before someone would be in the "moral right" to speak up about the flaws? 10? 100? 100,000?

When the essay was released there were already known ways of bypassing Cyber Patrol out there, in the wild. Yes, it's true. Your kid could bypass Cyber Patrol long before our essay came along, and you wouldn't be any the wiser.

Didn't know that, did you? Still a believer in security through obscurity?

No, if you take the stance that what we did was wrong because it allowed people to bypass the filter, then you really should go to the producer of the filter and ask them "How come your product wasn't secured against these attacks in the first place?".

If they answer, let me know.

So again, I did it because the information would benefit consumers.

The second fallacy: If you're against X, then you're for Y.

I hold beliefs and opinions that most users of filtering software find, I am certain, absurd. For instance, my informed opinion is that filtering software vendors are like to the old snake-oil peddlers of ages past, who went from town to town selling magic elixirs that would cure anything and everything. Of course, it only "worked" on the true believers.

As I've stated in my essays, I give no credence whatsoever to the hypothesis that children should be protected from knowing the facts of the world, or they'll become horrible immoral and disturbed people. Quite the contrary, I believe strongly that the act of restriction in itself is what is immoral and unjust.

This leads me to the next complaint that I've seen, and that is the false conclusion that: If you're against filtering, then you're for exposing children to porn/hatespeech/evil.

I'm for bringing up kids to be enlightened individuals, equipped with a critical mind, a desire to learn and an a healthy set of ethics to set them on their way. Hiding nature from them isn't going to make them enlightened, and leaving the filter software to do your parenting for you probably won't foster critical thinking much, and by trying to hide everything you find questionable, when and where will you have discussions on ethics?

I believe in a middle way. You don't hide things, you don't flaunt them either. You just explain them as the come up. Can't explain something? That's okay too. Just say that then.

But it ain't quite so simple...

No it isn't. My way is the hard way. It takes effort, thought and courage to raise children to be informed individuals, whereas installing a filter software only take a chunk of money and a whole lot of belief in advertising.

So I did it because I don't believe in filters, and exposing them is the only way I know to spread the word. And again, let me remind you; There wouldn't be anything for people like me to expose if these products were developed using sound software-engineering practices to begin with.

The technocrat

I have a hackeresque personality. The number of contact-points between me and a typical hacker is quite astonishing. Understanding hackers would give you one reason as to why I did it. Because there's a technical challenge in it. A puzzle. A problem to broken into pieces and conquered. A chance to pit my brain against those of the developers, to understand what they did by reverse-engineering, and then at a higher level, to try and understand why they did what they did -- something which may involve fields far from computer science.

So I did it for the challenge, and for the fun of it.

I did it because I could.

From there and beyond

Some comments. I had planned to release my part of the software under the GPL. I had never done anything like that, and even though I'd used Free Software for quite some time I wasn't "in the know" on just how one would release something using the license. I made a mental note about the licensing issue, to be discussed with Matthew prior to release.

Of course, as the final hours approached I was growing weary, we both were. I really wanted to get that thing out the door so that I could focus on other things (The irony, eh?). Some things were cut. I wanted to comment on the scheme used to register Cyber Patrol, but I had decided to not publish anything that could be used to pirate the program (after all, I don't want people to use these programs!), so that line of work was the first to be cut. A pity really, because had we gone that way we might have discovered the bad protection of credit-card information that was identified and published much later (See the FAQ for more info). The whole story could have been completely different with that info in the essay.

Anyway, I forgot to talk to Matthew about the licensing thing. I actually though that my software contribution was under the GPL; that was my intention, and at that time I thought that just stating as much in the source somewhere would be enough. Yeah, I can be pretty naïve.

In the best of worlds me and Matthew would have discussed the licensing, and then clearly expressed our desire to put everything in the public domain. It is most unfortunate that this didn't happen.

Why I didn't stand up for my rights? Well, there were few things to be gained, not much more than good personal PR. As I saw it, we'd already done what we'd set out to do. For the small price of seeing a stupid corporation gloat in their false victory, I got to move on to other things without risking running into deep financial troubles. See, at that time I had no money to defend myself with (not that I think it'd come to that, but the risk was there). All those complaining about us caving in, why don't you replicate our work then? You could have your very own circus, no?

Wanna hear the kicker? My lawyer told me that after a meeting with their reps in Sweden, they reportedly said something in the line of "Tell Eddy he can contact us if he has any suggestions on how we can improve our product". Yeah, look what that got me into the first time! Really, when I heard this I became speechless. Then I walked away in disgust.

So, to close things up, let me again emphasize the main reason why I did it: I wanted to expose the block-list. Read the essay, check the software. Plainly they spend the most space on that particular topic. Passwords? A nice bonus. Blocklists? Essential.

Parting words

Regrets? Some. Would I do it again. Yeah. Would I do it differently? Oh, yes.

Signed,
Eddy L O Jansson.

"To save the world requires faith and courage: faith in reason,
and courage to proclaim what reason shows to be true.
" -- Bertrand Russell

©2001 Eddy L O Jansson. All rights reserved. All trademarks acknowledged.