Yesterday's post caused a bit of a furor in the comments thread. A large number of people leaving comments (and others) didn't understand why the OS division has a "no Easter Eggs" policy.
If you think about this, it's not really that surprising. One of the aspects of Trustworthy Computing is that you can trust what's on your computer. Part of that means that there's absolutely NOTHING on your computer that isn't planned. If the manufacturer of the software that's on every desktop in your company can't stop their developers from sneaking undocumented features into the product (even features as relatively benign as an Easter Egg), how can you be sure that they've not snuck some other undocumented feature into the code.
Even mandating that you have access to the entire source code to the product doesn't guarantee that - first off, nobody in their right mind would audit all 10 million+ lines of code in the product before deployment, and even if you DID have the source code, that doesn't mean anything - Ken Thompson made that quite clear in his Turing Award lecture. Once you've lost the trust of your customers, they're gone - they're going to find somewhere else to take their business.
And there are LOTS of businesses and governments that have the sources to Microsoft products. Imagine how they'd react if (and when) they discovered the code? Especially when they were told that it was a "Special Surprise" for our users. Their only reaction would be to wonder what other "Special Surprises" were in the code.
It's even more than that. What happens when the Easter Egg has a security bug in it? It's not that unplausable - the NT 3.1 Easter Egg had a bug in it - the easter egg was designed to be triggered when someone typed in I LOVE NT, but apparently it could also be triggered by any anagram of "I LOVE NT" - as a result, "NOT EVIL" was also a trigger.
Going still further, Easter Eggs are percieved as a symptom of bloat, and lots of people get upset when they find them. From Adequacy.org:
Now if you followed the link above and read the article you may be thinking to yourself...
- Is this what MS developers do when they should be concentrating on security?
- How often do they audit their code?
- What's to stop someone from inserting malicious code?
- Is this why I pay so much for Windows and MS Office?
- I know other non-MS software contains EEs but this is rediculous.
- One more reason why peer review is better as EEs and malicious code can be removed quickly.
- Is this why security patches takes so long to be released?
- This is TrustWorthy Computing!?!
From technofile:
Even more disturbing is the vision of Microsoft as the purveyor of foolishness. Already, the cloying "Easter eggs" that Microsoft hides away in its software -- surprise messages, sounds or images that show off the skill of the programmers but have no usefulness otherwise -- are forcing many users to question the seriousness of Microsoft's management.
A company whose engineers can spend dozens or even hundreds of hours placing nonsensical "Easter eggs" in various programs would seem to have no excuse for releasing Windows with any bugs at all. Microsoft's priorities are upside down if "Easter egg" frills and other non-essential features are more important than getting the basic software to work right.
From Agathering.net:
"and some users might like to know exactly why the company allows such huge eggs to bloat already big applications even further"
I've been involved in Easter Eggs in the past - the Exchange 5.0 POP3 and NNTP servers had easter eggs in them. In our case, we actually followed the rules - we filed a bug in the database ("Exchange POP3 server doesn't have an Easter Egg"), we had the PM write up a spec for it, the test lead developed test cases for it. We even contacted the legal department to determine how we should reference the contingent staff that were included in the Easter Egg.
But it didn't matter - we still shouldn't have done it. Why? Because it was utterly irresponsible. We didn't tell the customers about it, and that was unforgivable, ESPECIALLY in a network server. What would have happened if there had been a buffer overflow or other security bug in the Easter Egg code? How could we POSSIBLY explain to our customers that the reason we allowed a worm to propagate on the internet was because of the vanity of our deveopers? Why on EARTH would they trust us in the future?
Not to mention that we messed up. Just like the NT 3.1 Easter Egg, we had a bug in our Easter Egg, and we would send the Easter Egg out in response to protocol elements other than the intended ones. When I was discussing this topic with Raymond Chen, he pointed out that his real-world IMAP client hit this bug - and he was more than slightly upset at us for it.
It's about trust. It's about being professional. Yeah, it's cool seeing your name up there in lights. It's cool when developers get a chance to let loose and show their creativity. But it's not cool when doing it costs us the trust of our customers.
Thanks to Raymond, KC and Betsy for their spirited email discussion that inspired this post, and especially to Raymond for the awesome links (and the dirt on my broken Easter Egg).
Edit: Fixed some html wierdness.
Edit2: s/anacronym/anagram/
from Hacker News https://ift.tt/3wbtoYC
No comments:
Post a Comment
Note: Only a member of this blog may post a comment.