posted Jun 16, 2020
in Programming
For the last 20 years I've been programming almost exclusively in languages with either duck typing or structural typing. These are languages where you can have objects that either conform to an informal interface without that conformance being declared (dynamically-typed languages), or where conformance is simply a matter of matching the interface (Go). This is as opposed to languages where conformance must be explicitly declared somehow, either with a formal subclass relationship (the class-based portion of C++) or by explicitly declaring conformance to an interface (Java).
Not needing to formally declare conformance is powerful because you can take some pre-existing object, declare a new interface that it conforms to, and then use it with your interface. This reduces the endless paperwork around re-wrapping things to make existing objects conform to new interfaces.
I have seen a lot of programmers over the years express concerns; what if something accidentally conforms to an interface and causes some horrible bug? What if I have an interface for Shoot() for a video game and some object wanders in that Shoot()s a real gun by accident? Isn't it safer to formally declare conformance?
In my 23 years of programming, I have never experienced this bug. I've also never heard of anyone else experiencing this. I've even said this in fora where I should have had someone pop up to correct me; a corollary to Cunningham's Law is that if you say something that should be controversial but nobody pops up to correct you, it must not be controversial after all.
My intuition is also that this should be dangerous. I've thought about it and I can explain why it is much less dangerous than it appears.
I will use Go interfaces for concreteness and simplicity in this post. The argument holds for dynamic scripting languages but requires more useless verbal gymnastics to be accurate as I discuss things.
Consider the example I gave above, which is typical of the sort of concern I see. If we have an interface:
type Shooter interface { Shoot() }
the concern is, "What if we have code that expects Shoot() to be something harmless, but a truly harmful implementation accidentally gets in instead?"
What is necessary for this result?
- The interface must be simple enough to accidentally implement. People asking this commonly use zero-parameter methods named with homonyms. But in practice, most methods aren't that simple. Consider:
package somegame type Shooter interface { Shoot(target game.Entity) }
and
package military_weapon type Shooter interface { Shoot(target sensor_ai.IdentifiedTarget) }
These interfaces are almost certainly still missing arguments in both cases, but overlap is now impossible in Go, and in dynamic languages it's almost certain that the targets will be so incompatible that the code will crash. It would take a highly contrived example of the two targets to have methods overlapping enough for the thing calling Shoot(...) to be able to function with either sort of target.
- The objects have to get mixed up. Even if we stipulate a program in which we have both a video game Shoot() and a military-grade Shoot() (while this may specifically sound silly, less silly instances of this are more feasible), they still have to cross paths. Even in the sloppiest, oldest, cruftiest dynamic languages programs you can find, with the weirdest and silliest metaprogramming you can find, data is still going to tend to have a lane that it stays in, because of the fact that in general, computer programs can't deal with "stuff"; they need structure to operate on, and even in dynamic languages, if the wrong sorts of structures get to code that expects other sorts of structures, things are going to crash. This level of confusion requires more than just accidentally making a typo, but serious structural mistakes in the program.
That is, you can't just typo your way into accidentally passing the wrong Shoot() around, because it's already weird that you'd have a function that has one of each in it in the first place. That itself requires more errors or weird circumstances. Code that is dealing with TCP socket protocol details isn't going to have random StreetAddress instances wander in, even if they both can Send. Code dealing with harmonizing the three computed answers from the subprocessors to decide how to control the RocketEngine won't have random Rapier objects wandering in, even if they can both Thrust.
- Something bad has to actually happen. Another common example is based on the ever-popular animal examples; what if I declare things that can Quack and do something duck-y with them, and somebody accidentally implements Quack on something else?
Well, for this to be some sort of bug, you have to have something undesirable happen as a result. In this particular situation, it's likely that whatever is happening is desirable, not an error. It's even more likely that it is desirable if the interface in question is Quack(output animalsound.Emitter, duration animalsound.Duration, volume animalsound.Volume). If the "accidental" implementation of that method does in fact exist, it's constrained by the interface to the point that it's probably a fine implementation anyhow.
Even a simple interface like Go's io.Write, which could just about be something you accidentally implement, is probably not a problem if it was "accidentally implemented". If you "accidentally" implement Write([]byte) (int, error), odds are it'll still be doing something at least semi-sensible if it is accidentally used as an io.Writer.
Now, follow me carefully here: All three of these are on the edge of plausibility. Since programming is a large space, rare events still occur with some regularity, and I'm sure that many people can cite instances of these three things occurring in their personal experience.
However, the combination of all three of them is exceedingly rare. I've never seen it. I've never heard of anyone experiencing it. I suspect someone, somewhere probably has seen it, and just hasn't participated in the conversations I've been in.
I'm not claiming the combination of these three things is impossible. It isn't. My point is that this is exceedingly rare, and it should be treated as such. It is not something one should spend precious, precious design budget on in a language's design to avoid. The benefits of decoupling interface declarations from their conformance are substantial and pay back on a daily basis; giving that up to avoid substantially less than one bug per average programming career, while also forcing a lot of design spend in the programs written in the language, is as awful a trade as you can get in the programming world.
from Hacker News https://ift.tt/QwiL2xX
No comments:
Post a Comment
Note: Only a member of this blog may post a comment.