Long time no post

Time I waste sometime airing my thoughts to a non-existing audience.

First, promises I made. Fanfic not advancing at all, I'm afraid. Been busy with taxes and lots of other stuff. Xenosaga II is advancing. I'm in the sidequest part, and I dislike doing those on second playthrough (I'll just play a second time with a lot less annoying things to do), so it can be a bit tedious. Though a lot of them are no worse than those in FF X-II, and probably no worse than the stuff in FF VII (I suspect that although I did those ten times, I'd balk at doing it again at my age). Beard is still off, for practical reasons and because I got used to it. And there's no way I can screw up shaving activities in this state.

Reminds me of the best way to keep things clean: own as few things as you can manage. Works OK so far.

Other things... got federal tax return back, so working on taxes instead of fanfic paid off, I guess. In an élan of non-geekiness, I bought a 30$ D-Link router instead of hacking one from leftover computer parts. The amount of power eaten by all those moving parts in regular computers made me balk. D-Link's not really hackable, but they get major good points for giving it a plain HTML interface. I don't know how solid the device is, but it's not being used for anything critical (my laptop isn't connected to it all the time), so even if it dies, it's no big deal.

Been following the Hacker's Diet's exercise program once more, reached rung 20 without too much trouble yesterday. I feel overall more alert. I recommend this program; its main attraction, to me, is that I cannot make excuses to skip an 11-minute routine, and since it's supposed to be daily, I can't think stuff like "I'll do it tomorrow..."

In my current conceited state of mind, I decided to commit to the electronic medium my thoughts on the rules every programmer should know. OK, so they're the rules I know, and they may not apply to every programmer; but I've been toying with the idea of writing some stuff on this someday, and, well, now's as good a time as any.

So, here it is: BGE's Killer Programming Rules... Of Justice.

If it ain't tested, it probably doesn't work
This can be seen as a corollary of the second law of thermodynamics. Code, left alone, "rots". Of course, this sounds silly; the code doesn't change, so how can it rot? Well, everything around it is changing; new OS, new libraries, new runtimes, even the code around it may make that code malfunction. There are only two ways to ensure the code remains OK: active maintenance, which is not always practical (codebases tend to become really big!) or periodic testing. And "testing" does not mean send it to a client and hope it works. That's exactly when it won't (and that part's a corollary of Murphy's law)
Just say no to protected data members
Protected data is t3h 3v1l. There is no telling what derived classes will do to it. Unless you want to be condemned to leave your base classes with the same implementation until the end of time (and that's not really feasible, because of the first principle, above), you should never use protected data. That's right, never. I'm usually not that drastic, but I've never seen protected data being used in a sane manner. And this is coming from the guy who things Multiple Inheritance can be very useful at times. If you think of the implications, protected data makes your base class less reusable and unable to evolve, which kind of goes against the grain of object-oriented programming principles.
Overriding concerns should not be abstracted away
Overriding concerns are things like transactional semantics and resource clean-up. Somebody, somewhere, is going to have to make a decision on the transaction boundaries or on the clean-up time. It very likely should be code that has a wide enough view of the problem to make an intelligent decision, usually some top-level or near-top-level method. I've ranted about this before.
Keep resource ownership sane. Don't transfer it implicitely.
This means several things, namely:
  • DON'T allocate a resource in a method to clean it up in another function at the same level;
  • DON'T have objects allocate resources and expect the caller to clean it up if something bad happens (note that if your object has a "close" function of some sort, it's really the object cleaning up; what I mean here is don't expect the caller to call a method to get said resource and clean it up by hand);
  • DON'T program as if exceptions cannot occur in any block, and DON'T try to catch every exception to force cleanup in catch handlers. This is extremely brittle. See Herb Sutter's Exceptional C++ for more details;
  • DON'T transfer resource ownership if you can help it;
  • DO try to give any limited resource a finite scope in a single method, if possible;
  • DO wrap the resource in an object with a close() function (or a destructor in C++) if the lifetime of the resource cannot be determined by the code allocating it.
Respect the computer and the OS; it's more often right than wrong.
Surprising to many who know me, I do apply this piece of advice to Windows-family OSes as well. In my experience, 98% of the time when I thought the compiler/OS/computer was being stupid, I found out that it was a coding error (not always mine, but always within the programming group). There are some exceptions to this: kernel panics and such should not happen through a programming error, period. But I'm talking of more subtle cases, where you wonder why the heck the code stopped working, why the OS is returning an error there, etc. Don't just throw up your hands in the air at the stupidity of the OS, even if, yes, it is stupid sometimes. But you'll probably find with times that if it's not necessarily entirely your fault, it's at least partly your fault, because what you're doing is a bad idea and that's why the OS is being difficult. Of course, OS programmers are human, and so are computer designers; but they have several hundred thousand programmers who bang on their code everyday, and that doesn't count users doing all sorts of nasty things to their nice piece of software; they are therefore tested very widely. OSes are pretty mature these days, and except for, say, exploits and such, if your code acts weird, it's pretty much certain that it's your fault.
Try to listen to what the machine is trying to tell you.
This is related to the previous point. If you have to do something really cumbersome, or scary, or brittle to get things to work, it's likely that somebody is trying to tell you something. Namely, that your semantics are muddled, that you're using the wrong approach, or that you're trying to do something that's not really allowed. Virus writers want to do the latter, but industrial programmers don't; it always causes huge problems in the long run, when it stops working with compiler XYZ and OS Gamma. Another nice way to test whether it's a boneheaded idea is to try to explain it to somebody. If you feel silly explaining it, or you can't explain it clearly, it's probably because you're a bit confused about what you're supposed to do or how you can achieve it. Try to take a different tack.
Sometimes, it pays to trust your intuition.
I've been known to really dazzle co-workers by looking at some code, pointing at a line saying it's not a good idea to do that, and, of course, it ends up being that line that causes the problem they're trying to fix. Now, if you can't really prove that this is the problem, intuition is worthless. But it's easy enough to throw test data at said method/class and check.
The brilliant lone programmer is a myth.
Well, I can't claim that I'm a perfect authority on the subject. But I've known programmers who started out as loners. Sure, they are more brilliant than some more social programmers. But they always reached their full potential only after becoming more social. Programming is about ideas; if you don't exchange them, you can miss something really obvious, or paint yourself in a nice little corner, or constrain your mind to a nice little box of your own making. It is unfortunate that the myth of the lone, incredibly brilliant scientist is so pervasive in our culture; I guess it's because everyone likes heroes. But, as Isaac Newton allegedly said, "if I have seen so far, it is because I have been standing on the shoulder of giants." (Aside: it's interesting Newton, of all people, said that, as there are rumours that he was not really the most cooperative scientist, nor one who shared his results very often). Now, this doesn't mean that I think one should ignore more introverted programmers; rather, one should try to make them feel comfortable in the team, so they start sharing all those insights. Being an introvert myself, I know it's not easy to bring out one, but it's not completely impossible either.
Keep commented-out code out of your source files.
Yeah, yeah, I know, maybe you'll need it someday. Just like protected member variables right? :-) Seriously, you should use source control. If you use source control, there's no reason to keep this cruft around. If you're really worried that you may need it, apply a source control label to the tree before removing it. The reason? Dead code breaks the flow of the code around it, makes plain-text searches find false positives (I know good IDEs make plain-text searches less frequent, but they still happen sometimes), and by the time you'll need it again, it probably won't be any good anymore; it'll have suffered bit rot. If you really must keep some code commented out in the source file, at least be polite and move it to the end of the file with a comment giving a hint where it came from. This way, people looking at it will figure out immediately that it's not something that was left commented out by mistake. But I really think it should be ditched rather than commented out; the latter idea is a lousy compromise at best. Just use source control. Comments are for explanations, not for executable statements.
Avoid doing things that disgust you, especially if the rest of the code already does.
Your objective, when touching a piece of code, should be to improve it, not worsen it. Adding a feature can be seen as an improvement, but it's not necessarily an improvement for the code quality; it's just a feature. Going through the code and resisting the temptation to copy-paste a segment is an improvement. Nobody will pay for such things, and you're always short of time; I know, I've been there. But when I stopped making excuses for myself, I realized that in many cases, I could find a solution that, if it didn't improve the code much, at least didn't make it worse, implemented the feature properly, took less time to debug because it was easier to understand, and (that's the part that surprised me quite a bit) didn't take more time, overall, to do than the quick-and-dirty solution would have taken. In fact, every time I was forced to take the quick-and-dirty solution, when I redid it properly later, I was always really annoyed somebody had forced me, because the proper way hadn't been any longer and wasn't really riskier. Code will worsen on its own; you should always strive to lessen its entropy, not add to it.
Avoid doing things "just in case."
This is the infamous YAGNI (You Ain't Gonna Need It) principle from Extreme Programming. By all means, think of a design that will accomodate "just in case" (just don't take that too seriously nor waste too much time on it). But don't bother implementing it unless you have an immediate use for it. You'll just be adding to the complexity, with no benefit, and you'll have added code that will rot eventually from disuse and lack of testing.
It's not because there's a class that it's object oriented.
I've seen many cases where people would create bunch of classes, each implementing bunch of interfaces... And who, in the end, created a spaghetti-like mess. OOP does not mean you should forget structured programming; it's still there, within your classes. And it's not because you replaced your globals with singletons that you don't have globals in your project. And it's not because your class implements an interface that it's swappable, especially if all the code ends up using the exact class because of missing stuff in the interface. I could go on, but I think you get the point.
Be really, really careful when designing and changing persistent data formats.
Those are always a bitch to upgrade. Your internal code can change somewhat more easily, so internal code organisation is only important for your next maintenance release; if it's not ideal, there's the possibility to fix it later. But broken persistent formats can be a real pain to fix. Even more so if you're signing the data or encrypting it in some way; you may not be able to re-sign or re-encrypt it after you've converted it. Bummer. Changing formats should be done carefully as well, because there's always trouble with conversions or with people who want to use old versions with new data. And no, it's not because you're using XML that you're safe from any of those problems. You can have problems with XML-based formats, too; you only avoid (some) parsing problems, not semantic problems.
Don't go ape with design patterns.
I sometimes think that software design teachers are encouraging students to look for ways to apply patterns when they teach their course. This is exactly the wrong way to teach patterns. Patterns should be a solution to an existing problem: you have a piece of code that needs to do such-and-such thing, so such-and-such pattern will help. You should never, even be given a pattern and asked to apply it in your program. The reason I think this is what's happening is because I've seen a trend in code written more recently that it's filled with pattern mush, often in places where it doesn't make sense. Extra factories where a simple function would do; composites where the composite's properties are not being used; and so on. This is extremely annoying because it can make the code hard to follow, especially if the person applying the pattern didn't really understand it. Also, I find that a few patterns in the GoF book aren't all that useful; Flyweight is rarely applied correctly, and I always found Visitor to be a bit cumbersome (yes, I know what it's supposed to solve, but I've usually found different ways to solve this particular problem, and they tend to be easier to read for maintenance programmers).
Make objects minimal-state.
That is, objects should maintain the bare minimum of the state they need. Never add a member variable just to avoid passing a parameter between two member functions; it may look like a good idea at first, but it's really not--you're adding extra state to the object for no good reason. Would you add a global to your module just to avoid parameters in structured programming? Well, adding a member for that reason is like adding a semi-global, and it's not a good idea.
Avoid the construct-and-call-setters anti-pattern.
I've seen this very often, and it makes me feel a bit sick every time. You have an object with an empty constructor, which must have its setters called before you call any method. IoC is bringing back this way of doing things (so is struts, to some degree), and it really distresses me. What if you forget to call a setter? What if you call them in the wrong order somehow? What if somebody calls a setter after calling a method, thus violating some invariants? By turning an atomic operation (object construction) into a non-atomic one, you're asking for trouble. Strive to keep your objects in a sane state, always. It should not be possible to put it in an insane state from the public interface, especially not by forgetting to call a method... Note that you can always prevent such problems by adding manual checks, but that's brittle; it's better to make it impossible to put the object in an incorrect state in the first place.

Well, that ran a bit longer than I thought, and it contains some stuff that's a bit lower-level than what I initially wanted to write. But there it is; I hope it was somewhat useful to you, at least. It's by no means complete, but it's a start. I may complete this list from time to time when things come to my attention. I may also strike out some items or modify them, as I've been known to revise my ideas on some things.

No comments: