Ruby is one of the coolest languages I've encountered in twenty-six years of learning new languages. I try to learn every new language I meet and that's why I got into coding in the first place - to develop languages and understand how a few statements that looked roughly like English could make a machine that only understands 1s and 0s perform interesting tasks. I know that language geekery in the linguistic sense is a minority interest amongst developers, most of whom only want derivatives of C (C++/Java/C#, etc.), but that's what sucked me in aged eleven, and that's what keeps me here today.
So when I say I love Ruby, it's not because of the hype or because of any particular implementation. It's because linguistically it's a beautiful and powerful language, the marriage of Lisp's elegance with Basic's simplicity. It's the language I wish I'd had when I first started out, and by now I'd actually be as good a Lisp hacker as I've always aspired to but never quite pulled off.
But Ruby has a problem. Well a couple of problems really. Current implementations still lag behind where I'd like performance-wise - kudos to both Charlie and Ezra for their respective implementations, but most of us are still using Matz's and that's pretty sluggish. Those kind of problems get solved with time though, so it's no biggy. Far more troubling though is the cult-like following that camps on the language's fringe.
You see in the tradition of Greek tragedy Ruby is cursed by the worst of all offspring, that supposed killer web development framework Rails. Rails itself is interesting technology, and certain of its components (like ActiveRecord) are beautifully conceived and implemented. DHH had a moment of revelation and that's turned out to be so powerful that it's made even Sun and Microsoft quake. However Rails has become the proverbial tail wagging the Ruby dog, or more to the point the Rails community - which in general has no particular affiliation with Ruby itself - has become the public embodiment of Ruby. So much so that my colleague spikyblackcat and I have to talk at Rails conferences if we want to demonstrate the cool stuff that can be done with Ruby and reach anything like a decent sized audience.
The thing that strikes us every time we do this is the complete incomprehension our audiences display for what are essentially simple little coding tricks. Tricks that anyone who spent more than five minutes learning the core language would spot. Neither of us is in Why's league, but we still blow minds so apparently Rails developers don't learn Ruby, they learn Rails. That's like a mechanic setting up in business fixing cars when they've only studied the sales brochure for a Citroen Zsara - the moment they're outside that comfort zone they're screwed.
And the Rails world is screwed.
It's screwed by the myth of test-driven development, which in general means that instead of writing well-thought-out code developers hack some stuff together based on some tests, and if it passes the tests they consider it fit for purpose. In fact I've been told by a rather well-known proponent of Agile that that's the very best way to write code.
Now until I meet someone who can show that they've done something significant by my personal yardstick - full end-to-end design, implementation and certification of a cockpit navigation and control system - I will remain the best-qualified person in the room on the subject of code quality and fitness for purpose, because as I routinely harp on about when I'm in a ranting mood I have done that, and in visual basic too. That was achieved the old-fashioned way, by figuring out the requirements and implementing them with lots of ad hoc testing during development based on execution path analysis and black/white boxing, etc. Generally those kinds of test are disposable because of the pace of code mutation if you're looking for an optimal implementation, so I rarely keep test suites.
The counter argument to the way that I work is based on the notion of continuous integration. Supposedly that's only practical if after every addition to the codebase a full set of tests can be run to make sure things aren't broken, but I find that a very strange requirement to place on any development process. Why does every version of the codebase always have to deployable? That's an ideological nonsense. I agree that broken code shouldn't be deployed except as an alpha or beta for client/customer approval, but that doesn't mean that you can't have broken code in your development branch for the days or weeks it takes to solve a particular problem. But I guess if ideology is what drives you, you're going to need an excessive test suite.
Test driven development may help continuous integration, but it won't get you any closer to valid requirements and optimal code because that's not its purpose. TDD about letting you write code that tracks back to a test which tracks back to an ad hoc requirement, telling you nothing about the context of that requirement or whether or not it's a good requirement. Don't chain yourself to TDD, instead do some proper analysis work because overall it's less effort and gives a much bigger payoff in code quality.
Aside from TDD there's also this notion of agile coding that I recently ranted about. Agile is a good thing if you stick to the core aspirations of the manifesto, keep your teams as small as physically possible, are clear about what you want to develop and don't let process or technological prejudice stand in the way of cleverness and exploration of the problem domain. Unfortunately software development is rife with shoddy hacks who only got into it to make money and for whom the ability to really grok a problem is so alien that they would rather spend millions of pounds on other shoddy developers with a well-known process than fifty thousand pounds to hire one good developer to do the same project in the same - or shorter - timeframe with a more pragmatic approach. I get so hot and bothered by this attitude that I'm going to throw some stones for a while at the more popular defences business people use to defend it.
Conformity === Quality
Well, in the sense that once everything conforms to a certain standard you can at least talk about uniform quality I guess. The funny thing about software is that the small number of people who are really good at it rarely are conformist by nature, but the code they write is of considerably higher quality than the norm. They also tend to code for the pleasure of it, make the mistake of taking jobs on risky but satisfying projects, earn much less than they could wearing a suit and tie and working conventional hours, display poor punctuality and a lack of concern for the corporate hierarchy, etc. They are mavericks and that's what makes them both so damn frustrating to work with, and so damn good at what they do.
Show me a coding shop without any mavericks - the people we used to affectionately call Boffins when plucky little Britain still cared about technical talent and creativity - and I'll show you a hell-hole of crap code and poorly researched requirements. Ever run into stories, those stupid little functional fragments that get passed off as requirements in the XP world? Then you'll know exactly what I mean.
Add to this that monocultures are intrinsically fragile, lacking the diversity needed to survive changing environmental conditions, and the idea that conformity is a good thing looks like a classic example of maladaption at work.
It's all about the Budget
The I know the price of everything, but the value of nothing defence. Many people put this down to accountants but the truth of the matter is that business people value money much more than they do the intellectual capital that generates wealth and set they're budgets accordingly. Unfortunately intellectual capital is very expensive to replace once you've lost it, because generally it's locked in the heads of employees and can take years to recreate. Sometimes it's not even possible to do so, and when a key employee leaves the business suffers irreperable damage. But that doesn't stop many supposedly competent managers from driving expertise out of their companies.
Good Programmers are hard to spot
Call me naive (many people have) but I believe that one good developer can spot another good developer based on about five minutes of chatting over a beer, because the good developers know their subject in depth and will have exciting personal projects to talk about. The real test though is the variety of bullshit they engage in as the beer count increases: generally it's the creative kind that makes you want to find out what they're trying not to talk about (that's probably covered by NDAs) and time just flies by.
Bad developers on the other hand will bore you rigid with waffle about this role they had at such-and-such big name company, or how much value they added on some boring-as-hell enterprise system, or why some trivial failing of a particular platform made it impossible to do something. Let's face it, good developers don't let platforms get in the way - if they have to write dodgy, unstable, hideous hacks to get something to work that's what they'll do and then spend the rest of their careers telling all and sundry about it because they've earnt the right to brag.
However the corporate attitude seems to be that coding tests are the way to spot good programmers. That means corporates will mostly hire programmers who produce safe, textbook solutions to generally dull and tedious problems. I guess that's one definition of a good programmer - one that you can rely upon to fit in neatly and interchangeably - but that's not going to spot real talent.
Still I guess you have to attract that first good developer before you have a chance to learn the difference, and if you insist on a culture that enforces conformity and bureaucracy I'm at a loss to understand why a good developer would want to work for you in the first place. Business people know this is a problem which is probably why I keep hearing them talk about adopting open source practices and it always leaves me wanting to say the same thing, "it's not the practices that make open souce work you morons, it's the fact that they're interesting projects and attract good developers who want to work on them for free because it's fun."
Big teams are good
Not that it's ever phrased that way. Big teams are universally accepted as a bad thing, but ever since Agile become the dominant meme we've been suffering the pair programming obsession. It's a bit like that scene in Mad Max Beyond Thunderdome: two go in, one comes out. By which I mean, two programmers get sat down together with one workstation and supposedly as a result of synergistic interactions they produce much better code than they would individually. I've ranted about this before so I won't repeat myself. I wouldn't stop anyone doing it if that's there idea of fun, but the idea of doubling my team size just to suit a religious preference strikes me as lunacy. I want my teams to be three people or less because experience tells me that's the sweet spot for producing good quality code on a tight budget.
Risk is Bad
One of the dirty truths of software development is that the vast majority of projects will fail to meet their objectives, and they will do so for purely human reasons. To start with most software pretty much writes itself once you get into it, with a few problem areas along the way where the machine refuses to see things the way you want, but most code is the mechanical fluff that any trained multicellular organism with an understanding of boolean logic should be able to write. Unfortunately a sizeable minority of programmers fall short of that mark, so you've got technical problems built in from the start and as time goes on that will sufficiently disturb management types that they'll start looking for voodoo hoodoo to compensate.
The whole Java/J2EE/OO/Design Patterns cult is one such voodoo, and for some people it works really well. There are people in the City building incredible data analysis tools using this technology and I respect that. But because it works for them doesn't mean it works for everybody, every time. That's the nature of risk - it's no respecter of fashion or perceived wisdom and you can only get a handle on it by studying a particular problem domain in reasonable depth. It's also pervasive. Contrary to popular opinion amongst business folks though, risk is a good thing. All the most interesting projects are risky, both financially and technologically, but when they pay off they do so out of all proportion to the obstacles. In fact it's the key thing I look for when taking on new projects because it guarantees I'll be using all of my skills to the fullest and will get to solve significant problems.
The trick with risk isn't so much in minimising it by playing safe, as in doing your homework properly and making contingency plans for when a particular course of action doesn't pay off. Backtracking decisions and being flexible is the name of the game, hence the idea of being agile and using small teams. What this requires though is strategic thinking and apparently apparently that's difficult if highly paid business consultants are to be believed. Personally I suggest you can spend a few weekends playing computer games like Europa Universalis and you'll soon get the hang of on-the-fly strategic thinking, but as no one believes that a good education can cost you as little as £20, let's have a look at cost...
Expensive === Less Dumb
I hate consultants. Not the freelance contractors mind whom you turn to when you need a given skill, but the large corporates who present themselves as experts with only the best on their payroll. Given that many of these consultancies place high value on recruiting employees with excellent academic qualifications, it's easy for them to present themselves as la crème de la crème, and people in management rarely seem to look at the substance behind academic backgrounds. Got an MBA from MIT? That'll do nicely.
Well another clue for anyone in search of a genuine edge: with rare exceptions the straight A students were too busy studying to hack much code in their spare time, and therefore they're orders of magnitude less likely to grok technology than those C-grade, scruffy, couldn't care less about your idea of office hours and decorum hackers that you consider so risky.
Large consulting firms follow the latest 'hot' technologies as that's where the money is, but strangely they rarely seem to hire the experts in those fields. That means you're paying premium rates for second-rate advice cobbled together on the fly by someone with minimal experience. To add insult to injury they'll often offer training courses to help their clients get up to speed in these technologies, and if what I've seen of these in recent months is anything to go by your inhouse developers and business analysts would be better served by a two week, all-expenses-paid holiday than one of these brainwashing shams. Not that I care - like most hardcore hackers I'm immune to these sorts of seminars because I can analyse them and see what's going on. So if other people want to pour good money into the pockets of parasites I consider that their business. Literally. Just don't be surprised when you discover that expensive just means you're being milked for all you're worth.
How do I come to that conclusion? Well a genuinely competent freelance developer is going to put maybe a 50% or 100% markup on their work, because they have overheads and slack periods, tax bills to save for and equipment to buy. Under some circumstances this can work out a reasonable cost saving compared to hiring a full-time employee, and the chances are that the freelancer takes pride in doing their job swiftly and efficiently so the return on investment should be high. Unfortunately the average developer from one of the large consulting firms is likely to have a 300+% markup put on their salary, and that's a far less understandable expense. By way of comparison:
|mode of employment||1 developer per annum||5 developers per annum|
|in house||£35k + employment costs||£175k + employment costs|
So where you can hire five decent developers for a year at a headline cost of £175k, the same developers on hire from a consultancy firm could easily cost £800k. Sure you'll have employment costs on the five developers if they're inhouse, but it's unlikely those costs will add up to £625k!
Alas expensive always sells.. After all, who wants to be known for hiring the cheapest consultants in town? No one. And because most people who get software developed this way have no real yardstick against which to measure the quality of the product they get or the productivity of the people producing it, there's no good counter argument to going down that route. Well, not unless you're willing to resort to the same tactics as the consultancy firm: over-cost all viable alternatives; misrepresent what constitutes productivity; claim to have skills you demonstrably lack; and cut back all of those pesky requirements that were key to the project but represent genuine technical challenges.
By now you're probably wondering if this latest diatribe has a point, and I guess it doesn't beyond getting it off my chest. I'm definitely not the first programmer to rant against the big corporates and their practices, or to suggest that the hype around Rails is a bad thing, and I'll probably continue using it on future projects when I think it's appropriate because behind the hype and the flim-flam there is a pretty decent little framework that showcases some real Ruby neatness. I just wish that there were more of us who see it that way...