In typical curmudgeonly fashion, I've recently found myself straining against the dogma around today's development best practices. While I am somewhat drawn to the ideas of agile development and "getting real," I have been finding in my projects that agile techniques, like Ruby on Rails, are good for only certain kinds of projects, and, perhaps more importantly, that agile projects (like RoR projects) tend to yield similar results.
Now if you're happy with big fonts and a silly little application that lets users create and share data of schema X, then great. Start your agile app with your Rails framework and build a single page, and take it from there. But you're going to end up with an application that's a lot like every other Rails app. It seems like everyone is taking Rails (or one of the many equivalents) and applying it to the vertical of their choice. It's Basecamp for cat enthusiasts! It's meta-Twitter!
Without even looking at the telltale URL, you can see these apps from a mile away. They still have the smell of the original auto-generated scaffolding, but more importantly, it's pretty clear that they were developed piecemeal. I go back and forth on the general design of standard HTML applications. From one point of view, they have the advantage of being highly modular. They can be represented by tools. And, perhaps most importantly, they can be encoded using a Model2 framework. But they have serious limitations. They have trouble preserving state. They are a ridiculously modal. They enforce inflexible data schemas.
It seems pretty clear that if you were to go and design an optimal user experience from first principles, you wouldn't start by saying: "the interface will be presented in a series of discontinuous page transitions." In fact, if you look at how the movies represent UI of the future, it's almost always the opposite: extremely fluid interfaces that are themselves more like a movie than a book. If you step back a bit, it's pretty easy to see the whole Web 2.0 thing as giant circlejerk of technology creators making it easier for mediocre developers to ship the next marginally useful application, while they struggle to implement a user interface that has the quality of a typical '80s PowerBuilder app.
There are a couple of arguments I want to make here. The first is simply about time. Anyone who's ever done a major renovation project knows the cost of trying to apply the finishing touches before the project is done. You insist that the cabinets get painted before the dishwasher goes in. Then the dishwasher installer comes and scratches the wall and breaks the cabinet door. There's no doubt that having shippable iterations with polished details is net more work than building a set of complete features and then driving the interface to completion once the big pieces are in place.
Now, if time and cost are not considerations, then this first point isn't germane. But time and cost are always considerations.
Anyway, my second point is potentially more important: piecemeal approaches yield piecemeal solutions. There's a hidden coupling between conventional web-based UI and agile processes. The way you add features is primarily by adding pages. So while agile process proponents argue that bracketing all but the nearest-at-hand design issues yields more flexibility, the fact is that by just by choosing an agile process, you're limiting yourself to the extremely restrictive set of conventions that govern Model2 apps in general. Once you implement that first partially thought-out wire frame, you're starting down a road of page after page features, and you've left behind the whole world of novel interface ideas and effects like zooming, animation and space-sharing.
In fact, many of the agile best practices reinforce short-sightedness and lock-in, even as they claim to emphasize nimbleness. Test-first gives you battle hardened code, but makes it much more painful to throw stuff away, or to dramatically rethink core assumptions. There's a place for test-free, exploratory programming. Continuous integration — or, more specifically, "zero new bugs in each build" — means that, sure, you address issues as they arise, but you don't get an opportunity to let issues collect around a given feature and then take a thoughtful, higher level approach that addresses the root cause. Instead, you end up with a bunch of ad hoc fixes that increase code complexity and mask bad underlying assumptions.
And most importantly, iterative design yields sub-optimal products. Ask any halfway-decent designer about agile, and I'm sure they'll tell you that they think it sucks. The whole point of design is to incorporate the big picture in the little details, but when you're pursuing a "let's try it and see how it goes" approach, there is no big picture; there's just a bunch of stories. Agile is remarkably intolerant of bottom-up design, the kind where a conversation between a designer and a developer yields a new way of gasp revisiting an already completed story.
I guess there's a place for all these agile RoR apps, but it seems like that's the enterprise. Sure: someone needs to replace all those crappy client data processing apps with crappy web-based ones. But on the consumer web, there's no need for the data to be so siloed, and there's no reason why every application should be so similar. This whole set of applications should be able to be replaced with something more flexible and powerful that lets users do more of what developers do in a typical RoR app.
But to make an application like that, you can't start with Rails. You can't use agile process. You need to sit and think a while. Then you need to design. Really carefully. Then you need to build a bunch of features. Maybe as a prototype, that you may even throw away. And then you can start to think about integration. And of course as you do this, your design will change. You need to do big upfront design and you need to expect the requirements to change. Sorry, but that's what it takes to build great stuff. Agile process is great for solving yesterday's problem, but when you set out to do something new, there aren't clear guideposts.
When I talk about RoR and agile, I tend to think of its most fervent proponents as kids, regardless of their age. I've been thinking about this over the past couple of months in a very different context. When I talk to my friends about running Elastic Process, I've tried to describe how I've had to get comfortable with a fairly high degree of uncertainty. I've tried to describe this as different orders of uncertainty. The first order is something we all know well: will this job come in? Will this person sign on? How should I balance my priorities? But the second order is little scarier: Will I want this person to sign on? How do I balance my competing sets of priorities?
I've found that the key is to try to just accept the uncertainty, and to not try to resolve it prematurely. In my experience, time is on my side. Sure, there's the occasional case where I didn't move quickly enough and missed an opportunity, but for every one of those, there's 9 times when I jumped the gun, when additional information would have made the correct choice clear, when the big rush turned out to be not so big and not such a rush.
I think you can roughly characterize the habit of delaying action until the last possible moment as "patience." And I think it's something that comes with some age and wisdom. The older you get, the easier it is to see a week, or a month, or even a year as a short time — as a time in which a problem or challenge can take shape, rather than fester.
And so it is with development. Patience. Patience to fully understand the problem before diving in to the solution. Patience to build solid underpinnings before cranking out "screens." Patience to let a problem breathe and mutate before trying to solve it. Patience to let bugs collect before going through and rooting them out. Patience, and then, when the time is right, decisive action.