Friday, May 30, 2008

Eureka!

A recent story in the New York Times about subway system mechanical failures ('$1 Billion Later, New York’s Subway Elevators Still Fail,' William Neuman, May 19,2008) reads like a software implementation project gone bad: insufficient training of service techs resulting in poor maintenance and frequent downtime; managers pushing for deliverables before they are ready for use, basic design flaws contributing to system instability. It reminded me of a question Scott Rosenberg posits in his fascinating study of software development, Dreaming in Code: “Why can’t we build software the way we build bridges?” (And yes, the book acknowledges that unfortunately we sometimes do build bridges the way we build software.)

With a career-long involvement in software development and implementation, I am fascinated by the question of why ‘Software is hard.’ I suspect that only software engineers think their discipline is uniquely difficult, but any software user (that’s everyone) probably does wonder why good software is so rare.

Creating innovative software usually starts with a Eureka! moment: a single person has a vision of creating something that is, in a word, transcendent. All of the complexity is stripped away as the pure, essential solution reveals itself to the visionary. Impelled by the epiphanic euphoria of the moment, the designer crafts a high-level design that retains as much of the purity of the Eureka vision, while acknowledging some of the messiness of actual data and user requirements. Through design and coding, more of the messiness of reality has to be dealt with: programmers have to bring both a creative mindset (any problem can be solved an infinite number of ways programmatically) and the discipline to create within logical structures.

The logic of even apparently simple software can be extremely dense. From a programmer’s perspective, the technical skills (programming language, operating system, database, etc.) differentiate competency far less than the cognitive skills (logical and critical thinking). Reading another person’s code gives you an intimate insight into how that person thinks (although not what that mind thinks about). This is true for one’s self as well. It’s not uncommon to revisit one’s own code written even recently and wonder at how much you knew then that you’ve now forgotten: as you re-enter the code, all of the variables have to be re-assigned in your own mind, as it were, as you immerse yourself into a mental construct that is unique to the time you spend embedded in a project.

I imagine this is no different in kind from other creative endeavors, such as architecture or orchestral composition. But we don’t expect architects or composers to be commonplace. A modest software company may easily have 30 – 50 people responsible for creating software – that is, actively engaged in the creative process. Just assembling that number of capable people is a huge undertaking. Managing the collaborative process is even more daunting. Translating that to other disciplines is unimaginable: can you imagine finding five Beethovens, much less 30, and then expecting the group of them to create Symphony No. 5? Yet, software is ubiquitous and we depend on it absolutely. Production levels must be high, since requirements change constantly and demand new solutions (software). We expect software to be continually new, and delight us with capabilities we had never imagined – as users we want to consume that Eureka moment and experience the thrill of a truly elegant and innovative solution.

With few exceptions, quality suffers as creative production rises. We lower our expectations. We’re willing to make trade-offs, and the software industry understands that. Finally delivered, the software has been re-imagined by programmers, test engineers, release managers, user input, and consultants who shape the final product. The inspired Archimedes shakes his head at what became of his vision.

Wednesday, May 14, 2008

Choice in a world of scarcity

I wonder how often the word ‘priority’ is spoken by managers each week – undoubtedly more than we are aware. The concept is embedded in our thought structure: individuals have prioritized task lists, as do teams, departments, companies and business partners. We incorporate priorities and hierarchies in our processes and business rules. Supposedly, priorities focus any person or group on the highest-gain activities at any time, and help us make the best choices in how to use the resources we have.

While priorities shine a light on the preferred activities, they simultaneously obscure those activities that are now left behind ‘to do later’ – if later ever comes. The reality is that we set priorities only because there is more need than resource to meet it. When we set a priority, we accept a view that scarcity (of labor, knowledge, equipment, time, cash or materials) is inevitable. And, given the pervasiveness of this practice, we accept that scarcity in any and all areas is inevitable.

Surely this is self-fulfilling: if we never challenge the scarcity, it never goes away – in fact, priorities accommodate the scarcity so well that we ensure it never goes away, since in fact we never prioritize the scarcity itself.

What if we declared it unacceptable to set priorities: tasks are either worth doing or not worth doing, and if they’re worth doing, they’re worth doing now. Just this challenge would force the business to confront the obstacles they currently face, and acknowledge that they can choose to eliminate the obstacles. Imagine: employees would never feel the burden of tasks undone, or having to explain away a customer’s disappointment while feeling the shame of knowing that there just wasn’t enough resource to meet the customer’s very valid requirements. With time saved in not deciding what work ‘really’ has to be done, managers and teams could invest the time in challenging the fact of scarcity and actual capacity.

So, I’m challenging myself to bring this awareness every time I hear someone utter the word ‘priority.’ This should be interesting.

Thursday, May 8, 2008

The 80%

In a process improvement workshop this week, we were talking about the Pareto principle (80-20 rule) and the importance of finding those elusive 20% of all problems that contribute 80% of the pain. By extension, 20% of the ideas to improve a process will provide 80% of potential gains. As I spoke, I was struck by the inverse of that: probably 80% of the ideas we generate without analysis will likely only provide 20% improvement. That’s sobering stuff.

And yet, it feels accurate. The process at hand has a capacity that has basically been unchanged for a year. The team who uses this process has been measuring productivity out of this process, and because it’s important to them, they’ve brought a lot of energy and creativity to improving the process. I don’t think there was anything wrong with any ideas that were applied –we have some very good minds working on the problem. But if we fix a problem that contributes only incrementally to the overall output, then we can only be disappointed.

It also caused me to think about how working in a management role entices you to believe that because of your role, your ideas are great – you have the experience and technical knowledge to solve problems quickly. Decisive problem-solving is seen as a hallmark of a good manager, so we reward and respect this behavior. And yet – how many of these decisions concern that 80% that contribute 20% gain? That perspective is certainly humbling for any manager.

Monday, May 5, 2008

Beta, not better

My husband recently acceded to an upgrade of a music recording application that achieved an unfortunately too-common nadir: in providing an array of new secondary features, it derailed one of the primary features of the application (the ability to sync tracks). He asked a very pertinent question: how can software engineers ignore the needs of the people who use the product?

Having worked in the software industry for some time, I understand well how modifications can unmoor features that were functional and stable in the prior release. I also know that when engineering and users are more than a degree of separation, it’s all too easy for engineering to make assumptions about how and why users engage with their application – resulting in irrevocable decisions that seem (and probably are) cavalier, despite good intentions. Back in the day of proprietary software development, responsible software companies held engineering accountable to client services. Having to front up a client, angry about an upgrade that removed functionality, is actually helpful for software engineers to understand their purpose: to create software that enhances the users’ abilities or quality of life.

Coincidentally, I read today a blog entry from Mitchell Baker, President of Mozilla Foundation: “It’s hard to find someone who understands both open source software and the consumer space.” I trust her on this, but wonder how this could be? Wasn’t open source supposed to bridge the chasm between software development and users? No longer at the mercy of the monolithic software corporations, open source’s promise was to integrate the user with the development process. And when the user is a software developer, this probably does hold true. But rather than become more democratic, too often development has become solipsistic instead, and accountable to no one.

Andrew Keen’s analysis (The Cult of the Amateur) is spot-on. Good software is the result of well-defined processes managed by professionals. Proprietary vs. open source is a false choice on this: ownership and licencing is irrelevant; whether engineers work together actually or virtually is also not an issue. But, development must be seen as a defined process, with a bright-line difference between a test release and the real thing. As consumers too often see ‘Beta’ at the top of their web-based applications, they are becoming accustomed to working with test release software, and we continue to lower the bar. Is this the brave new world?

Thursday, May 1, 2008

Opening up possibility

In a Supply Chain Symposium this week with a number of our key vendors, we learned so much from all participants. I especially enjoyed the differing perspectives on common problems. Perhaps most fascinating, though, was a recurring commentary on supply relationships within the industry. Suppliers expressed surprise in our sincere interest in their needs. As a retailer, we were delighted to hear that most suppliers could respond to needs we thought could not be met. The symposium was a great first step, and the surveys at the end of the symposium offered many ideas (and resources) for realizing the opportunities we discussed.

Well, that sounds like pretty basic stuff, doesn’t it? Isn’t this the promise of every symposium or convention? Yes, but…. How many times are you able to cash in on that promise? It seems to me the key difference with this event was in framing it not operationally (new concepts – technology – processes), but within a social context. We seek to redefine the supply relationships in order to redefine the processes that link us. If we only first achieve the intangibles (improved understanding, appreciation, trust, connectedness), I have to believe the tangibles (removing waste / cost out of the supply chain) will be realized. What a departure from the sometimes brutal supply relationship models practiced in the past couple decades, which often resulted in breakdowns due to the win/lose polarity embedded within.

I’m reminded that Rosamund and Benjamin Zander refer to this as ‘the downward spiral,’ which they represent so memorably in The Art of Possibility. If one assumes a world of scarcity (finite resources), then there is no possibility for expansion. I’d like to believe in the possibility that there’s more for all, if we allow problem-solving to explode the confines of past experience while we explore a social framework that defines possibility for all parties.