Friday, October 3, 2008

Agile or fragile?

We have a small project right now that is proving frustrating to the business and to IT, as we continue to rework code to meet the user's changing expectations. We are (today) hopeful we're almost there, but no one is happy with the experience. Netiher party is finding fault with the other (at least out loud). Intentions are good, but the outcomes just aren't what either department expected.

Coincidentally, I attended a session this week on Agile development -- a philosophy of software development that uses an iterative process. Bill Nazzaro of IconATG delivered an entertaining and insightful presentation on Agility ("Some answers can't be found until you make some mistakes" - I love that). His description of the process reminded me of Lean Production: the Agile movement invokes the small batch size production introduced by Lean. But, it flies in the face of the Lean concept of perfection. Which causes me to think of the recent IT project: what would really be satisfying is a development process in which we get to change our minds about requirements all through the project, but the build at any time is satisfyingly 'perfect' -- that is, without defect.

Possible? Might be, but the user expectation would have to be aligned -- that is, the user must be invested in an iterative process. Otherwise, each iteration will be perceived as rework (waste), rather than an evolutionary process in which some parts of the (genetic) code endure and some are trashed. If the user sees each phase as rework, he starts to lose confidence in the ability of the programmer to meet his requirements -- and naturally the programmer picks up on that, and the project goes into a downward spiral of mutual unhappiness.

I think the concept of perfection -- whether we articulate it or not -- is the derailer. Every one's hope hangs in the balance of whether something comes out right. (The problem, of course, is that without definitive requirements, there is no right - it's perpetually subjective.) To align with an iterative process, maybe we have to agree that the output is right if it simply helps us visualize where we want to go next. Hundreds of small successes may not feel as satisfying as delivering to a huge milestone -- but maybe that's an emotional expectation that we've programmed in ourselves. Could be time for an upgrade.

Wednesday, September 17, 2008

Risky business

I followed a spirited discussion on the user group forum hosted by our ERP vendor. Users were clamoring for development to embrace iPhone integration; which led to a general call for the vendor to broaden its scope beyond Windows client applications. ERP company representatives explained (ever so patiently) why they aren't going there, and why the users should see this reluctance as a good thing. With every posting by the software compay, you could see the frustration level rise amongst the users.  Since I've sat in both chairs, I know the frustration from each perspective. User: Why can't you offer more flexibility for my business needs? Vendor: Have you any idea how much this would cost? It has no quantifiable commercial value - how do I pay for this??

Ultimately, if the market eventually demands the new thing, the vendor will develop it -- but it will be a reactive strategy. In the 80s and early 90s, ERP companies could get away with that. The market accepted long lead times for development, and highly valued stability and risk avoidance. The market's in another place now, and users have much different expectations. They look to their technology vendors to bring the future to them. Sometimes (and only sometimes) the new new thing delivers perceivable value and it takes off like a rocket. But, the failure rate is high. 

Someone has to bear the risk, and maybe it's time for ERP vendors to think differently about the value their clients expect from them. Rethinking the relationship may lead to rethinking the contracts that underpin it, which fund development. Sometimes it' s possible to be safe and sorry. 

Friday, September 12, 2008

Reflections on learning

A teacher at my daughter's new high school shared a pearl this week: when kids are trying to learn something, he said, "Don't steal their struggle." As a parent who too frequently acts the kleptomaniac when her kid hesitates with an answer, this struck me to the core. My urge to help is so unhelpful, and this comment really brought it home to me.

So this was top of mind today in my dialogues today with staff. Ever ready to weigh in (... well, I do keep a blog...), my acting on that impulse is probably none too helpful when someone is working to understand something. Another person really can't help you understand. Wrapping your mind around new data requires something like a dialectical struggle. It's not comfortable and sometimes it feels almost physically painful - the hope is that you learn something because of the conflict between what you thought you knew and the new information.

I think this is the crucial difference between memorization and learning. We commit lots of new information to memory every day, and it's not confronting in the least -- where you left your car keys; how to do a new task; the name of the person who sits next to you at an event. This is no different in kind than memorizing the periodic table, or the Prologue to The Canterbury Tales. Memorization doesn't require that you change how you think about anything - it's just a mechanism to stuff more data into the matrix of your brain.

Too much workplace training is just memorization. Too much of what we 'learn' in our daily lives amounts to no more than this: we just add data to the existing constructs. As an undergraduate years ago, I naively was amazed at the drop-out rate in my freshman philosophy course. For some students, challenging the way they thought (about anything) was just too much -- enticing to me, others found it repulsive. Based on what I see, there's probably a fairly large subset of our population that refuses to challenge their preconceptions -- they'll take on new data, but anything that doesn't fit existing constructs is just lost on them. However we all have our limits: there's a point where my ability to conceptualize something like string theory just causes a system freeze.

Thankfully, it's not all that challenging. Many of the questions I hear daily go beyond memorization, but fall short of string theory. Why do we have so much inventory on hold? How can we enhance the customer's experience? How am I supposed to work with {name of least favorite coworker}? Although the person asking the question may think he's asking for information that can be acted upon, these are all learning questions. They are questions that should cause internal conflict and a change in thinking. The least helpful thing is to steal the struggle - respond with a pat answer. There's so much to be learned in challenging what you think that answer should be, and the discomfort eventually is replaced by the joy of having learned something new.



Monday, September 8, 2008

You can take it with you

This time of rapid technological development is most satisfying when you see your own visions made real by innovators around the world. For some time I've imagined a future in which we carry a single device that is a personal extension of ourselves -- a mobile daemon that is a virtual self. We would use this device to entertain ourselves, read, learn, engage with others, transact commerce. It would function as our multiple devices do today: mp3 player, laptop, PDA, mobile phone -- but it would fulfill all of those needs, while being absolutely portable to the extent we need it to (that is, small enough to slip into a jeans pocket when necessary, but large enough for reading screens of text easily). 

A very clever company called Modu has made this a reality using modular design (the very small modu phone slips into jackets - called modu mates -- to enable the physical interface required for various tasks). This is a glimpse into the future. Based on their representation of their product (they've not yet entered the US market), it appears that the company has designed an extraordinary solution.  Design is a key term here - the modu and its mates are very design-conscious, striving for the ultra-cool stratosphere that is Apple's domain. How the software enables a well-integrated user experience is also critical -- and even more, how well server-based applications enable a completely mobile experience. 

In the meantime, I'm still imagining where this can go - and enjoying the speed of development.

Friday, August 29, 2008

Lean in any language

I'm struck by something I read in Lean Retail (Simon G Fauser, 2007).Describing the difference between Lean Management and Kaizen (Japanese continuous improvement), the author explains that the differences reflect differences in social organisation in the Western and Japanese cultures. The West values individual contribution; in Japan the inidividual is sublimated to the group.

In the early 1990's, I attended a week-long Kaizen event at a manufacturing plant in East Texas. The event was hosted by a US consulting firm, and they had invited former Toyota Production managers to lead Kaizen teams. It was a fascinating experience, not least because I was able to witness the explosive culture clash between the East Texas plant managers and the Japanese consultants. (By Thursday, the plant manager was bravely attempting to defuse the situation and prevent a mass walk-out by line managers and supervisors.) On the sidelines, it appeared to me that the plant's real issue was the wholesale dismantling and replacement of their processes by outsiders. They experienced huge changes (and equally impressive gains, by the way) with no appreciation by these outsiders that what was ripped out represented the cumulative contribution of those managers and supervisors over long periods of time. The fact that some of the outsiders were from a different culture became the touchpoint. No doubt the consulting style of the Japanese was quite different from that of the US consultants -- who were much less direct, and more considerate communicators. But xenophobia played no small part in the scapegoating; it was convenient for externalizing the frustration and hurt feelings that arose out of the project. As we left the site at the end of the week, the plant staff were threatening darkly that they intended to undo all of the Kaizen work, which would have resulted in significant financial loss to the company.

However, I couldn't see that the process methodology, or how we went about performing Kaizen, or the decisions made, were influenced by culture. In fact, that's what I liked about the process: it was data-driven and completely logical. An experiment conducted in Tokyo looks the same if it's replicated in Tucson. I found this refreshing, since most business management practices are completely culture-dependant. The challenge in any business process is how to make it work with multiple people (it's not a process if everyone does his own thing). It has to be quantifiable; it has to be replicable regardless the individuals who perform it. That's the crux: you must create a mechanistic process that is manifested only within a social, human context. Values and social norms inform only one aspect of Lean or Kaizen: gaining buy-in necessary for a successful implementation.

For engineers and anlysts, that's always the rub. I like my chances of getting a machine to run a new sequence smoothly better than getting a team to do the same.

Friday, August 22, 2008

Chicken Feed

I’ve been thinking about motivation recently. An article this week in the NY Times (Mixed Results on Paying City Students to Pass Tests, 8/19/08) reports that efforts to pay students to score well on Advanced Placement tests resulted in more test takers, but few passing the test. An article in DC Velocity (The Secret to Going “Lean”, Pat Kelley and Ron Hounsell) argues the position of motivating the workforce by paying them for increased performance. (Rather alarmingly, the article suggests the best way is to reward individual performance – which makes me wonder what kind of process improvement that is supposed to encourage?) But overall, their argument is of a piece with the perceived wisdom that money motivates. Businesses believe this absolutely, evidenced by their executive pay structures.

But does money really motivate? Certainly pay that is perceived to be unfair de-motivates, but the inverse isn’t necessarily true. The Brafmans in their book Sway lay out a compelling argument that indeed money doesn’t motivate people to do what you want them to do, and can produce quite the opposite (and seemingly irrational) response. The NYC high school results prove their point: the promise of a cash reward motivated more students to try (that is, take the test), but was ineffective to motivate the behaviors that are necessary to succeeding on the test. So in the workplace: if we want to inspire teams to achieve breakthrough performance, perhaps we need to think outside the perceived wisdom.

In fact, I witnessed this just yesterday in a meeting with the DC staff. Speaking to the point of personal and professional growth, I mentioned off-hand my expectation that work was more than a paycheck – my hope is that everyone has an opportunity to grow professionally, benefit personally, and make a tangible contribution. I was unprepared for the enthusiastic response from the group to the ‘more than a paycheck’ comment; it resonated with them more than I could have expected. Now, I have no doubt that everyone in the room wants more money from his job, and I too want to see them make more money as a result of growing the business and professional growth. But motivation is more complex than feeding the chicken more pellets, although strangely we like to think of ourselves that way.

Thursday, August 14, 2008

Same as it ever was

Interesting story today in Technology Review, "How (Not) to Fix a Flaw." MIT students found security flaws in the Boston subway payment system, and they did what appears to be the honorable thing: rather than exploit it, they documented their discovery and attempted to bring it to the attention of others. The transit authority would prefer to keep it quiet while they try to fix the problem, so they moved to censor the students. So, desire for control confronts the threat of disclosure.

Wasn't it ever so? Every couple days I see a message on my machine that it's looking for 'updates' it thinks are essential. If I ask for more information about why I should install the updates it found, I get a fuzzy explanation that amounts to: Don't worry your little brain about this; we know what's best. How different it would be if instead, the message said "We've found a bug we created in the software you're running. An unassigned variable causes the application to freeze, requiring you to close and restart the application. This patch contains the fix for it." I'd love the honesty, and I'd also, strangely, give the software company more credibility just because they risked owning up to their mistakes. Even if they didn't have a fix, but knew about the problem (as with the subway payment system), wouldn't it make sense to get more minds working on the problem by letting others in on it?

We've gotta assume there are no secrets when a bug exists. Just because you don't acknowledge it, you think no one will notice? People who earn a living exploiting this vanity can only be grateful.

Monday, August 11, 2008

Uncommon sense

A business unit manager and I were talking last week about a process change his team had put in place. The team is convinced their change has improved productivity. Unfortunately, no metric supports that -- in fact, the data say that the process change is much less productive than the previous standard. So where's the disconnect?

The team leader is convinced that his 'common sense' approach will yield improvements. I'm reading the book Sway: The Irresistible Pull of Irrational Behavior and it's given me some insight into the dynamic at play. Regardless of education or profession, people are more influenced by behavior and perceived value than they are by quantifiable, objective data when making decisions. This is so obvious and, simultaneously, stunning.

I think about the HR assessment tools that profile how a person makes a decision: fact-based, feelings-based, balanced mix? It's dawning on me that we're probably kidding ourselves, when safety experts, scientists and medical doctors (fact-based jobs if ever there were) have a clear track record of throwing facts to the wind in life-or-death decisions. This is obviously so hard-wired in the human brain that it feels quixotic to tilt against it.

Yet, acknowledging the sway of 'common sense' is necessary to understand what is required to instill the uncommon sense of fact-based decisions. Uncommon sense is artificial, and not natural, but we can still value it over the chaotic natural order of things. Rare things are often more valuable, after all.

Saturday, August 2, 2008

What were you thinking?

The subject of whether (or how) our use of technology is changing the way we think is being richly discussed. Since my last post, the New York Times has run several stories that develop the theme more fully.

An article on Internet reading covers the debate on how online reading (particularly by students) stacks up against reading books. I find it striking to hear academics defending online reading’s value: these skills, they argue, will help make the next generation more employable and further, reading books is inefficient: it takes a long time to read a 400-page book and much less to scan summaries or pre-digested opinion about the book. And of course, children themselves prefer it – and, as exasperated parents see it – reading online is preferable to not reading at all.

I wonder what the word ‘reading’ means in this context. Are we simply talking about the ability to interpret written words? We’re certainly not talking about the complex thinking skills required to construct internally the argument or narrative of that 400-page book – which surely are skills that this employer is keenly interested in -- as should every employer of knowledge workers. How is it that we no longer expect our children to read Pride and Prejudice or To Kill a Mockingbird – which are certainly as accessible and relevant today as a generation ago when teens and young adults were expected to read them? (For this I have hard evidence: my 14-year old has not only read these books but savored them.)


This article on reading, and one that appeared in today’s Times , point to another dynamic that I think is worth noting: an increasing expectation of controlling the narrative in one’s own real or creative life. Teens interviewed in the reading article said that they prefer reading online because they can control the narrative (if reading fiction) or the information they receive in non-fiction articles and blogs. An interactive short story site allows readers to change plot points that they don’t like. (Hey – in your version, maybe Romeo and Juliet don’t die after all!) In today’s article, I learned of a new technology tool that allows you to go immediately to voicemail: the recipient thinks you called, but you intentionally go direct to voicemail so that you don’t have to interact with the other person. This is called, I learned, ‘indirect communication,’ which “may be turning some people into digital-era solipsists more interested in broadcasting information than in real time give-and-take.” Interacting with other people opens the possibility of being challenged: maybe the other person has a different viewpoint, heaven forbid! Back in the day, one took that as an opportunity to learn from others, or at least more finely tune one's own argument. I now realize how hopelessly out of date that concept is.

I can only imagine that the logical conclusion to this is a future world much like a wonderful film I saw this week (at the always-challenging Traverse City Film Festival), Sleep Dealers, in which people willingly plug their central nervous system into a corporate network that uses their brains to direct the production of a robotized workforce in other countries. If you’re not interested in using your brain, I suppose it’s a resource that can be commoditized like anything else.




Monday, July 21, 2008

Slow thinking

I always admire people who think quickly, and that's a fairly commonly held value I think. Living with knowledge only a search away has created the opportunity for more people to be more informed in their thinking. This is all good, right? A recent article is making me reconsider the relative value of slow thinking (or, contemplative thought, to use a more fancy-pants name).

Nicholas Carr,
writing in The Atlantic.com, eloquently addresses the disturbing and far-reaching changes wrought by our increasing dependence on the Internet for information and entertainment. In his 4,000-word article, Carr develops his argument using Kubrick, Nietzsche, Socrates, Frederick Winslow Taylor, the Gutenberg printing press, as well as contemporary writers and players. Halfway through, I felt the need to print it out so that I could read it in a way that allowed me to absorb what he was saying and develop my own thoughts as I read. Actually, that just made Carr's argument: the Internet is the antithesis of 'leisurely reading or slow, concentrated thought.'

I found particularly disturbing the comments from those who had noticed they were losing the ability or interest in deep reading -- and these are people who had previously invested themselves in reading and thinking deeply. They are surely a small minority of the greater populace, which has for decades used television, movies and Cliff Notes as primary sources of information and entertainment.

And yet to meet the challenges of today, more than ever we need fresh and complex thinking. More is at stake: decisions in one place can affect people globally. Economic, environmental, political risks are higher, and change is happening at a pace unheard of in human history. Yet our thinking is more mechanistic, and we are more likely to accept simple solutions based on invalid logic than complex solutions that are actually more likely to succeed. If the solution requires more explanation than a sound bite, it won't be heard. But Carr's point is even more distressing: even if people could stay tuned for the entire explanation, they wouldn't be able to perform the complex thinking required to interpret and understand the content because their wetware has been reprogrammed by today's technologies.

The implications -- for business, technology, politics, our world -- are tremendous. Critical thinking, in the classic sense, is required for problem-solving. What passes for critical thinking (the bombastic criticism of talk radio and cable opinion shows) in the popular mind has assumed primacy in forming opinion, and the popular mind is grateful that it doesn't have to engage in the rigor required of true critical thinking. What a feedback loop!

Friday, July 4, 2008

2D is here

Audrey Chait wrote to let me know that StoreXperience has brought 2D shopping to life here in the US. I just spent a few minutes on their website and downloading their app, and it's definitely interesting. The technology will need concentrated support from early adopters to gain traction, but the potential is huge. Hats off to StoreXperience - I'm hopeful this kind of technology will create a new frontier for cross-channel retail and a new dimension in the customer experience.

Thursday, June 26, 2008

Research while you shop

This week in a meeting amongst stores and buyers, the topic was a very stylish and well-constructed kid’s bed. The buyer had recently reduced the price (again!) from $648 to $598, and was asking the stores: what’s happening with this bed? It’s fun, has lots of storage and functionality, and the price is less than half of retail. The stores said it gets a lot of looks, but the above-$600 price tag for a kid’s bed is off-putting. So, we’re trying it at another price level, and I hope it will work.

Today I did a quick Internet search to see what retailers were selling it and at what price: every quoted price I saw was 4 figures. Our retail is so far below the market value it’s not funny.

Which just made me think (again) how powerful it would be to enable the customer to combine the benefits of Internet search (comparison pricing, reviews etc) with her in-store experience. Sure, we could allow the customers to use our showroom PCs to comp shop while in the store, but it would be so much more powerful if the customer could take a picture of the item (or its 2D barcode) with her own phone to search for and display comparative prices, specs, and reviews. Although our price tags declare the comparative retail price, how much more credible that price would be if the customer could verify the actual retail from other retailers’ websites.

Since research tells us that most people research online before shopping, and since this is a business that capitalizes on urgency (and the thrill of the treasure hunt), supporting the research in-store seems a natural. And besides, how many people are organized enough to do their homework before they go shopping? This is an app for the homework-impaired who love to shop in the moment, and also love to get an unbeatable deal.

Thursday, June 12, 2008

Raising the bar

From the Internet Retailers Conference & Exhibition:
I had an interesting experience today when I engaged in an SMS promotion one of the speakers mentioned in his presentation. The SMS experience was fast and satisfying -- I had to type maybe a total of 15 characters over 3 fast-paced text messages that resulted in an order to be shipped to my home address (which has more than 15 characters in one address line alone). Impressive, and a powerful example. Until... I read the last message: "Your order will be shipped in 6-8 weeks." OK, this is a free sample, so the order lead time is consistent with that service. But still.

The fulfillment industry remains challenged as ever, and marketing will be able to move much quicker, and will set customer expectations that only the very dedicated suppliers will be able to meet (Amazon and Dell can be exemplary, but they have worked at it from the get-go). Thing is, if I can place an order in under 30 seconds, why should I have to wait even a week, when ground service is usually 3-5 days in the US?

Truth is, fulfillment is dull stuff. And yet - it overwhelmingly determines the final impression left on our customer.

Going mobile

Great content in the m-commerce sessions today. I found it interesting to hear designers stress (over and over again) the constraints posed by the small display area of the mobile device. This isn't new -- for decades, business applications have been tailored for mobile devices (not as sexy, though: RF scanners and clunky PDAs), albeit for a more committed user (who can't opt out and remain an employee). Mobile marketers and app designers are facing how the PC screen size allows us to be lazy and largely ignorant of our users' needs, wants and desires. You don't have to know your user to design a PC-based screen display: you can throw everything s/he might need onto the screen and the user does the work of figuring out how to get what s/he wants. We even push the burden of maintaining screen resolution onto the user -- as well as loading whatever software we require for our content. And, the anonymous PC user has been pretty obliging, if s/he thought the content might be compelling enough.

But with mobiles, we're finally acknowledging a responsibility to provide content that is absolutely customized for the user. I doubt this is harder (requires more effort), but it does require a fundamental change in thinking that has to persist through the entire application life cycle. Change is usually harder than work.

The more things change...

Despite the touted diversity of the attendees by business size and location, and despite this being a conference about 'new' technology, the composition of the conference is surprising to me: mostly white males in their late 30s-to-40s; the Dockers crowd. Boomer music plays overhead in between sessions. The tech kids line the hallways and the walls of the conference rooms, sitting cross-legged and interacting with their gear. I wonder if they're getting what they wanted from the conference.

Bigfoot

The conference has kindly provided charter buses to shuttle us to/from the convention center. This morning I am the only rider on the bus. My carbon footprint is about a size 14.

I take heart, though. I notice that the hotel offers a 50% discount on valet parking rates if you drive a hybrid. Nice.

Monday, June 9, 2008

All about the front end

This week I'm attending the Internet Retailer Conference and Exhibition (IRCE). The attendance has been noteworthy - over 5000 people, from 49 states and many countries. There must be hundreds of speakers - every day's agenda is packed tight. Yet, as one speaker noted, there are almost no topics on fulfillment and delivery, and few vendors; the balance is overwhelmed by marketing and the front-end. In fact, the goal seems to be a business model that doesn't require inventory at all!

Friday, May 30, 2008

Eureka!

A recent story in the New York Times about subway system mechanical failures ('$1 Billion Later, New York’s Subway Elevators Still Fail,' William Neuman, May 19,2008) reads like a software implementation project gone bad: insufficient training of service techs resulting in poor maintenance and frequent downtime; managers pushing for deliverables before they are ready for use, basic design flaws contributing to system instability. It reminded me of a question Scott Rosenberg posits in his fascinating study of software development, Dreaming in Code: “Why can’t we build software the way we build bridges?” (And yes, the book acknowledges that unfortunately we sometimes do build bridges the way we build software.)

With a career-long involvement in software development and implementation, I am fascinated by the question of why ‘Software is hard.’ I suspect that only software engineers think their discipline is uniquely difficult, but any software user (that’s everyone) probably does wonder why good software is so rare.

Creating innovative software usually starts with a Eureka! moment: a single person has a vision of creating something that is, in a word, transcendent. All of the complexity is stripped away as the pure, essential solution reveals itself to the visionary. Impelled by the epiphanic euphoria of the moment, the designer crafts a high-level design that retains as much of the purity of the Eureka vision, while acknowledging some of the messiness of actual data and user requirements. Through design and coding, more of the messiness of reality has to be dealt with: programmers have to bring both a creative mindset (any problem can be solved an infinite number of ways programmatically) and the discipline to create within logical structures.

The logic of even apparently simple software can be extremely dense. From a programmer’s perspective, the technical skills (programming language, operating system, database, etc.) differentiate competency far less than the cognitive skills (logical and critical thinking). Reading another person’s code gives you an intimate insight into how that person thinks (although not what that mind thinks about). This is true for one’s self as well. It’s not uncommon to revisit one’s own code written even recently and wonder at how much you knew then that you’ve now forgotten: as you re-enter the code, all of the variables have to be re-assigned in your own mind, as it were, as you immerse yourself into a mental construct that is unique to the time you spend embedded in a project.

I imagine this is no different in kind from other creative endeavors, such as architecture or orchestral composition. But we don’t expect architects or composers to be commonplace. A modest software company may easily have 30 – 50 people responsible for creating software – that is, actively engaged in the creative process. Just assembling that number of capable people is a huge undertaking. Managing the collaborative process is even more daunting. Translating that to other disciplines is unimaginable: can you imagine finding five Beethovens, much less 30, and then expecting the group of them to create Symphony No. 5? Yet, software is ubiquitous and we depend on it absolutely. Production levels must be high, since requirements change constantly and demand new solutions (software). We expect software to be continually new, and delight us with capabilities we had never imagined – as users we want to consume that Eureka moment and experience the thrill of a truly elegant and innovative solution.

With few exceptions, quality suffers as creative production rises. We lower our expectations. We’re willing to make trade-offs, and the software industry understands that. Finally delivered, the software has been re-imagined by programmers, test engineers, release managers, user input, and consultants who shape the final product. The inspired Archimedes shakes his head at what became of his vision.

Wednesday, May 14, 2008

Choice in a world of scarcity

I wonder how often the word ‘priority’ is spoken by managers each week – undoubtedly more than we are aware. The concept is embedded in our thought structure: individuals have prioritized task lists, as do teams, departments, companies and business partners. We incorporate priorities and hierarchies in our processes and business rules. Supposedly, priorities focus any person or group on the highest-gain activities at any time, and help us make the best choices in how to use the resources we have.

While priorities shine a light on the preferred activities, they simultaneously obscure those activities that are now left behind ‘to do later’ – if later ever comes. The reality is that we set priorities only because there is more need than resource to meet it. When we set a priority, we accept a view that scarcity (of labor, knowledge, equipment, time, cash or materials) is inevitable. And, given the pervasiveness of this practice, we accept that scarcity in any and all areas is inevitable.

Surely this is self-fulfilling: if we never challenge the scarcity, it never goes away – in fact, priorities accommodate the scarcity so well that we ensure it never goes away, since in fact we never prioritize the scarcity itself.

What if we declared it unacceptable to set priorities: tasks are either worth doing or not worth doing, and if they’re worth doing, they’re worth doing now. Just this challenge would force the business to confront the obstacles they currently face, and acknowledge that they can choose to eliminate the obstacles. Imagine: employees would never feel the burden of tasks undone, or having to explain away a customer’s disappointment while feeling the shame of knowing that there just wasn’t enough resource to meet the customer’s very valid requirements. With time saved in not deciding what work ‘really’ has to be done, managers and teams could invest the time in challenging the fact of scarcity and actual capacity.

So, I’m challenging myself to bring this awareness every time I hear someone utter the word ‘priority.’ This should be interesting.

Thursday, May 8, 2008

The 80%

In a process improvement workshop this week, we were talking about the Pareto principle (80-20 rule) and the importance of finding those elusive 20% of all problems that contribute 80% of the pain. By extension, 20% of the ideas to improve a process will provide 80% of potential gains. As I spoke, I was struck by the inverse of that: probably 80% of the ideas we generate without analysis will likely only provide 20% improvement. That’s sobering stuff.

And yet, it feels accurate. The process at hand has a capacity that has basically been unchanged for a year. The team who uses this process has been measuring productivity out of this process, and because it’s important to them, they’ve brought a lot of energy and creativity to improving the process. I don’t think there was anything wrong with any ideas that were applied –we have some very good minds working on the problem. But if we fix a problem that contributes only incrementally to the overall output, then we can only be disappointed.

It also caused me to think about how working in a management role entices you to believe that because of your role, your ideas are great – you have the experience and technical knowledge to solve problems quickly. Decisive problem-solving is seen as a hallmark of a good manager, so we reward and respect this behavior. And yet – how many of these decisions concern that 80% that contribute 20% gain? That perspective is certainly humbling for any manager.

Monday, May 5, 2008

Beta, not better

My husband recently acceded to an upgrade of a music recording application that achieved an unfortunately too-common nadir: in providing an array of new secondary features, it derailed one of the primary features of the application (the ability to sync tracks). He asked a very pertinent question: how can software engineers ignore the needs of the people who use the product?

Having worked in the software industry for some time, I understand well how modifications can unmoor features that were functional and stable in the prior release. I also know that when engineering and users are more than a degree of separation, it’s all too easy for engineering to make assumptions about how and why users engage with their application – resulting in irrevocable decisions that seem (and probably are) cavalier, despite good intentions. Back in the day of proprietary software development, responsible software companies held engineering accountable to client services. Having to front up a client, angry about an upgrade that removed functionality, is actually helpful for software engineers to understand their purpose: to create software that enhances the users’ abilities or quality of life.

Coincidentally, I read today a blog entry from Mitchell Baker, President of Mozilla Foundation: “It’s hard to find someone who understands both open source software and the consumer space.” I trust her on this, but wonder how this could be? Wasn’t open source supposed to bridge the chasm between software development and users? No longer at the mercy of the monolithic software corporations, open source’s promise was to integrate the user with the development process. And when the user is a software developer, this probably does hold true. But rather than become more democratic, too often development has become solipsistic instead, and accountable to no one.

Andrew Keen’s analysis (The Cult of the Amateur) is spot-on. Good software is the result of well-defined processes managed by professionals. Proprietary vs. open source is a false choice on this: ownership and licencing is irrelevant; whether engineers work together actually or virtually is also not an issue. But, development must be seen as a defined process, with a bright-line difference between a test release and the real thing. As consumers too often see ‘Beta’ at the top of their web-based applications, they are becoming accustomed to working with test release software, and we continue to lower the bar. Is this the brave new world?

Thursday, May 1, 2008

Opening up possibility

In a Supply Chain Symposium this week with a number of our key vendors, we learned so much from all participants. I especially enjoyed the differing perspectives on common problems. Perhaps most fascinating, though, was a recurring commentary on supply relationships within the industry. Suppliers expressed surprise in our sincere interest in their needs. As a retailer, we were delighted to hear that most suppliers could respond to needs we thought could not be met. The symposium was a great first step, and the surveys at the end of the symposium offered many ideas (and resources) for realizing the opportunities we discussed.

Well, that sounds like pretty basic stuff, doesn’t it? Isn’t this the promise of every symposium or convention? Yes, but…. How many times are you able to cash in on that promise? It seems to me the key difference with this event was in framing it not operationally (new concepts – technology – processes), but within a social context. We seek to redefine the supply relationships in order to redefine the processes that link us. If we only first achieve the intangibles (improved understanding, appreciation, trust, connectedness), I have to believe the tangibles (removing waste / cost out of the supply chain) will be realized. What a departure from the sometimes brutal supply relationship models practiced in the past couple decades, which often resulted in breakdowns due to the win/lose polarity embedded within.

I’m reminded that Rosamund and Benjamin Zander refer to this as ‘the downward spiral,’ which they represent so memorably in The Art of Possibility. If one assumes a world of scarcity (finite resources), then there is no possibility for expansion. I’d like to believe in the possibility that there’s more for all, if we allow problem-solving to explode the confines of past experience while we explore a social framework that defines possibility for all parties.

Monday, April 28, 2008

Taking my own advice

I recently wrote a program that calculates lead time, filtering out unreliable data statistically. My objective is to generate a reliable lead time, not a perfect predictor but sound enough on which to base forecasting and replenishment. Since it’s not perfect, people who use the results will need to understand how it works. But – and this is the catch – they don’t really want to know how (that is, they don’t want to have to think about standard deviations), while they do need to be able to interpret credible results (which are sometimes not intuitive), and distinguish them from non-credible results.

Just a few days ago, I wrote here that the challenge is to embed process change within the social context we inhabit. How do I create a narrative or even entertain (!!), while implementing this tool? Just thinking about it takes me down a completely different path that my old-school teaching mind imagines (drawing equations with Greek symbols on a whiteboard…what was I thinking?).

Now I’m thinking about it from the learner’s perspective: why would I need this information? Why would I want this information? How will my life be better if I know this? How could I explain this to someone else in my department? The task now is to discover the human story in the solution, and tell it memorably.

This is much harder than writing the code.

Thursday, April 24, 2008

Stock-outs

I started looking into stock-out occurrences yesterday. The high incident rate on A items alone just stunned me – wow, what an opportunity! If a SKU performs well enough to drive 80% of our business in a category, despite the fact it’s not always in stock, we don’t even know its potential.

I used to work in the fastener industry, where you knew the potential of your top sellers: there’s no mystery surrounding the demand of a ¼” flat washer. The comparative reliability of supply in that industry supports a stability in demand that is all too absent in the furniture industry. The entire supply chain – from customer to manufacturer – seems to accept the unpredictability of supply. A customer shopping for an end table may be disappointed to leave a store empty handed, but she probably entered with low expectations. If her local grocery failed to supply her need for a household staple, she would be outraged to leave empty-handed – largely because the US food industry has raised our expectations of its ability to meet our demands. Why shouldn’t our customers have the same expectations?

Sunday, April 20, 2008

A simple idea

I recently read yet another manifesto that claimed to have found a simple way of mathematically organizing the supply chain to achieve the holy grail of high turns, exponential sales growth and lean inventory. On paper, this stuff always looks promising, but it’s like a diet fad – not terribly long after you implement the system, you’re searching for the new algorithm that will take you to the promised land. And yet, I’m one of those people who design those systems. Why is what we’re doing so elusive?

Maybe the objective – to organize the supply chain along a serialized set of transactions informed by mathematical projections – contains the seed of its own failure. The supply world is organic and unpredictable. Engineers see this as the challenge: let’s impose order and predictability! This works, to a degree – defined by the constraints you imposed when you fit the messy world into a statistical model. And then you become frustrated by the limitations you imposed.

I’m thinking now that a social model is more informative than a mechanical model. At a real basic level, we’re talking about people, not systems. People buy stuff – even when they follow departmental guidelines, they’re people: emotional, intuitive, cognitive, yet likely to make mistakes. In the supply world, there are individuals and groups of people interacting sometimes physically, sometimes virtually. All of these people are following some idea of a process – whether a customer or a truck driver or a production line supervisor – while they act in a personal and very individual way. Attempting to mechanize their actions and decisions is futile. Success for any of them is simply that the outcome of what they did was good: the customer found what she was looking for (at an acceptable price); the driver arrives on time without mishap; the supervisor’s line production and employee morale are high. So how can all of them be more successful?

You don’t have to over-think this one. We’re social animals. We naturally create organizing principles for living with each other. We communicate, tell stories, teach, learn, entertain. We share (sometimes) and take (sometimes). We create ideas about what we experience, and then change those ideas when experience changes our thinking. We make tools. When the tools break, we make new ones – and you have to count on the tool breaking; it will.

So how do I …… (forecast demand… plan resources….manage an assortment.. fill in your own need…)? Let’s not over-think it, or try to find the one super concept that will solve the problem for everyone, forever. Before we make a tool, let’s use those social attributes (communicate, tell stories, teach, learn, entertain) and tap into the combined experience of the people in our social group. When we make the tool, we’ll accept that it’s just one way of solving the problem. We also have to accept that not everyone in the group will be able to use the tool expertly every time, so the tool’s design has to cater for that. Then, be prepared to keep making it anew.

I think it’s that simple. And that complex.

Friday, April 18, 2008

Checkpoint or Checkmate?

At a meeting yesterday in our DC, I heard strong interest in establishing a system of quality inspection checkpoints so that we can better understand where quality defects occur. I was pleased and yet a bit surprised: quality control is classic give-and-take. Yeah, it’s great to have it, but it requires real effort to make it happen. It also takes a huge leap of faith to believe that the extra steps you are taking up-front will result in a productivity savings down the road. Typically, we all become so involved in what we have to accomplish right now that it’s hard to believe it’s worthwhile to produce somewhat fewer transactions so that we can spend more time making sure that each transaction is of good quality.

It comes down to what we value: quantity or quality? Is this a case of win/lose (checkmate), or is it possible to achieve both? What would a work environment look like in which we did achieve both?

Wednesday, April 16, 2008

Hands-free scanning

We handle large, often unwieldy cartons and SKUs in our warehousing operations. Capturing the receipt of an item requires at least 6 separate scans (item code, PO, package quantity, scan to cart, scan putaway of item to location) – which in our business means taking the hands off the product, picking up the RF scanner, locating and then scanning the appropriate bar code. And inevitably, you do this so many thousands of times and inevitably you drop the gun or leave it sitting where it shouldn’t be – and thus the guns are on a constant repair cycle.

I’ve been thinking lately about this problem – how could we reduce or (dare we dream?) eliminate scanning while retaining tight control of our inventory? What if we moved away from the data-limiting zebra-stripe barcodes, to the more data-rich 2D (QR) codes? An entire ASN line could be encoded in this expanded data symbology, and with one scan we could verify and receive an item.

Recent breakthroughs in voice technology (see
Vangard) make me wonder about receiving by voice – completely hands free! The receiving label could display an ID code for the ASN line: using voice tags and simple codes the operator could receive the item, instruct the system what to do with discrepancies using specific commands, and putaway the item into the bin location. No juggling RF guns and product, no equipment to be damaged, and potentially we can increase accuracy since the operator’s mind is focused on the item and its location – not interpreting data on a label or on a tiny screen.

A similar solution for picking isn’t as obvious (yet) – but just the thought of streamlining the receiving and putaway process while increasing accuracy is exciting.

Tuesday, April 15, 2008

Why is it so slow? Why is it so expensive?

I’m working with operations and IT staff on an EDI project. Hauling through the standards, I experienced a flash back to the late 80’s and my first EDI implementation. Back then, IT projects moved at a snail’s pace by comparison, and I remember being grateful for a standards body to produce a standardized set of transactions that all industries could use. My expectations are much different now – development should be fast; it should be collaborative; existing standards and protocols should be readily available at no cost to encourage everyone to use them. It’s such a disappointment to see that the only changes have been for the worse.

And yet, I see more of a need for greater speed and collaboration and open standards as technology moves to the mobile platform. Mobile technology development isn’t just about the latest cool phone app… thinking differently about the way we develop collaborative interfaces is absolutely necessary for supply chains to reinvent themselves for the global market of the 21st century.