Friday, October 3, 2008
Agile or fragile?
Coincidentally, I attended a session this week on Agile development -- a philosophy of software development that uses an iterative process. Bill Nazzaro of IconATG delivered an entertaining and insightful presentation on Agility ("Some answers can't be found until you make some mistakes" - I love that). His description of the process reminded me of Lean Production: the Agile movement invokes the small batch size production introduced by Lean. But, it flies in the face of the Lean concept of perfection. Which causes me to think of the recent IT project: what would really be satisfying is a development process in which we get to change our minds about requirements all through the project, but the build at any time is satisfyingly 'perfect' -- that is, without defect.
Possible? Might be, but the user expectation would have to be aligned -- that is, the user must be invested in an iterative process. Otherwise, each iteration will be perceived as rework (waste), rather than an evolutionary process in which some parts of the (genetic) code endure and some are trashed. If the user sees each phase as rework, he starts to lose confidence in the ability of the programmer to meet his requirements -- and naturally the programmer picks up on that, and the project goes into a downward spiral of mutual unhappiness.
I think the concept of perfection -- whether we articulate it or not -- is the derailer. Every one's hope hangs in the balance of whether something comes out right. (The problem, of course, is that without definitive requirements, there is no right - it's perpetually subjective.) To align with an iterative process, maybe we have to agree that the output is right if it simply helps us visualize where we want to go next. Hundreds of small successes may not feel as satisfying as delivering to a huge milestone -- but maybe that's an emotional expectation that we've programmed in ourselves. Could be time for an upgrade.
Wednesday, September 17, 2008
Risky business
Friday, September 12, 2008
Reflections on learning
I think this is the crucial difference between memorization and learning. We commit lots of new information to memory every day, and it's not confronting in the least -- where you left your car keys; how to do a new task; the name of the person who sits next to you at an event. This is no different in kind than memorizing the periodic table, or the Prologue to The Canterbury Tales. Memorization doesn't require that you change how you think about anything - it's just a mechanism to stuff more data into the matrix of your brain.
Too much workplace training is just memorization. Too much of what we 'learn' in our daily lives amounts to no more than this: we just add data to the existing constructs. As an undergraduate years ago, I naively was amazed at the drop-out rate in my freshman philosophy course. For some students, challenging the way they thought (about anything) was just too much -- enticing to me, others found it repulsive. Based on what I see, there's probably a fairly large subset of our population that refuses to challenge their preconceptions -- they'll take on new data, but anything that doesn't fit existing constructs is just lost on them. However we all have our limits: there's a point where my ability to conceptualize something like string theory just causes a system freeze.
Thankfully, it's not all that challenging. Many of the questions I hear daily go beyond memorization, but fall short of string theory. Why do we have so much inventory on hold? How can we enhance the customer's experience? How am I supposed to work with {name of least favorite coworker}? Although the person asking the question may think he's asking for information that can be acted upon, these are all learning questions. They are questions that should cause internal conflict and a change in thinking. The least helpful thing is to steal the struggle - respond with a pat answer. There's so much to be learned in challenging what you think that answer should be, and the discomfort eventually is replaced by the joy of having learned something new.
Monday, September 8, 2008
You can take it with you
Friday, August 29, 2008
Lean in any language
In the early 1990's, I attended a week-long Kaizen event at a manufacturing plant in East Texas. The event was hosted by a US consulting firm, and they had invited former Toyota Production managers to lead Kaizen teams. It was a fascinating experience, not least because I was able to witness the explosive culture clash between the East Texas plant managers and the Japanese consultants. (By Thursday, the plant manager was bravely attempting to defuse the situation and prevent a mass walk-out by line managers and supervisors.) On the sidelines, it appeared to me that the plant's real issue was the wholesale dismantling and replacement of their processes by outsiders. They experienced huge changes (and equally impressive gains, by the way) with no appreciation by these outsiders that what was ripped out represented the cumulative contribution of those managers and supervisors over long periods of time. The fact that some of the outsiders were from a different culture became the touchpoint. No doubt the consulting style of the Japanese was quite different from that of the US consultants -- who were much less direct, and more considerate communicators. But xenophobia played no small part in the scapegoating; it was convenient for externalizing the frustration and hurt feelings that arose out of the project. As we left the site at the end of the week, the plant staff were threatening darkly that they intended to undo all of the Kaizen work, which would have resulted in significant financial loss to the company.
However, I couldn't see that the process methodology, or how we went about performing Kaizen, or the decisions made, were influenced by culture. In fact, that's what I liked about the process: it was data-driven and completely logical. An experiment conducted in Tokyo looks the same if it's replicated in Tucson. I found this refreshing, since most business management practices are completely culture-dependant. The challenge in any business process is how to make it work with multiple people (it's not a process if everyone does his own thing). It has to be quantifiable; it has to be replicable regardless the individuals who perform it. That's the crux: you must create a mechanistic process that is manifested only within a social, human context. Values and social norms inform only one aspect of Lean or Kaizen: gaining buy-in necessary for a successful implementation.
For engineers and anlysts, that's always the rub. I like my chances of getting a machine to run a new sequence smoothly better than getting a team to do the same.
Friday, August 22, 2008
Chicken Feed
But does money really motivate? Certainly pay that is perceived to be unfair de-motivates, but the inverse isn’t necessarily true. The Brafmans in their book Sway lay out a compelling argument that indeed money doesn’t motivate people to do what you want them to do, and can produce quite the opposite (and seemingly irrational) response. The NYC high school results prove their point: the promise of a cash reward motivated more students to try (that is, take the test), but was ineffective to motivate the behaviors that are necessary to succeeding on the test. So in the workplace: if we want to inspire teams to achieve breakthrough performance, perhaps we need to think outside the perceived wisdom.
In fact, I witnessed this just yesterday in a meeting with the DC staff. Speaking to the point of personal and professional growth, I mentioned off-hand my expectation that work was more than a paycheck – my hope is that everyone has an opportunity to grow professionally, benefit personally, and make a tangible contribution. I was unprepared for the enthusiastic response from the group to the ‘more than a paycheck’ comment; it resonated with them more than I could have expected. Now, I have no doubt that everyone in the room wants more money from his job, and I too want to see them make more money as a result of growing the business and professional growth. But motivation is more complex than feeding the chicken more pellets, although strangely we like to think of ourselves that way.
Thursday, August 14, 2008
Same as it ever was
Wasn't it ever so? Every couple days I see a message on my machine that it's looking for 'updates' it thinks are essential. If I ask for more information about why I should install the updates it found, I get a fuzzy explanation that amounts to: Don't worry your little brain about this; we know what's best. How different it would be if instead, the message said "We've found a bug we created in the software you're running. An unassigned variable causes the application to freeze, requiring you to close and restart the application. This patch contains the fix for it." I'd love the honesty, and I'd also, strangely, give the software company more credibility just because they risked owning up to their mistakes. Even if they didn't have a fix, but knew about the problem (as with the subway payment system), wouldn't it make sense to get more minds working on the problem by letting others in on it?
We've gotta assume there are no secrets when a bug exists. Just because you don't acknowledge it, you think no one will notice? People who earn a living exploiting this vanity can only be grateful.
Monday, August 11, 2008
Uncommon sense
The team leader is convinced that his 'common sense' approach will yield improvements. I'm reading the book Sway: The Irresistible Pull of Irrational Behavior and it's given me some insight into the dynamic at play. Regardless of education or profession, people are more influenced by behavior and perceived value than they are by quantifiable, objective data when making decisions. This is so obvious and, simultaneously, stunning.
I think about the HR assessment tools that profile how a person makes a decision: fact-based, feelings-based, balanced mix? It's dawning on me that we're probably kidding ourselves, when safety experts, scientists and medical doctors (fact-based jobs if ever there were) have a clear track record of throwing facts to the wind in life-or-death decisions. This is obviously so hard-wired in the human brain that it feels quixotic to tilt against it.
Yet, acknowledging the sway of 'common sense' is necessary to understand what is required to instill the uncommon sense of fact-based decisions. Uncommon sense is artificial, and not natural, but we can still value it over the chaotic natural order of things. Rare things are often more valuable, after all.
Saturday, August 2, 2008
What were you thinking?
An article on Internet reading covers the debate on how online reading (particularly by students) stacks up against reading books. I find it striking to hear academics defending online reading’s value: these skills, they argue, will help make the next generation more employable and further, reading books is inefficient: it takes a long time to read a 400-page book and much less to scan summaries or pre-digested opinion about the book. And of course, children themselves prefer it – and, as exasperated parents see it – reading online is preferable to not reading at all.
I wonder what the word ‘reading’ means in this context. Are we simply talking about the ability to interpret written words? We’re certainly not talking about the complex thinking skills required to construct internally the argument or narrative of that 400-page book – which surely are skills that this employer is keenly interested in -- as should every employer of knowledge workers. How is it that we no longer expect our children to read Pride and Prejudice or To Kill a Mockingbird – which are certainly as accessible and relevant today as a generation ago when teens and young adults were expected to read them? (For this I have hard evidence: my 14-year old has not only read these books but savored them.)
This article on reading, and one that appeared in today’s Times , point to another dynamic that I think is worth noting: an increasing expectation of controlling the narrative in one’s own real or creative life. Teens interviewed in the reading article said that they prefer reading online because they can control the narrative (if reading fiction) or the information they receive in non-fiction articles and blogs. An interactive short story site allows readers to change plot points that they don’t like. (Hey – in your version, maybe Romeo and Juliet don’t die after all!) In today’s article, I learned of a new technology tool that allows you to go immediately to voicemail: the recipient thinks you called, but you intentionally go direct to voicemail so that you don’t have to interact with the other person. This is called, I learned, ‘indirect communication,’ which “may be turning some people into digital-era solipsists more interested in broadcasting information than in real time give-and-take.” Interacting with other people opens the possibility of being challenged: maybe the other person has a different viewpoint, heaven forbid! Back in the day, one took that as an opportunity to learn from others, or at least more finely tune one's own argument. I now realize how hopelessly out of date that concept is.
I can only imagine that the logical conclusion to this is a future world much like a wonderful film I saw this week (at the always-challenging Traverse City Film Festival), Sleep Dealers, in which people willingly plug their central nervous system into a corporate network that uses their brains to direct the production of a robotized workforce in other countries. If you’re not interested in using your brain, I suppose it’s a resource that can be commoditized like anything else.
Monday, July 21, 2008
Slow thinking
Nicholas Carr, writing in The Atlantic.com, eloquently addresses the disturbing and far-reaching changes wrought by our increasing dependence on the Internet for information and entertainment. In his 4,000-word article, Carr develops his argument using Kubrick, Nietzsche, Socrates, Frederick Winslow Taylor, the Gutenberg printing press, as well as contemporary writers and players. Halfway through, I felt the need to print it out so that I could read it in a way that allowed me to absorb what he was saying and develop my own thoughts as I read. Actually, that just made Carr's argument: the Internet is the antithesis of 'leisurely reading or slow, concentrated thought.'
I found particularly disturbing the comments from those who had noticed they were losing the ability or interest in deep reading -- and these are people who had previously invested themselves in reading and thinking deeply. They are surely a small minority of the greater populace, which has for decades used television, movies and Cliff Notes as primary sources of information and entertainment.
And yet to meet the challenges of today, more than ever we need fresh and complex thinking. More is at stake: decisions in one place can affect people globally. Economic, environmental, political risks are higher, and change is happening at a pace unheard of in human history. Yet our thinking is more mechanistic, and we are more likely to accept simple solutions based on invalid logic than complex solutions that are actually more likely to succeed. If the solution requires more explanation than a sound bite, it won't be heard. But Carr's point is even more distressing: even if people could stay tuned for the entire explanation, they wouldn't be able to perform the complex thinking required to interpret and understand the content because their wetware has been reprogrammed by today's technologies.
The implications -- for business, technology, politics, our world -- are tremendous. Critical thinking, in the classic sense, is required for problem-solving. What passes for critical thinking (the bombastic criticism of talk radio and cable opinion shows) in the popular mind has assumed primacy in forming opinion, and the popular mind is grateful that it doesn't have to engage in the rigor required of true critical thinking. What a feedback loop!
Friday, July 4, 2008
2D is here
Thursday, June 26, 2008
Research while you shop
This week in a meeting amongst stores and buyers, the topic was a very stylish and well-constructed kid’s bed. The buyer had recently reduced the price (again!) from $648 to $598, and was asking the stores: what’s happening with this bed? It’s fun, has lots of storage and functionality, and the price is less than half of retail. The stores said it gets a lot of looks, but the above-$600 price tag for a kid’s bed is off-putting. So, we’re trying it at another price level, and I hope it will work.
Today I did a quick Internet search to see what retailers were selling it and at what price: every quoted price I saw was 4 figures. Our retail is so far below the market value it’s not funny.
Which just made me think (again) how powerful it would be to enable the customer to combine the benefits of Internet search (comparison pricing, reviews etc) with her in-store experience. Sure, we could allow the customers to use our showroom PCs to comp shop while in the store, but it would be so much more powerful if the customer could take a picture of the item (or its 2D barcode) with her own phone to search for and display comparative prices, specs, and reviews. Although our price tags declare the comparative retail price, how much more credible that price would be if the customer could verify the actual retail from other retailers’ websites.
Thursday, June 12, 2008
Raising the bar
I had an interesting experience today when I engaged in an SMS promotion one of the speakers mentioned in his presentation. The SMS experience was fast and satisfying -- I had to type maybe a total of 15 characters over 3 fast-paced text messages that resulted in an order to be shipped to my home address (which has more than 15 characters in one address line alone). Impressive, and a powerful example. Until... I read the last message: "Your order will be shipped in 6-8 weeks." OK, this is a free sample, so the order lead time is consistent with that service. But still.
The fulfillment industry remains challenged as ever, and marketing will be able to move much quicker, and will set customer expectations that only the very dedicated suppliers will be able to meet (Amazon and Dell can be exemplary, but they have worked at it from the get-go). Thing is, if I can place an order in under 30 seconds, why should I have to wait even a week, when ground service is usually 3-5 days in the US?
Truth is, fulfillment is dull stuff. And yet - it overwhelmingly determines the final impression left on our customer.
Going mobile
But with mobiles, we're finally acknowledging a responsibility to provide content that is absolutely customized for the user. I doubt this is harder (requires more effort), but it does require a fundamental change in thinking that has to persist through the entire application life cycle. Change is usually harder than work.
The more things change...
Bigfoot
I take heart, though. I notice that the hotel offers a 50% discount on valet parking rates if you drive a hybrid. Nice.
Monday, June 9, 2008
All about the front end
Friday, May 30, 2008
Eureka!
With a career-long involvement in software development and implementation, I am fascinated by the question of why ‘Software is hard.’ I suspect that only software engineers think their discipline is uniquely difficult, but any software user (that’s everyone) probably does wonder why good software is so rare.
Creating innovative software usually starts with a Eureka! moment: a single person has a vision of creating something that is, in a word, transcendent. All of the complexity is stripped away as the pure, essential solution reveals itself to the visionary. Impelled by the epiphanic euphoria of the moment, the designer crafts a high-level design that retains as much of the purity of the Eureka vision, while acknowledging some of the messiness of actual data and user requirements. Through design and coding, more of the messiness of reality has to be dealt with: programmers have to bring both a creative mindset (any problem can be solved an infinite number of ways programmatically) and the discipline to create within logical structures.
The logic of even apparently simple software can be extremely dense. From a programmer’s perspective, the technical skills (programming language, operating system, database, etc.) differentiate competency far less than the cognitive skills (logical and critical thinking). Reading another person’s code gives you an intimate insight into how that person thinks (although not what that mind thinks about). This is true for one’s self as well. It’s not uncommon to revisit one’s own code written even recently and wonder at how much you knew then that you’ve now forgotten: as you re-enter the code, all of the variables have to be re-assigned in your own mind, as it were, as you immerse yourself into a mental construct that is unique to the time you spend embedded in a project.
I imagine this is no different in kind from other creative endeavors, such as architecture or orchestral composition. But we don’t expect architects or composers to be commonplace. A modest software company may easily have 30 – 50 people responsible for creating software – that is, actively engaged in the creative process. Just assembling that number of capable people is a huge undertaking. Managing the collaborative process is even more daunting. Translating that to other disciplines is unimaginable: can you imagine finding five Beethovens, much less 30, and then expecting the group of them to create Symphony No. 5? Yet, software is ubiquitous and we depend on it absolutely. Production levels must be high, since requirements change constantly and demand new solutions (software). We expect software to be continually new, and delight us with capabilities we had never imagined – as users we want to consume that Eureka moment and experience the thrill of a truly elegant and innovative solution.
With few exceptions, quality suffers as creative production rises. We lower our expectations. We’re willing to make trade-offs, and the software industry understands that. Finally delivered, the software has been re-imagined by programmers, test engineers, release managers, user input, and consultants who shape the final product. The inspired Archimedes shakes his head at what became of his vision.
Wednesday, May 14, 2008
Choice in a world of scarcity
While priorities shine a light on the preferred activities, they simultaneously obscure those activities that are now left behind ‘to do later’ – if later ever comes. The reality is that we set priorities only because there is more need than resource to meet it. When we set a priority, we accept a view that scarcity (of labor, knowledge, equipment, time, cash or materials) is inevitable. And, given the pervasiveness of this practice, we accept that scarcity in any and all areas is inevitable.
Surely this is self-fulfilling: if we never challenge the scarcity, it never goes away – in fact, priorities accommodate the scarcity so well that we ensure it never goes away, since in fact we never prioritize the scarcity itself.
What if we declared it unacceptable to set priorities: tasks are either worth doing or not worth doing, and if they’re worth doing, they’re worth doing now. Just this challenge would force the business to confront the obstacles they currently face, and acknowledge that they can choose to eliminate the obstacles. Imagine: employees would never feel the burden of tasks undone, or having to explain away a customer’s disappointment while feeling the shame of knowing that there just wasn’t enough resource to meet the customer’s very valid requirements. With time saved in not deciding what work ‘really’ has to be done, managers and teams could invest the time in challenging the fact of scarcity and actual capacity.
So, I’m challenging myself to bring this awareness every time I hear someone utter the word ‘priority.’ This should be interesting.
Thursday, May 8, 2008
The 80%
And yet, it feels accurate. The process at hand has a capacity that has basically been unchanged for a year. The team who uses this process has been measuring productivity out of this process, and because it’s important to them, they’ve brought a lot of energy and creativity to improving the process. I don’t think there was anything wrong with any ideas that were applied –we have some very good minds working on the problem. But if we fix a problem that contributes only incrementally to the overall output, then we can only be disappointed.
It also caused me to think about how working in a management role entices you to believe that because of your role, your ideas are great – you have the experience and technical knowledge to solve problems quickly. Decisive problem-solving is seen as a hallmark of a good manager, so we reward and respect this behavior. And yet – how many of these decisions concern that 80% that contribute 20% gain? That perspective is certainly humbling for any manager.
Monday, May 5, 2008
Beta, not better
Having worked in the software industry for some time, I understand well how modifications can unmoor features that were functional and stable in the prior release. I also know that when engineering and users are more than a degree of separation, it’s all too easy for engineering to make assumptions about how and why users engage with their application – resulting in irrevocable decisions that seem (and probably are) cavalier, despite good intentions. Back in the day of proprietary software development, responsible software companies held engineering accountable to client services. Having to front up a client, angry about an upgrade that removed functionality, is actually helpful for software engineers to understand their purpose: to create software that enhances the users’ abilities or quality of life.
Coincidentally, I read today a blog entry from Mitchell Baker, President of Mozilla Foundation: “It’s hard to find someone who understands both open source software and the consumer space.” I trust her on this, but wonder how this could be? Wasn’t open source supposed to bridge the chasm between software development and users? No longer at the mercy of the monolithic software corporations, open source’s promise was to integrate the user with the development process. And when the user is a software developer, this probably does hold true. But rather than become more democratic, too often development has become solipsistic instead, and accountable to no one.
Andrew Keen’s analysis (The Cult of the Amateur) is spot-on. Good software is the result of well-defined processes managed by professionals. Proprietary vs. open source is a false choice on this: ownership and licencing is irrelevant; whether engineers work together actually or virtually is also not an issue. But, development must be seen as a defined process, with a bright-line difference between a test release and the real thing. As consumers too often see ‘Beta’ at the top of their web-based applications, they are becoming accustomed to working with test release software, and we continue to lower the bar. Is this the brave new world?
Thursday, May 1, 2008
Opening up possibility
Well, that sounds like pretty basic stuff, doesn’t it? Isn’t this the promise of every symposium or convention? Yes, but…. How many times are you able to cash in on that promise? It seems to me the key difference with this event was in framing it not operationally (new concepts – technology – processes), but within a social context. We seek to redefine the supply relationships in order to redefine the processes that link us. If we only first achieve the intangibles (improved understanding, appreciation, trust, connectedness), I have to believe the tangibles (removing waste / cost out of the supply chain) will be realized. What a departure from the sometimes brutal supply relationship models practiced in the past couple decades, which often resulted in breakdowns due to the win/lose polarity embedded within.
I’m reminded that Rosamund and Benjamin Zander refer to this as ‘the downward spiral,’ which they represent so memorably in The Art of Possibility. If one assumes a world of scarcity (finite resources), then there is no possibility for expansion. I’d like to believe in the possibility that there’s more for all, if we allow problem-solving to explode the confines of past experience while we explore a social framework that defines possibility for all parties.
Monday, April 28, 2008
Taking my own advice
Just a few days ago, I wrote here that the challenge is to embed process change within the social context we inhabit. How do I create a narrative or even entertain (!!), while implementing this tool? Just thinking about it takes me down a completely different path that my old-school teaching mind imagines (drawing equations with Greek symbols on a whiteboard…what was I thinking?).
Now I’m thinking about it from the learner’s perspective: why would I need this information? Why would I want this information? How will my life be better if I know this? How could I explain this to someone else in my department? The task now is to discover the human story in the solution, and tell it memorably.
This is much harder than writing the code.
Thursday, April 24, 2008
Stock-outs
I used to work in the fastener industry, where you knew the potential of your top sellers: there’s no mystery surrounding the demand of a ¼” flat washer. The comparative reliability of supply in that industry supports a stability in demand that is all too absent in the furniture industry. The entire supply chain – from customer to manufacturer – seems to accept the unpredictability of supply. A customer shopping for an end table may be disappointed to leave a store empty handed, but she probably entered with low expectations. If her local grocery failed to supply her need for a household staple, she would be outraged to leave empty-handed – largely because the US food industry has raised our expectations of its ability to meet our demands. Why shouldn’t our customers have the same expectations?
Sunday, April 20, 2008
A simple idea
Maybe the objective – to organize the supply chain along a serialized set of transactions informed by mathematical projections – contains the seed of its own failure. The supply world is organic and unpredictable. Engineers see this as the challenge: let’s impose order and predictability! This works, to a degree – defined by the constraints you imposed when you fit the messy world into a statistical model. And then you become frustrated by the limitations you imposed.
I’m thinking now that a social model is more informative than a mechanical model. At a real basic level, we’re talking about people, not systems. People buy stuff – even when they follow departmental guidelines, they’re people: emotional, intuitive, cognitive, yet likely to make mistakes. In the supply world, there are individuals and groups of people interacting sometimes physically, sometimes virtually. All of these people are following some idea of a process – whether a customer or a truck driver or a production line supervisor – while they act in a personal and very individual way. Attempting to mechanize their actions and decisions is futile. Success for any of them is simply that the outcome of what they did was good: the customer found what she was looking for (at an acceptable price); the driver arrives on time without mishap; the supervisor’s line production and employee morale are high. So how can all of them be more successful?
You don’t have to over-think this one. We’re social animals. We naturally create organizing principles for living with each other. We communicate, tell stories, teach, learn, entertain. We share (sometimes) and take (sometimes). We create ideas about what we experience, and then change those ideas when experience changes our thinking. We make tools. When the tools break, we make new ones – and you have to count on the tool breaking; it will.
So how do I …… (forecast demand… plan resources….manage an assortment.. fill in your own need…)? Let’s not over-think it, or try to find the one super concept that will solve the problem for everyone, forever. Before we make a tool, let’s use those social attributes (communicate, tell stories, teach, learn, entertain) and tap into the combined experience of the people in our social group. When we make the tool, we’ll accept that it’s just one way of solving the problem. We also have to accept that not everyone in the group will be able to use the tool expertly every time, so the tool’s design has to cater for that. Then, be prepared to keep making it anew.
I think it’s that simple. And that complex.
Friday, April 18, 2008
Checkpoint or Checkmate?
It comes down to what we value: quantity or quality? Is this a case of win/lose (checkmate), or is it possible to achieve both? What would a work environment look like in which we did achieve both?
Wednesday, April 16, 2008
Hands-free scanning
I’ve been thinking lately about this problem – how could we reduce or (dare we dream?) eliminate scanning while retaining tight control of our inventory? What if we moved away from the data-limiting zebra-stripe barcodes, to the more data-rich 2D (QR) codes? An entire ASN line could be encoded in this expanded data symbology, and with one scan we could verify and receive an item.
Recent breakthroughs in voice technology (see Vangard) make me wonder about receiving by voice – completely hands free! The receiving label could display an ID code for the ASN line: using voice tags and simple codes the operator could receive the item, instruct the system what to do with discrepancies using specific commands, and putaway the item into the bin location. No juggling RF guns and product, no equipment to be damaged, and potentially we can increase accuracy since the operator’s mind is focused on the item and its location – not interpreting data on a label or on a tiny screen.
A similar solution for picking isn’t as obvious (yet) – but just the thought of streamlining the receiving and putaway process while increasing accuracy is exciting.
Tuesday, April 15, 2008
Why is it so slow? Why is it so expensive?
And yet, I see more of a need for greater speed and collaboration and open standards as technology moves to the mobile platform. Mobile technology development isn’t just about the latest cool phone app… thinking differently about the way we develop collaborative interfaces is absolutely necessary for supply chains to reinvent themselves for the global market of the 21st century.