New year, new administration, global and domestic outlook not so nice. I've been reading lots of blog entries recently and everyone I'm reading is trying to find an upside. Umair Haque's new rules for creating value in a changed world are an example. ("Innovating from Constraint in the Developing World"; "Detroit's 6 Mistakes and How Not to Make Them") If a business can no longer succeed using old paradigms in established markets, it now looks to new markets and (one hopes) reshaping its paradigm to exploit that market. Haque is not alone in his advise, just a cogent voice worth heeding. And yet I wonder what business leadership makes of it all.
I'm reminded of previous market downturns, and the businesses I was working in at the time. Invariably, I found myself in a boardroom in which the company's leadership worked diligently to locate itself on the life cycle curve, and redefine what its 'real' business was (as in: 'We're not in the buggy whip business; we're in the business of motivating horses!") so to better frame its marketing. None of those businesses actually re-invented itself, although they survived the specific downturn -- and later, it all seemed rather foolish. The new mission statements and value propositions didn't necessarily leverage the business out of the general economic muddle, but coming up with them did focus the leadership on something positive -- so there was no doubt some benefit to the employees in hearing a message of hope and sustainability.
But actually what's needed now is disruptive leadership: breaks with the past and past assumptions. As I read thought leaders such as Haque, I'm reminded (again) of the Zander exhortation to choose to live in a world of abundance, rather than scarcity. As we look forward to what can only be an extended period of economic recovery, it is too easy to feel the world has contracted, and one's immediate world diminished. And yet, that viewpoint is simply one choice of many. Shifting your perspective 45 degrees, you can just as easily see a world of abundance. It does force you to put the world you're used to viewing on pause, and step out of the frame to shift your perspective. What's required is innovation leveraging constraint to create abundance. In recent history, businesses have routinely outsourced their 'innovation' to consultants, as if this is something that can be purchased. What's needed is not Cliff Notes; you have to do the work yourself. This is an exercise that does not reward years of industry experience - not knowing that you can't do something should have a premium, it seems to me.
Leaders need to think outside the boardroom. Get into your teenage mind, when you were so much smarter than the dead men in your history books, and were feeling absolutely bullet proof and so much more confident than the person you became 10 years later. You weren't thinking about mortgages and daycare and surviving the annual review. Along the way you accepted the world of scarcity. What if you let go that illusion, and re-entered the world of abundance you once abandoned?
This week I've been listening to Beethoven's 5th and 7th Symphonies, and found my heart leaping at the beauty of this very familiar music that was made completely new for me. Conducted by Benjamin Zander, the orchestra created a world in a different time: same notes, different tempi. This seemingly subtle change recreated the work; it was as if I was hearing the music for the first time. Change your personal time signature; explore the new associations created from disruption and unlikely juxtaposition; change the world.
Sunday, January 25, 2009
Wednesday, January 14, 2009
Through the looking glass
Reading this morning's newspaper was an odd, Alice-like experience. First, alarmingly, a story that carries on the madness of using drugs for cosmetic enhancement: the pharmaceutical company that brought you Botox is now marketing its glaucoma drops for growing longer eyelashes. It sometimes seems that I've stepped into the darkest, most dystopian science fiction novel of my youth. I'm appalled by what seems our essential stupidity and superficiality.
But then, a story that took my breath away, in awe of the courage of Afghan schoolgirls (and their parents and teachers) determined to continue their education despite being terrorized and disfigured by acid attacks a month ago.
But then, a story that took my breath away, in awe of the courage of Afghan schoolgirls (and their parents and teachers) determined to continue their education despite being terrorized and disfigured by acid attacks a month ago.
“My parents told me to keep coming to school even if I am killed,” said Shamsia, 17, in a moment after class. Shamsia’s mother, like nearly all of the adult women in the area, is unable to read or write. “The people who did this to me don’t want women to be educated. They want us to be stupid things.”How transformative it would be, to refocus our scientific endeavors from the trivial and cosmetic, benefitting few in insignificant ways, to solving problems that would benefit all of humanity and leave the world better for it. But that would require us not to 'be stupid things.' That may be a tall order.
Monday, January 5, 2009
Mother's Little Helper
A Commentary ("Towards responsible use of cognitive-enhancing drugs by the healthy", Nature.com, 12/7/2008) in Nature's online magazine has generated a lot of discussion off- and online in the past month. The latest I read was an opinion piece by Judith Warner in The New York Times, which is a thoughtful response and yet fairly typical in addressing the Commentary's argument that this use is 'morally equivalent' to other cognitive-enhancing behaviors such as drinking coffee, getting enough sleep, exercising etc. I'm not particularly interested in the moral reading of using drugs to enhance mental performance. I'm much more interested in the nature of these enhancements.
According to the authors, the drugs (such as Ritalin, Adderall, and Provigil) act as stimulants that improve a healthy person's alertness, focus and memory use. People who need a short-term brain function 'enhancement' -- such as students taking a final exam or a physician on night call -- would, for the sake of argument, benefit. And so, in the popular mind, these are now 'smart pills': if use of these drugs help a student perform better on a test, why shouldn't we all take them to be the best we can be? The authors even state "..many different kinds of employee may benefit from enhancement and want access to it...".
So, apparently, did Arthur Conan Doyle believe that cocaine enhanced his fictional detective's already considerable mental faculties. In the 1980s in the US, this drug's popular use in creative fields led to heartbreaking losses and - more mundanely - some truly awful creative work. The syllogism is thus: John Belushi was brilliant; he did drugs; doing drugs makes you brilliant. May I suggest that anyone who fails to see the fault in that logic will definitely NOT benefit from taking cognitive-enhancing drugs?
My worry is not that this is the beginning of profligate use of drugs by healthy people, or that the bar for being 'smart' will become unattainable for those who can't afford the drugs. I worry that this argument will result in over-stimulated, under-disciplined brains that over-value their own brilliance.
The kinds of problems we rely on employees to solve in most workplaces do not require the cognitive feats of a student taking a final exam. The ability to access memory banks is not a premium in the workplace, where data is a few clicks away on the office file server or the Internet - you really don't have to remember it all and you don't get extra points if you do. The problems in business cannot be solved by a textbook: how do we increase throughput while decreasing costs? what will it take to capture more market share? how can we increase engagement and retention of staff? Businesses struggle with problems daily, and rely on the brainpower of their people to solve those problems. When a business does more struggling than solving, it's usually not because they lack the caffeine, or because they don't have enough geniuses on the payroll. But how do the people they employ use their native intelligence?
At least a generation of workers has been nurtured by an educational system that has taught to the test: memorization has had primacy. During this time, the obsolescence of technical knowledge has occurred with unprecedented speed. We've rewarded those students who were best able to commit to memory data that have a half-life of maybe a couple years. Teaching students how to synthesize ideas, how to make their thinking more plastic, how to apply rigor to critical thinking -- these efforts cannot so easily be measured, and therefore are not valued. And yet, aren't the problems we need to solve in industry and in the world just the types that require more than memorization -- they require flexible, inventive minds that work in a structured manner. They require the ability to work on problems over lengthy periods of time, much longer than the duration of a drug dose.
Of course, this type of cognitive enhancement is not on offer from the pharma industry. Unfortunately it's not regularly on offer from our educational system either. As long as we're happy to accept our lot as consumers of other nations' inventions, we can take our pills and feel OK about that.
According to the authors, the drugs (such as Ritalin, Adderall, and Provigil) act as stimulants that improve a healthy person's alertness, focus and memory use. People who need a short-term brain function 'enhancement' -- such as students taking a final exam or a physician on night call -- would, for the sake of argument, benefit. And so, in the popular mind, these are now 'smart pills': if use of these drugs help a student perform better on a test, why shouldn't we all take them to be the best we can be? The authors even state "..many different kinds of employee may benefit from enhancement and want access to it...".
So, apparently, did Arthur Conan Doyle believe that cocaine enhanced his fictional detective's already considerable mental faculties. In the 1980s in the US, this drug's popular use in creative fields led to heartbreaking losses and - more mundanely - some truly awful creative work. The syllogism is thus: John Belushi was brilliant; he did drugs; doing drugs makes you brilliant. May I suggest that anyone who fails to see the fault in that logic will definitely NOT benefit from taking cognitive-enhancing drugs?
My worry is not that this is the beginning of profligate use of drugs by healthy people, or that the bar for being 'smart' will become unattainable for those who can't afford the drugs. I worry that this argument will result in over-stimulated, under-disciplined brains that over-value their own brilliance.
The kinds of problems we rely on employees to solve in most workplaces do not require the cognitive feats of a student taking a final exam. The ability to access memory banks is not a premium in the workplace, where data is a few clicks away on the office file server or the Internet - you really don't have to remember it all and you don't get extra points if you do. The problems in business cannot be solved by a textbook: how do we increase throughput while decreasing costs? what will it take to capture more market share? how can we increase engagement and retention of staff? Businesses struggle with problems daily, and rely on the brainpower of their people to solve those problems. When a business does more struggling than solving, it's usually not because they lack the caffeine, or because they don't have enough geniuses on the payroll. But how do the people they employ use their native intelligence?
At least a generation of workers has been nurtured by an educational system that has taught to the test: memorization has had primacy. During this time, the obsolescence of technical knowledge has occurred with unprecedented speed. We've rewarded those students who were best able to commit to memory data that have a half-life of maybe a couple years. Teaching students how to synthesize ideas, how to make their thinking more plastic, how to apply rigor to critical thinking -- these efforts cannot so easily be measured, and therefore are not valued. And yet, aren't the problems we need to solve in industry and in the world just the types that require more than memorization -- they require flexible, inventive minds that work in a structured manner. They require the ability to work on problems over lengthy periods of time, much longer than the duration of a drug dose.
Of course, this type of cognitive enhancement is not on offer from the pharma industry. Unfortunately it's not regularly on offer from our educational system either. As long as we're happy to accept our lot as consumers of other nations' inventions, we can take our pills and feel OK about that.
Friday, October 3, 2008
Agile or fragile?
We have a small project right now that is proving frustrating to the business and to IT, as we continue to rework code to meet the user's changing expectations. We are (today) hopeful we're almost there, but no one is happy with the experience. Netiher party is finding fault with the other (at least out loud). Intentions are good, but the outcomes just aren't what either department expected.
Coincidentally, I attended a session this week on Agile development -- a philosophy of software development that uses an iterative process. Bill Nazzaro of IconATG delivered an entertaining and insightful presentation on Agility ("Some answers can't be found until you make some mistakes" - I love that). His description of the process reminded me of Lean Production: the Agile movement invokes the small batch size production introduced by Lean. But, it flies in the face of the Lean concept of perfection. Which causes me to think of the recent IT project: what would really be satisfying is a development process in which we get to change our minds about requirements all through the project, but the build at any time is satisfyingly 'perfect' -- that is, without defect.
Possible? Might be, but the user expectation would have to be aligned -- that is, the user must be invested in an iterative process. Otherwise, each iteration will be perceived as rework (waste), rather than an evolutionary process in which some parts of the (genetic) code endure and some are trashed. If the user sees each phase as rework, he starts to lose confidence in the ability of the programmer to meet his requirements -- and naturally the programmer picks up on that, and the project goes into a downward spiral of mutual unhappiness.
I think the concept of perfection -- whether we articulate it or not -- is the derailer. Every one's hope hangs in the balance of whether something comes out right. (The problem, of course, is that without definitive requirements, there is no right - it's perpetually subjective.) To align with an iterative process, maybe we have to agree that the output is right if it simply helps us visualize where we want to go next. Hundreds of small successes may not feel as satisfying as delivering to a huge milestone -- but maybe that's an emotional expectation that we've programmed in ourselves. Could be time for an upgrade.
Coincidentally, I attended a session this week on Agile development -- a philosophy of software development that uses an iterative process. Bill Nazzaro of IconATG delivered an entertaining and insightful presentation on Agility ("Some answers can't be found until you make some mistakes" - I love that). His description of the process reminded me of Lean Production: the Agile movement invokes the small batch size production introduced by Lean. But, it flies in the face of the Lean concept of perfection. Which causes me to think of the recent IT project: what would really be satisfying is a development process in which we get to change our minds about requirements all through the project, but the build at any time is satisfyingly 'perfect' -- that is, without defect.
Possible? Might be, but the user expectation would have to be aligned -- that is, the user must be invested in an iterative process. Otherwise, each iteration will be perceived as rework (waste), rather than an evolutionary process in which some parts of the (genetic) code endure and some are trashed. If the user sees each phase as rework, he starts to lose confidence in the ability of the programmer to meet his requirements -- and naturally the programmer picks up on that, and the project goes into a downward spiral of mutual unhappiness.
I think the concept of perfection -- whether we articulate it or not -- is the derailer. Every one's hope hangs in the balance of whether something comes out right. (The problem, of course, is that without definitive requirements, there is no right - it's perpetually subjective.) To align with an iterative process, maybe we have to agree that the output is right if it simply helps us visualize where we want to go next. Hundreds of small successes may not feel as satisfying as delivering to a huge milestone -- but maybe that's an emotional expectation that we've programmed in ourselves. Could be time for an upgrade.
Wednesday, September 17, 2008
Risky business
I followed a spirited discussion on the user group forum hosted by our ERP vendor. Users were clamoring for development to embrace iPhone integration; which led to a general call for the vendor to broaden its scope beyond Windows client applications. ERP company representatives explained (ever so patiently) why they aren't going there, and why the users should see this reluctance as a good thing. With every posting by the software compay, you could see the frustration level rise amongst the users. Since I've sat in both chairs, I know the frustration from each perspective. User: Why can't you offer more flexibility for my business needs? Vendor: Have you any idea how much this would cost? It has no quantifiable commercial value - how do I pay for this??
Ultimately, if the market eventually demands the new thing, the vendor will develop it -- but it will be a reactive strategy. In the 80s and early 90s, ERP companies could get away with that. The market accepted long lead times for development, and highly valued stability and risk avoidance. The market's in another place now, and users have much different expectations. They look to their technology vendors to bring the future to them. Sometimes (and only sometimes) the new new thing delivers perceivable value and it takes off like a rocket. But, the failure rate is high.
Someone has to bear the risk, and maybe it's time for ERP vendors to think differently about the value their clients expect from them. Rethinking the relationship may lead to rethinking the contracts that underpin it, which fund development. Sometimes it' s possible to be safe and sorry.
Friday, September 12, 2008
Reflections on learning
A teacher at my daughter's new high school shared a pearl this week: when kids are trying to learn something, he said, "Don't steal their struggle." As a parent who too frequently acts the kleptomaniac when her kid hesitates with an answer, this struck me to the core. My urge to help is so unhelpful, and this comment really brought it home to me.
So this was top of mind today in my dialogues today with staff. Ever ready to weigh in (... well, I do keep a blog...), my acting on that impulse is probably none too helpful when someone is working to understand something. Another person really can't help you understand. Wrapping your mind around new data requires something like a dialectical struggle. It's not comfortable and sometimes it feels almost physically painful - the hope is that you learn something because of the conflict between what you thought you knew and the new information.
I think this is the crucial difference between memorization and learning. We commit lots of new information to memory every day, and it's not confronting in the least -- where you left your car keys; how to do a new task; the name of the person who sits next to you at an event. This is no different in kind than memorizing the periodic table, or the Prologue to The Canterbury Tales. Memorization doesn't require that you change how you think about anything - it's just a mechanism to stuff more data into the matrix of your brain.
Too much workplace training is just memorization. Too much of what we 'learn' in our daily lives amounts to no more than this: we just add data to the existing constructs. As an undergraduate years ago, I naively was amazed at the drop-out rate in my freshman philosophy course. For some students, challenging the way they thought (about anything) was just too much -- enticing to me, others found it repulsive. Based on what I see, there's probably a fairly large subset of our population that refuses to challenge their preconceptions -- they'll take on new data, but anything that doesn't fit existing constructs is just lost on them. However we all have our limits: there's a point where my ability to conceptualize something like string theory just causes a system freeze.
Thankfully, it's not all that challenging. Many of the questions I hear daily go beyond memorization, but fall short of string theory. Why do we have so much inventory on hold? How can we enhance the customer's experience? How am I supposed to work with {name of least favorite coworker}? Although the person asking the question may think he's asking for information that can be acted upon, these are all learning questions. They are questions that should cause internal conflict and a change in thinking. The least helpful thing is to steal the struggle - respond with a pat answer. There's so much to be learned in challenging what you think that answer should be, and the discomfort eventually is replaced by the joy of having learned something new.
I think this is the crucial difference between memorization and learning. We commit lots of new information to memory every day, and it's not confronting in the least -- where you left your car keys; how to do a new task; the name of the person who sits next to you at an event. This is no different in kind than memorizing the periodic table, or the Prologue to The Canterbury Tales. Memorization doesn't require that you change how you think about anything - it's just a mechanism to stuff more data into the matrix of your brain.
Too much workplace training is just memorization. Too much of what we 'learn' in our daily lives amounts to no more than this: we just add data to the existing constructs. As an undergraduate years ago, I naively was amazed at the drop-out rate in my freshman philosophy course. For some students, challenging the way they thought (about anything) was just too much -- enticing to me, others found it repulsive. Based on what I see, there's probably a fairly large subset of our population that refuses to challenge their preconceptions -- they'll take on new data, but anything that doesn't fit existing constructs is just lost on them. However we all have our limits: there's a point where my ability to conceptualize something like string theory just causes a system freeze.
Thankfully, it's not all that challenging. Many of the questions I hear daily go beyond memorization, but fall short of string theory. Why do we have so much inventory on hold? How can we enhance the customer's experience? How am I supposed to work with {name of least favorite coworker}? Although the person asking the question may think he's asking for information that can be acted upon, these are all learning questions. They are questions that should cause internal conflict and a change in thinking. The least helpful thing is to steal the struggle - respond with a pat answer. There's so much to be learned in challenging what you think that answer should be, and the discomfort eventually is replaced by the joy of having learned something new.
Monday, September 8, 2008
You can take it with you
This time of rapid technological development is most satisfying when you see your own visions made real by innovators around the world. For some time I've imagined a future in which we carry a single device that is a personal extension of ourselves -- a mobile daemon that is a virtual self. We would use this device to entertain ourselves, read, learn, engage with others, transact commerce. It would function as our multiple devices do today: mp3 player, laptop, PDA, mobile phone -- but it would fulfill all of those needs, while being absolutely portable to the extent we need it to (that is, small enough to slip into a jeans pocket when necessary, but large enough for reading screens of text easily).
A very clever company called Modu has made this a reality using modular design (the very small modu phone slips into jackets - called modu mates -- to enable the physical interface required for various tasks). This is a glimpse into the future. Based on their representation of their product (they've not yet entered the US market), it appears that the company has designed an extraordinary solution. Design is a key term here - the modu and its mates are very design-conscious, striving for the ultra-cool stratosphere that is Apple's domain. How the software enables a well-integrated user experience is also critical -- and even more, how well server-based applications enable a completely mobile experience.
In the meantime, I'm still imagining where this can go - and enjoying the speed of development.
Subscribe to:
Posts (Atom)