Posts Tagged ‘disruption’
The Lesson of “Don’t forget all the parts move”
Today’s WSJ has a book excerpt about the demise of RIM/Blackberry. It is a fascinating story but also has a core lesson for product managers (including myself) which is the lesson of “don’t forget all the parts move”.
While hindsight is always 20/20, when you are faced with a potentially disruptive situation you have to take a step back and revisit nearly all of your assumptions, foundational or peripheral, because whether you see it or not, they are all going to face intense reinvention.
In disruptive theory we always talk about the core concept that disruptive products are better in some things but worse in many of the things (tasks, use cases, features) that are currently in use by the incumbent product. This is the basis of the disruption itself. In reading the excerpt it is clear that out of the gate this reality was how the RIM executives chose to view the iPhone as introduced as targeting a different market segment or different use cases:
If the iPhone gained traction, RIM’s senior executives believed, it would be with consumers who cared more about YouTube and other Internet escapes than efficiency and security. RIM’s core business customers valued BlackBerry’s secure and efficient communication systems. Offering mobile access to broader Internet content, says Mr. Conlee, “was not a space where we parked our business.”
There’s a natural business reaction to want to see a new entrant through the lens of a subset of your existing market. Once you can do that you get more comfortable doing battle in a small way rather than head-on. You feel your market size will trump a “niche” player.
The problem is that such perspective assumes a static view of the market. You’re assuming that all the other attributes of your implementation will remain advantaged and the new competitor will fail to translate that single advantage into a broader attack.
What happens, almost all the time, in technology is that disruptive entrants gain ecosystem momentum. There’s a finite bandwidth in the best people (engineers, partners, channel) to improve, integrate, promote products. Once the new product appears compelling in some way then there’s a race to gain a perceived first mover advantage. Or said another way, the leaders of the old world were already established and so a new platform yields a new chance to a leader. There’s a mad dash to execute whether you’re building leather cases, integrating line of business systems, or selling the product.
When I read that first quote, I thought how crazy to think that the rest of the internet, which includes email and messaging, would not race to try to establish new leadership in the space. The assumption that everyone is sitting still is flawed. Or just as likely, many of those incumbents will choose to assume their small part of the blackberry world will move ahead unscathed.
In a platform transition, everything is up for grabs. If you’re the platform you have to change everything and not just a few little things. First, no matter what you do the change is still going to happen. It means that you don’t have the option of doing nothing. Once a new platform gains momentum and you start losing your partners (of all kinds) or can no longer attract the top talent to the platform you have seen the warning sign and so has everyone else.
As Blackberry learned, you can’t take the path of trying to just change a few things and hope that taking what you perceive as the one missing piece and adding it to your platform will make the competitor go away. You can see how this worked in the example of the Storm device introduction, which aimed to add a bigger screen while maintaining the Blackberry keyboard feel. In other words, the perception was that it was the screen that was the thing that differentiated the device.
The browser was painfully slow, the clickable screen didn’t respond well in the corners and the device often froze and reset. Like most tech companies launching a glitchy product, RIM played for time. Verizon stoked sales with heavy subsidies, while RIM’s engineers raced to introduce software upgrades to eliminate Storm’s many bugs. “It was the best-selling initial product we ever had,” says Mr. Lazaridis, with 1 million devices sold in the first two months. “We couldn’t meet demand.”
Storm’s success was fleeting. By the time Mr. Balsillie was summoned to Verizon’s Basking Ridge, N.J., headquarters in the spring of 2009 to review the carrier’s sales data, RIM’s senior executives knew Storm was a wipeout. Virtually every one of the 1 million Storm phones shipped in 2008 needed replacing, Verizon’s chief marketing officer, John Stratton, told Mr. Balsillie. Many of the replacements were being returned as well. Storm was a complete failure, and Mr. Stratton wanted RIM to pay.
Of course we know now that there were many more elements of the iPhone that changed and it was no single feature or attribute. Every platform shift involves two steps:
- Introduction of a new platform that does some new things but does many existing things in a suboptimal way.
- Evolution of the new platform to achieve all those old scenarios but in new ways that often look like “hey we had that back then”. For example, consider the rise of secure messaging, mobile device management, and new implementations of email. All of these could be viewed as “Blackberry features” just done in a totally different way.
That’s why all the parts are moving, because everything you ever did will get revisited in a new context with a new implementation even if it (a) means the use case goes unanswered for a while and (b) the execution ends up being slightly different.
On a personal note, I was a Blackberry user from the earliest days (because our team made Outlook and the initial Blackberry was a client-side integration). When I saw the iPhone I was one of those people fixated on the keyboard. I was certain it would fail because I couldn’t peck out emails as fast as I could on Blackberry. In fact, I even remember talking about how Windows phones at the time had touchscreens so if that became popular we would have that as well. That summer, I waited on line to pick up my iPhone and was convinced of the future in just a few minutes.
You would have thought I would have been prepared. Previously, I had experienced a similar lesson. I had yet to be convinced of the utility of the internet on a phone, which the iPhone too solved. Of course my lens was clouded by the execution of the phones I used most (Blackberry and Windows) and the fact that the internet didn’t want to work on small screens and without Flash. I would visit Japan several times a year and see the DoCoMo i-mode phones and was a big skeptic—my friends from Japan still make fun of me for not seeing the future (by the way, at that time SMS had yet to even gain traction in the US and friends from Europe found that mysterious). What I failed to recognize was that in the i-mode implementation a full ecosystem solved the problem by moving all the parts around. Of course i-mode got disrupted when the whole of the internet moved to mobile. So perhaps it wasn’t just me. No matter what happens, someone always said it would. But saying it would happen and acting are very different things. Though I do recall many exchanges with Blackberry execs trying to convince them to have a great browser once I used the iPhone.
The lesson always comes back to underestimating the power of ecosystem momentum and the desire and ability of new players to do new things on a new platform.
A while back I made a list of all the moving parts of the Blackberry collapse. You can read it here, Disruption and woulda, coulda, shoulda.
Patience, IoT Is the New “Electronic”
The “Internet of Things” or IoT is cool. I know this because everyone tells everyone else how cool it is. Ask anyone and they will give you their own definition of what IoT means and why it is cool. That’s proof we are using a buzzword or are in a hype-cycle.
Much is at stake to benefit from, contribute to, or even control this next, next-generation of computing. If a company benefitted from 300 million PCs a year, that’s quite cool. If another company benefitted from 1 billion smartphones a year, then that’s pretty cool.
You know what is really cool, benefitting from 75 billion devices. That certainly explains the enthusiasm for the catch phrase.
Missing out on this wave is uncool. Just take a look at the CNBC screen shot to the left. That’s what we talked about in the Digital Innovation class at HBS last week and what motivated this post.
In an effort to quantify the opportunity, claim leadership, or just be included amongst those who “get it” we are all collectively missing the fact that we really don’t know how this will play out in any micro sense. It is safe to say everything will be connected to the internet. That’s about it. As Benedict Evans says, counting connected devices is a lot like counting how many electric motors are in your home. In the first days this was cool. Today, that seems silly. Benedict’s excellent post also goes into details asking many good questions about what being connected might mean and here I enhance our in-class discussion.
One way to view the history of “devices” is through two generations in the 20th century. For the first 50 years we had “analog motor” devices that replaced manual mechanical devices. This was the age of convenience brought by motors of all kinds from giant gas motors that produced electricity to tiny DC motors that powered household gadgets and everything in between. People very quickly learned the benefits of using motors to enhance manual effort. Though if you don’t think it was a generational shift, consider the reactions to the first labor saving home appliances (see Disney’s Carousel of Progress).
The next 50 years was about “digital electronics” which began with the diode, then the transistor, and then the microprocessor. What is amazing about this transition is how many decades past before the full transformation took place. Early on electronics replaced analog variants. Often these were viewed as luxuries at best, or inferior “gadgets” at worse. I recall my father debating with a car dealer the merits of “electronic fuel injection”. Many of us reading this certainly recall (or still believe) the debate over the quality of digital music relative to analog LP and cassette. Interestingly, the benefit we all experience today of size, weight, power consumption, portability, and more took years to gain acceptance. We used to think about “repairing” a VCR and how awful it was that you could not repair a DVD player. Go figure. The key innovation insight is that the benefits of electronics took decades to play out and were not readily apparent to all at the start.
We find ourselves at the start of a generation of invention where everything is connected. We are at the early stages where we are connecting things that we can connect, just like we added motors to replace the human turning the crank on a knitting loom. Some inventions have the magic of the portable radio—freedom and portability. Some seem as gimmicky as that blender.
Here are a few things we all know and love today that have already been transformed by “first generation” connectivity:
For the next few years, thousands of innovators will embark on the idea maze (Chris Dixon summarizes Balaji Srinivasan’s lecture). This is not just about product-market fit, but about much more basic questions. Every generational change in technology introduces a phase of crazy inventing, and that is where we are today with IoT.
This means that for the next couple of years most every product or invention, at first glance, might seem super cool (to some) and crazy to most everyone else. Then after a little use or analysis, more sober minds will prevail. The journey through the idea maze and engineering realities will continue.
This also means that every “thing” introduced will be met with skepticism of the broader, less tech-enthused, market (like our diverse classroom). Every introduction will seem more expensive, more complex, more superfluous than what is currently in use. In fact it is likely that even the ancillary benefits of being connected will be lost on most everyone.
That almost reads like the definition of innovator’s dilemma. Nothing sums this up more than how people talk about smart “watches”, connected thermostats, or robots. One either immediately sees the utility of strapping to your wrist a sub-optimal smartphone you have to charge midday or you ask why you can’t just look at your phone’s lock screen for the time. One looks at Nest thermostat and asks why paying 10X for the luxury of having a professional HVAC installer get stumped or having to “train” something you set and forget is such a good idea.
We find ourselves in the midst of a generational change in the technology base upon which everything is built. It used to be that owning an “electric” or “electronic” thing sounded modern and cool, well because they were so unique. That’s why adding “connected” or “smart” to a product is going to sound about as silly as saying “transistor radio” or “electronic oven”.
Every thing will be connected. The thing is we, collectively, have neither mastered connecting a thing without some downside (cost, weight, complexity) nor even figured out what we would do when something is connected. What are the equivalents of size, weight, reliability, ease of manufacturing, and more when it comes to connectivity? Today we do the “obvious” such as use the cloud for remote relay, access, storage. We write an app to control something over WiFi rather than build in a physical user interface. We collect and analyze data to inform usage or future products. There is more to come. How will devices be connected to each other? How will third parties improve the usage of things and just make them better? Where do we put the “smarts” in a thing when we have thousands of things? How might we find we are safer, healthier, faster, and even just happier?
We just don’t know yet. What we do know is that a lot of entrepreneurs and innovators across companies are going to try things out and incrementally get us to a new connected world, which in a few years will just be the world.
The Internet of Things is not about the things or even the platform the same way we thought about motors or microprocessors. The big winners in IoT will be thinking about an entirely different future, not just connecting to things we already use today in ways we already use them.
Startups aren’t features (of products or companies)
Companies often pay very close attention to new products from startups as they launched and ponder their impact on their scale, mainstream work. Almost all of the time the competitive risk was deemed minimal. Then one day the impact is significant.
In fact up until such a point most pundits and observers likely said that the startup will get overrun or crushed by a big company in the adjacent space. By this time it is often too late for the incumbent and what was a product challenge now looks like an opportunity to take on the challenges of venture integration.
Why is this dynamic so often repeated? Why does the advantage tilt to startups when it comes to innovation, particularly innovation that disrupts the traditional category definition or go to market of a product?
Much of the challenge described here is rooted in how we discuss technology disruption. Incumbents are faced with “disruption” on a daily basis and from all constituencies. To a great degree as an incumbent the sky is always falling. For every product that truly disrupts there are likely hundreds of products, technologies, marketing campaigns, pricing strategies and more that some were certain would be last straw for an incumbent.
Because statistically new ideas are not likely to disrupt and new companies are likely to fail, incumbents become experts at defining away the challenges and risks posed by a new entrant into the market. Incumbents view the risk of wild swings in strategy or execution as much higher risk than odds of a 1 in 100 chance a new technology upending the near term business. Factoring in any reasonable timeline and the incumbent has every incentive to side with statistics.
To answer “why startups aren’t features” this post looks at the three elements of a startup that competes with an incumbent: incumbent’s reaction, challenges faced by the incumbent, and the advantages of the startup.
Reaction
When a startup enters a space thought (by the incumbent or conventional wisdom) to be occupied by an incumbent there are series of reasonably predictable reactions that take place. The more entrenched the incumbent the more reasoned and bullet proof the logic appears to be. Remember, most technologies fail to take hold and most startups don’t grow into significant competitors. I’ve personally reacted to this situation as both a startup and as the incumbent.
Doesn’t solve a problem customers have. The first reaction is to just declare a product as not solving a customer problem. This is sort of the ultimate “in the bubble” reaction because the reality is that the incumbent’s existing customers almost certainly don’t have the specific problem being solved because they too live in the very same context. In a world where enterprises were comfortable sending PPT/PDFs over dedicated lines to replicated file servers, web technologies didn’t solve a problem anyone had (this is a real example I experienced in evangelizing web technology).
Just a feature. The first reaction to most startups is that whatever is being done is a feature of an existing product. Perhaps the most famous of all of these was Steve Jobs declaring Dropbox to be “a feature not a product”. Across the spectrum from enterprise to consumer this reaction is routine. Every major communication service, for example, enabled the exchange of photos (AIM, Messenger, MMS, Facebook, and more). Yet, from Instagram to Snapchat some incredibly innovative and valuable startups have been created that to some do nothing more than slight variations in sharing photos. In collaboration, email, app development, storage and more enterprise startups continue to innovate in ways that solve problems in uniquely valuable ways all while incumbents feel like they “already do that”. So while something might be a feature of an existing product, it is almost certainly not a feature exactly like one in an existing product or likely to become one.
Only a month’s work. One asset incumbents have is an existing engineering infrastructure and user experience. So when a new “feature” becomes interesting in the marketplace and discussions turn to “getting something done” the conclusion is usually that the work is about a month. Often this is based on estimate for how much effort the startup put into the work. However, the incumbent has all sorts of constraints that turn that month into many months: globalization, code reviews, security audits, training customer support, developing marketing plans, enterprise customer roadmaps, not to mention all the coordination and scheduling adjustments. On top of all of that, we all know that it is far easier to add a new feature to a new code base than to add something to a large and complex code base. So rarely is something a month’s work in reality.
Challenges
One thing worth doing as a startup (or as a customer of an incumbent) is considering why the challenges continue even if the incumbent spins up an effort to compete.
Just one feature. If you take at face value that the startup is doing just a feature then it is almost certainly the case that it will be packaged and communicated as such. The feature will get implemented as an add-on, an extra click or checkbox, and communicated to customers as part of the existing materials. In other words, the feature is an objection handler.
Takes a long time to integrate. At the enterprise level, the most critical part of any new feature or innovation is how it integrates with existing efforts. In that regard, the early feedback about the execution will always push for more integration with existing solutions. This will slow down the release of the efforts and tend to pile on more and more engineering work that is outside the domain of what the competitor is doing.
Doesn’t fit with broad value proposition. The other side of “just one feature” is that the go to market execution sees the new feature as somehow conflicting with the existing value proposition. This means that while people seem to be seeing great value in a solution the very existence of the solution runs counter to the core value proposition of the existing products. If you think about all those photo sharing applications, the whole idea was to collect all your photos, enable you to later share them or order prints or mugs. Along comes disappearing photos and that doesn’t fit at all with what you do. At the enterprise level, consider how the enterprise world was all about compliance and containing information while faced with file sharing that is all about beyond the firewall. Faced with reconciling these positioning elements, the incumbent will choose to sell against the startup’s scenario rather than embrace it.
Advantages
Startups also have some advantages in this dynamic that are readily exploitable. Most of the time when a new idea is taking hold one can see how the startup is maximizing the value they bring along one of these dimensions.
Depth versus breadth. Because the incumbent often views something new as a feature of an existing product, the startup has an opportunity to innovate much more deeply in the space. In any scenario becomes interesting, the flywheel of innovation that comes from usage creates many opportunities to improve the scenario. So while the early days might look like a feature, a startup is committed to the full depth of a scenario and only that scenario. They don’t have any pressure to maintain something that already exists or spend energy elsewhere. In a world where customers want the app to offer a full stack solution or expect a tool to complete the scenario without integrating something else, this turns out to be a huge advantage.
Single release effort. The startup is focused on one line of development. There’s no coordination, no schedules to align, no longer term marketing plans to reconcile and so on. Incumbents will often try to change plans but more often than not the reactions are in whitepapers (for enterprise) or beta releases (for consumer). While it might seem obvious, this is where the clarity, focus, and scale of the startup can be most advantageous.
Clear and recognizable value proposition/identity. The biggest challenge incumbents face when adding a new capability to their product/product line is where to put it so it will get noticed. There’s already enormous surface area in the product, the marketing, and also in the business/pricing. Even the basics of telling customers that you’ve done something new is difficult and calling attention to a specific feature it often ends up as a supporting point on the third pillar. Ironically, those arguing to compete more directly are often faced with internal pressures that amount to “don’t validate the competitor that much”. This means even if the feature exists in the incumbent’s product, it is probably really difficult to know that and equally difficult to find. The startup perspective is that the company comes to stand for the entire end-to-end scenario and over time when customers’ needs turn to that feature or scenario, there is total clarity in where to get the app or service.
Even with all of these challenges, this dynamic continues: initially dismissing startup products, later attempting to build what they do, and in general difficulty in reacting to inherent advantages of a startup. One needs to look long and hard for a story where an incumbent organically competed and won against a startup in a category or feature area.
Secret Weapon
More often than not the new categories of products come about because there is a change in the computing landscape at a fundamental level. This change can be the business model, for example the change to software as a service. It could also be the architecture, such as a move to cloud. There could also be a discontinuity in the core computing platform, such as the switch to graphical interface, the web, or mobile.
There’s a more subtle change which is when an underlying technology change is simply too difficult for incumbents to do in an additive fashion. The best way to think about this is if an incumbent has products in many spaces but a new product arises that contains a little bit of two of the incumbent’s products. In order to effectively compete, the incumbent first must go through a process of deciding which team takes the lead in competing. Then they must address innovator’s dilemma challenges and allocate resources in this new area. Then they must execute both the technology plans and go to market plans. While all of this is happening, the startup unburdened by any of these races ahead creating a more robust and full featured solution.
At first this might seem a bit crazy. As you think about it though, modern software is almost always a combination of widely reused elements: messaging, communicating, editing, rendering, photos, identity, storage, API / customization, payments, markets, and so on. Most new products represent bundles or mash-ups of these ingredients. The secret sauce is the precise choice of elements and of course the execution. Few startups choose to compete head-on with existing products. As we know, the next big thing is not a reimplementation of the current big thing.
The secret weapon in startups competing with large scale incumbents is to create a product that spans the engineering organization, takes a counter-intuitive architectural approach, or lands in the middle of the different elements of a go to market strategy. While it might sound like a master plan to do this on purpose, it is amazing how often entrepreneurs simply see the need for new products as a blending of existing solutions, a revisiting of legacy architectural assumptions, and/or emphasis on different parts of the solution.
—Steven Sinofsky (@stevesi)
#codecon and reflecting on generational changes
Attending the <code/conference> (#codecon) this past week turned out to be a remarkable experience, even more remarkable than I expected. The generational shift in our computing experience from desktop to mobile, from software to services, and from hundreds of millions to trillions was on display through the interviews with a dozen industry CEOs.
This post will explore this generational change through the speakers at the conference. Before diving into the details of each session, we will explore this change and the implicit context.
Generational Change
Reflecting on the interviews and demonstrations as well as the “lobby chatter” is a key part of learning by attending. I’ve always viewed this conference and predecessor D Conference as the most relevant conferences for learning about the strategic drivers of our industry. You can read my report from last year here. Writing these reports is part of the learning for me and reading the old reports lets me checkpoint on my own learning and journey.
If you move beyond the insights from any single speaker or the announcements at the event (all were widely reported by re/code and others and new this year by re/code partner CNBC), one theme just keeps coming back to me—the vast difference in tone and content between the incumbents and the challengers, between legacy and disruptors, between the old guard and the new, or whatever labels you want to use. We talk all the time about the transition of our industry from one era to another (and don’t forget the term “post-PC” was first used in this very forum) and the conference provides a microcosm expressed through leaders of these transitions taking place.
There is a vast difference in tone and content between the incumbents and the challengers, between legacy and disruptors, between the old guard and the new.
The transition is in full force. This does not mean by definition that all existing companies will lose and only new companies will win. Quite the contrary, the fact that these changes are now visible to all makes the creation, purchase, and use of new products and technologies evidence of the transition, as well as opportunity to create new plans and adjust. The mobile internet is causing the transition but also making the communication of that very transition much more transparent, which is unlike the progressive unveiling that characterized the mainframe to mini to PC transition.
Are the new companies doing enough to transition customers as well as their own business to new paradigms? How much should new companies bridge from existing solutions or should they expect a wholesale change from customers? Is there an understanding of the existing complexities of the real world?
Are the incumbents changing enough to build new products and business that reflect the new generation? Are they trying too much to “thread the needle” and incrementally step to a new context by maintaining status quo or “repotting the plants”? Is there an understanding of the complexities of existing solutions?
The puts this "generational" change out there for us to experience through the always challenging, yet always consistently even-handed questioning (interrogation) from Walt and Kara (and a great addition this year were interviews featuring seasoned members of the re/code team).
Context (is everything in business)
The attendees (in the audience) are people who have worked in the industry often times since the earliest days. The interviewers are professionals who cover deeply the industry and the subjects. It is hard to imagine creating a more informed or tougher environment. That’s the challenge.
Yet, industry leaders both line up and are obliged to appear (for the most part). Because the environment is so challenging and widely covered, leaders gain a great deal of credibility by standing up to the challenge.
Leaders gain a great deal of credibility by standing up to the challenge of appearing.
The conference takes place the same time every year, whether a company has something to announce or not. For example, last year attendees were frustrated because Apple’s Tim Cook did not announce anything. This is an unfair way to look at the “performance” of a participant. This conference has an amazing audience, but it is also an “uncontrolled” environment so announcing a new product is not without risk and not without huge upside (Disclaimer: I’ve been part of several product announcements/interviews at this forum). Apple, along with many companies, has a tried and true approach to announcing new things as we will see next week.
What is most interesting about the forum, however, is that the format and depth of the dialog allows for a strong “how did we get here” or “how are you wrestling with challenges” discussion. This is not a one-way speech or a forum where talking points go unchallenged. That is in a sense what separates the men from the boys so to speak.
When speakers prepare for the interview, especially at larger companies, folks in communications prepare talking points, responses to tough questions, anecdotes, and even jokes. This is a forum where this can take on “Presidential debate” levels of preparation. The challenge is that everyone in the audience and certainly the interviewers are all well-versed in these techniques. For the presenters, all of that over-preparation cycles through your mind during the tough questions and unpredictable questions from the audience. This is a tough environment.
When speakers choose not to say anything of depth or the answer is clearly a prepared message, you can almost feel the energy in the room drain. There is a collective sense of a missed opportunity to learn more among attendees.
When speakers choose not to say anything of depth or the answer is clearly a prepared message, you can almost feel the energy in the room drain.
Too many people focus on CEOs evading questions about the next big deal or the features/availability of the next product. I don’t think that is a way to evaluate speakers and in almost all cases the interviewers ask a question like this one time often make a joke and move on.
Reporters have an obligation to ask or they look like they are not doing their job. Speakers have an obligation to acknowledge such a forward-looking, material statement and move on. There’s a big caveat to this and where I wanted to share my own learning, my own journey. I believe when it comes to challenges and strategy, CEOs specifically and companies in general can and should do more to inform the dialog. The way I would say this is that if there is something out there that everyone knows to be a fact and the speaker knows to be a fact and everyone knows everyone knows, then talk about it. By not talking about it, the conventional wisdom becomes the reality and the conventional wisdom is often wrong and always incomplete.
I have personally experienced this in the transition from Windows Vista to Windows 7. “Everyone” knew something was up with Vista and certainly Microsoft knew, but no one was saying anything. The result was a strong desire to know the next features of Windows, which was the only thing that folks knew to ask. It served no one to talk about the features of the next product but it also served no one to pretend everything was going well. I missed a big opportunity and looked foolish in a very early interview I did with a (now) re/code reporter. I followed the tried and true approach of the incumbent which is to say nothing, redirect, and so on. See several thousand words without saying anything appear here, from 6 years ago this week.
It turns out that in a world of global instant communication, transparency, open source, platform shifts, and so on that the story about the products, the strategy, and more can come to define efforts more than folks think. This isn’t always the case because business is a social science, but by and large what distinguishes the way the PC era evolved from the way the mobile era is evolving is a vast difference in the flow of information and pace of change. Corporate communications and the leadership approach need to adapt to this era. Recognizing this one thing we did on the above transition in Windows was start blogging about the “why” of the product long before the release, which to this day was a unique level of transparency (and also a huge challenge).
The generational change taking place now is challenging large companies more than ever before. Technology companies are seeing their investments and assets have faster lifecycles and shorter lifespans. They should address head on the challenges of these timescales and commitments. Business approaches are also being challenged and everyone knows this on all sides, but not talking about the challenges means everyone just assumes how things will evolve, and collectively everyone can’t be right.
These changes are also pushing and pulling customers more than ever before. As individual consumers we invest a little bit in a new phone or tablet and maybe a gadget and services here and there. Some of these pan out and some don’t. But large companies looking to define themselves in a new era of mobility, bring your own devices, cross-organizational boundaries, and cloud need much more information and a clearer understanding of what and why things have transpired like they have. Discussing the rationale behind choices provides much more context for customers making bets and allows a much more open dialog to compare and contrast choices. This goes way beyond features and gets to the strategy, learning from the past, direction for the future–it is a fine line.
It is too easy to fall back to wanting to know the next products and features. Companies still have secrets. That’s what defines a company relative to competition. As Jeff Bezos commented recently, “sure, I’d like to know Apple’s product roadmap”. To interpret the need for openness as a public roadmap or feature list misses the point—what was missing from the incumbent perspective was a view of what has transpired over the past 5 years and with that understanding a view of what could provide more understanding of how investments are moving forward.
The real question is if incumbents are going to change enough, fast enough, and in a sense disrupt themselves and do so with a clear understanding of what has transpired in the past few years. Or will they take on all the characteristics of “Innovator’s Dilemma” and operate hoping incremental change dampens any effect of big transitions will allow them to weather the storm and return to normal.
To see how significant this transition is, I think it is best to start with Mary Meeker’s always informative “Internet Trends 2014”. The complete report is available and so is the video. There were many interesting data points—the rise of China, the conversion of smartphones from feature phones, the move of OS platforms to Silicon Valley companies, messaging, and more. One slide that sums up the transition along with the challenge showed the growth of tablets relative to PCs with the title “Tablet Units = Growing Faster Than PCs Ever Did…+52%, 2013”.
Because business is a social science and because there are many ways to look at data, no doubt some will challenge this data or conclusions. In fact, IDC just revised their tablet numbers down. Some feel that Tablets are reverting to their role as “media consumption” or lightweight computing devices. That I’m writing this on a tablet (yes one with a keyboard, but one with LTE, 10 hour battery life, weighs nothing, B5 size, etc.) provides my own anecdote about where things are heading.
This growth will change. It might sputter and then increase. There’s no doubt tablets are overtaking notebooks in terms of unit volumes. They are definitely not taking over all notebook workloads. But that would be like saying the growth of email was irrelevant to word-processing because it ignores the growth of the pie and shift in total volume to the new technology. As Steve Jobs said on stage at this conference, the software will catch up. This is happening. Despite what people might think, large numbers of attendees had their tablets at the conference and they were being “productive”.
Just as mainframe companies attempted to point out the shortcomings of PCs as servers, pointing out the shortcomings of tablets is not helpful, especially as tablets continue to gain more and more features of laptops while maintaining their unique characteristics (lightweight, fanless, quality over time, connectivity, reliability, security, apps, etc.)
One more slide from Mary sets the context that dominated the divergence of incumbents and disruptors and that was the view of the market size of each generation of computing, “Each New Computing Cycle = >10x > Installed Base Than Previous Cycle.”
“More than just phones” might lump too many devices into the last data point for some wishing to make the point that things are not changing so much. Let’s be clear—many mainframes still run the most critical systems of the world (I was in a briefing with an insurance company last week that wanted to hire me because I happened to know PL/1!). Today’s laptops have massive utility that isn’t being replaced overnight and probably won’t ever be “replaced”. That’s the Innovator’s Dilemma argument that does not equip either product developers or customers to innovate and prosper during these cycle changes.
Once you get beyond the specifics of what is coming next, which no one should be obliged to answer at #codecon, the dialog that gets to the heart of what is going on is worth having. What was missed? What was learned? What was tried? What did you think of what was tried? What is being done differently? How are big technology changes being thought of in isolation? Relative to existing investments? What point of view does a company have? What led the new company to be formed? What is different about investments being made? How do customers cope with change?
These questions and how they were answered made for quite a contrast between incumbents and disruptors. If you’re interested in per-speaker reports or the full interviews for any of them, please see the re/code site. My intent is not to summarize the sessions but to reflect on the sessions through this lens of forward leaning versus backward looking.
Incumbents
The incumbents of Microsoft, Intel, Comcast and Wal-Mart had a common theme which is that they each face significant challenges in the technology platforms and business models that brought them wildly successful. At the same time, each in my view missed an opportunity to say how they intend to change. In a sense, each asked us to leap to a future with them in leadership but without the detail to support that assertion.
It is key to understand that it is incredibly important for an industry to have large and healthy players operating at scale. In many ways, the startups we love serve as disruptive R&D for larger players and a healthy M&A pipeline is critical for all as evidenced by some of the recent mega-deals and dozens of smaller ones all aimed at the long term evolution of core products.
It is incredibly important for an industry to have large and healthy players operating at scale
Yet, many investments, particularly in hardware and manufacturing, require billions of dollars that can only be made by large companies. Incremental improvements we come to take for granted such as doubling of capacity, improved batteries, thinner devices, more pixels, massive data centers, and so on can only come from huge scale and well-functioning large companies.
At the same time, one look at Meeker’s slide above and one can’t help but notice that these large companies come to define the cycles she represents. Is that a convenient way we recall changes or were strategic changes part of a causal relationship? Don’t be so quick to judge. There’s a significant amount of subtlety and nuance.
Let’s look at some of the specific speakers.
Microsoft’s Satya Nadella and Intel’s Brian Krzanich both sit in the hot seat (the red chairs that define the #codecon set) with the same question so it is worth considering them together—what happened with respect to mobile and tablets. Satya talked about wishing to have taken the bet to build hardware all the way, sooner. Intel talked about the challenges in manufacturing at 14nm, not having the right product relative to power and the need to do better at 10nm. Mossberg kicked off Brian’s interview with the observation that he’s using a laptop half as frequently and using ARM based products a great deal. In a moment of candor, Brian talked about how many at Intel wished that the march towards mobility would have stopped at Ultrabooks and that Intel lacked the right parts to do tablets, which many at Intel did not think tablets would break out beyond consumption. I felt Brian’s comments showed a good acknowledgement about why things didn’t happen. At the same time, collectively the view of a strategy in the near to medium term didn’t come through. In eerily similar approaches, both Intel and Microsoft looked to a future beyond phones and tablets to an internet of things or more personal computing as where they will see greater success. I left both of these sessions feeling there was more to be told about where things are right now and what will happen over the next year or two (again not the features but the strategy—Microsoft and tablets small and large, Intel and mobile or even Chrome and Android). It isn’t that nothing was said, it was that everyone knows where things are today and the speakers know everyone knows, and the upside to keeping things close to the vest seems minimal and equates to “go with the disruptors” at some level.
One must admit that the challenge faced by Wal-Mart’s Doug McMillon is even greater in this audience which has few Walmart regulars (note, I shop at Walmart). In particular, many in this crowd are on the leading edge of home delivery and uber-for-everything and so visiting stores is already a thing of the past. That said, so much of what was said about online commerce felt too much like an expected incumbent response. For example, the idea that the lines are blurring between ecommerce and retail or that it is really hard to measure ecommerce if a person looked up an item on their mobile device before coming to the store (I wondered if there really was a metric that tried to give credit internally to the ecommerce division if someone did that). Ultimately, Doug said “physical still matters and digital makes it more valuable”. Maybe, except the last morning of the show I ordered a wall mount for the Sonos speaker we received at the show (yes elite gifts are part of the elite show) and it beat me home. Yes that is a luxury good and more, but to put forward the notion that ecommerce is still an add-on to physical stores seemed tricky for me.
Comcast’s Brian Roberts not only faces the challenge of cord cutters represented in the audience or the prospects of dealing with questions on net neutrality, but also just the fact that a lot of people have a lot of less than positive feelings about the products and services Comcast offers. When you look at Comcast as an incumbent and consider things like Netflix, Hulu, cord cutting, and more as the disruptive force it is very tough to see the dialog Brian led as satisfying. My feeling was that there is a strong response to keep everything as it is, while putting forward a notion that things are improving. There was a long demonstration of the X1 cable box. Yet in the same session when questioned about net neutrality, Brian said that it is too bad that Netflix should pay a cost of doing business as he has to pay for cableboxes. I think that they love the cablebox (evidence, it seems to be an incredible headache to get cablecards and very costly to switch to TiVo and the rent for cable boxes is pretty high). The fact that they spent 10 minutes doing a demo on the new platform seemed to indicate that—yet the platform has none of the elements of a modern platform relative to apps or openness as was asked by an attendee. The responses to questions about net neutrality seemed to show a strong desire to avoid change while at the same time not acknowledging a changing world and changing needs of what is going on relative to connectivity. The overall dialog around Netflix seemed harsh to me and it failed to consider just how much more pleasant (and modern) Netflix is as a consumer than the X1 experience shown. Disclaimer: I have had really significant problems with Comcast in our new place and having never used them before; this is my first time as a customer. As I have no choice for video or broadband, one could say it is challenging for me to be totally objective.
Each also stuck to revealing little, defending the status quo, and offering a view of the future that is the same but better.
Each of these CEOs and companies have enormously challenging jobs and situations. Having shareholders demanding consistent quarter by quarter results, customers who do not really want change from these service providers but seek change elsewhere, and massive organizations to change all make for the potential of no-win interviews. Yet, each also stuck to revealing little, defending the status quo, and offering a view of the future that is the same but better. My own experience and learning would offer than when facing massive disruptive challenges, engaging in the dialog serves all parties better even though the normal school of thought for the incumbent is to double-down, stick to talking points, and only reveal challenges through the lens of opportunity.
Disruptors
Several CEOs represented the leading edge of disruption. It is super easy to be a fan of disruption and to look at all that is going well with these leaders just as it is easy to look at all the challenges the incumbents face. At the same time, these disruptions are also representative of a new level of frankness and openness about what they face or have faced.
More than the great work these leaders represent, I think it is important to look at how each is communicating and participating in a dialog. One might suggest that when these leaders are under pressure or face challenges of being disrupted they will start to take on the characteristics demonstrated above. I don’t think that is the case, simply because several of these leaders have already faced (or are facing) these challenges in their business. While clearly disruptors have less to lose, it is important not to lose sight of the fact that some of these represent large public companies (not mega cap, but large) and all represent very large customer bases from consumer to enterprise.
It was exciting to see these leaders head to the future, demonstrate a unique point of view, and engage in a two-way dialog about where things are going
For me, it was exciting to watch these interviews and how these leaders took on their own challenges. It was also exciting to see these leaders head to the future, demonstrate a unique point of view, and engage in a two-way dialog about where things are going.
Let’s look at some of these speakers.
Uber’s Travis Kalanick is arguably the most used and mission critical service for the attendees. The love for the service runs deep. Equally deep is the love for how Uber is taking on the government in the regulation of taxis and ride sharing (along with Lyft, an a16z portfolio company). At the same time, Travis faces a lot of questions about his aggressive style and reputation. He didn’t hold back, characterizing the task ahead at Uber as “a political campaign, and the candidate is Uber and the opponent is an asshole named Taxi.” OK, probably a bit colorful. What I loved was how he embraced even the disruption to his own business. After seeing a truly autonomous car from Google the night before we heard the CEO of Uber telling us that self-driving cars are the future, not drivers. Considering that Uber is a marketplace for drivers, this embrace of your own disruption is great to see.
Most people expected a characteristically polite interview by Softbank’s Masayoshi Son-san, but were treated to candor and aggressiveness, though in a very polite way. This would be consistent with the amazing success Softbank and Yahoo BB had in Japan ten plus years ago bringing amazing broadband and low prices to a market easily dominate by the goliaths like NTT (the most visible building from the Shinjuku train station is the DoCoMo tower). Son-san told the story of starting Yahoo BB and “how they had: No experience, No technology, No capital. Just anger.” This was a true disruptor story, much like Uber’s story of realigning city government only at a national scale. While it was not so challenging to be candid about WiMax, Son-san was super clear about the failed technological approach. He was clear about the intention to go after broadband in the US with the same zeal he went after it in Japan.
Salesforce and Workday (Marc Benioff / Aneel Bhusri) together offered an incredibly clear view of disruption at the enterprise software level. If there’s one interview to watch, I would suggest this one because it has so much relevance to how software is made and brought to market from two CEOs who made and brought to market software in a previous generation. These are CEOs learning from their experience who have also engaged the marketplace differently as disruptors. There were many statements that are starting to seem less and less “bold” but nevertheless remain monumentally disruptive: “in a few years no one will run business software on premises”, “I run the company from a smartphone”, “if you’re going to build a cloud app you need to start from a clean sheet of paper—there’s no way around it”, “incumbents are holding on to the past and basically trying to monetize it”, “90% of the company can do all of HR on a smartphone” and so on. There were many profound elements of the dialog that revealed the depth of the strategic and technological shift these leaders are both creating and have experienced. For example, there was a description of competing with an incumbent like SAP who would go to a customer, negotiate a $40M deal to “upgrade” and then wait two years to get the latest features or start to use a SaaS model and the new features just show up. Yes there’s a ton of complexity in there and yes it is horribly disruptive to how businesses operate, but so was the introduction of the PC, client/server (upon which that $40M upgrade was based) and more. Finally, the discussion about being in a “post-server” world resonated with me as I just don’t see it as viable for companies to be building out their own data centers and this session provided a lot of evidence as to what these vendors are doing to make that a realistic assertion. From a format perspective I love the adjacency of these two and wish a couple of the incumbents were paired together.
Dropbox’s Drew Houston brought innovation, competition, and regulatory oversight into focus with his interview. This is another service that many people in the room not only use but rely on and that brings with it a degree of comfort and also a challenge in that the audience knows a lot about the services represented. Not content to simply reiterate what was previously known and said about the company, Drew talked about the genuine frustration he represents as a cloud provider learning about the revelation that the NSA tapped into cloud based services. It would have been easy to lay low but instead made the quip that the “NSA doesn’t send a muffin basket and say welcome”.
Netflix’s Reed Hastings represents learning and the learning from disruption incredibly well and can also be chronicled in his own appearances in the hot seat. Sometimes we forget that Netflix has been a public company for 12 years, to the day of this interview! For many of us it seems like ancient history that we used to get plastic discs in the mail and then return them Monday morning. Netflix is famously known for having disrupted itself and not with grace while on a path to streaming and today’s Orange is the New Black. I found the discussion looking backwards to missed opportunities and disruption absolutely fascinating. Reed talked about how the team would discuss “managing to the point of feeling like your skin crawled” and making decisions that were unbelievably difficult. While given the success right now, perhaps it is less difficult to look backwards at the challenges faced and mistakes made. It was amazing to hear this level of candor. Reed was even candid about something he said just a short time ago about the high price of Netflix stock which he said at the time was too high and represented a euphoria. In contrast to Comcast, Reed was much clearer about the net neutrality issues are playing out—he used a great example of Comcast trying to charge at both ends (both for the consumer and the internet service) by talking about the flow of money through the system. He offered an operational view of “strong net neutrality”. Putting aside the specifics of the issue, the tone of looking forward, candor about the past, expression of a clear point of view, and a view of delivering new products and services along with the inherent risks and challenges comes across as modern and consistent with a new style of leadership.
What comes next?
It might be too easy to read this and conclude big companies are legacy and being disrupted and new companies disrupt, but that would ignore two things.
First, this is a moment in time. While some would say disruption is akin to physics and must happen, there are dominant companies that reinvent themselves. Few even recall that IBM was close to bankruptcy when it reinvented itself from one dominant company to another, albeit in a very different way. And that reinvention progressed through nearly 20 years and returned 7X the broad stock market overall during that time.
Second, companies that disrupt are themselves prone to disruption down the road. We haven’t seen this dynamic play out yet for the companies here (though Netflix might be one). There is also a great deal of learning about how to reinvent and avoid the risk of being locked into a strategy and execution. Google doing the unthinkable of shutting down services or Facebook acquiring very large scale indirect competitors or technology complements are examples of a new generation of leaders acting differently relative to the potential disruption of core businesses.
Nothing is quite inevitable in business, but the potential to fall into familiar patterns is high.
Nothing is quite inevitable in business, but the potential to fall into familiar patterns is high. This past week at #codecon demonstrated the challenges and approaches to the core risk of the technology industry. In technology, the only thing you really do is monetize the work of the past and deliver innovation to the future. How leaders approach this reality is an evolving skill and #codecon allows us all to witness this evolution firsthand.
–Steven (@stevesi)
Tablets v. the World
Every time the topic of tablets versus laptops (and or smartphones) comes up, we end up in another endless debate about scenarios, consumption, productivity, keyboards, mice, screen size, multitasking, and more. In every case the debate centers around the core uses of “PCs” today—and PC is in quotes because the PC itself is a remarkably flexible device that has morphed over the years into many form factors. People study run-rates and trends and try to predict the demise of one over another and so on.
It isn’t so simple. But it also isn’t so binary.
For more on this dialog, you can also catch a couple of podcasts from Benedict Evans and I (see a16z Podcast: Engineering a Revolution at Work and a16z Podcast: When Your PC Expires).
Disruption
Every disruptive innovation shares (at least) two characteristics. First, the newly introduced technology is more often than not inferior in some key dimensions, while superior in some dimensions that in the current context seem to matter more. Second, despite much consternation, the technology being disrupted is almost certainly going to remain a vital part of the landscape in some form or another for quite some time—either simply because of the long tail of legacy or because it serves a function that is not replicated at all.
What changes, however, is where the emphasis takes place around an ecosystem and with a, usually, broader set of customers. The ecosystem is not a static world and it too plays a vital role in the transition. Where the ecosystem is investing is always a leading indicator of where the transition is heading.
We can look at transitions such as entertainment (theater, radio, film, TV, video, streaming) or transportation (horses, boats, trains, cars, planes) or even storage (removable, hard drives, USB, flash) as examples of where these traits are demonstrated. Computer user-interface moving from characters to GUI to touch shows these traits as well.
The introduction of the iPad, and the modern mobile OS (and smartphones) in general, shows many of these characteristics. The modern OS in combination with new hardware has many characteristics that separate it from the PC era including sealed case (non-extensible hardware), ultra-low power consumption, rich embedded graphics, touch user interface, app store, exclusively wireless connectivity, and more. This is the new platform which is where so much innovation in apps is taking place.
Here is where the debate starts—some of those features are either not valued or true limitations when compared to the vastly more capable PC model. There’s no doubt about that. It is just a fact. Not only does the PC have a wider range and more “powerful” hardware options, but it also benefits from 20 years of software that drives a vast array of processes, devices, workflows, and more. Tablet hardware is still immature relative to “PC standards” and apps do not seem to cover so many of the existing PC scenarios (even if they cover scenarios not even dreamed of or possible on PCs).
Hardware and Software
Two things are still rapidly changing that will account for a much broader transition from the dichotomy of tablet OR laptop today to a world where tablets with modern operating systems begin (or have begun) to replace many scenarios occupied by laptops.
We will soon start to see more innovation in tablets.
First, the hardware in tablets will benefit enormously from Moore’s law. While the pace of changes in smartphones (screen size, cpu, gpu, specs) has been faster than we have seen in tablets, my guess is we will soon start to see more innovation in tablets. In terms of both form factor and specs, tablets have been reasonably static since introduction. There are give or take two screen sizes and fairly modest spec bumps. My guess is that since the same vendors make both smartphones and tablets, the vast amount of energy has been focused on smartphones for now (just as when the PC industry shifted innovation from desktops to laptops and then swung back again to focus on all-in-ones). I suspect we will start to see more screen sizes for tablets and more innovation in peripherals and capabilities, along with specs that benefit from the rapid progress in Moore’s law.
Second, all the hardware innovation in the world isn’t enough to drive new scenarios or even more dramatic replacement scenarios. The amazing innovation in software on smartphones shows what can take place when developers of the world see potential and tap into the power of a new platform.
Two Examples
I wanted to offer two examples of where the transition to tablets has been surprisingly “behind the scenes” and really out of sight, but very interesting from a technical perspective.
Many of us find ourselves in the AT&T store all too often because we’re adding a line, replacing a phone, getting a new SIM or whatever. Over the past year or so, AT&T has aggressively rolled out iPads to replace the in-store PCs that were used for customer service. This is a massive software challenge. The in-store PCs had point of sale capability, bar code readers (for SIMs), and a large array of apps that drove the entire customer engagement (some of these apps ran Windows OpenStep believe it or not).
He kept telling me how frustrating it was to deal with the lack of capabilities of the new tools.
If you happened to visit the store during the early stages of the transition, you would have been able to sense the frustration with the account managers. There were many unfamiliar elements to the new apps on the iPads and worse there seemed to be many things that the desktop tools could do that the iPad apps could not. For example, I got caught trying to merge two accounts and the rep was forced to call the regional call center to do the work and while on hold he kept telling me how frustrating it was to deal with the lack of capabilities of the new tools. At the same time, the iPad had cool integration with portable bar code readers, the reps could easily show you what is on the screen to verify information (like picking a new phone number) and so on.
The transition is well underway now and I don’t think folks notice any more.
Today I spent a few hours with my friendly Comcast technician while he diagnosed something faulty with our cable signal. While he has a fancy signal meter, most of the work he does is actually adjusting things via a remote app on an iPad. Comcast technicians (as I learned, the ones in vans but not “bucket trucks”) were recently issued iPads. Sure enough during the visit he was on the phone to a central office and was saying “I have an iPad now and so without my PC I’m not able to get that measurement”.
The tech said, “I have an iPad now and so without my PC I’m not able to get that measurement”.
I was having flashbacks to the frustrated AT&T reps. Turns out this technician used to have a PC and ran the same software as the tech at the other end of the phone (and in the bucket trucks). They are moving techs to iPads because they do not have to carry chargers; they are more resilient when dropped; and the integrated Verizon connectivity all make for a far more convenient service tool. Plus things like entering the MAC address become much easier with bar code readers and the ability to use a much more agile form factor, as one example.
The conversation I had with a tech (always the anthropologist) was fun. He said they have a whole tracking and feedback process that helps them to prioritize what features the software folks need to add to the apps being used in the field. Turns out, I’m guessing, they built some pretty elaborate desktop software that did just about everything since it was used on the ground and in the data center, but they likely had little understanding of just what was used and how often. The creation of new apps will drive a new level of customer service and technician capabilities, even if there are some hiccups along the way.
Broader Implications
These two examples are hard core line of business tools. We’re seeing the same thing in the line of business tools used by folks at all sorts of companies big and small. The new generation of mobile-first SaaS tools make it far easier to create “documents” for sharing and collaboration, access business information, or participate in business services from CRM to accounting to benefits. The tools these are supplanting were developed over a decade and have tons of features and optimizations but lack the mobility and internet access that is so highly valued in a modern workplace. The transition will have some hiccups but is happening.
Along with these tools, so many of the tools for creation and production that are PC based on being reimagined and recast for modern work. We can see this revolution in Adobe’s work on photography for professionals with tablets, Paper and Penci from fiftythree, and of course the long list of productivity tools we talk about often on this blog. These tools do less, but they also do more. When combined with tablets and smartphones on modern platforms they enable a new view on the work and scenarios.
The characterization of tablets as “neither here nor there” or “in between tablet and a laptop” misses the reality that the modern nature of tablet platforms—both hardware and software—will drive innovation and subsequent transition for many many scenarios from traditional laptop platforms to tablet platforms. We’re in the middle period where this is happening—just as when people said cars were too expensive for the masses and would not be mainstream or when the GUI interface lacked the hardware horsepower and “keystroke productivity” to replace character based tools.
New hardware and new software will surface new capabilities and scenarios not previously possible (or imagined).
The traditional laptop will power hundreds of millions of endpoints for a very long time. But as the two examples here show, even in the most hardcore worlds where device integration meets custom software, there is a transformation and transition taking place. New hardware and new software will surface new capabilities and scenarios not previously possible (or imagined). It won’t be smooth and it won’t please everyone immediately, but it is happening–just as both of those same scenarios transitioned from character to GUI.
It really is about the software. That change is happening all around us.
–Steven (@stevesi)
If at first you don’t succeed: disrupting incumbents in the enterprise
I was talking with a founder/CEO of an enterprise startup about what it is like to disrupt a sizable incumbent. In the case we were talking about the disrupting technology was losing traction and the incumbent was regaining control of the situation, back off their heels, and generally felt like they had fended off the “attack” on a core business. This causes a lot of consternation at the disrupting startup as deals aren’t won, reviews and analyst reports swing the wrong way, and folks start to question the direction. If there really is a product/market fit, then hold on and persevere because almost always the disruption is still going to happen. Let’s look at why.
Incumbent Reacting
The most important thing to realize about a large successful company reacting to a disruptive market entry is that every element of the company just wants to return to “normal” as quickly as possible. It is that simple.
Every action about being disrupted is dictated by a desire to avoid changing things and to maintain the status quo.
If the disruption is a product feature, the motion is figuring out how to tell customers the feature isn’t that important (best case) or how to quickly add something along the lines of the feature and move on (worst case). If the disruption is a pricing change then every effort is about how to “manage customers” without actually changing the price. If the disruption is a new and seemingly important adjacent product, then the actions focus on how to point out that such a product isn’t really necessary. Across the spectrum of potential activities, it is why the early competitive responses are often dismissive or outwardly ignore the challenger. Aside from the normal desire to avoid validating a new market entry by commenting, it takes a lot of time for a large enterprise to go through the work to formulate a response and gain consensus. Therefore an articulate way of changing very little has a lot of appeal.
Status quo is the ultimate goal of the incumbent.
Once a disruptive product gains enough traction that a more robust response is required, the course of action is almost always one that is designed to reduce changes to plans, minimize effort overall, and to do just enough to “tie”. Why is that? Because in a big company “versus” a small company, enterprise customers tend to see “a tie as a win to the incumbent”. Customers have similar views about having their infrastructure disrupted and wish to minimize change, so goals are aligned. The idea of being able to check off that a given scenario is handled by what you already own makes things much easier.
Keep in mind that in any organization, large or small, everyone is at or beyond capacity. There’s no bench, no free cycles. So any change in immediate work necessarily means something isn’t going to get done. In a large organization these challenges are multiplied by scale. People worry about their performance reviews; managers worry about the commitments to other groups; sales people worry about quarterly quotas. All of these worries are extremely difficult to mitigate because they cross layers of managers and functions.
As much as a large team or leader would like to “focus” or “wave a wand” to get folks to see the importance of a crisis, the reality of doing so is itself a massive change effort that takes a lot of time.
This means that the actions taken often follow a known pattern:
- Campaign. The first thing that takes place is a campaign of words and positioning. The checklist of features, the benefits of the existing product, the breadth of features of the incumbent compared to the new product, and so on. If the new product is cheaper, then the focus turns to value. Almost always the campaign emphasizes the depth, breadth, reliability, and comfort of the incumbent’s offer. A campaign might also be quite negative and focus on a fear, compatibility with existing infrastructure, or conventional wisdom weakness of a disruptor, or the might introduce a pretty big leap of repositioning of the incumbent product. A good example of this is how on-premises server products have competed with SaaS by highlighting the lack of flexibility or potential security issues around the cloud. This approach is quick to wind up and easy to wind down. Once it starts to work you roll it out all over the world and execute. Once the deals are won back then the small tiger team that created the campaign goes back to articulating the product as originally intended, aka normal.
- Partnership. Quite often there can be a competitive response of best practices or a third-party tool/add-on that appears to provide some similar functionality. The basic idea is to use someone else to offer the benefit articulated by a disruptive product. Early in the SaaS competition, the on-premises companies were somewhat quick to partner with “hosting” companies who would simply build out a dedicated rack of servers and run the traditional software “as a service”. This repotting plants approach to SaaS has the benefit that once the immediate crisis is mitigated, either the need to actually offer and support the partnership ends or the company just becomes committed to this new sales channel for existing products. Again, everything else continues as it was.
- Special effort. Every once in a while the pressure is so great internally to compete that the engineering team signs up for a “one off” product change or special feature. Because the engineering team was already booked, a special effort is often something carefully negotiated and minimized in scope and effort. Engineering minimizes it internally to avoid messing up dependencies and other features. Sales will be specific in what they expect the result to do because while the commitment is being made they will likely begin to articulate this to red-hot customer situations. At the extreme, it is not uncommon for the engineering team to suggest to the sales organization that a consultant or third-party can use some form of extensibility in the product to implement something that looks like the missing work. The implications of doing enterprise work in a way that minimizes impact is that, well, the impact is minimized. Without the proper architecture or an implementation at the right level in the stack, the effort ultimately looks incomplete or like a one-off. Almost all the on-premise products attempting to morph into cloud products exhibit this in the form of features that used to be there simply not being available in the “SaaS version”. With enough wins, it is almost likely that the special effort feature doesn’t ever get used. Again, the customer is just as likely to be happy with the status quo.
All of these typical responses have the attribute that they can be ignored by the vast majority of resources on a business. Almost no one has to change what they are doing while the business is responding to a disruptive force. Large incumbents love when they can fend off competitors with minimal change.
Large incumbents love when they can fend off competitors with minimal change.
Once the initial wave of competitive wins settles in and the disruptive products lose, there is much rejoicing. The teams just get back to what they were doing and declare victory. Since most of the team didn’t change anything, folks just assume that this was just another competitor with inferior products, technology, approaches that their superior product fended off. Existing customers are happy. All is good.
Or is it?
Disruptor Persevering
This is exactly where the biggest opportunity exists for a disruptive market entry. The level of complacency that settles into an incumbent after the first round of victories is astounding. There’s essentially a reinforcing feedback loop because there was little or no dip in revenue (in fact if revenue was growing before then it still is), product usage is still there, customers go back to asking for features the same as they were before, sales people are making quota, and so on. Things went back to normal for the incumbent.
In fact, just about every disruption happens this way–the first round or first approaches don’t quite take hold.
Why is this?
- Product readiness can improve. Obviously the most common is that the disruptive product simply isn’t ready. The feature set, scale, enterprise controls, or other attributes are deficient. A well-run new product will have done extensive early customer work knowing what is missing and will balance launching with these deficiencies and with the ability to continue to develop the product. In a startup environment, a single company rarely gets a second shot with customers so calibrating readiness is critical. Relative to the broader category of disruption, the harsh reality is that if the disruptor’s idea or approach is the right one but the entry into the market was premature, the learning will apply to the next entry. That’s why the opportunity for disruption is still there. It is why time to market is not always the advantage and being able to apply learning from failures (your own or another entry) can be so valuable.
- Missing ingredient gets added. Often a disruptive product makes a forward-looking bet on some level of enterprise infrastructure or capability as a requirement for the new product to take hold. The incumbent latches on to this missing ingredient and uses it to create an overall state of lack of readiness. If there’s one thing that disruptors know, it is not to bet against Moore’s law. If your product takes more compute, more storage, or more bandwidth, these are most definitely short-term issues. Obviously there’s no room for sloppy work, but by and large time is on your side. So much of the disruption around mobile computing was slowed down by the enterprise issues around managing budgets and allocation of “mobile phones”. Companies did not see it as likely that even better phones would become essential for life outside of work and overwhelm the managed phone process. Similarly, the lack of high-speed mobile networks was seen as a barrier, but all the while the telcos are spending billions to build them out.
- Conventional wisdom will change. One of the most fragile elements of change are the mindsets of those that need to change. This is even more true in enterprise computing. In a world where the average tenure of a CIO is constantly under pressure, where budgets are always viewed with skepticism, and where the immediate needs far exceed resources and time, making the wrong choice can be very costly. Thus the conventional wisdom plays an important part in the timeline for a disruption taking hold. From the PC to the GUI to client/server, to the web, to the cloud, to acceptance of open source each of these went through a period where conventional wisdom was that these were inappropriate for the enterprise. Then one day we all wake up to a world where the approach is required for the enterprise. The new products that are forward-looking and weather the negatives wishing to maintain the status quo get richly rewarded when the conventional wisdom changes.
- Legacy products can’t change. Ultimately the best reason to persevere is because the technology products you’re disrupting simply aren’t going to be suited to the new world (new approach, new scenarios, new technologies). When you re-imagine how something should be, you have an inherent advantage. The very foundation of technology disruption continues to point out that incumbents with the most to lose have the biggest challenges leading through generational changes. Many say the enterprise software world, broadly speaking, is testing these challenges today.
All of these are why disruption has the characteristic of seeming to take a much longer time to take hold than expected, but when it does take hold it happens very rapidly. One day a product is ready for primetime. One day a missing ingredient is ubiquitous. One day conventional wisdom just changes. And legacy products really struggle to change enough (sometimes in business or sometimes in technology) to be “all in” players in the new world.
Of course all this hinges on an idea plus execution of a disruptive idea. All the academic theory and role-playing in the world cannot offer wisdom on knowing if you’re on to something. That’s where the team and entrepreneur’s intuition, perseverance, and adaptability to new data are the most valuable assets.
The opportunity and ability to disrupt the enterprise takes patience and more often than not several attempts, by one or more players learning and adjusting the overall approach. The intrinsic strengths of the incumbent means that new products can usually be defended against for a short time. At the same time the organization and operation of a large and successful company also means that there is near certainty that a subsequent wave of disruption will be stronger, better, and more likely to take hold simply because of the desire for the incumbent to get back to “normal”.
–Steven Sinofsky (@stevesi)
The Four Stages of Disruption
Innovation and disruption are the hallmarks of the technology world, and hardly a moment passes when we are not thinking, doing, or talking about these topics. While I was speaking with some entrepreneurs recently on the topic, the question kept coming up: “If we’re so aware of disruption, then why do successful products (or companies) keep getting disrupted?”
Good question, and here’s how I think about answering it.
As far back as 1962, Everett Rogers began his groundbreaking work defining the process and diffusion of innovation. Rogers defined the spread of innovation in the stages of knowledge, persuasion, decision, implementation and confirmation.
Those powerful concepts, however, do not fully describe disruptive technologies and products, and the impact on the existing technology base or companies that built it. Disruption is a critical element of the evolution of technology — from the positive and negative aspects of disruption a typical pattern emerges, as new technologies come to market and subsequently take hold.
A central question to disruption is whether it is inevitable or preventable. History would tend toward inevitable, but an engineer’s optimism might describe the disruption that a new technology can bring more as a problem to be solved.
Four Stages of Disruption
For incumbents, the stages of innovation for a technology product that ultimately disrupt follow a pattern that is fairly well known. While that doesn’t grant us the predictive powers to know whether an innovation will ultimately disrupt, we can use a model to understand what design choices to prioritize, and when. In other words, the pattern is likely necessary, but not sufficient to fend off disruption. Value exists in identifying the response and emotions surrounding each stage of the innovation pattern, because, as with disruption itself, the actions/reactions of incumbents and innovators play important roles in how parties progress through innovation. In some ways, the response and emotions to undergoing disruption are analogous to the classic stages of grieving.
Rather than the five stages of grief, we can describe four stages that comprise theinnovation pattern for technology products: Disruption of incumbent; rapid and linear evolution; appealing convergence; and complete reimagination. Any product line or technology can be placed in this sequence at a given time.
The pattern of disruption can be thought of as follows, keeping in mind that at any given time for any given category, different products and companies are likely at different stages relative to some local “end point” of innovation.
Stage One: Disruption of Incumbent
A moment of disruption is where the conversation about disruption often begins, even though determining that moment is entirely hindsight. (For example, when did BlackBerry get disrupted by the iPhone, film by digital imaging or bookstores by Amazon?) A new technology, product or service is available, and it seems to some to be a limited, but different, replacement for some existing, widely used and satisfactory solution. Most everyone is familiar with this stage of innovation. In fact, it could be argued that most are so familiar with this aspect that collectively our industry cries “disruption” far more often than is actually the case.
From a product development perspective, choosing whether a technology is disruptive at a potential moment is key. If you are making a new product, then you’re “betting the business” on a new technology — and doing so will be counterintuitive to many around you. If you have already built a business around a successful existing product, then your “bet the business” choice is whether or not to redirect efforts to a new technology. While difficult to prove, most would generally assert that new technologies that are subsequently disruptive are bet on by new companies first. The very nature of disruption is such that existing enterprises see more downside risk in betting the company than they see upside return in a new technology. This is the innovator’s dilemma.
The incumbent’s reactions to potential technology disruptions are practically cliche. New technologies are inferior. New products do not do all the things existing products do, or are inefficient. New services fail to address existing needs as well as what is already in place. Disruption can seem more expensive because the technologies have not yet scaled, or can seem cheaper because they simply do less. Of course, the new products are usually viewed as minimalist or as toys, and often unrelated to the core business. Additionally, business-model disruption has similar analogues relative to margins, channels, partners, revenue approaches and more.
The primary incumbent reaction during this stage is to essentially ignore the product or technology — not every individual in an organization, but the organization as a whole often enters this state of denial. One of the historical realities of disruption is uncovering the “told you so” evidence, which is always there, because no matter what happens, someone always said it would. The larger the organization, the more individuals probably sent mail or had potential early-stage work that could have avoided disruption, at least in their views (see “Disruption and Woulda, Coulda, Shoulda” and the case of BlackBerry). One of the key roles of a company is to make choices, and choosing change to a more risky course versus defense of the current approaches are the very choices that hamstring an organization.
There are dozens of examples of disruptive technologies and products. And the reactions (or inactions) of incumbents are legendary. One example that illustrates this point would be the introduction of the “PC as a server.” This has all of the hallmarks of disruption. The first customers to begin to use PCs as servers — for application workloads such as file sharing, or early client/server development — ran into incredible challenges relative to the mini/mainframe computing model. While new PCs were far more flexible and less expensive, they lacked the reliability, horsepower and tooling to supplant existing models. Those in the mini/mainframe world could remain comfortable observing the lack of those traits, almost dismissing PC servers as not “real servers,” while they continued on their path further distancing themselves from the capabilities of PC servers, refining their products and businesses for a growing base of customers. PCs as servers were simply toys.
At the same time, PC servers began to evolve and demonstrate richer models for application development (rich client front-ends), lower cost and scalable databases, and better economics for new application development. With the rapidly increasing demand for computing solutions to business problems, this wave of PC servers fit the bill. Soon the number of new applications written in this new way began to dwarf development on “real servers,” and the once-important servers became legacy relative to PC-based servers for those making the bet or shift. PC servers would soon begin to transition from disruption to broad adoption, but first the value proposition needed to be completed.
Stage Two: Rapid Linear Evolution
Once an innovative product or technology begins rapid adoption, the focus becomes “filling out” the product. In this stage, the product creators are still disruptors, innovating along the trajectory they set for themselves, with a strong focus on early-adopter customers, themselves disruptors. The disruptors are following their vision. The incumbents continue along their existing and successful trajectory, unknowingly sealing their fate.
This stage is critically important to understand from a product-development perspective. As a disruptive force, new products bring to the table a new way of looking at things — a counterculture, a revolution, an insurgency. The heroic efforts to bring a product or service to market (and the associated business models) leave a lot of room left to improve, often referred to as “low-hanging fruit.” The path from where one is today to the next six, 12, 18 months is well understood. You draw from the cutting-room floor of ideas that got you to where you are. Moving forward might even mean fixing or redoing some of the earlier decisions made with less information, or out of urgency.
Generally, your business approach follows the initial plan, as well, and has analogous elements of insurgency. Innovation proceeds rapidly in this point. Your focus is on the adopters of your product — your fellow disruptors (disrupting within their context). You are adding features critical to completing the scenario you set out to develop.
To the incumbent leaders, you look like you are digging in your heels for a losing battle. In their view, your vector points in the wrong direction, and you’re throwing good money after bad. This only further reinforces the view of disruptors that they are heading in the right direction. The previous generals are fighting the last war, and the disruptors have opened up a new front. And yet, the traction in the disruptor camp becomes undeniable. The incumbent begins to mount a response. That response is somewhere between dismissive and negative, and focuses on comparing the products by using the existing criteria established by the incumbent. The net effect of this effort is to validate the insurgency.
Stage Three: Appealing Convergence
As the market redefinition proceeds, the category of a new product starts to undergo a subtle redefinition. No longer is it enough to do new things well; the market begins to call for the replacement of the incumbent technology with the new technology. In this stage, the entire market begins to “wake up” to the capabilities of the new product.
As the disruptive product rapidly evolves, the initial vision becomes relatively complete (realizing that nothing is ever finished, but the scenarios overall start to fill in). The treadmill of rapidly evolving features begins to feel somewhat incremental, and relatively known to the team. The business starts to feel saturated. Overall, the early adopters are now a maturing group, and a sense of stability develops.
Looking broadly at the landscape, it is clear that the next battleground is to go after the incumbent customers who have not made the move. In other words, once you’ve conquered the greenfield you created, you check your rearview mirror and look to pick up the broad base of customers who did not see your product as market-ready or scenario-complete. To accomplish this, you look differently at your own product and see what is missing relative to the competition you just left in the dust. You begin to realize that all those things your competitor had that you don’t may not be such bad ideas after all. Maybe those folks you disrupted knew something, and had some insights that your market category could benefit from putting to work.
In looking at many disruptive technologies and disruptors, the pattern of looking back to move forward is typical. One can almost think of this as a natural maturing; you promise never to do some of the things your parents did, until one day you find yourself thinking, “Oh my, I’ve become my parents.” The reason that products are destined to converge along these lines is simply practical engineering. Even when technologies are disrupted, the older technologies evolved for a reason, and those reasons are often still valid. The disruptors have the advantage of looking at those problems and solving them in their newly defined context, which can often lead to improved solutions (easier to deploy, cheaper, etc.) At the same time, there is also a risk of second-system syndrome that must be carefully monitored. It is not uncommon for the renegade disruptors, fresh off the success they have been seeing, to come to believe in broader theories of unification or architecture and simply try to get too much done, or to lose the elegance of the newly defined solution.
Stage Four: Complete Reimagination
The last stage of technology disruption is when a category or technology is reimagined from the ground up. While one can consider this just another disruption, it is a unique stage in this taxonomy because of the responses from both the legacy incumbent and the disruptor.
Reimagining a technology or product is a return to first principles. It is about looking at the underlying assumptions and essentially rethinking all of them at once. What does it mean to capture an image,provide transportation, share computation, search the Web, and more? The reimagined technology often has little resemblance to the legacy, and often has the appearance of even making the previous disruptive technology appear to be legacy. The melding of old and new into a completely different solution often creates whole new categories of products and services, built upon a base of technology that appears completely different.
To those who have been through the first disruption, their knowledge or reference frame seems dated. There is also a feeling of being unable to keep up. The words are known, but the specifics seem like rocket science. Where there was comfort in the convergence of ideas, the newly reimagined world seems like a whole new generation, and so much more than a competitor.
In software, one way to think of this is generational. The disruptors studied the incumbents in university, and then went on to use that knowledge to build a better mousetrap. Those in university while the new mousetrap was being built benefited from learning from both a legacy and new perspective, thus seeing again how to disrupt. It is often this fortuitous timing that defines generations in technologies.
Reimagining is important because the breakthroughs so clearly subsume all that came before. What characterizes a reimagination most is that it renders the criteria used to evaluate the previous products irrelevant. Often there are orders of magnitude difference in cost, performance, reliability, service and features. Things are just wildly better. That’s why some have referred to this as the innovator’s curse. There’s no time to bask in the glory of the previous success, as there’s a disruptor following right up on your heels.
A recent example is cloud computing. Cloud computing is a reimagination ofboth the mini/mainframe and PC-server models. By some accounts, it is a hybrid of those two, taking the commodity hardware of the PC world and the thin client/data center view of the mainframe world. One would really have to squint in order to claim it is just that, however, as the fundamental innovation in cloud computing delivers entirely new scale, reliability and flexibility, at a cost that upends both of those models. Literally every assumption of the mainframe and client/server computing was revisited, intentionally or not, in building modern cloud systems.
For the previous incumbent, it is too late. There’s no way to sprinkle some reimagination on your product. The logical path, and the one most frequently taken, is to “mine the installed base,” and work hard to keep those customers happy and minimize the mass defections from your product. The question then becomes one of building an entirely new product that meets these new criteria, but from within the existing enterprise. The number of times this has been successfully accomplished is diminishingly small, but there will always be exceptions to the rule.
For the previous disruptor and new leader, there is a decision point that is almost unexpected. One might consider the drastic — simply learn from what you previously did, and essentially abandon your work and start over using what you learned. Or you could be more incremental, and get straight to the convergence stage with the latest technologies. It feels like the ground is moving beneath you. Can you converge rapidly, perhaps revisiting more assumptions, and show more flexibility to abandon some things while doing new things? Will your product architecture and technologies sustain this type of rethinking? Your customer base is relatively new, and was just feeling pretty good about winning, so the pressure to keep winning will be high. Will you do more than try to recast your work in this new light?
The relentless march of technology change comes faster than you think.
So What Can You Do?
Some sincerely believe that products, and thus companies, disrupt and then are doomed to be disrupted. Like a Starship captain when the shields are down, you simply tell all hands to brace themselves, and then see what’s left after the attack. Business and product development, however, are social sciences. There are no laws of nature, and nothing is certain to happen. There are patterns, which can be helpful signposts, or can blind you to potential actions. This is what makes the technology industry, and the changes technology bring to other industries, so exciting and interesting.
The following table summarizes the stages of disruption and the typical actions and reactions at each stage:
Stage | Disruptor | Incumbent |
---|---|---|
Disruption of Incumbent | Introduces new product with a distinct point of view, knowing it does not solve all the needs of the entire existing market, but advances the state of the art in technology and/or business. | New product or service is not relevant to existing customers or market, a.k.a. “deny.” |
Rapid linear evolution | Proceeds to rapidly add features/capabilities, filling out the value proposition after initial traction with select early adopters. | Begins to compare full-featured product to new product and show deficiencies, a.k.a. “validate.” |
Appealing Convergence | Sees opportunity to acquire broader customer base by appealing to slow movers. Sees limitations of own new product and learns from what was done in the past, reflected in a new way. Potential risk is being leapfrogged by even newer technologies and business models as focus turns to “installed base” of incumbent. | Considers cramming some element of disruptive features to existing product line to sufficiently demonstrate attention to future trends while minimizing interruption of existing customers, a.k.a. “compete.” Potential risk is failing to see the true value or capabilities of disruptive products relative to the limitations of existing products. |
Complete Reimagining | Approaches a decision point because new entrants to the market can benefit from all your product has demonstrated, without embracing the legacy customers as done previously. Embrace legacy market more, or keep pushing forward? | Arguably too late to respond, and begins to define the new product as part of a new market, and existing product part of a larger, existing market, a.k.a. “retreat.” |
Considering these stages and reactions, there are really two key decision points to be tuned-in to:
When you’re the incumbent, your key decision is to choose carefully what you view as disruptive or not. It is to the benefit of every competitor to claim they are disrupting your products and business. Creating this sort of chaos is something that causes untold consternation in a large organization. Unfortunately, there are no magic answers for the incumbent.
The business team needs to develop a keen understanding of the dynamics of competitive offerings, and know when a new model can offer more to customers and partners in a different way. More importantly, it must avoid an excess attachment to today’s measures of success.
The technology and product team needs to maintain a clinical detachment from the existing body of work to evaluate if something new is better, while also avoiding the more common technology trap of being attracted to the next shiny object.
When you’re the disruptor, your key decision point is really when and if to embrace convergence. Once you make the choices — in terms of business model or product offering — to embrace the point of view of the incumbent, you stand to gain from the bridge to the existing base of customers.
Alternatively, you create the potential to lose big to the next disruptor who takes the best of what you offer and leapfrogs the convergence stage with a truly reimagined product. By bridging to the legacy, you also run the risk of focusing your business and product plans on the customers least likely to keep pushing you forward, or those least likely to be aggressive and growing organizations. You run the risk of looking backward more than forward.
For everyone, timing is everything. We often look at disruption in hindsight, and choose disruptive moments based on product availability (or lack thereof). In practice, products require time to conceive, iterate and execute, and different companies will work on these at different paces. Apple famously talked about the 10-year project that was the iPhone, with many gaps, and while the iPad appears a quick successor, it, too, was part of that odyssey. Sometimes a new product appears to be a response to a new entry, but in reality it was under development for perhaps the same amount of time as another entry.
There are many examples of this path to disruption in technology businesses. While many seem “classic” today, the players at the time more often than not exhibited the actions and reactions described here.
As a social science, business does not lend itself to provable operational rules. As appealing as disruption theory might be, the context and actions of many parties create unique circumstances each and every time. There is no guarantee that new technologies and products will disrupt incumbents, just as there is no certainty that existing companies must be disrupted. Instead, product leaders look to patterns, and model their choices in an effort to create a new path.
Stages of Disruption In Practice
Digital imaging. Mobile imaging reimagined a category that disrupted film (always available, low-quality versus film), while converging on the historic form factors and usage of film cameras. In parallel, there is a wave of reimagination of digital imaging taking place that fundamentally changes imaging using light field technology, setting the stage for a potential leapfrog scenario.
- Retail purchasing. Web retailers disrupted physical retailers with selection, convenience, community, etc., ultimatelyconverging on many elements of traditional retailers (physical retail presence, logistics, house brands).
- Travel booking. Online travel booking is disrupting travel agents, then converging on historic models of aggregation and package deals.
- Portable music. From the Sony Walkman as a disruptor to the iPod and MP3 players, to mobile phones subsuming this functionality, and now to streaming playback, portable music has seen the disruptors get disrupted and incumbents seemingly stopped in their tracks over several generations. The change in scenarios enabled by changing technology infrastructure (increased storage, increased bandwidth, mobile bandwidth and more) have made this a very dynamic space.
- Urban transport. Ride sharing, car sharing, and more disruptive traditional ownership of vehicles or taxi services are in the process of converging models (such as Uber adding UberX.
- Productivity. Tools such as Quip, Box, Haiku Deck, Lucidchart, and more are being pulled by customers beyond early adopters to be compatible with existing tools and usage patterns. In practice, these tools are currently iterating very rapidly along their self-defined disruptive path. Some might suggest that previous disruptors in the space (OpenOffice, Zoho, perhaps even Google Docs) chose to converge with the existing market too soon, as a strategic misstep.
- Movie viewing. Netflix and others, as part of cord-cutting, with disruptive, low-priced, all-you-can-consume on-demand plans and producing their own content. Previous disruptors such as HBO are working to provide streaming and similar services, while constrained by existing business models and relationships.
- Messaging/communications apps. SMS, which many consider disruptive to 2000-era IM, is being challenged by much richer interactions that disrupt the business models of carrier SMS and feature sets afforded by SMS.
- Network infrastructure. Software-defined networking and cloud computing are reimagining the category of networking infrastructure, with incumbent players attempting to benefit from these shifts in the needs of customers. Incumbents at different levels are looking to adopt the model, while some providers see it as fundamentally questioning their assumptions.
— Steven Sinofsky (@stevesi). This story originally appeared on Recode.
Disruption and woulda, coulda, shoulda
With the latest pivot for Blackberry much has been said about disruption and what it can do to companies. The story, Inside the fall of BlackBerry: How the smartphone inventor failed to adapt, by Sean Silcoff, Jacquie Mcnish and Steve Ladurantaye in The Globe and Mail is a wonderful account.
Disruption has a couple of characteristics that make it fun to talk about. While it is happening even with a chorus of people claiming it is happening, it is actually very difficult to see. After it has happened the chorus of “told you so” grows even louder and more matter of fact. After the fact, everyone has a view of what could have been done to “prevent” disruption. Finally, the description of disruption tends to lose all of the details leading up to the failure as things get characterized at the broad company level or a simple characteristic (keyboard v. touch) when the situation is far more complex. Those nuances are what product folks deal with day to day and where all the learning can be found.
Like many challenges in business, there’s no easy solution and no pattern to follow. The decision moments, technology changes, and business realities are all happening to people that have the same skills and backgrounds as the chorus, but the real-world constraints of actually doing something about them.
The case of Blackberry is interesting because the breadth of disruptive forces is so great. It is not likely that a case like this will be seen again for a while—a case where a company has such an incredible position of strength in technology and business gained over a relatively short time and then essentially erased in a short time.
I loved my Blackberry. The first time I used one was before they were released (because there was integration with Outlook I was lucky enough to be using one some time in 1998—I even read the entire DOJ filing against Microsoft on one while stopped on the tarmac at JFK). Using the original 850 was a moment when you immediately felt propelled into the future. Using one felt like the first time I saw a graphical interface (Alto) or a GPS. Upon using one you just knew our technology lives would be different.
What went wrong is almost exactly the opposite of what went right and that’s what makes this such an interesting story and unbelievably difficult challenge for those involved. Even today I look at what went on and think of how galactic the challenges were for that amazing group of people that transported us all to the future with one product.
Assumptions
When you build a product you make a lot of assumptions about the state of the art of technology, the best business practices, and potential customer usage/behavior. Any new product that is even little bit revolutionary makes these choices at an instinctual level—no matter what news stories you read about research or surveys or whatever, I think we all know that there’s a certain gut feeling that comes into play.
This is especially the case for products that change our collective world view.
Whether made deliberately or not these assumptions play a crucial role in how a product evolves over time. I’ve never seen a new product developed where the folks wrote down a long list of assumptions. I wouldn’t even know where to start—so many of them are not even thought through and represent just an engineer or product manager “state of the art”, “best practice”, or “this is what I know”.
It turns out these assumptions, implicit or explicit, become your competitive advantage and allow you to take the market by storm.
But then along come technology advances, business model changes, or new customer behaviors and seemingly overnight your assumptions are invalidated.
In a relatively simple product (note, no product is simple to the folks making it) these assumptions might all be within the domain. Christensen famously studied the early days of the disk drive industry. To many of us these assumptions are all contained within one system or component and it is hard to see how disruption could take hold. Fast forward and we just assume solid-state storage, yet even this transition as obvious as it is to us, requires a whole new world view for people who engineer spinning disks.
In a complex product like the entirety of the Blackberry experience there are assumptions that cross hardware, software, communications networks, channel relationships, business models and more. When you bring all these together into a single picture one realizes the enormity of what was accomplished.
It is instructive to consider the many assumptions or ingredients of Blackberry success that go beyond the popular “keyboard v. touch”. In thinking about my own experience with the product, the following list just a few things that were essentially revisited by the iPhone from the perspective of the Blackberry device/team:
- Keyboard to touch. The most visible difference and most easily debated is this change. From crackberry thumbs to contests over who could type faster, your keyboard was clearly a major innovation. The move to touch would challenge you in technology, behavior, and more.
- Small (b&w) screens to large color. Closely connected with the shift to touch was a change in perspective that consuming information on a bigger screen would trump the use of the real estate for (arguably) more efficient input. Your whole notion of industrial design, supply chain, OS, and more would be challenged. As an aside, the power consumption of large screens immediately seemed like a non-starter to a team insanely focused on battery life.
- GPRS to 3G then LTE. Your heritage in radios, starting with the pager network, placed a premium on using the lowest power/bandwidth radio and focusing on efficiency therein. The iPhone, while 2G early, quickly turned around a game changing 3G device. You had been almost dragged into using the newer higher powered radios because your focus had been to treat radio usage as a premium resource.
- Minimize bandwidth to assume bandwidth is free. Your focus on reducing bytes over the wire was met with a device that just assumed bytes would be “free” or at least easily purchased. Many of the early comments on the iPhone focused on this but few assumed the way the communications companies would respond to an appetite for bandwidth. Imagine thinking how sloppy the iPhone was with bandwidth usage and how fast the battery would drain. Assuming a specific resource is high cost is often a path to disruption when someone makes a different assumption.
- No general web support v. general web support. Despite demand, the Blackberry avoided offered generalized web browsing support. The partnership with carriers also precluded this given their concern about network responsiveness and capacity. Again, few would have assumed a network buildout that would support mobile browsing the way it does today. The disruptor had the advantage of growing slowly (relatively) compared to flipping a switch on a giant installed base.
- WiFi as “present” to nearly ubiquitous. The physics of WiFi coverage (along with power consumption, chip surface area and more) assumed WiFi would be expensive and hard to find. Even with whole city WiFi projects in early 2000’s people didn’t see WiFi as a big part of the solution. Few thought about the presence of WiFi at home and new usage scenarios or that every urban setting, hotel, airport, and more would have WiFi. Even the carriers built out WiFi to offload traffic and include it for free in their plans. The elegant and seamless integration of WiFi on the iPhone became a quick advantage.
- Device update/mgmt by tethering to off air. Blackberry required tethering for some routine operations and for many the only way to integrate corporate mail was to keep a PC running all the time. The PC was an integral part of the Blackberry experience for many. While the iPhone was tethered for music and videos, the presence of WiFi and march towards PC-free experiences was an early assumption in the architecture that just took time to play out.
- Business to consumer. Your Blackberry was clearly a business device. Through much of the period of high success consumers flocked to devices like the SideKick. While there was some consumer success, you anchored in business scenarios from Exchange and Notes integration to network security. The iPhone comes along and out of the gate is aimed at consumers with a camera, MMS, and more. This disruption hits at the hardware, the software, the service integration, and even how the device is sold at carriers.
- Data center based service to broad set of cloud based services. Your connection to the enterprise was anchored in a server that business operated. This was a significant business upside as well as a key part of the value proposition for business. This server became a source for valuable business information propagated to the Blackberry (rather than use the web). The absence of an iPhone server seemed like a huge opportunity yet in fact it turned into an asset in terms of spreading the device. Instead the iPhone relied on the web (and subsequently apps) to deliver services rather than programmed and curated services.
- Deep channel partnership/revenue sharing to somewhat tense relationship. By most accounts, your Blackberry business was an incredible win-win with telcos around the world. Story after story talked of the amazing partnerships between carriers and Blackberry. At the same time, stories (and blame game) between Apple and AT&T in the US became somewhat legendary. Yet even with this tension, the iPhone was bringing very valuable customers to AT&T and unseating Blackberry customers.
- Ubiquitous channel presence to exclusives. Your global partnership strength was unmatched and yet disrupted. The iPhone launched with single carriers in limited markets, on purpose. Many viewed that as a liability, including Blackberry. Yet in hindsight this only increased the value to the selected partners and created demand from other potential partners (even with the tension).
- Revenue sharing to data plan. One of the main assets that was mostly invisible to consumers was the revenue to Blackberry for each device on the network. This was because Blackberry was running a secure email service as a major anchor of the offering. Most thought no one was going to give up this revenue, including the carrier ability to up-charge for your Blackberry. Few saw a transition to a heavily subsidized business model with high priced data plans purchased by consumers.
These are just a few and any one of these is probably debatable. The point is really the breadth of changes the iPhone introduced to the Blackberry offering and roadmap. Some of these are assumptions about the technology, some about the business model, some about the ecosystem, some about physics even!
Imagine you’ve just changed the world and everything you did to change the world—your entire world view—has been changed by a new product. Now imagine that the new product is not universally applauded and many folks not only say your product is better and more useful, but that the new product is simply inferior.
Put yourself in those shoes…
Disruption
Disruption happens when a new product comes along and changes the underlying assumptions of the incumbent, as we all know.
Incumbent products and businesses respond by often downplaying the impact of a particular feature or offering. And more often than folks might notice, disruption doesn’t happen so easily. In practice, established businesses and products can withstand a few perturbations to their offering. Products can be rearchitected. Prices can be changed. Features can be added.
What happens though when nearly every assumption is challenged? What you see is a complete redefinition of your entire company. And seeing this happen in real time is both hard to see and even harder to acknowledge. Even in the case of Blackberry there was a time window of perhaps 2 years to respond—is that really enough time to re-engineer everything about your product, company, and business?
One way to look at this case is that disruption rarely happens from a single vector or attribute, even though the chorus might claim X disrupts Y because of price or a single feature, for example. We can see this in the case of something like desktop Linux—being lower priced/open source are interesting attributes but it is fair to say that disruption never really happened to the degree that might have been claimed early on.
However, if you look at Linux in the data center the combination of using Linux for proprietary data center architectures and services combined with the benefit of open source/low price brought with it a much more powerful disruptive capability.
One might take away from this case and other examples, that the disruption to watch out for the most would be the one that combined multiple elements of the traditional marketing mix of product, price, place, promotion. When considering these dimensions it is also worth understanding the full breadth of assumptions, both implicit and explicit, in your product and business when defending against disruption. Likewise, if you’re intending to disrupt you want to consider the multiple dimensions of your approach in order to bypass the intrinsic defenses of incumbents.
It is not difficult to talk about disruption in our industry. As product and business leaders it is instructive to dive into a case of disruption and consider not just all the factors that contributed but how would you respond personally. Could you really lead a team through the process of creating a product that literally inverted almost every business and technology assumption that created $80B or so in market cap over a 10 year period?
In The Sun Also Rises, Hemingway wrote:
How did you go bankrupt? Two ways. Gradually, then suddenly.
That is how disruption happens.
—Steven Sinofsky
Continuous Productivity: New tools and a new way of working for a new era
What happens when the tools and technologies we use every day become mainstream parts of the business world? What happens when we stop leading separate “consumer” and “professional” lives when it comes to technology stacks? The result is a dramatic change in the products we use at work and as a result an upending of the canon of management practices that define how work is done.
This paper says business must embrace the consumer world and see it not as different, less functional, or less enterprise-worthy, but as the new path forward for how people will use technology platforms, how businesses will organize and execute work, and how the roles of software and hardware will evolve in business. Our industry speaks volumes of the consumerization of IT, but maybe that is not going far enough given the incredible pace of innovation and depth of usage of the consumer software world. New tools are appearing that radically alter the traditional definitions of productivity and work. Businesses failing to embrace these changes will find their employees simply working around IT at levels we have not seen even during the earliest days of the PC. Too many enterprises are either flat-out resisting these shifts or hoping for a “transition”—disruption is taking place, not only to every business, but within every business.
Paradigm shift
Continuous productivity is an era that fosters a seamless integration between consumer and business platforms. Today, tools and platforms used broadly for our non-work activities are often used for work, but under the radar. The cloud-powered smartphone and tablet, as productivity tools, are transforming the world around us along with the implied changes in how we work to be mobile and more social. We are in a new era, a paradigm shift, where there is evolutionary discontinuity, a step-function break from the past. This constantly connected, social and mobile generational shift is ushering a time period on par with the industrial production or the information society of the 20th century. Together our industry is shaping a new way to learn, work, and live with the power of software and mobile computing—an era of continuous productivity.
Continuous productivity manifests itself as an environment where the evolving tools and culture make it possible to innovate more and faster than ever, with significantly improved execution. Continuous productivity shifts our efforts from the start/stop world of episodic work and work products to one that builds on the technologies that start to answer what happens when:
- A generation of new employees has access to the collective knowledge of an entire profession and experts are easy to find and connect with.
- Collaboration takes place across organization and company boundaries with everyone connected by a social fiber that rises above the boundaries of institutions.
- Data, knowledge, analysis, and opinion are equally available to every member of a team in formats that are digital, sharable, and structured.
- People have the ability to time slice, context switch, and proactively deal with situations as they arise, shifting from a world of start/stop productivity and decision-making to one that is continuous.
Today our tools force us to hurry up and wait, then react at all hours to that email or notification of available data. Continuous productivity provides us a chance at a more balanced view of time management because we operate in a rhythm with tools to support that rhythm. Rather than feeling like you’re on call all the time waiting for progress or waiting on some person or event, you can simply be more effective as an individual, team, and organization because there are new tools and platforms that enable a new level of sanity.
Some might say this is predicting the present and that the world has already made this shift. In reality, the vast majority of organizations are facing challenges or even struggling right now with how the changes in the technology landscape will impact their efforts. What is going on is nothing short of a broad disruption—even winning organizations face an innovator’s dilemma in how to develop new products and services, organize their efforts, and communicate with customers, partners, and even within their own organizations. This disruption is driven by technology, and is not just about the products a company makes or services offered, but also about the very nature of companies.
Today’s socialplace
The starting point for this revolution in the workplace is the socialplace we all experience each and every day.
We carry out our non-work (digital) lives on our mobile devices. We use global services like Facebook, Twitter, Gmail, and others to communicate. In many places in the world, local services such as Weibo, MixIt, mail.ru, and dozens of others are used routinely by well over a billion people collectively. Entertainment services from YouTube, Netflix to Spotify to Pandora and more dominate non-TV entertainment and dominate the Internet itself. Relatively new services such as Pinterest or Instagram enter the scene and are used deeply by tens of millions in relatively short times.
While almost all of these services are available on traditional laptop and desktop PCs, the incredible growth in usage from smartphones and tablets has come to represent not just the leading edge of the scenario, but the expected norm. Product design is done for these experiences first, if not exclusively. Most would say that designing for a modern OS first or exclusively is the expected way to start on a new software experience. The browser experience (on a small screen or desktop device) is the backup to a richer, more integrated, more fluid app experience.
In short, the socialplace we are all familiar with is part of the fabric of life in much of the world and only growing in importance. The generation growing up today will of course only know this world and what follows. Around the world, the economies undergoing their first information revolutions will do so with these technologies as the baseline.
Historic workplace
Briefly, it is worth reflecting on and broadly characterizing some of the history of the workplace to help to place the dramatic changes into historic context.
Mechanized productivity
The industrial revolution that defined the first half of the 20th century marked the start of modern business, typified by high-volume, large-scale organizations. Mechanization created a culture of business derived from the capabilities and needs of the time. The essence of mechanization was the factory which focused on ever-improving and repeatable output. Factories were owned by those infusing capital into the system and the culture of owner, management, and labor grew out of this reality. Management itself was very much about hierarchy. There was a clear separation between labor and management primarily focused on owners/ownership.
The information available to management was limited. Supply chains and even assembly lines themselves were operated with little telemetry or understanding of the flow of raw materials through to sales of products. Even great companies ultimately fell because they lacked the ability to gather insights across this full spectrum of work.
Knowledge productivity
The problems created by the success of mechanized production were met with a solution—the introduction of the computer and the start of the information revolution. The mid-20th century would kick off a revolution in business, business marked by global and connected organizations. Knowledge created a new culture of business derived from the information gathering and analysis capabilities of first the mainframe and then the PC.
The essence of knowledge was the people-centric office which focused on ever-improving analysis and decision-making to allocate capital, develop products and services, and coordinate the work across the globe. The modern organization model of a board of directors, executives, middle management, and employees grew out of these new capabilities. Management of these knowledge-centric organizations happened through an ever-increasing network of middle-managers. The definition of work changed and most employees were not directly involved in making things, but in analyzing, coordinating, or servicing the products and services a company delivered.
The information available to management grew exponentially. Middle-management grew to spend their time researching, tabulating, reporting, and reconciling the information sources available. Information spanned from quantitative to qualitative and the successful leaders were expert or well versed in not just navigating or validating information, but in using it to effectively influence the organization as a whole. Knowledge is power in this environment. Management took over the role of resource allocation from owners and focused on decision-making as the primary effort, using knowledge and the skills of middle management to inform those choices.
A symbol of knowledge productivity might be the meeting. Meetings came to dominate the culture of organizations: meetings to decide what to meet about, meetings to confirm that people were on the same page, meetings to follow-up from other meetings, and so on. Management became very good at justifying meetings, the work that went into preparing, having, and following up from meetings. Power derived from holding meetings, creating follow-up items and more. The work products of meetings—the pre-reading memos, the presentations, the supporting analytics began to take on epic proportions. Staff organizations developed that shadowed the whole process.
The essence of these meetings was to execute on a strategy—a multi-year commitment to create value, defend against competition, and to execute. Much of the headquarters mindset of this era was devoted to strategic analysis and planning.
The very best companies became differentiated by their use of information technologies in now legendary ways such as to manage supply chain or deliver services to customers. Companies like Wal-Mart pioneered the use of technology to bring lower prices and better inventory management. Companies like the old MCI developed whole new products based entirely on the ability to write software to provide new ways of offering existing services.
Even with the broad availability of knowledge and information, companies still became trapped in the old ways of doing things, unable to adapt and change. The role of disruption as a function not just of technology development but as management decision-making showed the intricate relationship between the two. With this era of information technology came the notion of companies too big and too slow to react to changes in the marketplace even with information right there in front of collective eyes.
The impact of software, as we finished the first decade of the 21st century, is more profound than even the most optimistic software people would have predicted. As the entrepreneur and venture capitalist Marc Andreessen wrote two years ago, “software is eating the world”. Software is no longer just about the internal workings of business or a way to analyze information and execute more efficiently, but has come to define what products a business develops, offers, and serves. Software is now the product, from cars to planes to entertainment to banking and more. Every product not only has a major software component but it is also viewed and evaluated through the role of software. Software is ultimately the product, or at least a substantial part of differentiation, for every product and service.
Today’s workplace: Continuous Productivity
Today’s workplace is as different as the office was from the factory.
Today’s organizations are either themselves mobile or serving customers that are mobile, or likely both. Mobility is everywhere we look—from apps for consumers to sales people in stores and the cash registers to plane tickets. With mobility comes an unprecedented degree of freedom and flexibility—freedom from locality, limited information, and the desktop computer.
The knowledge-based organization spent much energy on connecting the dots between qualitative sampling and data sourced on what could be measured. Much went into trying get more sources of data and to seek the exact right answer to important management decisions. Today’s workplace has access to more data than ever before, but along with that came understanding that just because it came from a computer it isn’t right. Data is telemetry based on usage from all aspects of the system and goes beyond sampling and surveys. The use of data today substitutes algorithms seeking exact answers with heuristics informed by data guessing the best answer using a moment’s worth of statistical data. Today’s answers change over time as more usage generates more data. We no longer spend countless hours debating causality because what is happening is right there before our eyes.
We see this all the time in the promotion of goods on commerce sites, the use of keyword search and SEO, even the way that search itself corrects spellings or maps use a vast array of data to narrow a potentially very large set of results from queries. Technologies like speech or vision have gone from trying to compute the exact answer to using real-time data to provide contextually relevant and even more accurate guesses.
The availability of these information sources is moving from a hierarchical access model of the past to a much more collaborative and sharing-first approach. Every member of an organization should have access to the raw “feeds” that could be material to their role. Teams become the focus of collaborative work, empowered by the data to inform their decisions. We see the increasing use of “crowds” and product usage telemetry able to guide improved service and products, based not on qualitative sampling plus “judgment” but on what amounts to a census of real-world usage.
Information technology is at the heart of all of these changes, just as it was in the knowledge era. The technologies are vastly different. The mainframe was about centralized information and control. The PC era empowered people to first take mainframe data and make better use of it and later to create new, but inherently local or workgroup specific information sources. Today’s cloud-based services serve entire organizations easily and can also span the globe, organizations, and devices. This is such a fundamental shift in the availability of information that it changes everything in how information is collected, shared, and put to use. It changes everything about the tools used to create, analyze, synthesize, and share information.
Management using yesterday’s techniques can’t seem keep up with this world. People are overwhelmed by the power of their customers with all this information (such as when social networks create a backlash about an important decision, or we visit a car dealer armed with local pricing information). Within organizations, managers are constantly trying to stay ahead of the curve. The “young” employees seem to know more about what is going on because of Twitter and Facebook or just being constantly connected. Even information about the company is no longer the sole domain of management as the press are able to uncover or at least speculate about the workings of a company while employees see this speculation long before management is communicating with employees. Where people used to sit in important meetings and listen to important people guess about information, people now get real data from real sources in real-time while the meeting is taking place or even before.
This symbol of the knowledge era, the meeting, is under pressure because of the inefficiency of a meeting when compared to learning and communicating via the technology tools of today. Why wait for a meeting when everyone has the information required to move forward available on their smartphones? Why put all that work into preparing a perfect pitch for a meeting when the data is changing and is a guess anyway, likely to be further informed as the work progresses? Why slow down when competitors are speeding up?
There’s a new role for management that builds on this new level of information and employees skilled in using it. Much like those who grew up with PC “natively” were quick to assume their usage in the workplace (some might remember the novelty of when managers first began to answer their own email), those who grow up with the socialplace are using it to do work, much to the chagrin of management.
Management must assume a new type of leadership that is focused on framing the outcome, the characteristics of decisions, and the culture of the organization and much less about specific decision-making or reviewing work. The role of workplace technology has evolved significantly from theory to practice as a result of these tools. The following table contrasts the way we work between the historic norms and continuous productivity.
Then | Now, Continuous Productivity |
Process | Exploration |
Hierarchy, top down or middle out | Network, bottom up |
Internal committees | Internal and external teams, crowds |
Strategy-centric | Execution-centric |
Presenting packaged and produced ideas, documents | Sharing ideas and perspectives continuously, service |
Data based on snapshots at intervals, viewed statically | Data always real-time, viewed dynamically |
Process-centric | Rhythm-centric |
Exact answers | Approximation and iteration |
More users | More usage |
Today’s workplace technology, theory
Modern IT departments, fresh off the wave of PC standardization and broad homogenization of the IT infrastructure developed the tools and techniques to maintain, ne contain, the overall IT infrastructure.
A significant part of the effort involved managing the devices that access the network, primarily the PC. Management efforts ran the gamut from logon scripts, drive scanning, anti-virus software, standard (or only) software load, imaging, two-factor authentication and more. Motivating this has been the longstanding reliability and security problems of the connected laptop—the architecture’s openness so responsible for the rise of the device also created this fragility. We can see this expressed in two symbols of the challenges faced by IT: the corporate firewall and collaboration. Both of these technologies offer good theories but somewhat backfire in practice in today’s context.
With the rise of the Internet, the corporate firewall occupied a significant amount of IT effort. It also came to symbolize the barrier between employees and information resources. At some extremes, companies would routinely block known “time wasters” such as social networks and free email. Then over time as the popularity of some services grew, the firewall would be selectively opened up for business purposes. YouTube and other streaming services are examples of consumer services that transitioned to an approved part of enterprise infrastructure given the value of information available. While many companies might view Twitter as a time-wasting service, the PR departments routinely use it to track news and customer service might use it to understand problems with products so it too becomes an expected part of infrastructure. These “cracks” in the notion of enterprise v. consumer software started to appear.
Traditionally the meeting came to symbolize collaboration. The business meeting which occupied so much of the knowledge era has taken on new proportions with the spread of today’s technologies. Businesses have gone to great lengths to automate meetings and enhance them with services. In theory this works well and enables remote work and virtual teams across locations to collaborate. In practical use, for many users the implementation was burdensome and did not support the wide variety of devices or cross-organization scenarios required. The merger of meetings with the traditional tools of meetings (slides, analysis, memos) was also cumbersome as sharing these across the spectrum of devices and tools was also awkward. We are all familiar with the first 10 minutes of every meeting now turning into a technology timesink where people get connected in a variety of ways and then sync up with the “old tools” of meetings while they use new tools in the background.
Today’s workspace technology, practice
In practice, the ideal view that IT worked to achieve has been rapidly circumvented by the low-friction, high availability of a wide variety of faster-to-use, easier-to-use, more flexible, and very low-cost tools that address problems in need of solutions. Even though this is somewhat of a repeat of the introduction of PCs in the early 1990’s, this time around securing or locking down the usage of these services is far more challenging than preventing network access and isolating a device. The Internet works to make this so, by definition.
Today’s organizations face an onslaught of personally acquired tablets and smartphones that are becoming, or already are, the preferred device for accessing information and communication tools. As anyone who uses a smartphone knows, accessing your inbox from your phone quickly becomes the preferred way to deal with the bulk of email. How often do people use their phones to quickly check mail even while in front of their PC (even if the PC is not in standby or powered off)? How much faster is it to triage email on a phone than it is on your PC?
These personal devices are seen in airports, hotels, and business centers around the world. The long battery life, fast startup time, maintenance-free (relatively), and of course the wide selection of new apps for a wide array of services make these very attractive.
There is an ongoing debate about “productivity” on tablets. In nearly all ways this debate was never a debate, but just a matter of time. While many look at existing scenarios to be replicated on a tablet as a measure of success of tablets at achieving “professional productivity”, another measure is how many professionals use their tablets for their jobs and leave their laptops at home or work. By that measure, most are quick to admit that tablets (and smartphones) are a smashing success. The idea that tablets are used only for web browsing and light email seems as quaint as claiming PCs cannot do the work of mainframes—a common refrain in the 1980s. In practice, far too many laptops have become literally desktops or hometops.
While the use of tools such as AutoCAD, Creative Suite, or enterprise line of business tools will be required and require PCs for many years to come, the definition of professional productivity will come to include all the tasks that can be accomplished on smartphones and tablets. The nature of work is changing and so the reality of the tools in use are changing as well.
Perhaps the most pervasive services for work use are cloud-based storage products such as DropBox, Hightail (YouSendIt), or Box. These products are acquired easily by consumers, have straightforward browser-based interfaces and apps on all devices, and most importantly solve real problems required by modern information sharing. The basic scenario of sharing large files with a customers or partners (or even fellow employees) across heterogeneous devices and networks is easily addressed by these tools. As a result, expensive and elaborate (or often much richer) enterprise infrastructure goes unused for this most basic of business needs—sharing files. Even the ubiquitous USB memory stick is used to get around the limitations of enterprise storage products, much to the chagrin of IT departments.
Tools beyond those approved for communication are routinely used by employees on their personal devices (except of course in regulated industries). Tools such as WhatsApp or WeChat have hundreds of millions of users. A quick look at Facebook or Twitter show that for many of those actively engaged the sharing of work information, especially news about products and companies, is a very real effort that goes beyond “the eggs I had for breakfast” as social networks have sometimes been characterized. LinkedIn has become the goto place for sales people learning about customers and partners and recruiters seeking to hire (or headhunt) and is increasingly becoming a primary source of editorial content about work and the workplace. Leading strategists are routinely read by hundreds of thousands of people on LinkedIn and their views shared among the networks employees maintain of their fellow employees. It has become challenging for management to “compete” with the level and volume of discourse among employees.
The list of devices and services routinely used by workers at every level is endless. The reality appears to be that for many employees the number of hours of usage in front of approved enterprise apps on managed enterprise devices is on the decline, unless new tablets and phones have been approved. The consumerization of IT appears to be very real, just by anecdotally observing the devices in use on public transportation, airports, and hotels. Certainly the conversation among people in suits over what to bring on trips is real and rapidly tilting towards “tablet for trips”, if not already there.
The frustration people have with IT to deliver or approve the use of services is readily apparent, just as the frustration IT has with people pushing to use insecure, unapproved, and hard to manage tools and devices. Whenever IT puts in a barrier, it is just a big rock in the information river that is an organization and information just flows around it. Forward-looking IT is working diligently to get ahead of this challenge, but the models used to reign in control of PCs and servers on corporate premises will prove of limited utility.
A new approach is needed to deal with this reality.
Transition versus disruption
The biggest risks organizations face is in thinking the transition to a new way of working will be just that, a transition, rather than a disruption. While individuals within an organization, particularly those that might be in senior management, will seek to smoothly transition from one style of work to another, the bulk of employees will switch quickly. Interns, new hires, or employees looking for an edge see these changes as the new normal or the only normal they’ve ever experienced. Our own experience with PCs is proof of how quickly change can take place.
In Only the Paranoid Survive, Andy Grove discussed breaking the news to employees of a new strategy at Intel only to find out that employees had long ago concluded the need for change—much to the surprise of management. The nature of a disruptive change in management is one in which management believes they are planning a smooth transition to new methods or technologies only to find out employees have already adopted them.
Today’s technology landscape is one undergoing a disruptive change in the enterprise—the shift to cloud based services, social interaction, and mobility. There is no smooth transition that will take place. Businesses that believe people will gradually move from yesterday’s modalities of work to these new ways will be surprised to learn that people are already working in these new ways. Technologists seeking solutions that “combine the best of both worlds” or “technology bridge” solutions will only find themselves comfortably dipping their toe in the water further solidifying an old approach while competitors race past them. The nature of disruptive technologies is the relentless all or nothing that they impose as they charge forward.
While some might believe that continuing to focus on “the desktop” will enable a smoother transition to mobile (or consumer) while the rough edges are worked out or capabilities catch up to what we already have, this is precisely the innovator’s dilemma – hunkering down and hoping things will not take place as quickly as they seem to be for some. In fact, to solidify this point of view many will point to a lack of precipitous decline or the mission critical nature in traditional ways of working. The tail is very long, but innovation and competitive edge will not come from the tail. Too much focus on the tail will risk being left behind or at the very least distract from where things are rapidly heading. Compatibility with existing systems has significant value, but is unlikely to bring about more competitive offerings, better products, or step-function improvements in execution.
Culture of continuous productivity
The culture of continuous productivity enabled by new tools is literally a rewrite of the past 30 years of management doctrine. Hierarchy, top-down decision making, strategic plans, static competitors, single-sided markets, and more are almost quaint views in a world literally flattened by the presence of connectivity, mobility, and data. The impact of continuous productivity can be viewed through the organization, individuals and teams, and the role of data.
The social and mobile aspects of work, finally, gain support of digital tools and with those tools the realization of just how much of nearly all work processes are intrinsically social. The existence and paramount importance of “document creation tools” as the nature of work appear, in hindsight, to have served as a slight detour of our collective focus. Tools can now work more like we like to work, rather than forcing us to structure our work to suit the tools. Every new generation of tools comes with promises of improvements, but we’ve already seen how the newest styles of work lead to improvements in our lives outside of work. Where it used to be novel for the person with a PC to use those tools to organize a sports team or school function, now we see the reverse and we see the tools for the rest of life being used to improve our work.
This existence proof makes this revolution different. We already experience the dramatic improvements in our social and non-work “processes”. With the support and adoption of new tools, just as our non-work lives saw improvements we will see improvements in work.
The cultural changes encouraged or enabled by continuous productivity include:
- Innovate more and faster. The bottom line is that by compressing the time between meaningful interactions between members of a team, we will go from problem to solution faster. Whether solving a problem with an existing product or service or thinking up a new one, the continuous nature of communication speeds up the velocity and quality of work. We all experience the pace at which changes outside work take place compared to the slow pace of change within our workplaces.
-
Flatten hierarchy. The difficulty in broad communication, the formality of digital tools, and restrictions on the flow of information all fit perfectly with a strict hierarchical model of teams. Managers “knew” more than others. Information flowed down. Management informed employees. Equal access to tools and information, a continuous multi-way dialog, and the ease and bringing together relevant parties regardless of place in the organization flattens the hierarchy. But more than that, it shines a light on the ineffectiveness and irrelevancy of a hierarchy as a command structure.
- Improve execution. Execution improves because members of teams have access to the interactions and data in real-time. Gone are the days of “game of telephone” where information needed to “cascade” through an organization only to be reinterpreted or even filtered by each level of an organization.
-
Respond to changes using telemetry / data. With the advent of continuous real-world usage telemetry, the debate and dialog move from deciding what the problems to be solved might be to solving the problem. You don’t spend energy arguing over the problem, but debating the merits of various solutions.
- Strengthen organization and partnerships. Organizations that communicate openly and transparently leave much less room for politics and hidden agendas. The transparency afforded by tools might introduce some rough and tumble in the early days as new “norms” are created but over time the ability to collaborate will only improve given the shared context and information base everyone works from.
- Focus on the destination, not the journey. The real-time sharing of information forces organizations to operate in real-time. Problems are in the here and now and demand solutions in the present. The benefit of this “pressure” is that a focus on the internal systems, the steps along the way, or intermediate results is, out of necessity, de-emphasized.
Organization culture change
Continuously productive organizations look and feel different from traditional organizations. As a comparison, consider how different a reunion (college, family, etc.) is in the era of Facebook usage. When everyone gets together there is so much more that is known—the reunion starts from shared context and “intimacy”. Organizations should be just as effective, no matter how big or how geographically dispersed.
Effective organizations were previously defined by rhythms of weekly, monthly and quarterly updates. These “episodic” connection points had high production values (and costs) and ironically relatively low retention and usage. Management liked this approach as it placed a high value on and required active management as distinct from the work. Tools were designed to run these meetings or email blasts, but over time these were far too often over-produced and tended to be used more for backward looking pseudo-accountability.
Looking ahead, continuously productive organizations will be characterized by the following:
- Execution-centric focus. Rather than indexing on the process of getting work done, the focus will shift dramatically to execution. The management doctrine of the late 20th century was about strategy. For decades we all knew that strategy took a short time to craft in reality, but in practice almost took on a life of its own. This often led to an ever-widening gap between strategy and execution, with execution being left to those of less seniority. When everyone has the ability to know what can be known (which isn’t everything) and to know what needs to be done, execution reigns supreme. The opportunity to improve or invent will be everywhere and even with finite resources available, the biggest failure of an organization will be a failure to act.
- Management framing context with teams deciding. Because information required discovery and flowed (deliberately) inefficiently management tasked itself with deciding “things”. The entire process of meetings degenerated into a ritualized process to inform management to decide amongst options while outside the meeting “everyone” always seemed to know what to do. The new role of management is to provide decision-making frameworks, not decisions. Decisions need to be made where there is the most information. Framing the problem to be solved out of the myriad of problems and communicating that efficiently is the new role of management.
- Outside is your friend. Previously the prevailing view was that inside companies there was more information than there was outside and often the outside was viewed as being poorly informed or incomplete. The debate over just how much wisdom resides in the crowd will continue and certainly what distinguishes companies with competitive products will be just how they navigate the crowd and simultaneously serve both articulated and unarticulated needs. For certain, the idea that the outside is an asset to the creation of value, not just the destination of value, is enabled by the tools and continuous flow of information.
- Employees see management participate and learn, everyone has the tools of management. It took practically 10 years from the introduction of the PC until management embraced it as a tool for everyday use by management. The revolution of social tools is totally different because today management already uses the socialplace tools outside of work. Using Twitter for work is little different from using Facebook for family. Employees expect management to participate directly and personally, whether the tool is a public cloud service or a private/controlled service. The idea of having an assistant participate on behalf of a manager with a social tool is as archaic as printing out email and typing in handwritten replies. Management no longer has separate tools or a different (more complete) set of books for the business, but rather information about projects and teams becomes readily accessible.
- Individuals own devices, organizations develop and manage IP. PCs were first acquired by individual tech enthusiasts or leading edge managers and then later by organizations. Over time PCs became physical assets of organizations. As organizations focused more on locking down and managing those assets and as individuals more broadly had their own PCs, there was a decided shift to being able to just “use a computer” when needed. The ubiquity of mobile devices almost from the arrival of smartphones and certainly tablets, has placed these devices squarely in the hands of individuals. The tablet is mine. And because it is so convenient for the rest of my life and I value doing a good job at work, I’m more than happy to do work on it “for free”. In exchange, organizations are rapidly moving to tools and processes that more clearly identify the work products as organization IP not the devices. Cloud-based services become the repositories of IP and devices access that through managed credentials.
Individuals and teams work differently
The new tools and techniques come together to improve upon the way individuals and teams interact. Just as the first communication tools transformed business, the tools of mobile and continuous productivity change the way interactions happen between individuals and teams.
- Sense and respond. Organizations through the PC era were focused on planning and reacting cycles. The long lead time to plan combined with the time to plan a reaction to events that were often delayed measurements themselves characterized “normal”. New tools are much more real-time and the information presented represents the whole of the information at work, not just samples and surveys. The way people will work will focus much more on everyone being sensors for what is going on and responding in real-time. Think of the difference between calling for a car or hailing a cab and using Uber or Lyft from either a consumer perspective or from the business perspective of load balancing cars and awareness of the assets at hand as representative to sensing and responding rather than planning.
- Bottom up and network centric. The idea of management hierarchy or middle management as gatekeepers is being broken down by the presence of information and connectivity. The modern organization working to be the most productive will foster an environment of bottom up—that is people closest to the work are empowered with information and tools to respond to changes in the environment. These “bottoms” of the organization will be highly networked with each other and connected to customers, partners, and even competitors. The “bandwidth” of this network is seemingly instant, facilitated by information sharing tools.
- Team and crowd spanning the internal and external. The barriers of an organization will take on less and less meaning when it comes to the networks created by employees. Nearly all businesses at scale are highly virtualized across vendors, partners, and customers. Collaboration on product development, product implementation, and product support take place spanning information networks as well as human networks. The “crowd” is no longer a mob characterized by comments on a blog post or web site, but can be structured and systematically tapped with rich demographic information to inform decisions and choices.
- Unstructured work rhythm. The highly structured approach to work that characterized the 20th century was created out of a necessity for gathering, analyzing, and presenting information for “costly” gatherings of time constrained people and expensive computing. With the pace of business and product change enabled by software, there is far less structure required in the overall work process. The rhythm of work is much more like routine social interactions and much less like daily, weekly, monthly staff meetings. Industries like news gathering have seen these radical transformations, as one example.
Data becomes pervasive (and big)
With software capabilities come ever-increasing data and information. While the 20th century enabled the collection of data and to a large degree the analysis of data to yield ever improving decisions in business, the prevalence of continuous data again transforms business.
- Sharing data continuously. First and foremost, data will now be shared continuously and broadly within organizations. The days when reports were something for management and management waited until the end of the week or month to disseminate filtered information are over. Even though financial data has been relatively available, we’re now able to see how products are used, trouble shoot problems customers might be having, understand the impact of small changes, and try out alternative approaches. Modern organizations will provide tools that enable the continuous sharing of data through mobile-first apps that don’t require connectivity to corporate networks or systems chained to desktop resources
- Always up to date. The implication of continuously sharing information means that everyone is always up to date. When having a discussion or meeting, the real world numbers can be pulled up right then and there in the hallway or meeting room. Members of teams don’t spend time figuring out if they agree on numbers, where they came from or when they were “pulled”. Rather the tools define the numbers people are looking at and the data in those tools is the one true set of facts.
- Yielding best statistical approach informed by telemetry (induction). The notion that there is a “right” answer is antiquated as the printed report. We can now all admit that going to a meeting with a printed out copy of “the numbers” is not worth the debate over the validity or timeframe of those numbers (“the meeting was rescheduled, now we have to reprint the slides.”) Meetings now are informed by live data using tools such as Mixpanel or live reporting from Workday, Salesforce and others. We all know now that “right” is the enemy of “close enough” given that the datasets we can work with are truly based on census and not surveys. This telemetry facilitates an inductive approach to decision-making.
- Valuing more usage. Because of the ability to truly understand the usage of products—movies watched, bank accounts used, limousines taken, rooms booked, products browsed and more—the value of having more people using products and services increases dramatically. Share matters more in this world because with share comes the best understanding of potential growth areas and opportunities to develop for new scenarios and new business approaches.
New generation of productivity tools, examples and checklist
Bringing together new technologies and new methods for management has implications that go beyond the obvious and immediate. We will all certainly be bringing our own devices to work, accessing and contributing to work from a variety of platforms, and seeing our work take place across organization boundaries with greater ease. We can look very specifically at how things will change across the tools we use, the way we communicate, how success is measured, and the structure of teams.
Tools will be quite different from those that grew up through the desktop PC era. At the highest level the implications about how tools are used are profound. New tools are being developed today—these are not “ports” of existing tools for mobile platforms, but ideas for new interpretations of tools or new combinations of technologies. In the classic definition of innovator’s dilemma, these new tools are less functional than the current state-of-the-art desktop tools. These new tools have features and capabilities that are either unavailable or suboptimal at an architectural level in today’s ubiquitous tools. It will be some time, if ever, before new tools have all the capabilities of existing tools. By now, this pattern of disruptive technologies is familiar (for example, digital cameras, online reading, online videos, digital music, etc.).
The user experience of this new generation of productivity tools takes on a number of attributes that contrast with existing tools, including:
- Continuous v. episodic. Historically work took place in peaks and valleys. Rough drafts created, then circulated, then distributed after much fanfare (and often watering down). The inability to stay in contact led to a rhythm that was based on high-cost meetings taking place at infrequent times, often requiring significant devotion of time to catching up. Continuously productive tools keep teams connected through the whole process of creation and sharing. This is not just the use of adjunct tools like email (and endless attachments) or change tracking used by a small number of specialists, but deep and instant collaboration, real-time editing, and a view that information is never perfect or done being assembled.
- Online and shared information. The old world of creating information was based on deliberate sharing at points in time. Heavyweight sharing of attachments led to a world where each of us became “merge points” for work. We worked independently in silos hoping not to step on each other never sure where the true document of record might be or even who had permission to see a document. New tools are online all the time and by default. By default information can be shared and everyone is up to date all the time.
- Capture and continue The episodic nature of work products along with the general pace of organizations created an environment where the “final” output carried with it significant meaning (to some). Yet how often do meetings take place where the presenter apologizes for data that is out of date relative to the image of a spreadsheet or org chart embedded in a presentation or memo? Working continuously means capturing information quickly and in real-time then moving on. There are very few end points or final documents. Working with customers and partners is a continuous process and the information is continuous as well.
- Low startup costs. Implementing a new system used to be a time consuming and elaborate process viewed as a multi-year investment and deployment project. Tools came to define the work process and more critically make it impossibly difficult to change the work process. New tools are experienced the same way we experience everything on the Internet—we visit a site or download an app and give it a try. The cost to starting up is a low-cost subscription or even a trial. Over time more features can be purchased (more controls, more depth), but the key is the very low-cost to begin to try out a new way to work. Work needs change as market dynamics change and the era of tools preventing change is over.
- Sharing inside and outside. We are all familiar with the challenges of sharing information beyond corporate boundaries. Management and IT are, rightfully, protective of assets. Individuals struggle with the basics of getting files through firewalls and email guards. The results are solutions today that few are happy with. Tools are rapidly evolving to use real identities to enable sharing when needed and cross-organization connections as desired. Failing to adopt these approaches, IT will be left watching assets leak out and workarounds continue unabated.
- Measured enterprise integration. The PC era came to be defined at first by empowerment as leading edge technology adopters brought PCs to the workplace. The mayhem this created was then controlled by IT that became responsible to keep PCs running, information and networks secure, and enforce consistency in organizations for the sake of sharing and collaboration. Many might (perhaps wrongly) conclude that the consumerization wave defined here means IT has no role in these tasks. Rather the new era is defined by a measured approach to IT control and integration. Tools for identity and device management will come to define how IT integrates and controls—customization or picking and choosing code are neither likely nor scalable across the plethora of devices and platforms that will be used by people to participate in work processes. The net is to control enterprise information flow, not enterprise information endpoints.
- Mobile first. An example of a transition between the old and new, many see the ability to view email attachments on mobile devices as a way forward. However, new tools imply this is a true bridge solution as mobility will come to trump most everything for a broad set of people. Deep design for architects, spreadsheets for analysts, or computation for engineers are examples that will likely be stationary or at least require unique computing capabilities for some time. We will all likely be surprised by the pace at which even these “power” scenarios transition in part to mobile. The value of being able to make progress while close to the site, the client, or the problem will become a huge asset for those that approach their professions that way.
- Devices in many sizes. Until there is a radical transformation of user-machine interaction (input, display), it is likely almost all of us will continue to routinely use devices of several sizes and those sizes will tend to gravitate towards different scenarios (see http://blog.flurry.com/bid/99859/The-Who-What-and-When-of-iPhone-and-iPad-Usage), though commonality in the platforms will allow for overlap. This overlap will continue to be debated as “compromise” by some. It is certain we will all have a device that we carry and use almost all the time, the “phone”. A larger screen device will continue to better serve many scenarios or just provide a larger screen area upon which to operate. Some will find a small tablet size meeting their needs almost all of the time. Others will prefer a larger tablet, perhaps with a keyboard. It is likely we will see somewhat larger tablets arise as people look to use modern operating systems as full-time replacements for existing computing devices. The implications are that tools will be designed for different device sizes and input modalities.
It is worth considering a few examples of these tools. As an illustration, the following lists tools in a few generalized categories of work processes. New tools are appearing almost every week as the opportunity for innovation in the productivity space is at a unique inflection point. These examples are just a few tools that I’ve personally had a chance to experience—I suspect (and hope) that many will want to expand these categories and suggest additional tools (or use this as a springboard for a dialog!)
- Creation. Quip, Evernote, Paper, Haiku Deck, Lucidchart
- Storage and Sharing. Box, Dropbox, Hightail
- Reporting. Mixpanel, Quantifind
- Communications. WhatsApp, Anchor, Voxer
- Tracking. Asana, Todoist, Relaborate
- Training. Udacity, Thinkful, Codeacademy
The architecture and implementation of continuous productivity tools will also be quite different from the architecture of existing tools. This starts by targeting a new generation of platforms, sealed-case platforms.
The PC era was defined by a level of openness in architecture that created the opportunity for innovation and creativity that led to the amazing revolution we all benefit from today. An unintended side-effect of that openness was the inherent unreliability over time, security challenges, and general futzing that have come to define the experience many lament. The new generation of sealed case platforms—that is hardware, software, and services that have different points of openness, relative to previous norms in computing, provide for an experience that is more reliable over time, more secure and predictable, and less time-consuming to own and use. The tradeoff seems dramatic (or draconian) to those versed in old platforms where tweaking and customizing came to dominate. In practice the movement up the stack, so to speak, of the platform will free up enormous amounts of IT budget and resources to allow a much broader focus on the business. In addition, choice, flexibility, simplicity in use, and ease of using multiple devices, along with a relative lack of futzing will come to define this new computing experience for individuals.
The sealed case platforms include iOS, Android, Chromebooks, Windows RT, and others. These platforms are defined by characteristics such as minimizing APIs that manipulate the OS itself, APIs that enforce lower power utilization (defined background execution), cross-application security (sandboxing), relative assurances that apps do what they say they will do (permissions, App Stores), defined semantics for exchanging data between applications, and enforced access to both user data and app state data. These platforms are all relatively new and the “rules” for just how sealed a platform might be and how this level of control will evolve are still being written by vendors. In addition, devices themselves demonstrate the ideals of sealed case by restricting the attachment of peripherals and reducing the reliance on kernel mode software written outside the OS itself. For many this evolution is as controversial as the transition automobiles made from “user-serviceable” to electronic controlled engines, but the benefits to the humans using the devices are clear.
Building on the sealed case platform, a new generation of applications will exhibit a significant number of the following attributes at the architecture and implementation level. As with all transitions, debates will rage over the relative strength or priority of one or more attributes for an app or scenario (“is something truly cloud” or historically “is this a native GUI”). Over time, if history is any guide, the preferred tools will exhibit these and other attributes as a first or native priority, and de-prioritize the checklists that characterized the “best of” apps for the previous era.
The following is a checklist of attributes of tools for continuous productivity:
- Mobile first. Information will be accessed and actions will be performed mobile first for a vast majority of both employees and customers. Mobile first is about native apps, which is likely to create a set of choices for developers as they balance different platforms and different form factors.
- Cloud first. Information we create will be stored first in the cloud, and when needed (or possible) will sync back to devices. The days of all of us focusing on the tasks of file management and thinking about physical storage have been replaced by essentially unlimited cloud storage. With cloud-storage comes multi-device access and instant collaboration that spans networks. Search becomes an integral part of the user-experience along with labels and meta-data, rather than physical hierarchy presenting only a single dimension. Export to broadly used interchange formats and printing remain as critical and archival steps, but not the primary way we share and collaborate.
- User experience is platform native or browser exploitive. Supporting mobile apps is a decision to fully use and integrate with a mobile platform. While using a browser can and will be a choice for some, even then it will become increasingly important to exploit the features unique to a browser. In all cases, the usage within a customer’s chosen environment encourages the full range of support for that platform environment.
- Service is the product, product is the service. Whether an internal IT or a consumer facing offering, there is no distinction where a product ends and a continuously operated and improving service begins. This means that the operational view of a product is of paramount importance to the product itself and it means that almost every physical product can be improved by a software service element.
- Tools are discrete, loosely coupled, limited surface area. The tools used will span platforms and form factors. When used this way, monolithic tools that require complex interactions will fall out of favor relative to tools more focused in their functionality. Doing a smaller set of things with focus and alacrity will provide more utility, especially when these tools can be easily connected through standard data types or intermediate services such as sharing, storage, and identity.
- Data contributed is data extractable. Data that you add to a service as an end-user is easily extracted for further use and sharing. A corollary to this is that data will be used more if it can also be extracted a shared. Putting barriers in place to share data will drive the usage of the data (and tool) lower.
- Metadata is as important as data. In mobile scenarios the need to search and isolate information with a smaller user interface surface area and fewer “keystrokes” means that tools for organization become even more important. The use of metadata implicit in the data, from location to author to extracted information from a directory of people will become increasingly important to mobile usage scenarios.
- Files move from something you manage to something you use when needed. Files (and by corollary mailboxes) will simply become tools and not obsessions. We’re all seeing the advances in unlimited storage along with accurate search change the way we use mailboxes. The same will happen with files. In addition, the isolation and contract-based sharing that defines sealed platforms will alter the semantic level at which we deal with information. The days of spending countless hours creating and managing hierarchies and physical storage structures are over—unlimited storage, device replication, and search make for far better alternatives.
- Identity is a choice. Use of services, particularly consumer facing services, requires flexibility in identity. Being able to use company credentials and/or company sign-on should be a choice but not a requirement. This is especially true when considering use of tools that enable cross-organization collaboration. Inviting people to participate in the process should be as simple as sending them mail today.
- User experience has a memory and is aware and predictive. People expect their interactions with services to be smart—to remember choices, learn preferences, and predict what comes next. As an example, location-based services are not restricted to just maps or specific services, but broadly to all mobile interactions where the value of location can improve the overall experience.
- Telemetry is essential / privacy redefined. Usage is what drives incremental product improvements along with the ability to deliver a continuously improving product/service. This usage will be measured by anonymous, private, opt-in telemetry. In addition, all of our experiences will improve because the experience will be tailored to our usage. This implies a new level of trust with regard to the vendors we all use. Privacy will no doubt undergo (or already has undergone) definitional changes as we become either comfortable or informed with respect to the opportunities for better products.
- Participation is a feature. Nearly every service benefits from participation by those relevant to the work at hand. New tools will not just enable, but encourage collaboration and communication in real-time and connected to the work products. Working in one place (document editor) and participating in another (email inbox) has generally been suboptimal and now we have alternatives. Participation is a feature of creating a work product and ideally seamless.
- Business communication becomes indistinguishable from social. The history of business communication having a distinct protocol from social goes back at least to learning the difference between a business letter and a friendly letter in typing class. Today we use casual tools like SMS for business communication and while we will certainly be more respectful and clear with customers, clients, and superiors, the reality is the immediacy of tools that enable continuous productivity will also create a new set of norms for business communication. We will also see the ability to do business communication from any device at any time and social/personal communication on that same device drive a convergence of communication styles.
- Enterprise usage and control does not make things worse. In order for enterprises to manage and protect the intellectual property that defines the enterprise and the contribution employees make to the enterprise IP, data will need to be managed. This is distinctly different from managing tools—the days of trying to prevent or manage information leaks by controlling the tools themselves are likely behind us. People have too many choices and will simply choose tools (often against policy and budgets) that provide for frictionless work with coworkers, partners, customers, and vendors. The new generation of tools will enable the protection and management of information that does not make using tools worse or cause people to seek available alternatives. The best tools will seamlessly integrate with enterprise identity while maintaining the consumerization attributes we all love.
What comes next?
Over the coming months and years, debates will continue over whether or not the new platforms and newly created tools will replace, augment, or see occasional use relative to the tools with which we are all familiar. Changes as significant as those we are experiencing right now happen two ways, at first gradually and then quickly, to paraphrase Hemingway. Some might find little need or incentive to change. Others have already embraced the changes. Perhaps those right now on the cusp, realize that the benefits of their new device and new apps are gradually taking over their most important work and information needs. All of these will happen. This makes for a healthy dialog.
It also makes for an amazing opportunity to transform how organizations make products, serve customers, and do the work of corporations. We’re on the verge of seeing an entire rewrite of the management canon of the 20th century. New ways of organizing, managing, working, collaborating are being enabled by the tools of the continuous productivity paradigm shift.
Above all, it makes for an incredible opportunity for developers and those creating new products and services. We will all benefit from the innovations in technology that we will experience much sooner than we think.
–Steven Sinofsky