Michael Hay

Innovating in Uncertain Times

Blog Post created by Michael Hay on Nov 22, 2016

INTRODUCTION

At the top of the headlines, ahead of Brexit, is Mr. Trump's ascendency from Candidate Trump to President Elect Trump.  However, hidden below these headlines are older news threads and data below the surface, which both carry lessons and messages.  Is there a new Darker Age emerging?  Well as I learned a couple of years ago, the data doesn't seem to point in that direction in fact the data trends towards something more positive1.  So what do we make of this?  Well I think that instead of living in interesting times, we live in uncertain times!  To motivate our discussion let's look at a few headlines and in some case their hidden dimensions.

 

uncertain news.png

Samsung's exploding phones, recalls and the withdrawals of products from the market are very visible. In fact, if you travel on an airplane recently one cannot avoid a reference to Samsung by the airlines.  Below the surface are also other data points about say washing machines from Samsung both in the US and in Australia. While we aren't privy to the discussions and debates inside of Samsung, I bet that they are looking at many things from their supply chain, to their processes for QA, and so on.  Maybe there is a lesson here to learn: Sweating the details and testing complex devices is critical.  Heck perhaps we can even say that hardware matters.

 

With Deutsche bank now being required to pay for past misdeeds, to the US government, can a company ever really close their books on a given quarter or year?  Is this the new Lehman Shock?  Analysts just don’t know.

 

For Apple they may or may not owe back taxes, the conclusion has yet to be seen and the Brexit is looming here too perhaps this whole thing is irrelevant then again Apple’s potential burden is nearly the same as well Deutsche Bank’s.  There is correlation for sure, but I'm not sure we can say for sure one caused the other.

 

When considering China people thought the growth machine would go on forever, but if you talk to the Chinese people there are real and new market dynamics now.  So they are in a transition from the factory of the world to well something else.  Their growth pattern is now following that transition as well.

 

Ireland’s the opposite of China because after accepting more than 60 billion euros in bailout funding they are now one of the hotest economies in Europe.  Yet their key benefits are low corporate tax rates, and now the looming Brexit.

 

On the technology side, who would have thought a NO-SQL projects espousing the ideas of scale-up monolithic storage as they make the transition from scale-out at all costs to in-memory.  However this isn't the only case of contrarian technology memes below the surface.  My colleague Vincent Franceschini pointed out an interesting company called Nervana who makes an AI chip, plus Google has their Tensor Flow chip and Microsoft is in the pursuit of FPGAs for Deep Learning.  What is the specialized hardware doing coming from cloud computing companies?

 

In all of these cases the outcomes are always clear in hindsight, but ahead of time prediction let alone scenario planning is more than hard unless you have a time machine. To me this means that the world is simultaneously more complex and more uncertain than ever certainly more so than people want to admit.  Given these points and line of thinking, I believe the stage is set for exploring the point implied in the title: Innovating in Uncertain Times.

 

uncertainty normal.pngASSUME A GIVEN AND THEN PROGRESS

For this post we'll assume Uncertainty is the New Normal, cannot be avoided and is therefore our given.  So let's progress...  Obviously we'd like to at least intellectually understand why this to be the case.  To do that we'll do a little thought experimentation brewing together a some statistics ideas, thoughts around queuing (theory), and reflect on what kind of meaning that has for innovation.

 

COPING WITH UNIFORM VARIANCE

In the old world there was a sense of consistent or uniform variance, whereby uniform variance.pngspecializations could be built and inherent variations for needs were relatively low. This resulted in both simplified service/product differentiation and necessarily preferred depth over breadth.  Small or "tight variance" shaped both market definitions and even helped to provide a framework to compare and contrast actors in markets.  (For example, It allowed market analyst firms to define what a market like storage meant, who the actors were and compare them to each other.)  Beyond market definitions and comparative models this "tight variance" meant that legitimate stretches and leaps were possible.  In the case of ICT (Information Communications Technologies), having a storage OR network OR server company stretch to be a converged systems company was realistic and we can see these changes realized in the that market today.  So the key point here is tight variance is a good thing for companies to carve out a niche in a defined market, compete and grow in a measured semi-diversified manner.

 

To start our experiment let's review some of the properties of the Gaussian Distribution.  Firstly, there are many things you can pull from the Gaussian like the mean, median, mode, variance and so on.  Depending on the character of the data, these descriptive statistics can have more or less meaning.  Specifically, when there is tight variance descriptive statistics, like the mean, have strong meaning.  In fact, the mean of a Gaussian in conjunction with a tight variance can almost represent the character of the data in the distribution.  (We'll look at this a little later, but for now be aware that the mean of a Gaussian with a large variance represents information loss and has poor descriptive meaning.) The figure that I've included in this section attempts to visualize these very points by illustrating both low statistical and coloration variance -- actually all of these "quantized units" are shades of green.  So we can kind of say that the "mean" is green with a variance of green-1, green-2, green-3, etc.  Linking this back to both market specialization and ICT specifically, this "tight" statistical and color variance is meant to imply that stretching to new adjacent and largely congruent areas is/was achievable!  Meaning that you'd expect to see companies diversify from being storage specialized to converged systems and appliances capable, and we can find evidence of this in the likes of both Hitachi Data Systems and NetApp.

 

Continuing our thought experiment let's take each one of these quantized green squares to mean an abstracted need, and then imagine how it would be handled.  In our experiment, the needs will be queued up and a "single server" will exhaust the queue of needs.  Looking at the image below we'll take all of the various shades of green to represent a family of needs.  In fact, because some elements in the family are the same, these elements can be processed as batches.  That is to say a single response or unit of work can resolve multiple need quanta at once.  Using that line of thinking in this example queue for all instances of which are marked as “same” we can perform one action and meet the need of many individual instances, shrinking the actual number of unique needs in the queue from 23 to 8.

 

processing uniform.png

 

Need consistency makes it easier for a single centralized organization, team or individual, to cope on a global scale and semi-diversify -- as in the cases of Hitachi Data Systems and NetApp.  However, what happens when both the hypothetical statistical variance of the distribution is higher and the spread of colors is greater?

 

COPING WITH NON-UNIFORM VARIANCE?

In the new world needs are more unique moving quickly beyond either real or perceived certainty and consistency.  Also expressed needs come from varying parts of businesses, a wider array of locations, and new parties meaning a depressingly gargantuan variance.  With the variance being both larger mathematically and more geographically dispersed, the ability for a single centralized organization to respond to incoming needs is next to impossible.  This is because it isn’t a game of handling adjacent similar needs, but instead coping with major chasms between the needs compounded with location dispersal.  For example, let’s say you were an organization selling Big Data Software Stacks, but now an O&G company is asking you to participate in the field of seismic sensing and interpretation by using AI, past data (from within the O&G company), etc.  At first glance this is doable, right?  However, when you more deeply understand the problem you realize not so fast.  Working through this problem actually mandates the on-boarding Seismic Interpreters, Geophysicists, and Seismic data handlers.  However, the specialization ask has another dimension:  the teams can only be fielded for select geographic locations.  This combination, specialization and dispersed location, defeats the premise of a centralized consolidated organization, mix in tight timelines and who knows.  Instead, a distributed team with the right skills in the right locations plus a matching  “centralized curator” is likely a better fit.  Let's explore this.

 

nonuniform variance.png

 

This fictitious Gaussian distribution is significantly different than the previous one.  The variance is wider and the color palette is broader.  As alluded to earlier, a large variance results in descriptive statistics losing meaning and information.  So by having a wider suggested variance and broader color palette I'm trying to over emphasize that point.  While like the previous hypothetical distribution the mean is green we cannot say that the variance is green-1, green-2, green-3, but must instead say it is blue-1, blue-2, green-1, green-2, brown-1, yellow-1, yellow-2, and so on.  In other words, while there is a clear mean using it to describe the overall distribution without understanding the wide (color) variance would result in lost information.  Similarly, if we imagine the impact to a server, optimized to handle a queue of green needs with low variance, we'll find that while the greens can be processed it is the blues, yellows, and browns that will persist in the queue unprocessed.  Let's examine this behavior with our continued thought experiment.

 

processing nonuniform.png

Since the server can process any greens the queue is initially reduced but needs that are not green remain.  Therefore, if non-green needs continue to arrive in the queue it will grow without bounds.  Beyond boundless growth when new green needs enter the queue they may be occluded, resulting in either elongated time to service or those with green needs to abandon the queue.  Typically, in any of these cases there is a negative impression created of the server.  Now let's equate this situation to a fictitious organization with the green server representing the organization.  Initially, the organization can handle the green needs, but soon enough when non-green needs arrive processing fails and the entities adding both green and non-green needs to the queue become dissatisfied.  You may want to ask why this could happen or why would an organization do this?  The answer stems from uncertainty.  For organizations to remedy the uncertainty they are looking beyond their current stable of customers, competitors, vendors and suppliers to new ones.  That is because entities outside of an organization’s domain may help them invent something novel.  So this naturally creates a scenario where novel needs are likely accepted into the organization, and novel needs carry both unknowns and have emergent properties (e.g. firm associations to specific geographical regions which are likely not apparent at an initial review).

 

ADDING MORE CENTRAL CAPABILITY

A likely response, by many organizations today, is to hire new individuals and teams capable of handling novel (non-green) needs.  Such a formation would look like multiple servers handling multiple queues, and the image in this paragraph illustrates this formation.  processing nonuniform central.pngWhen given enough time plus a lack of geographic variation, forming a centralized set of teams is both possible and economical.  However, early in this post I referenced President Elect Trump and the Brexit, but why?  Beyond the uncertainty these two actions represent a rejection of decades of globalization.  In fact, there appears to be a desire to move towards more intense local country focus -- at least for America and the U.K. so far -- at the expense of centralized global responses.  More evidence of this comes from my year's interactions with customers in various countries.  Most recently, I gave a talk in Europe where I dubbed this movement re-localization to both open the dialogue and gauge interest in increased country specific innovations.  What I found from the audience was a desire to learn about happenings globally, but optimize them for the local country.  Moreover, my visits the remainder of the week resulted in some interactions with the country's Telecommunications company.  What I learned is that this company had shed nearly all of its global holdings and was spending the resulting moneys locally to develop an IoT practice.  We spent the evening with this Telco and I was able to connect to one of their team members to talk about strange things like Google's Tensor Flow chip, FPGAs and so on.  During that discussion it was obvious to me that the goal was to learn about things from around the world, and apply the results locally unencumbered by centralized global control.  While this is mostly about my recent visit, again I've found customers, partners, and Hitachi's field teams all mobilizing to support localities first and global responses second.  If we take this new information and overlay it on our experimental idea of using more different types of servers in a centralized location, in my opinion I think this runs afoul of the emerging trend of re-localization.  Firstly, this formation is more than challenged to develop empathy of any specific locality and that means local optimizations are next to impossible.  Secondly, there is a hierarchy that is implied by any such formation, which runs afoul of needing to respond locally and autonomously.

 

So if minor stretches, suggested by the handling of tight variance and shades of grain, aren't enough to respond to the market.  Also if allowing in more variance -- non-green variance in our thought experiment -- and multiple servers with discrete specializations in a centralized location fails, what is the response?  Disruptive innovation, right?

 

DISRUPTIVE INNOVATION?

agility.pngTo understand if Disruptive Innovation is a response to uncertainty let’s start from the definition of disruption and see where that takes us.

Disruption - to cause (something) to be unable to continue in the normal way : to interrupt the normal progress or activity of (something) [Merriam-Webster, Disrupt | Definition of Disrupt by Merriam-Webster]

So there is a “normal” condition assumed in the definition and disruptions push against or move away from that normal.  Yet these are uncertain times, and does that suggest we can resolve uncertainty with disruptive thinking and innovations?  Again let's turn to a dictionary to extract the definition of uncertain to see.

Uncertain - not certain to occur :  problematical <his success was uncertain> : not reliable :  untrustworthy <an uncertain ally> : not known beyond doubt :  dubious <an uncertain claim> :  not having certain knowledge : doubtful <remains uncertain about her plans> : not clearly identified or defined <a fire of uncertain origin> : not constant. [Merriam-Webster, Uncertain | Definition of Uncertain by Merriam-Webster]

In this case I used a longer definition of "uncertain" to avoid any sense that "uncertainty" contains a reliable normal.  Given that we're living in uncertain times I think it is ok to say that disruptive innovation is unlikely to work in the Era of Uncertainty.  We're going to have to deliver something decidedly different than we've seen previously.  But what then?  Fortunately, sometimes simple responses are best and more specifically agility, more intensive localization and new kinds of pairing can help.

 

  • Agility is about flexible adaptation to the surrounding environment.  Those who are more agile tend to field more frequent responses to changing and uncertain market conditions allowing them to vet these responses finding the best fit for the time.  Elimination at too early a stage means potential solutions to known and unknown problems are potentially removed before they can be observed in the wild.
  • Additionally, due to noisy uncertainty with intense local variation, a centralized strategy to rule all strategies is doomed from the outset.  Instead more experimentation in various localities and a central discipline that helps more in match making between the localities may be a better “strategy” to employ.  This could also mean, given the bazaar2 approach to open source development, a key discipline avoiding institutional thinking and internal politics is an open source-like model.  In other words being a bit more unruly and letting sales and field teams themselves engage in the “innovation funnel" could be more productive than top down thinking.  Even in natural systems simple and profound locally executed rules can have a dramatic impact.  An example is ants; given some basic rules, like finding food and reacting to danger using scents, ants can appear to be more intelligent than they actually are.  Structure and complexity emerge in ant societies through these simple rules exchanged in a one-on-one local manner.  The same thing can be said about organizations.
  • Pairing is not new at all, but what is new about pairing or partnering is who organizations are doing it with.  Having a co-pilot who in a nonthreatening manner can share how they are transforming and at the same time navigate with you is important. Sometimes when distinct organizations work together each organization can point out a new finding to the other.  This is simply due to showing up with different biases and engaging in a new pairing.  It is almost like the “consulting effect” where an outsider can see something or capture an organizational wisdom you’re not paying attention to internally.  At a micro level agile programming can take advantage of so-called paired programming to “increased code quality: ‘programming out loud’ leads to clearer articulation of the complexities and hidden details in coding tasks, reducing the risk of error or going down blind alleys.”

 

The key will be how to balance between these three areas, and most importantly managing implied changes on centralized teams, field organizations, partners and customers.  The results are likely many new ecosystems which vary per geography.   To be clear it is likely not possible to tackle innovating in uncertain times through incremental enhancements to corporate functions.  In fact my experiences show that corporate-to-corporate collaborations often are challenged.  Even when there is willpower invested to do something, should a corporate team you're collaborating with reorganize you end up having to reset the engagement thwarting the original design and stifling innovation.  So I'm making a kind of assumption that collaboration on the edge with field organizations rather than central corporate teams is ideal to start.  Once again we can be inspired by the example of ants: Generally speaking food isn’t found in the ant nest but out in the field and the same is true with the food of business, revenue.  Exactly, what organizational structure can we explore having the potential to aid in the quest to innovate in this Era of Uncertainty?

 

CONCLUSIONS

As to a structure that would be compatible with this Era of Uncertainty, I think we can draw lessons from both natural and digital worlds.  Specifically, the ideal of simple rules in insect societies, like ants, is very attractive, but what's missing is how to bridge larger geographic dispersion.  Obviously, centralized command and control mechanisms -- referred to as the Cathedral by Eric Raymond -- are attractive because we can simply "order" certain actions in various geographies.  Yet, as I spent a long time explaining a Cathedral-like structure isn't suitable for today.  So again what to do?  Somehow we have to merge these two seemingly opposite approaches into a structure.  This is where we can look towards digital systems as inspiration in my opinion.  More specifically, I think that the internet's Domain Naming Service3 (DNS) provides a likely example of how to build an organizational structure fit for these times.  While you can get into the details of the DNS, including its rules and commands, I think that inside the mission of the DNS to resolve names and addresses there are some basic rules that are inspiring.  Notably, check locally to see if you've got the address cached if you do use it, if you don't ask the next nearest neighbor on the list for help.  This simple rule is repeated recursively, until the name is resolved to an address and eventually you get to you targeted service.  One could in fact argue that this venerable mature service, while having very simple rules, has endured the tests and challenges of the internet over time including uncertainty.  In terms of what structure and how need processing would occur, below are a series of diagrams that point to that very model.

 

Here in this first diagram you'll find centralized curation and distributed execution in two sites one labeled with a 1 and the other with a 2.  As we step through each state you can see under normal operations the needs processing capability works well locally when needs that can be processed locally arrive.  However, when there are unfamiliar needs which arrive both in sites 1 and 2, what should be done?

 

org 1.png

 

If we imagine the role of the centralized curation team to help specific localities find capability to process new needs, the next set of images starts to make sense.  Essentially, in the first pane the localities ask the central team to help them find someone who can handle the brown and blue needs.  In response, the centralized curation team responds to each locality with the address of a particular locality capable of handling the required needs.  From there, with the last step the localities exchange work to conclude meeting the needs on a peer-to-peer basis.

 

org 2.png

This means that the impression of handling the response locally is maintained, the work is accomplished and the entities expressing these needs are fulfilled.  With root servers, resolvers, and so on this organizational structure -- inspired by a digital system -- has an uncanny similarity to the DNS.  I'm sure that when the reader looks at this there are obvious misses such as overworking individuals, the need for adding skills in particular localities should workloads increase, and so on.  Honestly, I'm not after trying to resolve these ideas yet, but more interested in getting a well thought out idea in the wild for review and comment.  I suppose I'm following Eric Raymond's idea of release early and release often!

 

 

FOOTNOTES

  1. I read a great piece by Steven Pinker and Andrew Mack over at The Slate which looks at the issue of an emerging Darker Age.  Their article, The world is not falling apart: The trend lines reveal an increasingly peaceful period in history, uses data to suggest that the reality is quite the opposite.  Essentially, hidden below the noise, of bad news headlines, are data pointing to the reduction in crime, war and death to historic lows.  This obviously contradicts the nearly hourly reporting of all things bad, and it is the data that unwinds the intuitive leap from bad news to a new Darker Age.
  2. Eric Raymond's timeless essay called the Cathedral and the Bazaar chronicles musings about unruly open source projects including Linux and his own Fetchmail.  While the stories are largely about building Open Source communities I believe there are organizational lessons and implications beyond just creating code.  Specifically, I think that some of the ideas about letting a project evolve quickly and in the open are essential to consider in the era of uncertainty. 
  3. There are new movements afoot to move the ideas of the DNS onto newer stacks like the Blockchain -- one such effort is called Namecoin.  "Namecoin was also the first solution to Zooko’s Triangle, the long-standing problem of producing a naming system that is simultaneously secure, decentralized, and human-meaningful." (https://namecoing.org)  Such a system would enable a more resilient internet and thwart opportunities for censorship and also improve overall usability.  However given the theory of the Blockchain, I cannot easily imagine what a structure would look like based upon it.  Would we end up having many many small organizations/companies that band together to solve market problems?  Who knows.  I can say that I believe the DNS in its current form to be a sufficient leap both for the Era of Uncertainty and to build organizations enabling Digital Transformation.  In fact, one could argue that to effectively enable Digital Transformation, you'd need to organize based upon Digital and not Natural systems, but perhaps this is a topic for another future post.

Outcomes