A Model for Transition to IoE in Manufacturing

In a recent interview, executives from Robert Bosch GmbH and McKinsey discussed the Internet of Everything (IoE) and its impact on manufacturing.  They described significant changes to the production process and to the management of supply chains from this “fourth industrial revolution.”  The IoE allows for the interconnection of factories within and across regions and the exposure or “display” of the status of each component of each product for each customer via each distribution method.  Sensors in machines and in components will be able to keep universally in synch about what has to be done, what has been done and how well it was done.

A global decentralization of production control is now possible. Creating this reality will require new forms of intercompany and interdisciplinary collaboration.  The buyer, seller and distributor will all be involved in product design, engineering, and logistics.

GE Industrial InternetToday, physical flows and financial flows and information flows are different for manufacturing.  The IoE vision has them increasingly fusing together.  This transformation to what GE calls the Industrial Internet begs a set of questions: In this future how will orders be placed and with whom?  Who or what verifies the accuracy of an order or a deliverable across a network of suppliers, manufacturers and distributors that is formed, of an instant, down to the level of at an order at a time?

In this coming future state information, via the cloud, will be real-time available to all concerned parties.  The decisions to be made based on this information will be subtle, situation-sensitive, and so voluminous and time dependent that people won’t be making them. Algorithms running in machine-machine (M2M) systems will.  On first consideration this all seems overwhelmingly complicated.  We’ll need a model, an example to build from, on how to make the transition.  It turns out we have one.

Changing the trading cycles for Wall Street are recent, real examples that provide a roadmap for the manufacturing transition.  In that world the number of days allowed to settle a trade, the “settlement cycle,” has undergone major transitions. The most notable was from 5 days to 3 days, so-called T+5 to T+3, occurred in 1995. That change required almost every firm in the US to make some changes to their processing flows and systems.  Since the move to T+3 various exchanges have made further improvements towards T+1.  The table below shows some of the major changes, the before and after, that were accomplished:

T-5 to T-1 Table

T+1, even if never mandated, can be viewed as an example of industry opportunity through dislocation. At some level, IoE capabilities can enable dramatic cycle time gains by unlinking end-to-end dependencies (e.g. I no longer need to “affirm” trades based upon evaluating “confirm trade” messages). Some entities/roles will become more independent, some more dependent. Some may disappear if they no longer add value.

The parallels for manufacturing in an Internet of Everything world are clear (though some elements used in trading may not be used here or at the same level of emphasis).  Cross-industry governance will be needed on the format and import of transactions, acceptable technical modes of sending and receiving the messages, management of the quality and timing of the messages both in content and technically, and how to handle disputes.

Douglas Brockway
doug.brockway@returnonintelligence.com

Ira Feinberg
ira.feinberg@returnonintelligence.com

July 16, 2015

You Are HERE

Now what?..

The Internet of Everything has recently joined Big Data Analytics, Social and Mobile technologies and the Cloud as subjects that one can bring up in a general business or social situation and be reasonably sure people will know what it is or quickly understand it.  What is also becoming generally understood is that these elements are connected.  We call them I-SMAC.  They feed on each other and the combinations are creating new businesses and “disrupting” old ones.

That there is a new opportunity, or, if you’re of a different mind-set, a new threat, raises the question among business leaders, “where are we and what should we be doing?”  There’s a framework that dates back to the days before “Enterprise IT” was called Enterprise IT that can help.  First laid out in a Harvard Business Review article in the mid-70’s, the “Stages Theory” proposes four “growth processes” that managers can use to track the evolution of IT in support of business.

On the “Demand Side” are included the Using Community, their use, participation and understanding of technology, and the “Applications Portfolio”, now including both applications and services, that make up the functional, now including process and analytic capabilities that an organization (or market) does or could use.

Growth Processes

On the “Supply Side” are the Resources brought to bear:  technologies, personnel inside and, now, outside the organization, and other elements like facilities and supplies, along with Management Practices which range from strategy and governance through development and support to daily operation and break/fix.

On a cross-industry basis the Applications Portfolio for I-SMAC is still in an early stage.  In some companies and industries, like retail bookselling or personal photography, it has passed the early experimentation stage and a full ramp up in capability is underway.  In no case are these portfolios “mature” Stage IV portfolios. Over recent months we have seen a subtle but clear shift in the awareness of I-SMAC opportunities.  Still, the Using community tends to be either unaware or artificially enthusiastic or doubtful and combative.  This is consistent with the early stage nature of the portfolios.  Lots of promise but not yet enough history to show unquestioned benefit.

For the most parts the Resources being brought to bear are new and rapidly changing.  There is a very short half-life of the preferred vendor or technology for a given task, or there is not yet an implicit and emerging standard, in most cases.  The staff, in-house or in service providers, are skilled in what they are working on but, as the technologies around them are kaleidoscopically changing, are having to spend large amounts of time keeping up.  Management Practices are currently updates-with-Band-Aids of what went before.  The best way to build I-SMAC systems and to manage them at scale is not yet proven.

You Are HERE Stages

What should you do in your case?  First, set a baseline that reflects your industry or market overall and shows the position of your company.  However detailed and analytic you wish or need to make it, the baseline should cover the state of each of the Growth Processes.  You will typically find that they are at a similar stage, but not identical.  Spend more think time if one growth process is Stage III and one Stage I.  Such mis-matches are trouble.  Do a compare-and-contrast analysis between your status and an industry synopsis.  Make decisions about whether you are ahead or behind and what you should do about it.

Second, with a “light touch,” explore the I-SMAC efforts underway within your company today. This basic inventory, by the way, is a Stage II practice. You are building organizational awareness of how you are trying to take advantage of, or face down a threat from I-SMAC.  You need to know what these efforts are, but as they are almost certainly early-Stage efforts you need to avoid the urge to pull the plug because you can’t yet see the mature market value. Make sure you’re in the game. If you are not at least trying to make use of some combination of Internet of Everything generated Data via Mobile platforms, leveraging Social technology via the Cloud you are exposed to competitors and new entrants who will.

Douglas Brockway
July 15, 2013
doug.brockway@returnonintelligence.com

 

Experience and Best Practices – Keys To Software Implementability

Implementing software into a complex insurance carrier environment can be risky and fraught with impediments that increase cost and timelines. Return On Intelligence, having managed software implementations on a global basis, asked ourselves the question:  Are there any experiences or best practices that if employed can lower the risk of software implementation to ensure a rapid and cost effective implementation?

What we learned after looking at a wide sample of implementations suggests that a foundation for implementation success consists of experience in three categories:

  1. Insurance Knowledge. Broad and deep;
  2. Understanding the data. To be processed and its relationship to the business;
  3. Technical acumen.  Not just with the software product alone, but the conversion and interfaces required.

In addition to these three categories, we found that focusing on a number of best practices increases the chances of a successful core systems implementation project:

  1. Organizational Readiness. Assessing the readiness levels of all participating parties as they relate to planning and timeframes, as well as the required participation of the recipient department or departments;
  2. Governance. Managing overall governance of the software implementation and adherence to agreed priorities;
  3. Accelerators. The proper choice and use of any “implementation accelerators” such as data conversion or testing tools.

Our conclusion is that when undertaking a core systems transformation, the focus on experience and best practices should be required considerations, and may in fact contribute to avoiding protracted and costly software implementations that endanger the progress of the carrier’s strategy at the least, and its profitability at the most.

Jim Janavich

 

 

Understanding the Dynamics of IT Spending in an I-SMAC World

We recently read another analysis of the purported inefficiency and waste in IT spending, this time the authors were aghast that 80-85% of IT spending is used, in Gartner Group terms, to “Keep the Lights On.”  They described this spending as wasteful maintenance and “troubling” ongoing enhancements.

One could argue that maintenance and break-fix is “keeping the lights on,” but making continual adjustments in application function to align with market and business need is a competitive imperative.  It’s not waste.

In many IT organizations the total IT budget is very tightly managed, often fixed and rarely rising more than a percentage point or three.  If a company keeps spending money on new systems it quickly comes to a cross-roads.  Either the amount of new systems development must be cut or the amount of maintenance, enhancement and operations must be cut, or, the enhancement and operations must become much more efficient.Spending model insert

Beyond the radical up-tick in systems capabilities, a key reason companies are pursuing I-SMAC-based solutions is that they have the ability to radically change the IT Activity-based Funding Model’s dynamics.  On the one hand I-SMAC tends to result in building systems faster which means a shorter time to the on-going enhancement and operations costs.  On the other hand the social, mobile and cloud costs are delivered as services with ongoing unit cost reductions.  The analytic costs mirror traditional enhancement costs but the returns are worth it.

Just as when we transitioned from the mainframe/mini era to client server, then to the internet and now to I-SMAC, the scope of what can be developed for a dollar invested has taken a significant leap ahead.  Also, the unit costs and gross costs, of maintaining and enhancement each unit, or dollar, of developed function has gone down.  The economics of this means that everything will change.  When I-SMAC matures we’ll be able to look back and see that significant amounts of spending still go to “keeping the lights on.” But, there will be so many more lights that it will clearly be money well spent.

Telematics Data – Changing The Insurance Underwriting and Actuarial Environment

Telematics and specifically the usage based data it generates, significantly improves the ability to rate and price automobile insurance, by adding a deeper level of granularity to the data commonly used today.

Companies in the forefront of using telematics data, are beginning to understand the value of its many indicators as they relate to policyholder driving behavior, and how that behavior positive or negative, directly affects overall policy administration cost.

This advantage though, also comes with a possible disadvantage – higher volumes of data being added to already burdened processing resources. A single vehicle generates approximately 2.6 MB of data per week.  If 50,000 auto policies are on the books, accumulating that data for a year results in 6.8 TB per year.

Pay How You Drive Data

Given that the use of telematics data from automobiles is on the rise in insurance companies, to be followed by telematics data generated from wireless sensors in personal and commercial use; a solution for processing huge volumes of data quickly is indicated.

Most likely that solution is SAP/HANA based, processes the data and analytics together in main-memory, provides underwriters and actuaries a technological advantage to their business – real-time rating and pricing, a solution that doesn’t exist with traditional methods.

Jim Janavich

The Big Data Challenge – What’s Your Point?

Recently I attended a seminar at MIT’s Enterprise Forum on data and analytics, Big Data, in the pharmaceutical and health care industries and I learned a thing or two.  The panel included an investor, and leaders in research and IT from Pfizer, Astra Zeneca and a joint effort between Harvard University and MIT called The Broad Institute.

The moderator described how there are a number of success stories that relate to the harvesting of previously un-known or managed data.  Still, there are many technical, human, and organizational challenges to widespread success. Unwisely, in his view, many, instead of participating, are sitting on the sidelines waiting for the clear path to be sorted out. In a bit of hyperbole he said, “If you don’t like change, you’re going to really hate being irrelevant.”

Everyone agreed that we are in the early days of big data and related analytics.  Whatever we think of the volume, velocity and variety of the data we’re dealing with, our knowledge regarding what it is and what to do is in its infancy.  That said, in the past few years of trying the panelists have learned to distinguish between the technical issues of data volume and velocity and the human capital issue of data variety.  They believe that the large, constantly changing data universe will be increasingly manageable as our technologies try and catch up.

An Industry Note

Within the Pharmaceutical and Healthcare spheres per se there are challenges involving creating knowledge/data about drugs or genomes and the willingness or others to pay for access to that knowledge, which comes at a cost. This is especially true with genomic testing. Normally one pays each time you run a blood test or a CAT scan… each test is different and the analysis relates to that test. With genomes, how do we create policy around this data where you test once and the data are used and viewed many times by others?

The variety of data is something that must be dealt with by people. It comes in different forms (one example: structured v unstructured) from different sources, some from the analysis you just invented, and the uses and potentials are constantly changing. The panelists believe that our ability to understand, examine and use the variety of data is limited mostly by human skills, insight, experience and knowledge.

There was an extended discussion about the volume of data that got me to thinking.  It is agreed by all that we have more data available than we know what to do with.  And, each time we do an analysis we create more data.  The data volumes are increasing faster than our abilities to store, manage, inquire, and analyze the volumes.  The data volumes are beyond our ability to cope and are growing faster than our abilities grow.

For all practical purposes this means that data volumes are infinite.  Whatever our skill and technology scope the volume of data exceeds it today and will do so for some time.  We have to keep trying to catch up but understanding and analyzing all of our data will never be a productive goal.

The strategic differentiation in analytics will come from what my colleague Allan Frank describes as “answering outcome-based questions.” In the context of the panel’s observations, the skills and insights needed to address big data may well include technical data scientists and writers of algorithms and more. But, success will certainly hinge on the ability to distill what business outcomes you want, why, and what you need to know in order to service those outcomes. Our friend Bruce Rogow puts it perhaps more emphatically.  He associates success with “defining your purpose.”

If you want to have strategic success in the area of big data an analytics we recommend some familiar frameworks applied to this space:

  1. Whether you’re responsible for a small business unit or for an enterprise, understand your business vision.  If is already prepared, get a copy.  Break it down into the strategic vectors and “do-wells” or if you prefer, your critical success factors, and describe the business capabilities needed to succeed and the technology ecosystem, in this case data and analytical ecosystem, needed to support them.
  2. Start organizing and iterating 6-12 week cycles that scaled agile world calls “release trains.”  Have a subset of the business narratives and related segment of the ecosystem taken to the next level, designed and built.  At the end of this cycle you have a working environment that examines real data, produces real results.
  3. Determine what about the effort was successful, what needed help or more data or more analysis or a better defined business purpose.  Define another analytical release train. Do it again.

Doug Brockway
doug.brockway@returnonintelligence.com

Consumerization and BYOD – Transformation Catalysts

By Doug Brockway and Ilja Vinogradov

The consumerization of IT, which the use of third-party cloud services and applications such as cloud storage and social media and Bring Your Own Device (BYOD), are driving an irreversible trend in the way businesses and their staffs produce and consume information.  The impact goes far beyond satisfying the desires of individuals to use their own devices and not be hassled about it.  In acceding to that trend business also stands up to the need to change the way the information and transactions in corporate systems are consumed. This leads to transformations in applications portfolios, in business process and business results.

The first steps in BYOD came with Wintel notebooks, then Macs were added, and now mobile devices, tablets and smart phones.  As-is, the information displayed is not consumable by mobile devices.  The different screen sizes require different UI layouts.  The point and click interactions upon which “industrial” systems rely is confounded by the touch interaction of a tablet; your iPad has no right click, it’s hard to double click on your Android.

But, while most companies now spend considerable time and energy on UI and UX for a mobile device world across multiple vendors there’s a deeper issue, a deeper opportunity at play.  The core systems that run our corporations and our institutions are “Functional Systems” or as Clay Shirky has called them, “Web School” systems, where scalability, generality, and completeness were the key virtues. They use “enterprise design practices” in that from the back-end to the UX the designs put all the function one might need to cover all the situations one might encounter across a homogenous set of “users.”  Web School systems are “closed” systems.  Their function is designed for consumption only in a pre-defined manner using an application UI. These web-enabled apps are designed to provide maximum functionality with minimal amount of screens, for the most part to reduce development cost per user.

Increasingly we are finding that breaking this paradigm by combining “situational design” front-end systems with “cleanly implemented” core systems creates the optimal solutions.  This means designing mobile apps that are optimized to allow increasingly targeted groups to accomplish particular tasks as quickly as possible.  These apps sacrifice some functionality found in Web School systems in return for targeted relevance (Economies of Scope). These systems are “open” in that their function can be consumed not only by different humans but by other applications as well. Think of localization not just for nationalities and languages.  Think of localization in the sense of the engineering data needed in the field is different from that in the lab or the timeliness of CRM data and the sales reporting needs of a SME channel are different from that of selling to large corporations.

In this world good design keeps the mobile part of the technology ecosystem as simple as possible from implementation point of view.  The complexity is pushed to backend, to middleware and to so-called “smart process apps.” This is where the different transactions are created, the different views to data.  A useful analog is the concept of “software agents” – making business process components that respond to individualized environments, i.e. software that enables decisions of real and tangible value.

Because users are on the move and business needs are in constant flux they need capabilities to be developed quickly, customized to immediate need, and at a low enough cost to have very short payback period.  These systems may be in use for some time but the economics allow them to be treated as throw-away solutions.  Marc Andreesen showed in “Why Software is Eating the World” that the costs of building such targeted-use systems has dropped and will continue to drop precipitously.  This also allows for design and development emphasis on time to market, especially time to materially, positively affect employee productivity.

It is the customization, the lower costs of ownership, the continuous alignment to business need, the “enterprise agility” that makes strategically thoughtful actions to take advantage of consumerization and BYOD transformational.

Why Innovation Matters

In any competitive market innovation is a requirement. Whenever a product or service is introduced it must struggle to catch on with the market.  Once it does there is a near magical window of time when the company that offers the product has an open field without material competition. This is the Land of Opportunity:

Opportunity to Exposure on Life Cycle

But, success breeds imitation and soon there are multiple similar offerings and the competitive advantage transitions from the unique product to unique customer service to uniquely – commodity product or service.  Unless something is done company revenues from the product will decline even as unit sales increase.  This is the land of Exposure.

Someone is going to come up with something better; a newer product with broader applications and features.  It could be the original company, it could be a competitor, it could be a new entry from outside the industry.  What is certain is that the revenue growth from the current offering will peak and the long-term winners are companies that are working on and introducing the next innovation as the market is approaching the peak.  It is those companies that ride the next upsurge in revenues when competitors are fewer and margins are at their best.  Any company that can continuously take advantage of the next market before their current products are commoditized is a winner.  Just ask Apple.

At the same time, one can over innovate, reach too far.  A company with too little coordination and discipline regarding what ideas are being worked on, which are being placed into the market and how much of the company’s future is tied up in their success or failure is asking for trouble.  Just ask AIG.

Innovation matters because for most businesses in most industries it is the key to sustained success.  Without it management is steering a path to a long decline and eventual exit from the market.  Innovation matters because it can be directed.  Management can say how much of current revenue to retain for the next idea, what kinds of idea to foster, what kinds to look for in customer requests and the ideas and actions of competitors and tinkerers alike. Innovation matters because left unguided it can damage and even break entire companies.

Innovation matters.

Doug Brockway
doug.brockway@returnonintelligence.com

Innovation and Creativity Sources

At Return on Intelligence help our clients assess readiness, plans and tactics to enhance their creativity and innovation. Often, this work is made compelling and interesting through the insights and examples of innovative thought and innovation constructs and theories that are published or available via video.  If you’re interested in exploring the subject of innovation, below are some places to start.  If you have more or others, please share them.

For readers, I recommend Steven Johnson’s compendium, The Innovator’s Cookbook.  It is “an anthology of classic essays on innovation” with “important essays by some of [Johnson’s] heroes — Stewart Brand, John Seely Brown, Erik Von Hippel.” The first selection is Peter Drucker’s classic article, “The Discipline of Innovation.”  It is in this article that most modern constructs about innovation are laid down. Another of my favorites is, “How to Kill Creativity” by Teresa Amabile. It is, of course, a primer on the opposite topic.

Videos

TedTalks on YouTube offers a number of useful and insightful video lectures. Some of my favorites are:

  • Charles Leadbeater’s talk “On Innovation” about collaborative creativity.
  • Steven Johnson’s “Where Good Ideas Come From” is about how “the long slow hunch” is required for the eventual ‘Eureka!’ moment.
  • No student of innovation can be complete without watching Matt Ridley’s “When Ideas Have Sex” which explores the astounding power of compounded innovation.

Not at TedTalks Steven Johnson has a short YouTube video on Essentials for Inventing What’s Next.  It talks about the importance of getting out of your normal environment, your cognitive rut, in order to find new insights. For a longer examination of these ideas, read Jonah Lehrer’s “Imagine: How Creativity Works” accompanied by a short animation.

Blogs as Sources:

Along the lines of Johnson and Lehrer, Skott Berkun wrote a nice piece called, “Why you get ideas in the shower” about ideas and when they appear. There is a lot of talk or presumption that innovation is a transformational activity.  And, sometimes it is, but we believe that only those that can innovate on a day-to-day basis can see, much less execute on, longer-term or transformational opportunities.  Here’s a blog post by Jeffrey Phillips on “incremental innovation” and some commentary from me.  We made a very simple video that builds off these and other AKA ideas derived from our client work.

Culture and Human Issues

If you’re interested in culture, human issues and Innovation you might start with a blog series from the Harvard Business Review. This link is to an HBR article, which is chock full of incredible resources…the blog comments themselves are a rich vein by some real thought leaders.

A friend, who happens to share my name, has written a number of articles addressing the interconnection between happiness/well-being and business success:

In summary, the other Doug Brockway says that stressed, overly pressured people are not performing at their best.  ‘Couldn’t have said it better myself….

Sample McKinsey Articles –

McKinsey writes extensively on innovation and creativity.  A sampling of their articles is referenced below.  In order to get a sampling suited to your case in your market, it is best to go to their web site and search around for items such as:

“Lessons Learned in Innovation” by Mervyn’s PSI

“McKinsey Innovation Metrics Survey”

“Leadership and Innovation” – McKinsey Quarterly 2008

“Succeeding at Open-source Innovation”

Good Old Search

Lastly, a basic way to find out more is a good-old web search.  For matters like this, I tend to start with Wikipedia.  Articles there may be written by consultants or vendors, but still, definitional articles such as those that define innovation tend to be academically neutral.  If you take the names of the authors and speakers above, or the subjects or titles or their works, it is easy to find articles about them and their subjects, laudatory or critical, usually, to some degree, informative.

If you have other favorites, please share them.  We are always interested in furthering valuable commentary.

Doug Brockway
doug.brockway@returnonintelligence.com