Following up on my last post about IEEE ERT 2018, here are a couple of charts for my “discussion brief,” which include a Human-Machine-System Capaility Mapping chart (above) and concept illustrations of the Experiential Decision Support System (below.) The charts’ text conveys context setting remarks, which I am also providing here.
The goal of furthering machine intelligence is to make humans more able and smarter: the opposite engineering approach typically becomes a source of self-defeating technical myopia waiting to happen and missed opportunities. This simple mapping exercise can be customized to assess and roadmap capability levels.
The more sophisticated automation becomes, the more obvious the criticality of the human factor in both consumer and enterprise environments… rather than less. And, in any case, customer acceptance and adoption criteria remain Quality’s litmus test for emerging technologies.
Digitalization is fostering (a) XaaS, (b) Self-Service, (c) the Shared Economy and the (d) Maker Movement. All elevate human involvement and drive the push for opening and democratizing technologies. These make (e) citizen science and citizen developers shape the next generation prosumers at mass market scale.
Digital Transformation initiatives embracing the above allow (f) nimbler enterprise teams to operate at far greater scale, scope and speed, and shift focus from routine operations to dynamic value creation coupled with extreme efficiencies.
This entails (g) interdisciplinary workstyles and collaborative organizational behaviors that include (h) customer co-creation models. In this new context, humans remain (i) the ultimate critical element in system reliability and safety. Left shifting Quality by Design (QbD) prioritizes Human-Centered-Design tools and processes to deliver high performance workforce automation systems.
Cost-effective Lean Ops systems intertwine analytics, automation, programmability and flexible systems integration. All optimized for dynamic behaviors given Soft System’s perpetual motion. This means designing “for-ever” rapid and seamless reconfigurability instead of just engineering “day 1” implementations.
Operational Excellence dictates system-wide as well as subsystem level visualization, and a combination of centralized & distributed closed loop controls under user friendly operational modes. Cognitive models involve Situational Awareness (SA,) Sense Making (SM,) Root Cause Analysis (RCA,) Scenario Planning (SP,) and ROA (Real Options Analysis.)
The Experiential element is not just about programming known rules and policies but, most importantly, it grows by assimiliating iterative learning in the context of cyclical automation: routine decisions and manual operations can be streamlined and colapsed, then switching to “exception” based management for that particular event.
Productivity calls for streamlining operations so that (a) waste can be eliminated & prevented, and (b) value based tasks can be performed effortlessly, in less steps, at speed & without error. High performance behaviors and sustainable competitiveness also call for the ability to (c) experiment and create new capabilities, as well as leveraging (d) process mining for customer journeys & value stream mapping (CJM & VSM) to continuously optimize them and guarantee service levels.
Service Operations Centers (SOC) should be equipped with Experiential Decision Support Systems (DSS) featuring (d) collaborative filtering, (e) actionable data stories conveying hindsight, insight & foresight and (f) adaptive cybernetics. Advanced visualization for both (f) intuitive & highly abstracted infographics and (g) scientific views is of the essence.
Quality is best addressed as a human experience, which determines (d) meaning and, therefore, the degree to which a system is lean vs. over-engineered or subpar (both being defective and carrying obvious and hidden costs.) A new take on QbD for Soft Systems, which are inherently fluid by definition, emphasizes acceptance testing probing for: usefulness & utility, usability & affectivity, consumability & serviceability and safety thru use cases and lifecycle events.
“A great idea is only the beginning. The Back End of Innovation provides a strategic road map to successful commercialization. Learn how to bring new products to market and commercialize them for maximum impact on the bottom line. Uncover new ways to solve problems we all encounter in today’s dynamic business world.”
Back End of Innovation #BEICONF
I am working on the talk that I will deliver at Back End of Innovation 2016 and just came across BEI’s banner on prominent sites, such as CNN’s Innovation section (left screenshot).
The organizers have made available a discount code, which I can share if you were interested in attending. If so, feel free to send me a message on LinkedIn.
The conference’s agenda features speakers from 3M, Cisco, Coca-Cola, Fidelity, Johnson & Johnson, Keurig, Pepsi, Vodafone and Xerox among others and I will be there proudly representing Nokia.
My talk’s title is “Lean Ops Innovation: Dynamic Service Delivery,” which is scheduled on November 17 at 11:30. Here is the abstract:
“Network Operators in the telecommunications industry operate complex sets of technologies and environments. This sector’s future relies on furthering software defined systems supporting the next wave of pervasive digital services, which all of us come to rely on in our day-to-day lives.
Nokia’s Applications & Analytics (A&A) team has evolved and redefined Lean principles to intertwine advanced analytics, automation, programmability and human factors engineering, the four pillars of a new LeanOps’ framework. The outcome is effective service delivery enabled by highly efficient systems that remain nimble and agile at any scale and at any point in the life-cycle.
Join Jose for this session to learn:
- A new Lean Ops framework intertwining analytics, automation, programmability and human factors.
- How to effectively interweave Design Thinking, Lean, DevOps and Agile to deliver breakthrough innovation.
- Unlocking the value of Human Factors Engineering in the cloud age and, therefore, expanding the human possibilities of technology.”
Earlier in the year I gave a talk at IEEE Communications Quality & Reliability – CQR 2016 also on Nokia’s Lean Ops.
Back then, my focus was HCI, Human-Computer-Interaction and operational efficiencies. As an example, immersive user interfaces taking advantage of 3D data visualization coupled with autonomation and assisted automation, as well as continuous optimization lead to effective decision support systems (DSS) that mitigate human error and elevate value based tasks.
That was discussed in the context of the kind of complex operational environments experienced in the telecommunications industry by network operators. As shared above, my presentation at BEI will focus on the underlying construct instead.
This is my “75 word” bio for this event: “Jose is a Design Director at Nokia’s Applications & Analytics Group. His 15+ years of experience feature leadership responsibilities in strategy, product management, R&D, and marketing. Jose worked with Bell Labs and holds three patents. He is a Member of the Advisory Board at MIT IDSS and is the recipient of an MBA from Chicago’s DePaul University as a Honeywell Europe’s Be Brilliant Scholar. Jose holds a postgraduate degree in Human Factors Engineering from BarcelonaTech.”
This is the second time that I’m featured as part of BEI’s Speaker Faculty and I would like to take this chance to thank the team at Informa for their kind invitation.
I will be happy to meet at BEI and hope to see you there : )
“The Mother of All Demos is a name given retrospectively to Douglas Englbart’s December 9, 1968 […] The live demonstration featured the introduction of a complete computer hardware and software system called the oN-Line System or more commonly, NLS. The 90-minute presentation essentially demonstrated almost all the fundamental elements of modern personal computing: windows, hypertext, graphics, efficient navigation and command input, video conferencing, the computer mouse, word processing, dynamic file linking, revisions control, and a collaborative real-time editor (collaborative work). Engelbart’s presentation was the first to publicly demonstrate all these elements in a single system. The demonstration was highly influential and spawned similar projects at Xerox PARC in the early 1970s. The underlying technologies influenced both the Apple Macintosh and Microsoft Windows graphical user interface operating systems in the 1980s and 1990s.” – The Mother of All Demos, Wikipedia.
Compelling demonstrations can make all the difference when introducing emerging technologies. There is no slideware or paper substitute for the kind of revelations, quality insights, and lasting emotions that we all get when experiencing things live and first hand. On the research side, interactive demonstrations have become invaluable tools that expose and test concepts. Moreover, they prompt invaluable feedback by questioning, validating, unveiling unsuspected items as well as winning hearts and minds to further advance a cause.
Those are some of the reasons why I prioritize demo development and my research process involves activities such as field trips and ethnographic insights captured in environments like the Museum of Science and Industry (MSI) in Chicago and open-door showcases at renowned institutions like Fermilab. Successful science exhibits make complex topics approachable and engaging. They are carefully designed with craftsmanship pride to be perceived as astute, immersive and to appeal to our brain’s intuition and intellect.
The above graphic features quotes from Albert Einstein and Nicholas Negroponte on the left, coupled with Salvador Dalí and Arthur C. Clarke on the right. I created that poster’s first version a few years ago and became my reference framework for prototyping and demonstration since. The photographs are courtesy of Wikipedia. Here are further insights on what these quotes mean to me:
1.- DEMO OR DIE – The introduction of inventions and diffusion of innovations relies on effectively conveying clear and concise value. Interacting with engaging demonstrations can be best supported by well thought out whiteboarding sessions. This communication strategy works best when allowing dynamic conversations instead of long agendas packed with presentation monologues. Most people can talk about the many times when they were either overwhelmed, underwhelmed or just bored to death by slideware… and became suspicious of hype. Note that we all deal with an unfavorable Signal-to-Noise (S/N) ratio in today’s information rich environment and, therefore, compete for customers and/or users’ undivided attention. Once again, memorable hands-on demonstrations can make all the difference.
2.- GROW TO LOOK LIKE THE PORTRAIT – High tech is a fast paced industry. One can be left wondering if the technology, toolset, application and/or overall system being discussed will grow and scale as needed beyond day one. There can also be concerns around maturity levels, roadmapping options and future proofing when working with emerging technologies. Demos can be used to convey a tangible vision based on attainable end-goals. They can also be used for what-if-analysis, sunny and rainy day scenarios (which can include full lifecycle and stress tests) and plot plausible journeys to go from A to B and any steps in between. Helping everyone come to terms with what lays ahead is key to defining product strategies and planning decisions “to grow to look like the portrait.”
3.- EXPLAIN IT SIMPLY – Apparently unavoidable jargon and well intended technical kumbaya can become easily entangled. Complex explanations suffer from information overload. Convoluted narratives pleasing the presenter’s ego can make unclear what specific problem or pain point he/she solving, and what the sought after benefits and priorities are. When “less is more” it definitely pays to define a vantage point, zoom out, distill fundamentals and synthesize the essence. Knowing your audience and getting the job done in the clearest and most effective terms possible means striking a balance and staying away from oversimplifying or complicating matters. This is an iterative exercise that often demands more time, effort and reviews than the usual information dump. We also need to be able to step-zoom to deliver the next level of detail and to conduct deep dives… without incurring information overload. Humanizing technology, storytelling techniques and ease of information visualization are key to developing a coherent narrative.
“The meaning of a communication is defined by the Change and Affect it creates for the audience. Stories are concerned with transformation. In stories something Changes to create an emotion […] The Change has to resonate with the Audience to generate an Affect; a feeling, a reaction or an insight […] We shall consider these two defining characteristics of narrative to clarify the purpose of any communication […] Change and Affect create meaning. – “Crackle and Fizz. Essential Communication and Pitching Skills for Scientists.” – Caroline van den Brul. Imperial College Press.
4..- IT’S MAGIC – This is all about the so called X-FACTOR: an unsuspected quality making something be different and special in unequivocal terms. To be more precise, the X-FACTOR’s experience can be broken down as follows:
- SURPRISE FACTOR – this relies on managing perceptions and the discovery process, the tipping point being delivered by a timely and unsuspected clever twist and a defining punch line – the “aha” moment.
- WOW FACTOR – high impact, impressive, awe-inspiring outcome, benefits and results that can be easily understood and embraced – the “I didn’t know we could do that” and “I want to know more” moment.
- COOL FACTOR – elegant sophistication and grace, clear object of desire – the “I want that” moment, this being most demos’ ultimate Call-To-Action (CTA.)
The art and science behind the above is known as “affective design.” Techniques such as perceptual learning and emotional intelligence in design (emotional design in short) are applied in Human-Computer-Interaction (HCI) to foster pleasant ease of use, drive further engagement and productive usage in the process. Widespread digitalization and the advent of wearables make HCI commonplace, which is influencing product design.
The above is a demo’s “full disclosure” chart, which breaks down what’s real and what’s not. This is needed because vaporware can be an issue of concern.
1.- PRIOR ART – In the above example, a given percentage of the demonstration system involved known technologies, some from third party partners.
2.- STATE OF THE ART – The greatest and latest features, cutting edge delivered by technologies that are available today.
3.- FUTURE ART – A sneak preview of new features and capabilities that are planned, undergoing development and/or committed, but not yet available.
4.- ART OF THE POSSIBLE – Proof of Concept illustrating experimentation results and potential, bleeding edge capabilities that are not yet committed.
By the way, vaporware is the result of positioning 3 and 4 as part of 2. Avoiding unpleasant misunderstands prompts the need for disclosing these four different maturity levels. Note that one graphic applies to a comprehensive demonstration system encompassing those four aspects and their relative weight.
One other thought, there is a difference between incremental and disruptive innovation. The first delivers improved qualities such as better performance in A/B comparison testing as an example, “A” being prior art and “B” state of the art. Most would agree on defining disruptive innovations as game changers which deliver unique capabilities that clearly supersede legacy and conventional systems. That alone renders “A” obsolete. A/B comparison testing leads to discussions on the difference between Present Mode of Operations (PMO) and Future Mode of Operations (FMO.)
“Humanists must be educated with a deep appreciation of modern science. Scientists and engineers must be steeped in humanistic learning. And all learning must be linked with a broad concern for the complex effects of technology on our evolving culture.” – Jerome B. Wiesner.
“See inner relationships and make connections that others usually don’t see; we learn to think the unthinkable. On the other hand we may be uncomfortable with the insights that arise from from seeing the world differently. However, we need innovation and creativity that steams from seeing things differently […] I recommend that you start to manage your own dilemmas.” – Get There Early: Sensing the Future to Compete in the Present by Bob Johansen. 2007 Edition published by Berrett-Koehler.
“They preferred to think they worked not in a laboratory but in what Kelly once called ‘an institute of creative technology.’ This description aimed to inform the world that the line between the art and science of what Bell Labs scientists did wasn’t always distinct […] many of Kelly’s colleagues might have been eccentrics […] working within a culture, and within an institution, where the very point of new ideas was to make them into new things.” – The Idea Factory by John Gertner. 2012 edition published by The Penguin Press.
“He kept asking Kay and others for an assessment of trends that foretold what the future might hold for the company. During one maddening session, Kay, whose thoughts often seemed tailored to go directly from his tongue to wikiquotes, shot back a line that was to become PARC’s creed: the best way to predict the future is to invent it.” – The Innovators by Walter Isaacson. 2014 edition published by Simon & Shuster.
Inventions involve the creation of a novelty which is, therefore, something new and different. Note that innovations take matters further since they entail realization, introduction and adoption processes. I am fortunate enough to have experienced both. My research work is credited as either inventor or co-inventor in patents and awards. But that alone does not necessarily imply actual development. Getting into innovating as such came to fruition when undertaking product management responsibilities.
Those of us thinking of the commercialization of inventions and the so-called diffusion of innovations are attracted to qualitative and quantitative metrics. These are valuable insights and data speaking to the correlation between inventing and innovating, which leads to articulating best practices, processes, budget and resource allocations. However, it is also true that success can, often times, be powered by outliers.
As the “black swan theory” states: there can be easily dismissed and hard to predict impactful events that end up changing everything. Long story short, the art of serial innovation is a dynamic endeavor: just relying on what you think that you knew well can cloud and betray one’s otherwise better judgment. This is when “objects in the mirror are closer than they appear,” metaphorically speaking, and things just happen at unprecedented speed.
Most would agree that good ideation can come to the surface anytime and anywhere from subject experts, users themselves as well as unusual suspects. Inventing takes a higher commitment level to address how things should work… and there can be alternative and competing solutions to a given problem. Serial innovation becomes a greater challenge since is it measured by repeated success.
I created the above framework in the context of the high tech sector. It conveys a need for striking an equilibrium point between unmanageable complexity (right) and either self-defeating oversimplification or undifferentiated simplicity for that matter (left.)
Semantics matter: anyone can argue the merits and faults of simplicity and complexity. Though, delivering elegant sophistication displays consensus thanks to a clear level of quality and refinement, functional depth and differentiation, effortless operations and ease of use. One other thought: I would also like to claim that purposely engineering effortless ops and ease of use drives everyone’s energy to focus on value based activities. We democratize innovation in the process.
The first chart became a vehicle to discuss the difference between invention and serial innovation. Let’s now look at the difference between incremental and disruptive innovation.
Innovating drives changes. Nonetheless, legacy systems can continue to benefit from incremental innovation. This means bettering and further optimizing current technologies and operations. Existing footprint and know-how combined with economies of scale, as well as risk aversion, expensive switching costs when considering emerging tech and possible resistance to change… all favor that phenomenon. So, it pays to understand Daniel C. Snow’s teaching on “old technologies’ last gasp” when outlining transition and/or transformation plans.
The lower right quadrant is where new paradigms are set to deliver disruptive innovation. B2 is is clearly set beyond the reach of legacy systems: diseconomies of scale and diminishing competitiveness with declining returns being key reasons. B2 means that legacy tech is clearly outdated and superseded.
Disruptive innovation is the game changer. That’s the kind of paradigm shift that new entrants and green field players will take advantage of. The so-called industry establishment can continue to skim incremental innovation, though only up to a point at which they are rendered “old guard” and obsolete. That is the essence behind Clayton Christensen’s Innovator Dilemma.
The upper row shows quadrants A and B1, and an obvious intersection zone in between. Established players can operate hybrid environments to cross G.A. Moore’s chasm. They can gradually transform or fully re-invent themselves at that intersection. The above chart is designed to help leaders and management consultants plot portfolios in each quadrant as well as their evolution (e.g. course and speed.) based on KPI (Key Performance Indicators) or set phased discontinuity.
Quick recap. Incremental innovation delivers better (technical, operational, financial) performance, which is usually presented in the form of A/B (before and after) comparison tests. Disruptive innovation brings about unique capabilities that legacy systems cannot match. We are talking about emerging technologies, so capability and maturity models come into play. I will discuss that in one of my next posts on Lean Ops Redefined.
We have discussed insights around invention and serial innovation, incremental and disruptive innovation. My next tool is design to map out where value exists, new value is created and value migration across the two.
No doubt, disruptive innovation alters the landscape: value migrates (or circles back) to any of the above quadrants. Some markets are placing a premium in the upper right quadrant already. That’s where end-to-end solutions and services create new value and dominate, which commands higher margins. Service focus seeks understanding and developing customers’ experiences instead of a product push or pull approach. Solution focus forces a more holistic systems engineering approach encompassing the value (supply) chain and relevant ecosystems.
That combination delivers significant competitive advantages with the advent of virtualization and cloud computing technologies. Early draft versions of that chart showed a different breakdown, namely: hardware, platforms, applications and services. When testing and putting this kind of charts to work, I could plot everything by applying color coding, then size of the addressable market, revenue and growth would determine each circle’s size. In any case, that basic template can be customized as needed.
“Inventing the future” can certainly take unique instincts, skills, workstyles and eccentric behaviors. When acknowledging that talent is a critical success factor, we then need to get serious about quipping individuals to make a difference while understanding that it takes a cross-functional team to make things happen. Serial innovation takes foresight, situational awareness, leadership and organizational agility. I hope that the above tools helped with mapping and discussing concepts such as (c) defining value, (b) transformation, and (a) moving the needle with elegant sophistication as the defining delivery.
Wondering about the last chart on Lean Ops? That one is just a sneak preview in advance to an incoming post also centered on “Innovation Management Essentials.”
As usual, looking forward to comments and emails, as well as meeting at any of these venues:
“Intel® Network Builders is an ecosystem of independent software vendors (ISVs), operating system vendors (OSVs), original equipment manufacturers (OEMs), telecom equipment manufacturers (TEMs), system integrators and carriers, coming together to accelerate the adoption of network functions virtualization (NFV)- and software-defined networking (SDN)- based solutions in Telco networks, public, private enterprise and hybrid clouds.” – About Intel Network Builders.
“We see IDF15 as a partnership. Intel and Developers/Makers/Technologists. We’ll share our vision and technology leadership […] Join us on August 18-20, San Francisco, Moscone Center.” – Intel Developer Forum.
Glad to share that our team is returning to IDF. We had a terrific experience last year and are looking forward to IDF15. This event is quickly approaching: just 11 days away at the time of writing this.
IDF14 was kind to us. In addition to opportunities to meet with customers and partners, as well as discussing the latest on Network Functions Virtualization (NFV), Alcatel-Lucent was featured as “Best in Show” jointly with Microsoft and Lenovo. Moreover, our CloudBand platform was the recipient of the Software and Services Award.
The above short video introduces the demonstration system that we are deploying this year. By the way, TelecomTV displayed this clip in a Proof of Concept (PoC) section created for NFV demonstrations from a variety of vendors.
However, please note that our demonstration is not a PoC. The Lean NFV Ops system features commercially available solutions from Alcatel-Lucent and our partners, all running on CloudBand 3.0 Clicking on the right picture will take you to Intel’s page on our platform.
CloudBand is comprised of two distinctive solutions: Nodes that can be easily deployed as part of the carrier’s Network Functions Virtualization Infrastructure (NFVI) and the prominent Management System which delivers the Management and Orchestration (MANO) platform.
By the way, NFVI and MANO are terms outlined in the NFV reference architecture provided by the working group focusing on this topic at the European Telecommunications Standards Institute (ETSI).
CloudBand’s node automation software runs on Commercial-Off-The-Shelf (COTS) hardware, these are x86 systems. Intel’s CPUs power Alcatel-Lucent’s Cloud Innovation Center’s (CIC) showcase.
At IDF15 we will also discuss boosting packet processing in context of the Data Plan Development Kit (DPDK) engineered to enable x10 performance. In the meantime, Alan’s article provides quick insights on what this mean to Virtual Network Functions (VNF) such as the Evolved Packet Core (vEPC). See reference links below.
This is relevant because the Lean NFV Ops demo deploys a fully virtualized and completely functional Voice over Long Term Evolution (VoLTE) system from the ground up. This needs the vEPC and the IP Multimedia Subsystem (IMS) working together.
Long story short, we’ll be making live 4G video calls onsite with this system, which we showed at IDF14 already. This time around our team will also conduct a number of sophisticated lifecycle operations involving maintenance events with full service continuity: zero downtime, all transparent to the end user mobile broadband experience.
Basically, you will see not last year’s PoC but a live demonstration system with real solutions in action. We are now operating in an end-to-end environment tested in real time by a variety rainy day scenarios. Additionally, we will cover high availability (HA), smart placement, dynamic scaling and root cause analysis (RCA) among other key topics. Last but not least, we’ll share Bell Labs’ research findings on automation and NFV economics. There is even more new stuff…
Better yet, instead of just making a VoLTE call with 4G phones as shown in the above video, we will be using Web Real Time Communications (WebRTC) as part of the experience. This means using ubiquitous web browsers on any kind of mobile device and/or conventional desktop.
My understanding is that RealSense comes from Intel Perceptual Computing looking into immersive communications and gesture based user interfaces. By using Personify’s application our demo captures, cuts out and projects the end user’s face and body which is then seeing as video overlay. This means that we can any backgrounds of our choice.
This can be experienced as a form of Augmented Reality (AR) where a person, who is at his/her home, is seen by the other user as if he/she was in a museum room: moving around, stopping next to pictures of interest and having a real time video conversation. This happens in the context of a video call where both end users are comfortably talking from home.
By the way, we’ll be brining another interactive gadget with us which works with Intel’s Galileo board. But, you will have to come to IDF to play with that one. Just ask for the “Whack-a-Mole” : )
We will be glad to meet at IDF. Feel free to stop by our booth and/or to schedule a meeting:
EVENT: Intel Developer Forum 2015.
VENUE: Moscone Center in San Francisco. August 18-21.
BOOTH: Network Builders Community #173.
I will be speaking at:
EVENT: Intel Network Builders Summit.
VENUE: The Westin Saint Francis. August 17, 1:30 pm. Room Elizabeth.
PANEL: “The State of Management & Orchestration (MANO)”
I also plan to attend the following two IDF Mega Sessions:
“5G: Innovation from Client to Cloud” with Sandra Rivera and Aicha Evans.
“Making The Future… with You” presented by Genevieve Bell.
See you there.