Following up on my last post about IEEE ERT 2018, here are a couple of charts for my “discussion brief,” which include a Human-Machine-System Capaility Mapping chart (above) and concept illustrations of the Experiential Decision Support System (below.) The charts’ text conveys context setting remarks, which I am also providing here.
The goal of furthering machine intelligence is to make humans more able and smarter: the opposite engineering approach typically becomes a source of self-defeating technical myopia waiting to happen and missed opportunities. This simple mapping exercise can be customized to assess and roadmap capability levels.
The more sophisticated automation becomes, the more obvious the criticality of the human factor in both consumer and enterprise environments… rather than less. And, in any case, customer acceptance and adoption criteria remain Quality’s litmus test for emerging technologies.
Digitalization is fostering (a) XaaS, (b) Self-Service, (c) the Shared Economy and the (d) Maker Movement. All elevate human involvement and drive the push for opening and democratizing technologies. These make (e) citizen science and citizen developers shape the next generation prosumers at mass market scale.
Digital Transformation initiatives embracing the above allow (f) nimbler enterprise teams to operate at far greater scale, scope and speed, and shift focus from routine operations to dynamic value creation coupled with extreme efficiencies.
This entails (g) interdisciplinary workstyles and collaborative organizational behaviors that include (h) customer co-creation models. In this new context, humans remain (i) the ultimate critical element in system reliability and safety. Left shifting Quality by Design (QbD) prioritizes Human-Centered-Design tools and processes to deliver high performance workforce automation systems.
Cost-effective Lean Ops systems intertwine analytics, automation, programmability and flexible systems integration. All optimized for dynamic behaviors given Soft System’s perpetual motion. This means designing “for-ever” rapid and seamless reconfigurability instead of just engineering “day 1” implementations.
Operational Excellence dictates system-wide as well as subsystem level visualization, and a combination of centralized & distributed closed loop controls under user friendly operational modes. Cognitive models involve Situational Awareness (SA,) Sense Making (SM,) Root Cause Analysis (RCA,) Scenario Planning (SP,) and ROA (Real Options Analysis.)
The Experiential element is not just about programming known rules and policies but, most importantly, it grows by assimiliating iterative learning in the context of cyclical automation: routine decisions and manual operations can be streamlined and colapsed, then switching to “exception” based management for that particular event.
Productivity calls for streamlining operations so that (a) waste can be eliminated & prevented, and (b) value based tasks can be performed effortlessly, in less steps, at speed & without error. High performance behaviors and sustainable competitiveness also call for the ability to (c) experiment and create new capabilities, as well as leveraging (d) process mining for customer journeys & value stream mapping (CJM & VSM) to continuously optimize them and guarantee service levels.
Service Operations Centers (SOC) should be equipped with Experiential Decision Support Systems (DSS) featuring (d) collaborative filtering, (e) actionable data stories conveying hindsight, insight & foresight and (f) adaptive cybernetics. Advanced visualization for both (f) intuitive & highly abstracted infographics and (g) scientific views is of the essence.
Quality is best addressed as a human experience, which determines (d) meaning and, therefore, the degree to which a system is lean vs. over-engineered or subpar (both being defective and carrying obvious and hidden costs.) A new take on QbD for Soft Systems, which are inherently fluid by definition, emphasizes acceptance testing probing for: usefulness & utility, usability & affectivity, consumability & serviceability and safety thru use cases and lifecycle events.
ETR turned out to be a very productive undertaking and I would like to thank IEEE’s Spilios Markis, Chi-Ming Chen and Chris Mayer for all the help provided prior and during workshop.
My contribution focusing on addressing the unprecedented flexibility of advanced software defined systems and artificial intelligence. That intersection defines game changing technologies leading to zero-touch automation and, therefore, fostering self-service opportunities at both operational and service consumption levels.
“Zero touch” implies extreme automation to its fullest while self-service reveals that this new order elevates the criticality of HMS (Human Machine Systems.) More touch points surface compared to what legacy technologies allowed given their constraint and restricted nature. That prompts a new take on HCI (Human Computer Interaction) and QbD (Quality by Design) to best deliver service quality throughout: concept exploration and service definition, fulfilment and adaptation, assurance and security… across multi-domain, highly decomposed, re-configurable and exceptionally dynamic end-to-end systems involving integration and service delivery in continuous motion.
These are thought out to (a) dramatically optimize support personnel ratios and (b) shift staff’s attention and efforts to value based activities and innovation. These are small agile teams and new talent tasked with jobs involving (c) far greater scale with (d) a wider interdisciplinary scope, and all to be performed at (e) digital speed. In this next-level productivity and more demanding and challenging context, success relies on new tools embracing Design Thinking’s HCD (Human-Centered-Design.)
That is applied to capability models and subsequent modes of operation for (f) HITL (Human “IN” The Loop) Computing largely devoted to deep domain expertise supported by Science Visualization, as well as (g) HOTL (Human “ON” the Loop) for system-wide supervisory responsibilities and ease of service creation and onboarding. HOTL draws from highly abstracted Visualization techniques and Low Code Development revealing the behavior of end-to-end systems and subsystems and adequate flow control.
These are coupled with effective Cybernetics gearing up for context aware 360-closed-loop-control, zooming in and out between distributed and central levels. Last but not least, effective and efficient tools that are characterized by ease of use and consumability do attract many more new users from many more different domains to interact with these systems in a self-service fashion and create new business opportunities as a result.
IEEE CQR-ETR 2018: “Discuss and identify the RAS (Reliability, Availability and Serviceability) challenges, requirements and methodologies in the emerging technology areas like the Cloud Computing, Wireless/Mobility (with focus on 5G technologies), NFV (Network Functions Virtualization), SDN (Software Defined Networking), or similar large-scale distributed and virtualization systems.”
“Discuss the RAS requirements and technologies for mission-critical industries (e.g., airborne systems, railway communication systems, the banking and financial communication systems, etc.), with the goal to promote the interindustry
sharing of related ideas and experiences. Identify potential directions for resolving identified issues and propose possible solution.”
Session Title: A Programmatic Approach for an Artificial Intelligence Code of Conduct.
Today’s DX, Digital Transformation, programs are all the rage, but it takes a fair amount of double clicking and inquisitive questioning to separate facts from vaporware. DX typically involves a wide variety of game changing initiatives intersecting analytics, automation, programmability, software-defined systems, end-to-end integration, service-level composition and controls… all coming together to optimize for Quality as a differentiated and value-based Human Experience. Therefore, Customer Delight metrics (rather than outmoded customer satisfaction ones) are set to redefine the “Q” in CQR, Communications Quality & Reliability in 5G.
While the Telecoms industry rallies toward a zero-touch automation paradigms, which some happen to position as a Human-“OFF”-the-Loop panacea, this session will expose the need for considering, and possibly pivoting, to the kind of Operational Excellence that can only be delivered by adaptive HMS, Human-Machine-Systems instead.
Note the rise of Dataviz (Data and Science Visualization,) ML’s (Machine Learning’s) Collaborative Filtering, AI’s (Artificial Intelligence’s) RecSys (Recommender Systems) and a renewed take on Cybernetics are driving innovation in HILT and HOTL (Human-“IN”-The-Loop and Human-“ON”-the-Loop, Computing,) as well as delivering effective mass-personalization with Affective Computing powered by Human Dynamics’ analytics.
Telecoms’ pioneered HFE, Human Factors Engineering: a holistic systems engineering discipline addressing people (culture, workstyle, skills,) processes (procedures, methods, practices,) and technologies (crafts, tools, systems) so that we can best humanize technology and make a compelling difference across the value chain at all levels. We are now embarked on a new journey.
The sought after outcome of any Digital Service Provider, DSP, is to be instrumental to our Citizens’ Quality Experiences with new service experimentation, transactions and growth models. This takes agility and dynamic system-wide (horizontal and vertical) behaviors, which prompt effortless operability at unprecedented speed, scale and scope. Our work permeates design, development, delivery and serviceability, and continuous intertwined lifecycles instead of lock-step waterfalls.
In this context, AI, Artificial Intelligence, enables us, humans, to envision and implement capabilities beyond the reach of legacy systems’ last gasps. By the same token, practices that might have appeared to serve us well in the past, are exposing their limitations when becoming latency-prone barriers. A successful path forward takes augmented Human-Machine Intelligence. A programmatic approach for an AI’s Code of Conduct would enable us to best model AI’s behavior, design better human-network interactions and collaborate on standarization.
I am on my way to Mobile World Congress and last night I had the opportunity to speak at DevMynd’s “Agile Software in a Hardware World.” That panel discussion featured BMW Technology Corporation (BMW, Mini, Rolls-Royce,) Monsanto’s “The Climate Corporation,” and Nokia Software Group, which I was proud to represent. The venue, 1KFulton, is a century-old and former cold storage building in the Fulton Market neighborhood, home to Google’s Chicago campus.
Reflecting on that panel discussion, small group conversations and one-on-one chats before and after the event, I think that it is fair to state the following:
(A) software is undergoing a defining moment while re-shaping industries. “Software defined instruments and systems” have superseded capaibilities of hardware-centric deployments.
In other words, economic value and profitability are migrating from conventional products to software dominated environments that control tools, systems, and processes.
In this new context, (B) collaborative undertakings (co-creation, open source,) platforms, modularization and mashups are paving the way for rapid experimentation and for a wide-range of services to surface.
Back to economics… a venture capital firm operating in the Silicon Valley shared with me that when comparing current investments with equivalent old-school ones, they experienced x3 times time-to-market speed at 1/3 of the investment, which allows them to better diversify risk and fund more start-ups in the process.
Moreover, we are now operating at (C) unprecedented speed, scale and scope. For that reason alone, software should improve our ability to “pivot” and dynamically adapt to changing circumstances.
Most plans don’t survive first contact and many start-ups and emerging technologies don’t survive the so-called “crossing-the-chasm” or “Valley of Death.” So, remaining lean and embracing continuous/iterative improvement are of the essence. That’s a quality mantra rather than an excuse for forgoing best quality practices.
Back to economics again: quality management’s definition of “customer satisfaction” is now table-stakes and compliance in that area drives low-cost commoditization. “Customer delight” is the higher benchmark that commands a premium and the kind of margins enabling us to re-invest to further innovate.
Let’s now state the obvious, “customers” are human beings, aren’t they? Interestingly enough, the more sophistication and diversification, the higher the need for (D) humanizing technology so that we can better create, consume, use and democratize any digital services. In turn, this has fostered (E) Design Thinking as a leading innovation practice that intersects art and science. Design Thinking addresses HMS, Human-Machine-Systems, by prioritizing HCD, Human-Centered-Design.
In terms of economic effectiveness and efficience, that means outcome-oriented system-sizing, rather than over-engineering waste. It also means the definition of meaningful and purposeful requirements: some are designed to meet customer satisfaction metrics, while others are explicetly thought out to exceed that baseline and, hence, to actually deliver the X-Factor prompting customer delight. All key to customer acceptance and adoption growth.
Better yet, one of the event’s participants volunteered the fact that “good design” factoring inuitive interaction, advanced dataviz (data visualization) and effortless controls was proven to shrink the sales cycle by literally half: not only customers perceived and experienced the service’s taginble value early, the sales team was also able to approach more customers in that timeframe. Innovative Human-Computer-Interaction based on information design, value based tasks, streamlined processes, intuitive data visualization, effortless controls and overall UX, User Exeperience, double as compeling demonstration tools.
This is a side note: that has already become a critical success factor in Artificial Intelligence’s new developments, AI being software’s top transformational exponent as DSS, Decision Support Systems for humans and/or machines become quintessential. I will detail that in another post.
One last thought… (F) software’s pervasiveness has also brought along Agile development practices. These include “user stories” borrowing a Design Thinking technique by which application features are defined by sinthesizing human optics (persona/outcome/rationale) to put technical myopia at bay.
After all, we should all be in the business of making tech human. Otherwise, what would negating or ignoring that say about each of us and our collective culture?
“Reflecting the diversity of the agenda, we are thankful for the support of our advisory board. The board is integral to the development and execution of Design Thinking, supporting the strategic positioning of the brand and advising to the content and participants that matter most. Hear from some of the greatest minds in Design Thinking as they shed a light on its mysteries and separate fact from fiction.”