Following up on my last post about IEEE ERT 2018, here are a couple of charts for my “discussion brief,” which include a Human-Machine-System Capaility Mapping chart (above) and concept illustrations of the Experiential Decision Support System (below.) The charts’ text conveys context setting remarks, which I am also providing here.
The goal of furthering machine intelligence is to make humans more able and smarter: the opposite engineering approach typically becomes a source of self-defeating technical myopia waiting to happen and missed opportunities. This simple mapping exercise can be customized to assess and roadmap capability levels.
The more sophisticated automation becomes, the more obvious the criticality of the human factor in both consumer and enterprise environments… rather than less. And, in any case, customer acceptance and adoption criteria remain Quality’s litmus test for emerging technologies.
Digitalization is fostering (a) XaaS, (b) Self-Service, (c) the Shared Economy and the (d) Maker Movement. All elevate human involvement and drive the push for opening and democratizing technologies. These make (e) citizen science and citizen developers shape the next generation prosumers at mass market scale.
Digital Transformation initiatives embracing the above allow (f) nimbler enterprise teams to operate at far greater scale, scope and speed, and shift focus from routine operations to dynamic value creation coupled with extreme efficiencies.
This entails (g) interdisciplinary workstyles and collaborative organizational behaviors that include (h) customer co-creation models. In this new context, humans remain (i) the ultimate critical element in system reliability and safety. Left shifting Quality by Design (QbD) prioritizes Human-Centered-Design tools and processes to deliver high performance workforce automation systems.
Cost-effective Lean Ops systems intertwine analytics, automation, programmability and flexible systems integration. All optimized for dynamic behaviors given Soft System’s perpetual motion. This means designing “for-ever” rapid and seamless reconfigurability instead of just engineering “day 1” implementations.
Operational Excellence dictates system-wide as well as subsystem level visualization, and a combination of centralized & distributed closed loop controls under user friendly operational modes. Cognitive models involve Situational Awareness (SA,) Sense Making (SM,) Root Cause Analysis (RCA,) Scenario Planning (SP,) and ROA (Real Options Analysis.)
The Experiential element is not just about programming known rules and policies but, most importantly, it grows by assimiliating iterative learning in the context of cyclical automation: routine decisions and manual operations can be streamlined and colapsed, then switching to “exception” based management for that particular event.
Productivity calls for streamlining operations so that (a) waste can be eliminated & prevented, and (b) value based tasks can be performed effortlessly, in less steps, at speed & without error. High performance behaviors and sustainable competitiveness also call for the ability to (c) experiment and create new capabilities, as well as leveraging (d) process mining for customer journeys & value stream mapping (CJM & VSM) to continuously optimize them and guarantee service levels.
Service Operations Centers (SOC) should be equipped with Experiential Decision Support Systems (DSS) featuring (d) collaborative filtering, (e) actionable data stories conveying hindsight, insight & foresight and (f) adaptive cybernetics. Advanced visualization for both (f) intuitive & highly abstracted infographics and (g) scientific views is of the essence.
Quality is best addressed as a human experience, which determines (d) meaning and, therefore, the degree to which a system is lean vs. over-engineered or subpar (both being defective and carrying obvious and hidden costs.) A new take on QbD for Soft Systems, which are inherently fluid by definition, emphasizes acceptance testing probing for: usefulness & utility, usability & affectivity, consumability & serviceability and safety thru use cases and lifecycle events.
I am reviewing the Manifesto on Human Factors Engineering and making updates. In the meantime, what follows below was a draft introduction letter, which was left unpublished when releasing the Manifesto last year. Blue text shows new updates. As far as this post’s title is concerned, DX refers to Digital Experiences. The same acronym is also used for Digital Transformation initiatives.
Claude E. Shannon, the father of information theory, is credited with being the first to use the word “bit” in a ground-breaking paper published in the Bell Labs’ Research Journal in 1948. He defined a mathematical framework that defines information and how to encode and transmit it over communication networks.
John E. Karlin, the father of Human Factors Engineering and a peer of Shannon’s at Bell Labs, is credited with assembling the first business organization addressing the human side of the equation just a year earlier. His interdisciplinary team focused on how to interface and, therefore, best design communication systems that account for cognitive and behavioral matters, as well as form factor considerations for devices to be user friendly.
In the Age of Digital Transformation, the notion of “being digital” has transcended the sophisticated handling of binary digits and what we can do with tangible hardware. Data driven automation and the notion of zero-touch lead to the development of end-to-end digital systems that are largely software defined and autonomic. These are engineered to be highly efficient and to operate without human intervention… or so we thought.
That feat can only be accomplished by undertaking a holistic design approach which, paradoxically, highlights the larger context and the new nature of Human-Machine-Systems. Otherwise, we would incur a technical myopia where presumably good technology ends up addressing the wrong problems or causing new ones that offset the intended benefits. In the digital age, technical prowess alone does no longer guarantee success: impressive inventions can fail to “cross the chasm,” fall in the “valley of death,” and never become true innovations to their creators and investors’ dismay. Passing the Turing Test just to plunge into the uncanny valley paradox also reinforces that point.
Note: the above draft chart is not self-explanatory, requires some updating and I will better address it on another post… but I’d like to leave this version here for ongoing discussions and feedback.
Being digital entails a new breed of jobs enabled by workforce automation. Any of us may become a citizen developer who can leverage self-service and intuitive decision support systems to create, blend and program services, because automation does the heavy lifting under the hood. Interdisciplinary collaboration is now within reach as teams involving individuals from different domains can effectively share these tools and the underlying resources to overcome the pitfalls and diminishing returns of organizational fragmentation. Enterprises can better innovate and further business opportunities by engaging in rapid experimentation with nimbler teams working at greater scale and scope, and by doing so at unprecedented speed.
At the time of writing this, and in the foreseeable future, no enterprise system is left alone without a human being accountable for its performance (or lack of thereof) since our skills and judgement remain key to critical and ultimate decision making. The more sophisticated the environment, the more obvious, as smaller agile teams become responsible for systems operating at greater scale, scope and speed. Dr. Alonso Vera, Chief at NASA’s HSI (Human Systems Integration) Division, states that “humans are the most critical element in system safety, reliability and performance […] across the gamut of applications even as increasingly intelligent software systems come on-line,” Human-Centered Design and Operations of Complex Aerospace Systems 2017.
It should also be noted that best practices in A.I. are promoting the kind of augmented and collaborative intelligence that only Human-On-The-Loop and Human-In-The-Loop Computing can deliver. A.I. is also powering up Affective Computing to make day-to-day digital services be contextual, empathic and adaptive, and allowing for mass-personalization at scale. We are also leveraging Natural Language Processing coupled with Dataviz helping better search, discover and visualize insight and foresight with interactive infographic quality, instead of just rendering data overloading screens and overwhelming navigation.
These are all good reasons to further our understanding of how to best leverage analytics, automation and programmability to design enterprise and consumer systems driven by a human-centered approach. The desired outcome is greater utility, frictionless consumability, dynamic adaptation and, last but not least, extreme ease of use at any level throughout a service’s lifecycle. That’s the fertile ground that enables new cross-pollination opportunities to enable a better future, which continuous improvement sets in constant motion and, hence, always is in the making.
Being digital is a human experience and, as such, it involves human affects. That relates to how we perceive our predominantly analog world and the diversity of our social and cultural fabrics. We interact with a great deal of objects and services of all sizes which can, and will be, digitized and automated in some fashion. We will continue to lead our lives in a variety of changing contexts and perform numerous tasks throughout the day, some routinely and some exercising more demanding skills with both low and high tech in that mix. So, it pays to think of Human Factors Engineering as not only having pioneered human-centered-design, but as an endless source of serial innovation for Creative Technologists to address our evolving lifestyles and quality of life in the DX Age.
IEEE CQR-ETR 2018: “Discuss and identify the RAS (Reliability, Availability and Serviceability) challenges, requirements and methodologies in the emerging technology areas like the Cloud Computing, Wireless/Mobility (with focus on 5G technologies), NFV (Network Functions Virtualization), SDN (Software Defined Networking), or similar large-scale distributed and virtualization systems.”
“Discuss the RAS requirements and technologies for mission-critical industries (e.g., airborne systems, railway communication systems, the banking and financial communication systems, etc.), with the goal to promote the interindustry
sharing of related ideas and experiences. Identify potential directions for resolving identified issues and propose possible solution.”
Session Title: A Programmatic Approach for an Artificial Intelligence Code of Conduct.
Today’s DX, Digital Transformation, programs are all the rage, but it takes a fair amount of double clicking and inquisitive questioning to separate facts from vaporware. DX typically involves a wide variety of game changing initiatives intersecting analytics, automation, programmability, software-defined systems, end-to-end integration, service-level composition and controls… all coming together to optimize for Quality as a differentiated and value-based Human Experience. Therefore, Customer Delight metrics (rather than outmoded customer satisfaction ones) are set to redefine the “Q” in CQR, Communications Quality & Reliability in 5G.
While the Telecoms industry rallies toward a zero-touch automation paradigms, which some happen to position as a Human-“OFF”-the-Loop panacea, this session will expose the need for considering, and possibly pivoting, to the kind of Operational Excellence that can only be delivered by adaptive HMS, Human-Machine-Systems instead.
Note the rise of Dataviz (Data and Science Visualization,) ML’s (Machine Learning’s) Collaborative Filtering, AI’s (Artificial Intelligence’s) RecSys (Recommender Systems) and a renewed take on Cybernetics are driving innovation in HILT and HOTL (Human-“IN”-The-Loop and Human-“ON”-the-Loop, Computing,) as well as delivering effective mass-personalization with Affective Computing powered by Human Dynamics’ analytics.
Telecoms’ pioneered HFE, Human Factors Engineering: a holistic systems engineering discipline addressing people (culture, workstyle, skills,) processes (procedures, methods, practices,) and technologies (crafts, tools, systems) so that we can best humanize technology and make a compelling difference across the value chain at all levels. We are now embarked on a new journey.
The sought after outcome of any Digital Service Provider, DSP, is to be instrumental to our Citizens’ Quality Experiences with new service experimentation, transactions and growth models. This takes agility and dynamic system-wide (horizontal and vertical) behaviors, which prompt effortless operability at unprecedented speed, scale and scope. Our work permeates design, development, delivery and serviceability, and continuous intertwined lifecycles instead of lock-step waterfalls.
In this context, AI, Artificial Intelligence, enables us, humans, to envision and implement capabilities beyond the reach of legacy systems’ last gasps. By the same token, practices that might have appeared to serve us well in the past, are exposing their limitations when becoming latency-prone barriers. A successful path forward takes augmented Human-Machine Intelligence. A programmatic approach for an AI’s Code of Conduct would enable us to best model AI’s behavior, design better human-network interactions and collaborate on standarization.
“Reflecting the diversity of the agenda, we are thankful for the support of our advisory board. The board is integral to the development and execution of Design Thinking, supporting the strategic positioning of the brand and advising to the content and participants that matter most. Hear from some of the greatest minds in Design Thinking as they shed a light on its mysteries and separate fact from fiction.”
“The world of IoT and connected devices is expanding rapidly. We all carry super computers in our pockets and interact with everything from home automation, cars, consumer electronics, and healthcare devices.”
“In this complex hardware + software environment the product development cycle can be tricky. For example, you can’t just follow agile software practices by the book when you’re building a connected pace maker. So how do we approach product development when the stakes are high and the moving parts are many? During this discussion we’ll be tackling topics such as:”
“How do you roadmap a product which includes both hardware and software components? How does agile development fit in? How does the regulatory landscape affect how we approach development and iteration? How do you build teams around these integrated products? And how do you keep them in sync and working together?”
I’d first like to thank the team at DevMynd for their kind invitation. I am looking forward to joining the panel discussion in Chicago this coming Thursday, February 22. In the meantime, I will welcome any comments and insights as I gear up for this discussion.
I’m working on outlining some of the myths, dilemmas and trade-offs that I have encounter as an Industrial Designer and in Product Management.
From a design perspective, there are two topics worth looking at: Design Thinking as a Human-Centered methodology and its outcomes in terms of: (a) utility, (b) usability, (c) consumability, (d) affectivity and (e) the composite and differential value of the resulting digital experiences that involve software and hardware.
This “new brave world” equips us with the freedom to explore new form factors, cognitive models and, most impoartantly, the development human x technology networks. Some of the specifics come down to design semantics re-defining HMS, Human-Machine-Systems, in the context of multi-modal user interfaces and innovative interactions where Machine Learning and new visualization paradigms happen to surface.
From a Product Management viewpoint, there is a need for also pondering about how to best leverage Design Thinking beyond Industrial Design and Software Development to talkle product and service strategy. Here my focus gravitates toward addressing: (a) success factors and (b) limiting factors under control, as well as (d) other determining factors beyond our area of influence that can impact the difussion of innovations either possitively or negatively. Moreover, I like to couple business model innovation with behavioral economics and information network effects.
This construct really boils down to capturing the essence behind (e) stakeholders’ acceptance criteria and (f) the users’ engagement, adoption and growth rates. This means defining capability and maturity levels and how to best factor for the fact that they adapt and evolve over time. Obviously, this leads to taking a close look at how to best intersect Lean and Agile practies, but not only, so that we can lead and navigate constantly changing environments in “digital time.”
Let’s get down to a more tactical level: end-to-end system design entails a mix of loosely and tightly coupled elements, and a platform approach to operate at speed, scale and wider scope that what black boxes can match. A reality check unveils a hybrid world where decisions on capacity and performance levels, as well as serviceability and dependency levels drive decisions toward optimizing for distributed systems and, therefore, the rising value of end-to-end solutions vs. point solutions only.
In that context, inter-disciplinary teams involving creative technologists and domain experts make our organizations effectively diverse, smarter and innovative. Otherwise, self-defeating arrogance, conflicting silos and technical myopia can make pre-production and production be costlier by promoting unncessary friction and getting everyone to work harder and harder rather than smarter. Typically, that negates productivity, forces a number corrective actions, and significantly shifts and/or downsizex sought after results.
The beauty of the Studio’s human-experience-centered practice is a healthy obssession for delivering “meaning.” The definition of “meaningful outcomes” (rather than churning outputs) makes these organizations behave based on value and impact. We strive to foster not just customer satisfaction and net promoter scores, but measurable customer delight and network effects (superior and service-level performance indicators) which, in turn, set and streamline technical requirements.
Long story short, the Studio’s mindset (critical thinking / wonder & discovery / problem solving) and workstyle (collaborative / experiential / iterative / adaptive) help explain why creative technologysts are instrumental and serial innovation engines for the digital age.
Footnote: the term “team of creative technologysts” was first coined by Nokia Bell Labs back in the 1940s to single out the differentiated value of inter-disciplinary undertakings. In the late forties, Bell Labs’ Clauded Shannon pioneered Information Theory and John Karlin set up the first Human Factors Engineering in industry. That HFE team was formed by a pyschologist, a statistician (the father of quality control visualization,) an engineer, and a physicist.