Tagged: Human Factors

NOKIA HFE18 Conference (2) GI: Genuine Intelligence


Every once in a while we get to experience Murphy’s (dreaded) Law. This time around that had to do with stability issues with a media webcasting platform. We are now working on rescheduling NOKIA HFE18 under a different format. In parallel, we are also kicking off planning for HFE19… and we will take full advantage of lessons learned.


HFE18 Banner

Ref: Nokia HFE18 Conference (1) #MakeTechHuman


We regret any inconvenience that this eleventh hour change in plans might cause, and remain extremely grateful to both speakers and volunteers who have already invested time and efforts, which should not go to waste.

In the meantime, I’d like to volunteer just a handful of insights on the session that I was scheduled for and, therefore, keep the discussion going. The objective is to further improve what’s already available and allow for an even better session when we get to reconvene. Here is my session’s abstract to begin with.


THE SOFT & HARD NATURE OF ANYTHING DIGITAL

“Our quest to deliver productivity tools yielding operational excellence for DSPs, Digital Service Providers leads to the design of signature experiences by innovating in the process.”

“The Studio at Nokia Software’s Solutions Engineering is set to work with deceptively simple techniques and elegant sophistication… because neither oversimplification nor self-defeating complexity allow end-to-end systems to efficiently operate at digital speed and global scale.”

“This discussion intersects the soft and hard natures of dynamic systems by modeling Human Machine Systems (HMS) and the design of cybernetics. This practice focuses on critical success factors for the early acceptance and broader adoption of emerging technologies.”

“The work at the Studio embraces a renewed approach to QbD, Quality by Design, which is set to left-shift and unveil instrumental considerations at early design stages. The result is Nokia Studio’s QXbD, Quality Experiences by Design, optimizing for customer delight rather than table-stakes customer satisfaction.”


GI4HMS Jose de Francisco


NI – WHAT IS NATURAL INTELLIGENCE? At the time of writing this, we humans possess NI, Natural Intelligence. NI involves naturally developed cognitive functions and models leveraged by the sort of biological beings, which humans happen to be. Intelligence (a) captures, (b) generates, (c) applies and (d) evolves knowledge. Our individual and collective brainpower can be gauged in terms of (e) skills and (f) talent levels, jointly with an understanding of (g) the underlying decisioning process and (h) our perceived experiences in context.


AI – WHAT IS ARTIFICIAL INTELLIGENCE? Intelligence that is not naturally occurring, simulated knowledge in other words. This is generated by programmable artifacts consuming, processing and producing data under closed loop models. Whether working with individual or networked machine intelligence, there is neither information derived from mindfulness nor the type of general purpose sense making that match those of the human experience. The year is 2018… and that’s where state of the art is today.


GI4HMS Jose de Francisco 2.jpg


GI – WHAT IS GENUINE INTELLIGENCE? Earlier in the year I introduced this topic at Design Thinking 2018 (plenary session) and at IEEE Emerging Technologies Roundtable (invitation only workshop.) Coincidentally, both were held in Austin, TX, back in May. I proposed thinking about GI as the outcome of NI powered by AI.

By the way, “genuine” means acting in bonafide. To be clearer: with honesty and without the intention to deceive. Given the trade-offs (pros and cons) that NI and AI bring to the table, GI gets us a step closer to productive bonafide systems.

GI is, therefore, the outcome of purposely crafting optimal technology solutions that augment human possibilities. This is addressed by Human Factors Engineering interdisciplinary science given HFE’s holistic approach and focus on value driven Human-Machine-Systems, HMS.

Quick side note: those of you into Lean and Lean Six Sigma can approach this topic with Jikoda (autonomation.) Ditto for anyone working on Human-in-the-Loop Computing, Affective Computing, RecSys (Recommender Systems,) Human Dynamics and Process Mining with Machine Learning or, better yet, xAI, Explainable Artificial Intelligence.


DDESS Nokia Studio


DDESS – The most tangible design work entails the delivery of DDESS, Digital Decision & Execution Support Systems. This is where GI gets interesting because we need to apply new optics to take a fresh look at what Operational Excellence is (and is not) moving forward.

In a nutshell DDESS’ purpose is to reveal and inform decisions and to make decisions, all in context. But, I will pause here as this topic will be better covered in subsequent posts… just one more thought: DDESS addresses decision support for (NI) humans, (AI) machines, and (GI) human-machine systems. Coming to terms with that one insight alone becomes a critical success factor.


Some other thought… it turns out that, in today’s day and age, projects that are techno-centric heavy only succeed a fraction of the time, 10% or so by some estimates. Selective memories tend to focus and celebrate the 10% that make it… but that is a terrible ROI, Return on Investment, which inflicts (1) severe technical debt, (2) latency costs in systems engineering and (3) a huge opportunity cost as funding and good efforts could have been put to work for more productive endeavours.

By many other well documented and more recent accounts, HCD, Human Centered Design, happens to flip that ratio as designers are obsessed with optimizing for user acceptance and frictionless adoption from day one. HFE takes painstaking work on purposeful and value driven technological solutions where a smart combination of Outside-IN-innovation and Inside-OUT-ingenuity happens to make all the difference.

 

QbD and Digitalization’s need for Designing Quality into Solutions


“[They] lost their quality leadership to new, aggressive competition. The most obvious consequence was lost of market share (…) [due to] quality features that were perceived as better meeting customer needs [and] they did not fail in service as often.”

“Loss of market share is not the only reason behind [it] (…) a second major force has been the phenomenon of life behind the quality dikes. We have learned that living in a technological society puts us at the mercy of the continuing operation of the goods and services that make a society possible (…) without such quality we have failure of all sorts (…) at the least these failures involve annoyances and minor costs. At their worst they are terrifying.”

“A third major force has been the gathering awareness by companies that they have been enduring excessive costs due to chronic quality-related wastes (…) about a third of what we do consists of redoing work previously done (…) lacking expertise in the quality disciplines, they are amateurs in the best sense of that word.”

J.M. Juran’s assessment on Quality issues in the 1960s-70s.


 

What follows are some of the insights driving the work that I’m doing on reviewing, leveraging and updating QbD (Quality by Design) in the context of today’s fast growing and all-encompassing digitalization.

I am dusting off my research from 2010 on the 3Q Model. Back then I was a senior manager at Alcatel-Lucent’s Solutions & Technology Introduction Department. My current role is Senior Studio Director at Nokia Software’s Solutions Engineering. Note that the scope is End-to-End Solutions. These are holistic system-wide (cross-sectional and longitudinal) undertakings intersecting different domains to deliver the higher value of the whole. I have discussed QbD for Digital Transformation projects at the Design Thinking 2018 event and at the IEEE (Institute of Electrical and Electronics Engineers) conference on CQR (Communications Quality and Reliability) back in April and May of this year. Interestingly enough, both events were held in Austin, Texas.


Juran on QbD book.jpg

QbD was first coined by Juran, a renown pioneer of quality practices, whose work on that specific topic started in the mid 80s. He linked Quality to customer satisfaction and reliability as the two dimensions to focus on:

“Features” were defined as “quality characteristics,” which meant properties intended to satisfy specific customer needs. That would also include “promptness of delivery,” “ease of maintenance,” and “courtesy of service” to name some examples. “The better the features, the higher the quality in the eyes of customers.”

As far as reliability and, therefore, replicability and consistent performance, “freedom from deficiencies” conveyed the fact that “the fewer the deficiencies the better the quality in the eyes of customers.” A “deficiency” is a failure that triggers dissatisfaction, which calls for incurring higher costs to redo prior work.

“Fitness for use” was mentioned as an attempt to capture the above two together. The so-called Juran Trilogy entails Quality Planning, Quality Control, and Quality Improvement.


More than three decades have passed since Juran started to work on “New Steps for Planning Quality into Goods and Services.” Let’s decompose QbD’s acronym at face value and distill its essence.

As a designer, my belief & practice system focuses on “serial innovation” consistently delivering superior value. This is achieved by means of purposeful and elegant solutions equipped with capability models and optimal functionality leading to Quality Experiences.

Customer Delight, rather than just satisfaction, being the sought after outcome. This applies to both small and large undertakings, and as A. Kay, a pioneer in graphical user interfaces, best put it, “simple things should be simple, complex should be possible.”

Following that train of thought, “Designing Quality into Solutions” should become center stage to: (a) collaborative and iterative ideation, (b) agile development, (c) continuous delivery and (d) the dynamic diffusion of (e) new and mass-customizable digital services for consumer and enterprise markets, as well as no-for-profit. Overall, QoB is key to Operational Excellence.


Usability Testing Project


In a world where “Continuous Improvement” leads to incremental and breakthrough innovations, Quality’s critical KPI, Key Performance Indicator, can be expressed in terms of measurable advances in QoUX, the Quality of the Users’ Experiences. These are lagging (outcome) metrics that are far from static because they evolve within and over lifecycles. Therefore, reliability is not just applied to production operations, but also to the solution’s consistent performance and serviceability over time and under changing scenarios and events.


Given Quality’s unequivocal narrative around the “experiential” paradigm and, therefore, human-centric-optics, QbD’s best work should optimize for user “delight,” which is defined as superior “satisfaction,” rather than just aiming for requirements compliance.

It is very tempting to rally around core competencies within comfort zones that exist, and then settling on just aiming for “customer satisfaction” around “must-meet” baseline requirements. Though, that might not suffice given the necessity to innovate and better compete by leveraging unique sources of sustainable differentiation.


Let’s now state the obvious: “designing” Quality Experiences into digital solutions is best addressed by means of Human-Centered methodologies that optimize for (f) users’ “acceptance criteria” and (g) the kind of “adoption levels” that foster user base growth.

The opposite approach would risk the adverse effects (and hidden costs) that can be incurred when technical myopia leads the way. A. Cooper’s “The Inmates are Running the Asylum” captures that very well. His book is referenced below.


Accenture Survey.jpg


Just for the record, the year is 2018 and we are gearing for a pervasive digital world dominated by software defined systems. The 4th Industrial Revolution’s floodgates are set wide-open.

Low and high tech perform best when playing a supporting role. Technology enables “Services” which justify it, otherwise the so-called Chasm and Valley of Death wait around the corner. It pays to emphasize that “Services” are defined by “Use Cases.” So, it shouldn’t take much effort to see that “Use(case)ability” (“usability” being the proper term) is a CSF, Critical Success Factor. “Fitness for use” in other words.

Let’s take that further and couple “usability” with designing for usefulness,” “utility,” “consumability & serviceablity” as well as “affectivity” because perception and human affects orient satisfaction and dissatisfaction levels.

QbD cannot be put to work without adequately addressing Human Dynamics, which entails psychological (e.g. cognitive models, information architecture) physiological (e.g. device form factor, workstation ergonomics) and social dimensions (e.g. network effects increasing value for users.) That happens to be the SoW (Scope of Work) of HFE’s (Human Factors Engineering) interdisciplinary teams in Design Studios… and the topic of my next post on QbD’s Intellectual Capital.


 

A few more thoughts…

In spite of one’s day-to-day work and/or belief system being either closer to or removed from the kinds of jobs and tasks that make tech human, it makes sense to engage in meaningful outcome oriented and goal driven practices by applying HCD, Human-Centered-Design. The purpose is delivering quality and achieving customer acceptance and delight, given that customers are human beings. That is the reason why Design Thinking has outgrown the field of industry design and is applied to a wide variety of domains and disciplines nowadays.

Tech’s roller-coaster industry is packed with well intended technologies that fail. We all know that this is a fiercely competitive environment in constant change. Though, it is also true that, in many of those cases, UX, User Experience, professionals were not engaged at any part of the process, or were purposely involved at the back-end, or were called to come to the rescue in the eleventh hour. That leaves no room for Design to make a difference. Superficial changes just amount to bells-and-whistles and shiny-objects to disguise the underlying quality issues that are likely to re-surface at some point.

QbD’s top objective should be excelling at effectively & efficiently addressing our customers’ acceptance and adoption criteria. That remains true even in the context of full automation. Humans still get promoted and demoted (or fired) based on those system’s performance. D. Newman’s recent article on Forbes magazine rightly states that “you cannot run your business without people (…) you cannot operate technology without people (…) research have shown that people are a critical component for digital transformation.”

Today’s best practice calls for “reverse engineering” solutions by working from that human-centered understanding around Human Machine Systems (HMS.) That is substantially different from only relying on a far riskier “if you build it, they will come” model and its costlier brute-force mindset. 

When dealing with challenging, intractable and complex projects, overlooking that fact typically results in exponential project risk and plenty of the, otherwise, avoidable zig-zagging course corrections ahead (e.g. opportunity costs in financial analysis and hidden and latency costs in systems engineering.)

Agile’s iterative development and ability to pivot shouldn’t be a refuge for either subpar or no design effort, but a vehicle to best implement QbD and augment development capacity while minimizing technical debt. This is why this revision of QbD for today’s tech industry calls for Design Sprints to lead the way.

Last but not least, before dismissing this QbD revision as a philanthropic and humanistic only endeavor, I suggest deep thinking around its (1) business criticality and (2) contribution to risk mitigation.

 

J. de Francisco

Bell Labs, Distinguished Member of Technical Staff

Nokia Software, Senior Studio Director @ Solutions Engineering


Disclaimer:

The above comments are my own and I welcome your feedback on LinkedIn ‘s Messaging and Nokia’s Yammer, which can double as input for further revisions as well as collaboration opportunities.


References:

A. Cooper. The Inmates are Running the Asylum. Why High-Tech Products Drive Us Crazy and How to Restore the Sanity, Sams Publishing, 2004.

D. Newman. 3 Reasons People are Critical for Digital Transformation Success. Forbes, June 2018.

J. de Francisco. IEEE ETR 2018, Emerging Technologies Reliablity Roundtable – Human Factors Session (2). Innovarista: Innovation at Work, July 2018 innovarista.org

J. de Francisco. IEEE ETR 2018, Emerging Technologies – Human Factors Session. Innovarista: Innovation at Work. May 2018 innovarista.org

J.M. Juran. Juran on Quality by Design: the New Steps for Planning Quality into Goods and Services, The Free Press, 1992.

 

IEEE ETR 2018, Emerging Technologies Reliability Roundtable – Human Factors Session (2)


Following up on my last post about IEEE ERT 2018, here are a couple of charts for my “discussion brief,” which include a Human-Machine-System Capability Mapping chart (above) and concept illustrations of the Experiential Decision Support System (below.)  The charts’ text conveys context setting remarks, which I am also providing here.


Slide1


The goal of furthering machine intelligence is to make humans more able and smarter: the opposite engineering approach typically becomes a source of self-defeating technical myopia waiting to happen and missed opportunities. This simple mapping exercise can be customized to assess and roadmap capability levels.

The more sophisticated automation becomes, the more obvious the criticality of the human factor in both consumer and enterprise environments… rather than less. And, in any case, customer acceptance and adoption criteria remain Quality’s litmus test for emerging technologies.

Digitalization is fostering (a) XaaS,  (b) Self-Service, (c) the Shared Economy and the (d) Maker Movement. All elevate human involvement and drive the push for opening and democratizing technologies. These make (e) citizen science and citizen developers shape the next generation prosumers at mass market scale.

Digital Transformation initiatives embracing the above allow (f) nimbler enterprise teams to operate at far greater scale, scope and speed, and shift focus from routine operations to dynamic value creation coupled with extreme efficiencies.

This entails (g) interdisciplinary workstyles and collaborative organizational behaviors that include (h) customer co-creation models. In this new context, humans remain (i) the ultimate critical element in system reliability and safety. Left shifting Quality by Design (QbD) prioritizes Human-Centered-Design tools and processes to deliver high performance workforce automation systems.


Slide2


Cost-effective Lean Ops systems intertwine analytics, automation, programmability and flexible systems integration. All optimized for dynamic behaviors given Soft System’s perpetual motion. This means designing “for-ever” rapid and seamless reconfigurability instead of just engineering “day 1” implementations.

Operational Excellence dictates system-wide as well as subsystem level visualization, and a combination of centralized & distributed closed loop controls under user friendly operational modes. Cognitive models involve Situational Awareness (SA,) Sense Making (SM,) Root Cause Analysis (RCA,) Scenario Planning (SP,) and ROA (Real Options Analysis.)

The Experiential element is not just about programming known rules and policies but, most importantly, it grows by assimilating iterative learning in the context of cyclical automation: routine decisions and manual operations can be streamlined and collapsed, then switching to “exception” based management for that particular event.

Productivity calls for streamlining operations so that (a) waste can be eliminated & prevented, and (b) value based tasks can be performed effortlessly, in less steps, at speed & without error. High performance behaviors and sustainable competitiveness also call for the ability to (c) experiment and create new capabilities, as well as leveraging (d) process mining for customer journeys & value stream mapping (CJM & VSM) to continuously optimize them and guarantee service levels.

Service Operations Centers (SOC) should be equipped with Experiential Decision Support Systems (DSS) featuring (d) collaborative filtering, (e) actionable data stories conveying hindsight, insight & foresight and (f) adaptive cybernetics. Advanced visualization for both (f) intuitive & highly abstracted infographics and (g) scientific views is of the essence.

Quality is best addressed as a human experience, which determines (d) meaning and, therefore, the degree to which a system is lean vs. over-engineered or subpar (both being defective and carrying obvious and hidden costs.) A new take on QbD for Soft Systems, which are inherently fluid by definition, emphasizes acceptance testing probing for: usefulness & utility, usability & affectivity, consumability & serviceability and safety thru use cases and lifecycle events.