“[They] lost their quality leadership to new, aggressive competition. The most obvious consequence was lost of market share (…) [due to] quality features that were perceived as better meeting customer needs [and] they did not fail in service as often.”
“Loss of market share is not the only reason behind [it] (…) a second major force has been the phenomenon of life behind the quality dikes. We have learned that living in a technological society puts us at the mercy of the continuing operation of the goods and services that make a society possible (…) without such quality we have failure of all sorts (…) at the least these failures involve annoyances and minor costs. At their worst they are terrifying.”
“A third major force has been the gathering awareness by companies that they have been enduring excessive costs due to chronic quality-related wastes (…) about a third of what we do consists of redoing work previously done (…) lacking expertise in the quality disciplines, they are amateurs in the best sense of that word.”
J.M. Juran’s assessment on Quality issues in the 1960s-70s.
What follows are some of the insights driving the work that I’m doing on reviewing, leveraging and updating QbD (Quality by Design) in the context of today’s fast growing and all-encompassing digitalization.
I am dusting off my research from 2010 on the 3Q Model. Back then I was a senior manager at Alcatel-Lucent’s Solutions & Technology Introduction Department. My current role is Senior Studio Director at Nokia Software’s Solutions Engineering. Note that the scope is End-to-End Solutions. These are holistic system-wide (cross-sectional and longitudinal) undertakings intersecting different domains to deliver the higher value of the whole. I have discussed QbD for Digital Transformation projects at the Design Thinking 2018 event and at the IEEE (Institute of Electrical and Electronics Engineers) conference on CQR (Communications Quality and Reliability) back in April and May of this year. Interestingly enough, both events were held in Austin, Texas.
QbD was first coined by Juran, a renown pioneer of quality practices, whose work on that specific topic started in the mid 80s. He linked Quality to customer satisfaction and reliability as the two dimensions to focus on:
“Features” were defined as “quality characteristics,” which meant properties intended to satisfy specific customer needs. That would also include “promptness of delivery,” “ease of maintenance,” and “courtesy of service” to name some examples. “The better the features, the higher the quality in the eyes of customers.”
As far as reliability and, therefore, replicability and consistent performance, “freedom from deficiencies” conveyed the fact that “the fewer the deficiencies the better the quality in the eyes of customers.” A “deficiency” is a failure that triggers dissatisfaction, which calls for incurring higher costs to redo prior work.
“Fitness for use” was mentioned as an attempt to capture the above two together. The so-called Juran Trilogy entails Quality Planning, Quality Control, and Quality Improvement.
More than three decades have passed since Juran started to work on “New Steps for Planning Quality into Goods and Services.” Let’s decompose QbD’s acronym at face value and distill its essence.
As a designer, my belief & practice system focuses on “serial innovation” consistently delivering superior value. This is achieved by means of purposeful and elegant solutions equipped with capability models and optimal functionality leading to Quality Experiences.
Customer Delight, rather than just satisfaction, being the sought after outcome. This applies to both small and large undertakings, and as A. Kay, a pioneer in graphical user interfaces, best put it, “simple things should be simple, complex should be possible.”
Following that train of thought, “Designing Quality into Solutions” should become center stage to: (a) collaborative and iterative ideation, (b) agile development, (c) continuous delivery and (d) the dynamic diffusion of (e) new and mass-customizable digital services for consumer and enterprise markets, as well as no-for-profit. Overall, QoB is key to Operational Excellence.
In a world where “Continuous Improvement” leads to incremental and breakthrough innovations, Quality’s critical KPI, Key Performance Indicator, can be expressed in terms of measurable advances in QoUX, the Quality of the Users’ Experiences. These are lagging (outcome) metrics that are far from static because they evolve within and over lifecycles. Therefore, reliability is not just applied to production operations, but also to the solution’s consistent performance and serviceability over time and under changing scenarios and events.
Given Quality’s unequivocal narrative around the “experiential” paradigm and, therefore, human-centric-optics, QbD’s best work should optimize for user “delight,” which is defined as superior “satisfaction,” rather than just aiming for requirements compliance.
It is very tempting to rally around core competencies within comfort zones that exist, and then settling on just aiming for “customer satisfaction” around “must-meet” baseline requirements. Though, that might not suffice given the necessity to innovate and better compete by leveraging unique sources of sustainable differentiation.
Let’s now state the obvious: “designing” Quality Experiences into digital solutions is best addressed by means of Human-Centered methodologies that optimize for (f) users’ “acceptance criteria” and (g) the kind of “adoption levels” that foster user base growth.
The opposite approach would risk the adverse effects (and hidden costs) that can be incurred when technical myopia leads the way. A. Cooper’s “The Inmates are Running the Asylum” captures that very well. His book is referenced below.
Just for the record, the year is 2018 and we are gearing for a pervasive digital world dominated by software defined systems. The 4th Industrial Revolution’s floodgates are set wide-open.
Low and high tech perform best when playing a supporting role. Technology enables “Services” which justify it, otherwise the so-called Chasm and Valley of Death wait around the corner. It pays to emphasize that “Services” are defined by “Use Cases.” So, it shouldn’t take much effort to see that “Use(case)ability” (“usability” being the proper term) is a CSF, Critical Success Factor. “Fitness for use” in other words.
Let’s take that further and couple “usability” with designing for usefulness,” “utility,” “consumability & serviceablity” as well as “affectivity” because perception and human affects orient satisfaction and dissatisfaction levels.
QbD cannot be put to work without adequately addressing Human Dynamics, which entails psychological (e.g. cognitive models, information architecture) physiological (e.g. device form factor, workstation ergonomics) and social dimensions (e.g. network effects increasing value for users.) That happens to be the SoW (Scope of Work) of HFE’s (Human Factors Engineering) interdisciplinary teams in Design Studios… and the topic of my next post on QbD’s Intellectual Capital.
A few more thoughts…
In spite of one’s day-to-day work and/or belief system being either closer to or removed from the kinds of jobs and tasks that make tech human, it makes sense to engage in meaningful outcome oriented and goal driven practices by applying HCD, Human-Centered-Design. The purpose is delivering quality and achieving customer acceptance and delight, given that customers are human beings. That is the reason why Design Thinking has outgrown the field of industry design and is applied to a wide variety of domains and disciplines nowadays.
Tech’s roller-coaster industry is packed with well intended technologies that fail. We all know that this is a fiercely competitive environment in constant change. Though, it is also true that, in many of those cases, UX, User Experience, professionals were not engaged at any part of the process, or were purposely involved at the back-end, or were called to come to the rescue in the eleventh hour. That leaves no room for Design to make a difference. Superficial changes just amount to bells-and-whistles and shiny-objects to disguise the underlying quality issues that are likely to re-surface at some point.
QbD’s top objective should be excelling at effectively & efficiently addressing our customers’ acceptance and adoption criteria. That remains true even in the context of full automation. Humans still get promoted and demoted (or fired) based on those system’s performance. D. Newman’s recent article on Forbes magazine rightly states that “you cannot run your business without people (…) you cannot operate technology without people (…) research have shown that people are a critical component for digital transformation.”
Today’s best practice calls for “reverse engineering” solutions by working from that human-centered understanding around Human Machine Systems (HMS.) That is substantially different from only relying on a far riskier “if you build it, they will come” model and its costlier brute-force mindset.
When dealing with challenging, intractable and complex projects, overlooking that fact typically results in exponential project risk and plenty of the, otherwise, avoidable zig-zagging course corrections ahead (e.g. opportunity costs in financial analysis and hidden and latency costs in systems engineering.)
Agile’s iterative development and ability to pivot shouldn’t be a refuge for either subpar or no design effort, but a vehicle to best implement QbD and augment development capacity while minimizing technical debt. This is why this revision of QbD for today’s tech industry calls for Design Sprints to lead the way.
Last but not least, before dismissing this QbD revision as a philanthropic and humanistic only endeavor, I suggest deep thinking around its (1) business criticality and (2) contribution to risk mitigation.
J. de Francisco
Bell Labs, Distinguished Member of Technical Staff
Nokia Software, Senior Studio Director @ Solutions Engineering
A. Cooper. The Inmates are Running the Asylum. Why High-Tech Products Drive Us Crazy and How to Restore the Sanity, Sams Publishing, 2004.
D. Newman. 3 Reasons People are Critical for Digital Transformation Success. Forbes, June 2018.
J. de Francisco. IEEE ETR 2018, Emerging Technologies Reliablity Roundtable – Human Factors Session (2). Innovarista: Innovation at Work, July 2018 innovarista.org
J. de Francisco. IEEE ETR 2018, Emerging Technologies – Human Factors Session. Innovarista: Innovation at Work. May 2018 innovarista.org
J.M. Juran. Juran on Quality by Design: the New Steps for Planning Quality into Goods and Services, The Free Press, 1992.
I am reviewing the Manifesto on Human Factors Engineering and making updates. In the meantime, what follows below was a draft introduction letter, which was left unpublished when releasing the Manifesto last year. Blue text shows new updates. As far as this post’s title is concerned, DX refers to Digital Experiences. The same acronym is also used for Digital Transformation initiatives.
Claude E. Shannon, the father of information theory, is credited with being the first to use the word “bit” in a ground-breaking paper published in the Bell Labs’ Research Journal in 1948. He defined a mathematical framework that defines information and how to encode and transmit it over communication networks.
John E. Karlin, the father of Human Factors Engineering and a peer of Shannon’s at Bell Labs, is credited with assembling the first business organization addressing the human side of the equation just a year earlier. His interdisciplinary team focused on how to interface and, therefore, best design communication systems that account for cognitive and behavioral matters, as well as form factor considerations for devices to be user friendly.
In the Age of Digital Transformation, the notion of “being digital” has transcended the sophisticated handling of binary digits and what we can do with tangible hardware. Data driven automation and the notion of zero-touch lead to the development of end-to-end digital systems that are largely software defined and autonomic. These are engineered to be highly efficient and to operate without human intervention… or so we thought.
That feat can only be accomplished by undertaking a holistic design approach which, paradoxically, highlights the larger context and the new nature of Human-Machine-Systems. Otherwise, we would incur a technical myopia where presumably good technology ends up addressing the wrong problems or causing new ones that offset the intended benefits. In the digital age, technical prowess alone does no longer guarantee success: impressive inventions can fail to “cross the chasm,” fall in the “valley of death,” and never become true innovations to their creators and investors’ dismay. Passing the Turing Test just to plunge into the uncanny valley paradox also reinforces that point.
Note: the above draft chart is not self-explanatory, requires some updating and I will better address it on another post… but I’d like to leave this version here for ongoing discussions and feedback.
Being digital entails a new breed of jobs enabled by workforce automation. Any of us may become a citizen developer who can leverage self-service and intuitive decision support systems to create, blend and program services, because automation does the heavy lifting under the hood. Interdisciplinary collaboration is now within reach as teams involving individuals from different domains can effectively share these tools and the underlying resources to overcome the pitfalls and diminishing returns of organizational fragmentation. Enterprises can better innovate and further business opportunities by engaging in rapid experimentation with nimbler teams working at greater scale and scope, and by doing so at unprecedented speed.
At the time of writing this, and in the foreseeable future, no enterprise system is left alone without a human being accountable for its performance (or lack of thereof) since our skills and judgement remain key to critical and ultimate decision making. The more sophisticated the environment, the more obvious, as smaller agile teams become responsible for systems operating at greater scale, scope and speed. Dr. Alonso Vera, Chief at NASA’s HSI (Human Systems Integration) Division, states that “humans are the most critical element in system safety, reliability and performance […] across the gamut of applications even as increasingly intelligent software systems come on-line,” Human-Centered Design and Operations of Complex Aerospace Systems 2017.
It should also be noted that best practices in A.I. are promoting the kind of augmented and collaborative intelligence that only Human-On-The-Loop and Human-In-The-Loop Computing can deliver. A.I. is also powering up Affective Computing to make day-to-day digital services be contextual, empathic and adaptive, and allowing for mass-personalization at scale. We are also leveraging Natural Language Processing coupled with Dataviz helping better search, discover and visualize insight and foresight with interactive infographic quality, instead of just rendering data overloading screens and overwhelming navigation.
These are all good reasons to further our understanding of how to best leverage analytics, automation and programmability to design enterprise and consumer systems driven by a human-centered approach. The desired outcome is greater utility, frictionless consumability, dynamic adaptation and, last but not least, extreme ease of use at any level throughout a service’s lifecycle. That’s the fertile ground that enables new cross-pollination opportunities to enable a better future, which continuous improvement sets in constant motion and, hence, always is in the making.
Being digital is a human experience and, as such, it involves human affects. That relates to how we perceive our predominantly analog world and the diversity of our social and cultural fabrics. We interact with a great deal of objects and services of all sizes which can, and will be, digitized and automated in some fashion. We will continue to lead our lives in a variety of changing contexts and perform numerous tasks throughout the day, some routinely and some exercising more demanding skills with both low and high tech in that mix. So, it pays to think of Human Factors Engineering as not only having pioneered human-centered-design, but as an endless source of serial innovation for Creative Technologists to address our evolving lifestyles and quality of life in the DX Age.
ETR turned out to be a very productive undertaking and I would like to thank IEEE’s Spilios Markis, Chi-Ming Chen and Chris Mayer for all the help provided prior and during workshop.
My contribution focusing on addressing the unprecedented flexibility of advanced software defined systems and artificial intelligence. That intersection defines game changing technologies leading to zero-touch automation and, therefore, fostering self-service opportunities at both operational and service consumption levels.
“Zero touch” implies extreme automation to its fullest while self-service reveals that this new order elevates the criticality of HMS (Human Machine Systems.) More touch points surface compared to what legacy technologies allowed given their constraint and restricted nature. That prompts a new take on HCI (Human Computer Interaction) and QbD (Quality by Design) to best deliver service quality throughout: concept exploration and service definition, fulfilment and adaptation, assurance and security… across multi-domain, highly decomposed, re-configurable and exceptionally dynamic end-to-end systems involving integration and service delivery in continuous motion.
These are thought out to (a) dramatically optimize support personnel ratios and (b) shift staff’s attention and efforts to value based activities and innovation. These are small agile teams and new talent tasked with jobs involving (c) far greater scale with (d) a wider interdisciplinary scope, and all to be performed at (e) digital speed. In this next-level productivity and more demanding and challenging context, success relies on new tools embracing Design Thinking’s HCD (Human-Centered-Design.)
That is applied to capability models and subsequent modes of operation for (f) HITL (Human “IN” The Loop) Computing largely devoted to deep domain expertise supported by Science Visualization, as well as (g) HOTL (Human “ON” the Loop) for system-wide supervisory responsibilities and ease of service creation and onboarding. HOTL draws from highly abstracted Visualization techniques and Low Code Development revealing the behavior of end-to-end systems and subsystems and adequate flow control.
These are coupled with effective Cybernetics gearing up for context aware 360-closed-loop-control, zooming in and out between distributed and central levels. Last but not least, effective and efficient tools that are characterized by ease of use and consumability do attract many more new users from many more different domains to interact with these systems in a self-service fashion and create new business opportunities as a result.