Tagged: Human-Machine Systems

IEEE ETR 2018, Emerging Technologies Reliability – Human Factors Session

IEEE ETR 2018 on Twitter


ETR turned out to be a very productive undertaking and I would like to thank IEEE’s Spilios Markis, Chi-Ming Chen and Chris Mayer for all the help provided prior and during workshop.

My contribution focusing on addressing the unprecedented flexibility of advanced software defined systems and artificial intelligence. That intersection defines game changing technologies leading to zero-touch automation and, therefore, fostering self-service opportunities at both operational and service consumption levels.

“Zero touch” implies extreme automation to its fullest while self-service reveals that this new order elevates the criticality of HMS (Human Machine Systems.) More touch points surface compared to what legacy technologies allowed given their constraint and restricted nature. That prompts a new take on HCI (Human Computer Interaction) and QbD (Quality by Design) to best deliver service quality throughout: concept exploration and service definition, fulfilment and adaptation, assurance and security… across multi-domain, highly decomposed, re-configurable and exceptionally dynamic end-to-end systems involving integration and service delivery in continuous motion.

These are thought out to (a) dramatically optimize support personnel ratios and (b) shift staff’s attention and efforts to value based activities and innovation. These are small agile teams and new talent tasked with jobs involving (c) far greater scale with (d) a wider interdisciplinary scope, and all to be performed at (e) digital speed. In this next-level productivity and more demanding and challenging context, success relies on new tools embracing Design Thinking’s HCD (Human-Centered-Design.)

That is applied to capability models and subsequent modes of operation for (f) HITL (Human “IN” The Loop) Computing largely devoted to  deep domain expertise supported by Science Visualization, as well as (g) HOTL (Human “ON” the Loop) for system-wide supervisory responsibilities and ease of service creation and onboarding. HOTL draws from highly abstracted Visualization techniques and Low Code Development revealing the behavior of end-to-end systems and subsystems and adequate flow control.

These are coupled with effective Cybernetics gearing up for context aware 360-closed-loop-control, zooming in and out between distributed and central levels. Last but not least, effective and efficient tools that are characterized by ease of use and consumability do attract many more new users from many more different domains to interact with these systems in a self-service fashion and create new business opportunities as a result.

 

Software’s Defining Age


1k Fulton Market


I am on my way to Mobile World Congress and last night I had the opportunity to speak at DevMynd’s “Agile Software in a Hardware World.” That panel discussion featured BMW Technology Corporation (BMW, Mini, Rolls-Royce,) Monsanto’s “The Climate Corporation,” and Nokia Software Group, which I was proud to represent. The venue, 1KFulton, is a century-old and former cold storage building in the Fulton Market neighborhood, home to Google’s Chicago campus.


DEVMYND EVENT


Reflecting on that panel discussion, small group conversations and one-on-one chats before and after the event, I think that it is fair to state the following:

(A) software is undergoing a defining moment while re-shaping industries. “Software defined instruments and systems” have superseded capabilities of hardware-centric deployments.

In other words, economic value and profitability are migrating from conventional products to software dominated environments that control tools, systems, and processes.


In this new context, (B) collaborative undertakings (co-creation, open source,) platforms, modularization and mashups are paving the way for rapid experimentation and for a wide-range of services to surface.

Back to economics… a venture capital firm operating in the Silicon Valley shared with me that when comparing current investments with equivalent old-school ones, they experienced x3 times time-to-market speed at 1/3 of the investment, which allows them to better diversify risk and fund more start-ups in the process.


Moreover, we are now operating at (C) unprecedented speed, scale and scope. For that reason alone, software should improve our ability to “pivot” and dynamically adapt to changing circumstances.

Most plans don’t survive first contact and many start-ups and emerging technologies don’t survive the so-called “crossing-the-chasm” or “Valley of Death.” So, remaining lean and embracing continuous/iterative improvement are of the essence. That’s a quality mantra rather than an excuse for forgoing best quality practices.

Back to economics again: quality management’s definition of “customer satisfaction” is now table-stakes and compliance in that area drives low-cost commoditization. “Customer delight” is the higher benchmark that commands a premium and the kind of margins enabling us to re-invest to further innovate.


Let’s now state the obvious, “customers” are human beings, aren’t they? Interestingly enough, the more sophistication and diversification, the higher the need for (D) humanizing technology so that we can better create, consume, use and democratize any digital services. In turn, this has fostered (E) Design Thinking as a leading innovation practice that intersects art and science. Design Thinking addresses HMS, Human-Machine-Systems, by prioritizing HCD, Human-Centered-Design.

In terms of economic effectiveness and efficiency, that means outcome-oriented system-sizing, rather than over-engineering waste. It also means the definition of meaningful and purposeful requirements: some are designed to meet customer satisfaction metrics, while others are explicitly thought out to exceed that baseline and, hence, to actually deliver the X-Factor prompting customer delight. All key to customer acceptance and adoption growth.

Better yet, one of the event’s participants volunteered the fact that “good design” factoring intuitive interaction, advanced dataviz (data visualization) and effortless controls was proven to shrink the sales cycle by literally half: not only customers perceived and experienced the service’s tangible value early, the sales team was also able to approach more customers in that timeframe. Innovative Human-Computer-Interaction based on information design, value based tasks, streamlined processes, intuitive data visualization, effortless controls and overall UX, User Experience, double as compelling demonstration tools.


This is a side note: that has already become a critical success factor in Artificial Intelligence’s new developments, AI being software’s top transformational exponent as DSS, Decision Support Systems for humans and/or machines become quintessential. I will detail that in another post.


One last thought… (F) software’s pervasiveness has also brought along Agile development practices. These include “user stories” borrowing a Design Thinking technique by which application features are defined by synthesizing human optics (persona/outcome/rationale) to put technical myopia at bay.

After all, we should all be in the business of making tech human. Otherwise, what would negating or ignoring that say about each of us and our collective culture?

Nokia @ Service Design Week 2017


Exploring Other Methods. November 7, 4:00 PM Understanding How Design Thinking, Lean and Agile Play within Service Design.

“Since service design serves as the umbrella discipline for delivering service experiences, there are many sub methods to address different types of problems. For example, Design Thinking is helpful on the front end to empathize and identify customer needs where Agile is helpful in software development and digital experience design. This group explores well-known methods and how they play a role in the service design universe.”


image


image

I’m back in Chicago and I would first like to thank everyone who joined my session about “Exploring Other Methods” for your participation (full house) and encouraging feedback. I hope to cross paths again in the near future. In the meantime, we can take advantage of LinkedIn to stay in touch. I would also like to express my gratitude to Michael DeJager and Tyler Peterson for all of their tireless help.

Here are the links for a couple of the items that I briefly discussed when providing context for Exploring Other Methods: a photo album of where I work, Nokia’s Chicago Technology Center, and the first version of the Human Factors Engineering Manifesto. Regarding requests about the slideware for my talk… I ran an interactive whiteboarding session with my iPad connected to the projector and I did not produce formal slides.


The discussion’s narrative was centered on how to best approach HSM, Human-Machine-Systems, to craft a compelling Service Experience. In that context, “Human” refers to relevant stakeholders and “Machine” to any technology involved. The “Systems” approach prompts a holistic undertaking which includes Front Stage, Back Stage factors and the continuum across the too.

Service Design is about innovation, whether capability-wise that qualifies as incremental, breakthrough and/or disruptive innovation. Today’s Service Design also entails a wide range of low and high-tech at any point in the process. While this is just anecdotal evidence, when I asked everyone about who can do away without any technology, there was an implicit understanding of the rhetorical nature of my question and, therefore, the obvious pervasiveness of digital experiences.

We are a technological society. Good design is concerned with human factors and crafts technological solutions to enable human experiences that contribute to our quality of life and the quality of the work we do. That is Human Factors Engineering (HFE) reason for being, a field pioneered by Nokia Bell Labs in 1947.


From that perspective, it pays to intertwine any relevant practices and tools for the healthy purpose of figuring out what combination works best for any given Service Design project. While process repeatability is a desired outcome, what makes an interdisciplinary team smart is the ability to mix, match and blend what’s needed for each undertaking.

We can think of it as an a-la-carte menu featuring elements from Design Thinking, Agile and Lean methodologies just to name a popular handful to start with. I did not discuss some other such as Concept of Operations, Goal Directed Design or Outcome Driven Innovation, but I do recommend expanding one’s horizons beyond the aforementioned few. Note that while featuring commonalities, each one works with different optics. A holistic approach to Service Design also requires a composite method, leveraging as much (or as little) as needed from any, and with any needed adaptations.


Rather than summarizing what I shared at Service Design Week, I’m taking this chance to further reflect on those insights. So, given that we operate in highly dynamic environments, why wouldn’t designers also apply dynamic methodologies?

I’d like to think twice about cookie-cutter and one-size-fits-all approaches because Service Design typically prompts problems and opportunities where fixed-gear-techniques that might have worked well in the past can end up betraying one’s confidence: they might no longer serve or be the best fit whichever purpose they were originally conceived for. Design typically takes us beyond our comfort level, and that makes it an exciting profession.

Statistically speaking, the more one does the very same thing, the closer one gets to mastering that craft (e.g. deliberate practice model). But, paradoxically, you also get closer and closer to confronting environmental deviations, anomalies and rare events in an ever-changing world with even-growing moving parts and targets (e.g. black swan model). Besides, Service Design practitioners shouldn’t deny themselves the benefits that come with continuous improvement. So, here is a quick recap: innovation in Service Design’s outcomes and method innovation go hand by hand. As Einstein put it:

“Insanity is doing the same thing over and over and expecting a different result.”

“If we knew what it was we were doing, it would not be called research, would it?”