Category: CTO

NOKIA HFE18 Conference (2) GI: Genuine Intelligence


Every once in a while we get to experience Murphy’s (dreaded) Law. This time around that had to do with stability issues with a media webcasting platform. We are now working on rescheduling NOKIA HFE18 under a different format. In parallel, we are also kicking off planning for HFE19… and we will take full advantage of lessons learned.


HFE18 Banner

Ref: Nokia HFE18 Conference (1) #MakeTechHuman


We regret any inconvenience that this eleventh hour change in plans might cause, and remain extremely grateful to both speakers and volunteers who have already invested time and efforts, which should not go to waste.

In the meantime, I’d like to volunteer just a handful of insights on the session that I was scheduled for and, therefore, keep the discussion going. The objective is to further improve what’s already available and allow for an even better session when we get to reconvene. Here is my session’s abstract to begin with.


THE SOFT & HARD NATURE OF ANYTHING DIGITAL

“Our quest to deliver productivity tools yielding operational excellence for DSPs, Digital Service Providers leads to the design of signature experiences by innovating in the process.”

“The Studio at Nokia Software’s Solutions Engineering is set to work with deceptively simple techniques and elegant sophistication… because neither oversimplification nor self-defeating complexity allow end-to-end systems to efficiently operate at digital speed and global scale.”

“This discussion intersects the soft and hard natures of dynamic systems by modeling Human Machine Systems (HMS) and the design of cybernetics. This practice focuses on critical success factors for the early acceptance and broader adoption of emerging technologies.”

“The work at the Studio embraces a renewed approach to QbD, Quality by Design, which is set to left-shift and unveil instrumental considerations at early design stages. The result is Nokia Studio’s QXbD, Quality Experiences by Design, optimizing for customer delight rather than table-stakes customer satisfaction.”


GI4HMS Jose de Francisco


NI – WHAT IS NATURAL INTELLIGENCE? At the time of writing this, we humans possess NI, Natural Intelligence. NI involves naturally developed cognitive functions and models leveraged by the sort of biological beings, which humans happen to be. Intelligence (a) captures, (b) generates, (c) applies and (d) evolves knowledge. Our individual and collective brainpower can be gauged in terms of (e) skills and (f) talent levels, jointly with an understanding of (g) the underlying decisioning process and (h) our perceived experiences in context.


AI – WHAT IS ARTIFICIAL INTELLIGENCE? Intelligence that is not naturally occurring, simulated knowledge in other words. This is generated by programmable artifacts consuming, processing and producing data under closed loop models. Whether working with individual or networked machine intelligence, there is neither information derived from mindfulness nor the type of general purpose sense making that match those of the human experience. The year is 2018… and that’s where state of the art is today.


GI4HMS Jose de Francisco 2.jpg


GI – WHAT IS GENUINE INTELLIGENCE? Earlier in the year I introduced this topic at Design Thinking 2018 (plenary session) and at IEEE Emerging Technologies Roundtable (invitation only workshop.) Coincidentally, both were held in Austin, TX, back in May. I proposed thinking about GI as the outcome of NI powered by AI.

By the way, “genuine” means acting in bonafide. To be clearer: with honesty and without the intention to deceive. Given the trade-offs (pros and cons) that NI and AI bring to the table, GI gets us a step closer to productive bonafide systems.

GI is, therefore, the outcome of purposely crafting optimal technology solutions that augment human possibilities. This is addressed by Human Factors Engineering interdisciplinary science given HFE’s holistic approach and focus on value driven Human-Machine-Systems, HMS.

Quick side note: those of you into Lean and Lean Six Sigma can approach this topic with Jikoda (autonomation.) Ditto for anyone working on Human-in-the-Loop Computing, Affective Computing, RecSys (Recommender Systems,) Human Dynamics and Process Mining with Machine Learning or, better yet, xAI, Explainable Artificial Intelligence.


DDESS Nokia Studio


DDESS – The most tangible design work entails the delivery of DDESS, Digital Decision & Execution Support Systems. This is where GI gets interesting because we need to apply new optics to take a fresh look at what Operational Excellence is (and is not) moving forward.

In a nutshell DDESS’ purpose is to reveal and inform decisions and to make decisions, all in context. But, I will pause here as this topic will be better covered in subsequent posts… just one more thought: DDESS addresses decision support for (NI) humans, (AI) machines, and (GI) human-machine systems. Coming to terms with that one insight alone becomes a critical success factor.


Some other thought… it turns out that, in today’s day and age, projects that are techno-centric heavy only succeed a fraction of the time, 10% or so by some estimates. Selective memories tend to focus and celebrate the 10% that make it… but that is a terrible ROI, Return on Investment, which inflicts (1) severe technical debt, (2) latency costs in systems engineering and (3) a huge opportunity cost as funding and good efforts could have been put to work for more productive endeavours.

By many other well documented and more recent accounts, HCD, Human Centered Design, happens to flip that ratio as designers are obsessed with optimizing for user acceptance and frictionless adoption from day one. HFE takes painstaking work on purposeful and value driven technological solutions where a smart combination of Outside-IN-innovation and Inside-OUT-ingenuity happens to make all the difference.

 

Executive Forum on Digital Transformation (DX)- Chicago, September 12 2017


“Argyle Executive Forum is bringing together senior digital & IT executives from a variety of industry verticals for our biannual CIO Chicago Forum. Throughout a full day of content and networking, we will focus on the most pressing issues facing IT executives with regards to leading the business through digital transformation, with an agenda geared specifically towards Chief Information officers, Chief Data Officers, Chief Digital Officers, as well as Data/ Analytics/MIS VPs, Directors, and Architects in a leading role.

Leading the Business Through Digital Transformation – Argyle.


 


 

imageFirst, thanks to the team at Argyle for what turned out to be a timely and insightful conference on DX, Digital Transformation. Nokia was one of the Executive Forum’s sponsors as a Senior Supporter.

It is worth noticing that this event featured partners who we work with such as HP Enterprise, Thought Leader Sponsor, and IBM, Breakout Session Sponsor.

That talks to the criticality of collaborative undertakings as Digital Transformation becomes a pressing objective across industries, academia, public service and government sectors.

What follows is my notes and personal insights. While all the sessions and discussions were quite relevant, I would like to highlight the opening keynote, which set the tone and narrative of the event.


imageJames P. MacLennan, SVP & CIO at IDEX, discussed “The Five Components of a Great Digital Strategy,” which addressed the fact that “Design Thinking”, “Human Factors” and a collaborative culture involving interdisciplinary workstyles and “Great Teams” have become of the essence.

Moreover, he stated that “a Digital Business” will only succeed when it understands hot to connect with people.” The “human element” and, therefore, “people centered” strategies turn out to be critical success factors.

I would like to add that this entails engineering a continuum of (a) stakeholders, who are all human personas by definition, and to do so across (b) UX (user experience) and CX (customer experience) domains.

This job takes (c) a holistic understanding of customer facing (front end) and resource facing (back end) elements forming a coherent end-to-end system. Otherwise, operational fragmentation will take a toll and will deny the intended DX benefits.


imageJames’ presentation displayed the convoluted UI (user interface) shown in this picture to illustrate the paradox of well intended yet counterproductive implementations that negate transformation initiatives.

Here is another valuable insight coming out of Argyle’s Executive Forum: information technologies (IT) and tech and processes for operations cannot longer be worlds apart, which demands superb cross-functional teamwork.

Cognitive overload, deficient information architecture, and poor usability translates into: human error, risk aversion, costly budget overruns, missing or deviating from goals, so on and so forth.

Any and all of these issues combined can be silently impacting quality or, simply, just lowering the bar for a business to get through noisy and cluttered operational environments. That is hardly the stuff that operational excellence calls for.


Obviously, in the context of CX, customer satisfaction becomes harder and harder to attain and, more specifically, to get that effectively done in a consistent fashion.

Predictability and consistency are key objectives for any Quality Management program. If that scenario alone wasn’t troublesome enough, Customer Delight (rather than just satisfying agreed upon requirements) is Design Thinking’s ultimate performance indicator, which commands a premium clearly beyond reach under those circumstances.

Quality management wise, “satisfaction” is the fulfilment of expected specifications while “delight” is about great pleasure, or great satisfaction if you will. “Satisfaction” can be rationalized and is the acceptance ticket to be in business. “Delight” accounts for human affects (emotions) and is a powerful source of differentiation. Those who think that’s just about splitting hairs should take a pause and think twice because DX is set to enable game changing experiences on all counts and fronts.


Thru the forum and session after session, Jim’s “Design for Humans”  principle gained more and more critical mass as presenters and panelists discussed the reasons why we should be mindful of the user journey and how to best improve all touch points along the way.

In one of the panel discussions this became even more evident when the question on aligning people, processes and technologies pointed to difficult prioritization exercises. Note that there was immediate consensus on the need for putting people first and humanizing technology and processes by applying Design Thinking, a human centered methodology that is corner stone to the job of creative technologists.

That means projects that are driven by clear missions and specific experiential outcomes and lifecycles (Goal Directed Design) rather than just an I/O approach. It also means rapid experience prototyping and A/B multivariate testing to explore possibilities since Design Thinking is a serial innovation engine.



imageLet’s connect some more dots.

Chicago’s NPR station aired a rerun of The Power of Design this past weekend. The discussion was centered on “How Can We Design For A Better Experience.”

By the way, TED’s acronym actually stands for the convergence of Technology, Entertainment and… Design.


Interview with Tony Fadell, one of the main designers of the iPod (Apple) and founder of Nest (Google).

 “Design begins by also noticing all those little problems that many ignore (…) we we though our lives accepting these design flaws that actually don’t improve our lives.”

“Steve Jobs challenged us to see our products through the eyes of the customer, the new customer, the one that has fears and possible frustrations, and hopes and exhilaration that the new technology can work straight away for them. He called it “staying beginners” and wanted to make sure that we focused on those tiny little details to make things work faster and seamless for the new customers.”

“There is this positive emotional momentum that builds on itself at each step of the process (…) when you hit a brick wall you loose all the momentum (…) and though away an entire great experience.”

“There are to halves to design, just as there are two halves to your brain, the emotional part and the rational part. If you want people to truly adopt your product it has to have an emotional component, something that grabs you (…) that unlocks your curiosity, it also needs to rationally work (…) because people see value beyond the sexiness.”


Interview with Joe Gebbia, Airbnb cofounder.

“Any time that you see duct tape in the world, that’s a design opportunity (…) it’s an indicator that something is broken, that something did not perform the way it was design to and that there is an opportunity to improve it.”

“Design is the key to (Airbnb) success (…) and as a competitive advantage, design is thing that can separate you (…) the next thing that can differentiate you. All things being equal, two comparable products side by side with the same technical features and components… you will be crazy to choose the one that is harder to use.”

“Airbnb’s design decisions not only made the service easy to use but it helped millions of complete strangers trust each other (…) and open their homes (…) design is more than the look and feel of something, it is the whole experience.”


Related Posts:

Human Factors Engineering: Big Data & Social Analytics to #MakeTechHuman


“Netflix’s analytical orientation has already led to a high level of success and growth. But the company is also counting on analytics to drive it through a major technological shift […] by analytics we mean the extensive use of data, statistical and quantitative analysis, explanatory and predictive models, and fact-based management to drive decisions an actions”. Competing on Analytics by Thomas H. Davenport and Jeanne G. Harris.

“Big data changes the nature of business, markets, and society […] the effects on individuals may be the biggest shock of all […] this will force an adjustment to traditional ideas of management, decision making, human resources and education”. Big Data by Viktor Mayer-Schonberger and Kenneth Cukier.

“Social physics functions by analyzing patters of human experience and idea exchange within the digital breadcrumbs we all leave behind as we move through the world […] the process of analyzing the patterns is called reality mining […] one of the ten technologies that will change the world [according to MIT Technology Review]”. Social Physics by Alex Pentland.


image

It’s Saturday night and I am happy to share that I just submitted my last two Jupyter notebooks and, therefore, completed MIT’s first certificate course on Big Data and Social Analytics.

This was one intensive summer with very little time left for anything else beyond work, day-to-day family life and spending most evenings and weekends studying. MIT BD&SA course developers estimated a weekly workload of 8 to 12 hours through 9 weeks. Though, many of us have spent north of 15 hours a week to cover: videos and readings, Python programming and written assignments, quizzes, and forum discussions. By the way, all definitely worthwhile.

While taking this course, I couldn’t help recalling the kind of scarce data we used to work with when I got my postgrad on Human Factors Engineering at BarcelonaTech in the early 90s, also graduating with the first class.

By means of an example, one of the industrial ergonomics projects got kicked off with statistical data provided by the military. Stats on Marines fit for service being the only readily available physiological data for us to design a local civilian application.  We knew that wasn’t a representative model of the target user base for the industrial workstation under design. Back then, undertaking a proper data collection study was costly and beyond project means.

Our group worked with small data by testing things on ourselves and leveraging in-house dogfooding to some extent. Though, unfortunately, this kind of findings might not adequately reflect the reality of human variability. If overlooked, that can result on implementing designs that optimize for a set of “proficient some” while undermining ease of use for many others and missing the mark in the process. Let’s keep in mind that, as clearly outlined in Crossing the Chasm, early success among devoted early adopters might not translate in mainstream praise and popularity, then failing to grow the user base and failing in the market.  


imageTo be clear, working with secondary research (e.g. reference data sets from third parties) and conducting primary research by testing things on ourselves coupled with in-house dogfooding are all valuable practices. Though not necessarily enough to make a compelling difference in today’s “big data” day and age.

MIT BD&SA discusses the benefits of working with living labs driven by UCD, User Centered Design. We now have commercial off-the-shelf technologies (smartphones, Internet of Things, sensing networks, machine learning) at our disposal, which allow us to capture user actions and behavior on location and, most importantly, with greater data resolution.

Couple that with ethnographic research focusing on understanding human factors by observing users in their own environment and usage context and, most importantly, capturing their PoV, Point of View at each step.

So, those of us working on Human Factors Engineering and driven by User Centered Design to deliver processes, tools, products and services, can create new experiences that take the human possibilities of technologies to new unprecedented levels, analytics becoming of the essence to #MakeTechHuman.



image

Big Data Revolution. TED Radio Hour. NPR.

image

The Human Face of Big Data. PBS.

image

Source: Business Innovation Demands Accelerated Insights. Intel.

image 

MakeTechHuman. Nokia.


See you at RecSys 2016 next week : )

#MakeTechHuman