IEEE CQR-ETR 2018: “Discuss and identify the RAS (Reliability, Availability and Serviceability) challenges, requirements and methodologies in the emerging technology areas like the Cloud Computing, Wireless/Mobility (with focus on 5G technologies), NFV (Network Functions Virtualization), SDN (Software Defined Networking), or similar large-scale distributed and virtualization systems.”
“Discuss the RAS requirements and technologies for mission-critical industries (e.g., airborne systems, railway communication systems, the banking and financial communication systems, etc.), with the goal to promote the interindustry
sharing of related ideas and experiences. Identify potential directions for resolving identified issues and propose possible solution.”
Session Title: A Programmatic Approach for an Artificial Intelligence Code of Conduct.
Today’s DX, Digital Transformation, programs are all the rage, but it takes a fair amount of double clicking and inquisitive questioning to separate facts from vaporware. DX typically involves a wide variety of game changing initiatives intersecting analytics, automation, programmability, software-defined systems, end-to-end integration, service-level composition and controls… all coming together to optimize for Quality as a differentiated and value-based Human Experience. Therefore, Customer Delight metrics (rather than outmoded customer satisfaction ones) are set to redefine the “Q” in CQR, Communications Quality & Reliability in 5G.
While the Telecoms industry rallies toward a zero-touch automation paradigms, which some happen to position as a Human-“OFF”-the-Loop panacea, this session will expose the need for considering, and possibly pivoting, to the kind of Operational Excellence that can only be delivered by adaptive HMS, Human-Machine-Systems instead.
Note the rise of Dataviz (Data and Science Visualization,) ML’s (Machine Learning’s) Collaborative Filtering, AI’s (Artificial Intelligence’s) RecSys (Recommender Systems) and a renewed take on Cybernetics are driving innovation in HILT and HOTL (Human-“IN”-The-Loop and Human-“ON”-the-Loop, Computing,) as well as delivering effective mass-personalization with Affective Computing powered by Human Dynamics’ analytics.
Telecoms’ pioneered HFE, Human Factors Engineering: a holistic systems engineering discipline addressing people (culture, workstyle, skills,) processes (procedures, methods, practices,) and technologies (crafts, tools, systems) so that we can best humanize technology and make a compelling difference across the value chain at all levels. We are now embarked on a new journey.
The sought after outcome of any Digital Service Provider, DSP, is to be instrumental to our Citizens’ Quality Experiences with new service experimentation, transactions and growth models. This takes agility and dynamic system-wide (horizontal and vertical) behaviors, which prompt effortless operability at unprecedented speed, scale and scope. Our work permeates design, development, delivery and serviceability, and continuous intertwined lifecycles instead of lock-step waterfalls.
In this context, AI, Artificial Intelligence, enables us, humans, to envision and implement capabilities beyond the reach of legacy systems’ last gasps. By the same token, practices that might have appeared to serve us well in the past, are exposing their limitations when becoming latency-prone barriers. A successful path forward takes augmented Human-Machine Intelligence. A programmatic approach for an AI’s Code of Conduct would enable us to best model AI’s behavior, design better human-network interactions and collaborate on standarization.
I am on my way to Mobile World Congress and last night I had the opportunity to speak at DevMynd’s “Agile Software in a Hardware World.” That panel discussion featured BMW Technology Corporation (BMW, Mini, Rolls-Royce,) Monsanto’s “The Climate Corporation,” and Nokia Software Group, which I was proud to represent. The venue, 1KFulton, is a century-old and former cold storage building in the Fulton Market neighborhood, home to Google’s Chicago campus.
Reflecting on that panel discussion, small group conversations and one-on-one chats before and after the event, I think that it is fair to state the following:
(A) software is undergoing a defining moment while re-shaping industries. “Software defined instruments and systems” have superseded capaibilities of hardware-centric deployments.
In other words, economic value and profitability are migrating from conventional products to software dominated environments that control tools, systems, and processes.
In this new context, (B) collaborative undertakings (co-creation, open source,) platforms, modularization and mashups are paving the way for rapid experimentation and for a wide-range of services to surface.
Back to economics… a venture capital firm operating in the Silicon Valley shared with me that when comparing current investments with equivalent old-school ones, they experienced x3 times time-to-market speed at 1/3 of the investment, which allows them to better diversify risk and fund more start-ups in the process.
Moreover, we are now operating at (C) unprecedented speed, scale and scope. For that reason alone, software should improve our ability to “pivot” and dynamically adapt to changing circumstances.
Most plans don’t survive first contact and many start-ups and emerging technologies don’t survive the so-called “crossing-the-chasm” or “Valley of Death.” So, remaining lean and embracing continuous/iterative improvement are of the essence. That’s a quality mantra rather than an excuse for forgoing best quality practices.
Back to economics again: quality management’s definition of “customer satisfaction” is now table-stakes and compliance in that area drives low-cost commoditization. “Customer delight” is the higher benchmark that commands a premium and the kind of margins enabling us to re-invest to further innovate.
Let’s now state the obvious, “customers” are human beings, aren’t they? Interestingly enough, the more sophistication and diversification, the higher the need for (D) humanizing technology so that we can better create, consume, use and democratize any digital services. In turn, this has fostered (E) Design Thinking as a leading innovation practice that intersects art and science. Design Thinking addresses HMS, Human-Machine-Systems, by prioritizing HCD, Human-Centered-Design.
In terms of economic effectiveness and efficience, that means outcome-oriented system-sizing, rather than over-engineering waste. It also means the definition of meaningful and purposeful requirements: some are designed to meet customer satisfaction metrics, while others are explicetly thought out to exceed that baseline and, hence, to actually deliver the X-Factor prompting customer delight. All key to customer acceptance and adoption growth.
Better yet, one of the event’s participants volunteered the fact that “good design” factoring inuitive interaction, advanced dataviz (data visualization) and effortless controls was proven to shrink the sales cycle by literally half: not only customers perceived and experienced the service’s taginble value early, the sales team was also able to approach more customers in that timeframe. Innovative Human-Computer-Interaction based on information design, value based tasks, streamlined processes, intuitive data visualization, effortless controls and overall UX, User Exeperience, double as compeling demonstration tools.
This is a side note: that has already become a critical success factor in Artificial Intelligence’s new developments, AI being software’s top transformational exponent as DSS, Decision Support Systems for humans and/or machines become quintessential. I will detail that in another post.
One last thought… (F) software’s pervasiveness has also brought along Agile development practices. These include “user stories” borrowing a Design Thinking technique by which application features are defined by sinthesizing human optics (persona/outcome/rationale) to put technical myopia at bay.
After all, we should all be in the business of making tech human. Otherwise, what would negating or ignoring that say about each of us and our collective culture?
“The world of IoT and connected devices is expanding rapidly. We all carry super computers in our pockets and interact with everything from home automation, cars, consumer electronics, and healthcare devices.”
“In this complex hardware + software environment the product development cycle can be tricky. For example, you can’t just follow agile software practices by the book when you’re building a connected pace maker. So how do we approach product development when the stakes are high and the moving parts are many? During this discussion we’ll be tackling topics such as:”
“How do you roadmap a product which includes both hardware and software components? How does agile development fit in? How does the regulatory landscape affect how we approach development and iteration? How do you build teams around these integrated products? And how do you keep them in sync and working together?”
I’d first like to thank the team at DevMynd for their kind invitation. I am looking forward to joining the panel discussion in Chicago this coming Thursday, February 22. In the meantime, I will welcome any comments and insights as I gear up for this discussion.
I’m working on outlining some of the myths, dilemmas and trade-offs that I have encounter as an Industrial Designer and in Product Management.
From a design perspective, there are two topics worth looking at: Design Thinking as a Human-Centered methodology and its outcomes in terms of: (a) utility, (b) usability, (c) consumability, (d) affectivity and (e) the composite and differential value of the resulting digital experiences that involve software and hardware.
This “new brave world” equips us with the freedom to explore new form factors, cognitive models and, most impoartantly, the development human x technology networks. Some of the specifics come down to design semantics re-defining HMS, Human-Machine-Systems, in the context of multi-modal user interfaces and innovative interactions where Machine Learning and new visualization paradigms happen to surface.
From a Product Management viewpoint, there is a need for also pondering about how to best leverage Design Thinking beyond Industrial Design and Software Development to talkle product and service strategy. Here my focus gravitates toward addressing: (a) success factors and (b) limiting factors under control, as well as (d) other determining factors beyond our area of influence that can impact the difussion of innovations either possitively or negatively. Moreover, I like to couple business model innovation with behavioral economics and information network effects.
This construct really boils down to capturing the essence behind (e) stakeholders’ acceptance criteria and (f) the users’ engagement, adoption and growth rates. This means defining capability and maturity levels and how to best factor for the fact that they adapt and evolve over time. Obviously, this leads to taking a close look at how to best intersect Lean and Agile practies, but not only, so that we can lead and navigate constantly changing environments in “digital time.”
Let’s get down to a more tactical level: end-to-end system design entails a mix of loosely and tightly coupled elements, and a platform approach to operate at speed, scale and wider scope that what black boxes can match. A reality check unveils a hybrid world where decisions on capacity and performance levels, as well as serviceability and dependency levels drive decisions toward optimizing for distributed systems and, therefore, the rising value of end-to-end solutions vs. point solutions only.
In that context, inter-disciplinary teams involving creative technologists and domain experts make our organizations effectively diverse, smarter and innovative. Otherwise, self-defeating arrogance, conflicting silos and technical myopia can make pre-production and production be costlier by promoting unncessary friction and getting everyone to work harder and harder rather than smarter. Typically, that negates productivity, forces a number corrective actions, and significantly shifts and/or downsizex sought after results.
The beauty of the Studio’s human-experience-centered practice is a healthy obssession for delivering “meaning.” The definition of “meaningful outcomes” (rather than churning outputs) makes these organizations behave based on value and impact. We strive to foster not just customer satisfaction and net promoter scores, but measurable customer delight and network effects (superior and service-level performance indicators) which, in turn, set and streamline technical requirements.
Long story short, the Studio’s mindset (critical thinking / wonder & discovery / problem solving) and workstyle (collaborative / experiential / iterative / adaptive) help explain why creative technologysts are instrumental and serial innovation engines for the digital age.
Footnote: the term “team of creative technologysts” was first coined by Nokia Bell Labs back in the 1940s to single out the differentiated value of inter-disciplinary undertakings. In the late forties, Bell Labs’ Clauded Shannon pioneered Information Theory and John Karlin set up the first Human Factors Engineering in industry. That HFE team was formed by a pyschologist, a statistician (the father of quality control visualization,) an engineer, and a physicist.
“Inventing the Future with a focus on groundbreaking innovation, Nokia has been a catalyst for the world’s most powerful, game-changing technology shifts. We are committed to innovating for people and developing new technologies and solutions for the world we live in. With our Technology Vision 2020, we are helping operators deal with extreme traffic growth, simplify network operations and provide the ultimate personal gigabyte experience.” https://networks.nokia.com/innovation
Last month I joined the Chicago’s Science Fair as a judge in the Computer Science category. I am glad to share that received a plaque for my fifth year of service. Then, just a month later, I found myself on the other side of things as a contestant at Nokia’s Innovation Event in Espoo, Finland.
This year’s competition registered about 500 submissions worldwide. LeanOps qualified among the Top 3 Finalists in the Product & Solution Innovation Category. Ted East and I made the trip from Chicago to present on behalf of the team. We all were happy enough with LeanOps’ Finalist position. Moreover, any of the other finalist and shortlisted projects deserved being recipients of the first prize anyway. That speaks to Nokia’s renewed ingenuity and technical prowess.
But, those of us scheduled to be on stage could also feel the kind of mounting pressure that comes from making the most of this sort of high visibility opportunity. So, Ted and I spent a considerable amount of effort crafting and improving our delivery until the very last minute. We had the benefit of invaluable coaching and genuine advice while gearing up for this event. That should not be taken for granted and, therefore, we are humble and grateful for it. The fact is that Barry’s, Fabian’s, Kelvin’s, Corinna’s and Tuuli’s consideration and words of wisdom paid off. We came back home with the First Prize and our gratitude should be extended to everyone making this year’s event happen. My apologies for not having listed everyone’s names here.
Communicating science and technology is a challenge: any of us can risk alienating audiences willing to listen and individuals who would otherwise be excited about what our project entails. Information overload, convoluted jargon and failing to convey what the actual impact would be can jeopardize anyone’s good work due to lack of clarity. Moreover, it can compromise funding opportunities and drive collaboration and talented people away. So, it shouldn’t be hard to concur with Alan Alda, founder of the Center for Communicating Science at Stony Brook University, when he states that “science communication” is as important as science itself (watch min 01:20 onward):
On my own note’s cover page I always scribble a couple of Einstein’s quotes: “if you cannot explain it simply, you don’t understand it well enough” and “everything should be made as simple as possible, but not simpler.” The former reminds me about the negative effect of self-defeating complexity. The later cautions about the diminishing returns of over-simplification and nonsense. Audiences can spot either issue right away, which negatively impacts speakers’ credibility and reputation. Recovering from that bad impression becomes an uphill battle and, unfortunately, bridges can be also burned for no good reason.
Communicating science and technology works best when striking an equilibrium point with (a) a well structured flow populated with (b) meaningful and engaging information of interest that is (c) purposely abstracted at the right level for each audience. Admittedly, by being in Human Factors Engineering, I cannot help but thinking that Information and Cognition Theory principles which serve us well when addressing the design of UI, User Interfaces, also become of the essence in any activity where we happen to be the medium to disseminate concepts, achievements, possibilities, constrains and what’s needed to move forward with a given project.
There also is a need for working with visual communication that can effectively deliver far more information than what words alone would be able to. We created backdrops of infographic quality that helped set the stage at each step. Half way of the talk we played a short video clip that illustrated a key and differentiated project element.
Our discussion flow followed a basic creative brief breakdown, which covered: what, why, how, who and when and the Q&A section helped us provide the next level of detail. Long story short, relevant content of substance remains “conditio sine qua non” – which means distilling indispensable items down to need-to-know, anything you-cannot-do-without.
We also had an impactful demo station at the so-called bazaar area, which had been unveiled and praised by experts at Mobile World Congress 2017 back in March. Last but not least, full credit for this award goes to one of the best teams in our industry. These are craftpeople who put their diverse talent to work by solving new and hard problems and, most importantly, making stuff work in no time.
“Cloud technologies virtualize your network to allow intelligent automation that instantly reacts to fluctuating demand and accelerates new services. Cloud is the foundation for IoT and 5G. But to realize the potential of a software-defined network, you need to operate a software-defined business – with the integrated performance you can depend on. Our cloud solutions and services featured at Mobile World Congress will demonstrate how you can transform your network, operations and business for agility, automation, security and instant service innovation.” – Realizing the agility of software defined business through the Cloud. Nokia, February 2017.
LeanOps was showcased in the booth’s private area. We had a good show and our team was involved in a number of discussions with network operators, ecosystem partners, industry analysts and public officials.
LeanOps’ mission is to “Make Sophisticated Operations Effortless.” Our team assembles end-to-end solutions to deliver the greater value of the whole. This is a systems integration job that takes advantage of Nokia’s portfolio depth, our ecosystem and open source tools. LeanOps interlaces (a) analytics, (b) automation, (c) programmability and (d) human factors engineering: our solution’s DNA.
We unveiled our new Decision Support System (DSS). This is a “solution level” single pane of glass, a metaphorical and multi-modal user interface purposely optimized for inter-disciplinary teamwork. LeanOps’ DSS renders complex systems and delivers multi-dimensional data visualization following the project’s “operations friendly” design directive.
From a Goal Directive Engineering standpoint, we have set a “4I Framework” that entails (1) Intuitive use (2) Immersive and (3) Interactive maneuverability delivering (4) Insightful experiences rather than just data. Moreover, all the magic is fully abstracted and, therefore, the underlying sophistication is completely transparent to the users. LeanOps’ SAIL, Smart Abstraction and Integration Layer, takes care of that under the hood. DSS and SAIL are both intertwined and integral to LeanOps’ end-to-end solutions are not sold independently as standalone products.
I would also like to share that LeanOps’ DSS transcends conventional HCI, Human-Computer-Interaction, to bring about CNI, Collaborative-Network-Intelligence, instead. I personally believe that switching gears from HCI to CNI makes all the difference given the value of human networks and machine networks, where collective intelligence becomes the outcome.
Taking into consideration LeanOps’ next-gen positioning, our MWC demo station was located in the “Cloud Zone,” though it is worth highlighting that LeanOps’ mission entails “operational transformation” with end-to-end solutions addressing hybrid environments and bridging legacy, current and emerging technologies, physical and virtual assets. “Lean” is a holistic undertaking involving practices, processes, technologies, tools and human factors, and so is Nokia LeanOps.
This year’s video is not publicly available. So, if you happen to be a network operator, an enterprise wrestling with complex environments, or a partner interested in LeanOps, please send me a message over LinkedIn to set up a call.
By the way, since I keep getting questions about Nokia’s new phones… I need to refer you to our peers at HMD Global, the independent Finnish company behind the Nokia branded phones. Nokia Corporation focuses on technologies zeroing in on network systems, analytics, applications, and services at the time of writing this. LeanOps is part of Nokia Corporation and our team, Solutions & Partners, is in the Applications & Analytics Group.