Tagged: Agile

Software’s Defining Age


1k Fulton Market


I am on my way to Mobile World Congress and last night I had the opportunity to speak at DevMynd’s “Agile Software in a Hardware World.” That panel discussion featured BMW Technology Corporation (BMW, Mini, Rolls-Royce,) Monsanto’s “The Climate Corporation,” and Nokia Software Group, which I was proud to represent. The venue, 1KFulton, is a century-old and former cold storage building in the Fulton Market neighborhood, home to Google’s Chicago campus.


DEVMYND EVENT


Reflecting on that panel discussion, small group conversations and one-on-one chats before and after the event, I think that it is fair to state the following:

(A) software is undergoing a defining moment while re-shaping industries. “Software defined instruments and systems” have superseded capabilities of hardware-centric deployments.

In other words, economic value and profitability are migrating from conventional products to software dominated environments that control tools, systems, and processes.


In this new context, (B) collaborative undertakings (co-creation, open source,) platforms, modularization and mashups are paving the way for rapid experimentation and for a wide-range of services to surface.

Back to economics… a venture capital firm operating in the Silicon Valley shared with me that when comparing current investments with equivalent old-school ones, they experienced x3 times time-to-market speed at 1/3 of the investment, which allows them to better diversify risk and fund more start-ups in the process.


Moreover, we are now operating at (C) unprecedented speed, scale and scope. For that reason alone, software should improve our ability to “pivot” and dynamically adapt to changing circumstances.

Most plans don’t survive first contact and many start-ups and emerging technologies don’t survive the so-called “crossing-the-chasm” or “Valley of Death.” So, remaining lean and embracing continuous/iterative improvement are of the essence. That’s a quality mantra rather than an excuse for forgoing best quality practices.

Back to economics again: quality management’s definition of “customer satisfaction” is now table-stakes and compliance in that area drives low-cost commoditization. “Customer delight” is the higher benchmark that commands a premium and the kind of margins enabling us to re-invest to further innovate.


Let’s now state the obvious, “customers” are human beings, aren’t they? Interestingly enough, the more sophistication and diversification, the higher the need for (D) humanizing technology so that we can better create, consume, use and democratize any digital services. In turn, this has fostered (E) Design Thinking as a leading innovation practice that intersects art and science. Design Thinking addresses HMS, Human-Machine-Systems, by prioritizing HCD, Human-Centered-Design.

In terms of economic effectiveness and efficiency, that means outcome-oriented system-sizing, rather than over-engineering waste. It also means the definition of meaningful and purposeful requirements: some are designed to meet customer satisfaction metrics, while others are explicitly thought out to exceed that baseline and, hence, to actually deliver the X-Factor prompting customer delight. All key to customer acceptance and adoption growth.

Better yet, one of the event’s participants volunteered the fact that “good design” factoring intuitive interaction, advanced dataviz (data visualization) and effortless controls was proven to shrink the sales cycle by literally half: not only customers perceived and experienced the service’s tangible value early, the sales team was also able to approach more customers in that timeframe. Innovative Human-Computer-Interaction based on information design, value based tasks, streamlined processes, intuitive data visualization, effortless controls and overall UX, User Experience, double as compelling demonstration tools.


This is a side note: that has already become a critical success factor in Artificial Intelligence’s new developments, AI being software’s top transformational exponent as DSS, Decision Support Systems for humans and/or machines become quintessential. I will detail that in another post.


One last thought… (F) software’s pervasiveness has also brought along Agile development practices. These include “user stories” borrowing a Design Thinking technique by which application features are defined by synthesizing human optics (persona/outcome/rationale) to put technical myopia at bay.

After all, we should all be in the business of making tech human. Otherwise, what would negating or ignoring that say about each of us and our collective culture?

Agile Software in a Hardware World

 

“The world of IoT and connected devices is expanding rapidly. We all carry super computers in our pockets and interact with everything from home automation, cars, consumer electronics, and healthcare devices.”

“In this complex hardware + software environment the product development cycle can be tricky. For example, you can’t just follow agile software practices by the book when you’re building a connected pace maker. So how do we approach product development when the stakes are high and the moving parts are many? During this discussion we’ll be tackling topics such as:”

“How do you roadmap a product which includes both hardware and software components? How does agile development fit in? How does the regulatory landscape affect how we approach development and iteration? How do you build teams around these integrated products? And how do you keep them in sync and working together?”

DEVMYND EVENT


I’d first like to thank the team at DevMynd for their kind invitation. I am looking forward to joining the panel discussion in Chicago this coming Thursday, February 22. In the meantime, I will welcome any comments and insights as I gear up for this discussion.

I’m working on outlining some of the myths, dilemmas and trade-offs that I have encounter as an Industrial Designer and in Product Management.

From a design perspective, there are two topics worth looking at: Design Thinking as a Human-Centered methodology and its outcomes in terms of: (a) utility, (b) usability, (c) consumability, (d) affectivity and (e) the composite and differential value of the resulting digital experiences that involve software and hardware.

This “new brave world” equips us with the freedom to explore new form factors, cognitive models and, most importantly, the development human x technology networks. Some of the specifics come down to design semantics re-defining HMS, Human-Machine-Systems, in the context of multi-modal user interfaces and innovative interactions where Machine Learning and new visualization paradigms happen to surface.

From a Product Management viewpoint, there is a need for also pondering about how to best leverage Design Thinking beyond Industrial Design and Software Development to tackle product and service strategy. Here my focus gravitates toward addressing: (a) success factors and (b) limiting factors under control, as well as (d) other determining factors beyond our area of influence that can impact the diffusion of innovations either positively or negatively. Moreover, I like to couple business model innovation with behavioral economics and information network effects.

This construct really boils down to capturing the essence behind (e) stakeholders’ acceptance criteria and (f) the users’ engagement, adoption and growth rates. This means defining capability and maturity levels and how to best factor for the fact that they adapt and evolve over time. Obviously, this leads to taking a close look at how to best intersect Lean and Agile practices, but not only, so that we can lead and navigate constantly changing environments in “digital time.”

Let’s get down to a more tactical level: end-to-end system design entails a mix of loosely and tightly coupled elements, and a platform approach to operate at speed, scale and wider scope that what black boxes can match. A reality check unveils a hybrid world where decisions on capacity and performance levels, as well as serviceability and dependency levels drive decisions toward optimizing for distributed systems and, therefore, the rising value of end-to-end solutions vs. point solutions only.

In that context, inter-disciplinary teams involving creative technologists and domain experts make our organizations effectively diverse, smarter and innovative. Otherwise, self-defeating arrogance, conflicting silos and technical myopia can make pre-production and production be costlier by promoting unnecessary friction and getting everyone to work harder and harder rather than smarter. Typically, that negates productivity, forces a number corrective actions, and significantly shifts and/or downsized sought after results.

The beauty of the Studio’s human-experience-centered practice is a healthy obsession for delivering “meaning.” The definition of “meaningful outcomes” (rather than churning outputs) makes these organizations behave based on value and impact. We strive to foster not just customer satisfaction and net promoter scores, but measurable customer delight and network effects (superior and service-level performance indicators) which, in turn, set and streamline technical requirements.

Long story short, the Studio’s mindset (critical thinking / wonder & discovery / problem solving) and workstyle (collaborative / experiential / iterative / adaptive) help explain why creative technologists are instrumental and serial innovation engines for the digital age.

 


Footnote: the term “team of creative technologists” was first coined by Nokia Bell Labs back in the 1940s to single out the differentiated value of inter-disciplinary undertakings. In the late forties, Bell Labs’ Claude Shannon pioneered Information Theory and John Karlin set up the first Human Factors Engineering in industry. That HFE team was formed by a psychologist, a statistician (the father of quality control visualization,) an engineer, and a physicist.

Innovation Management Essentials: Situational Awareness


“Develop foresight, to sense and understand the context around the dilemmas that challenge you. The goal is not to predict what’s going to happen but to provoke your creativity and prepare you for your biggest challenges, many of which are going to come in the form of dilemmas (…) leaders are sense makers, and they help others make sense- often by asking penetrating questions.” Get There Early by Bob Johansen.


image


Situational Awareness (SA) involves sensemaking. SA deals with critical information on what’s going on with a project as well as around it. Know-how, past experiences, lessons learned and best practices are of the essence. These work well when addressing incremental innovation. Though, our perception is also shaped by motivation, expectations, filters as well as organizational behaviors (culture, workstyle, decision making, roles and responsibilities, processes) and, possibly, conflicting priorities.

Taking things to new levels, disruptive innovation gets us immersed in what turn out to be game changing environments. In this specific context, creative destruction takes place and so do errors in judgment. Dealing with uncertainty, ambiguity and rapidly superseding cascading events can quickly render one’s viewpoint out of focus and even out of place.

Those just sticking to what they know because relying on one’s “assumptions and belief system” has consistently served they well, might now suffer from complacency, myopia and tunnel vision instead… experiencing blindsiding denial in the process. Clayton’s “The Innovator’s Dilemma” and Taleb’s “The Black Swan” and “Antifragile” are worth understanding. 


Early awareness takes continuous listening and monitoring. Lets first think of project sensors gathering data and probes strategically placed to explore and discover clues which might not yet be visible. Leading indicators form a set of metrics that change in advance to a given event taking hold and can be used to raise alerts. Lagging indicators signal conditions in place for changes to take hold and become the new pattern.

Defining a narrow set of key performance indicators (KPI) improves visibility, saving us from clutter and information overload. KPIs can correlate and synthesize need-to-see data and can work with high level abstractions. These are usually delivered as “dashboards” that we can easily work with. Here is a “6 R” framework on KPI quality to mitigate distortions:


Relevancy: validity and utility level in context.   Resolution: meaningful detail and abstractions.
Range: scope (fields) and scale dimensions.   Recency: lifecycle – growth, decay and refresh velocity, ephemeral vs. durable.
Robustness: complete or sufficient to support the analysis, portrays what’s being measured.   Reliability: data integrity and error free, factors signal to noise rate, accounts for outliers.

The above is based on a “5 R” version I first learned on an MIT course about big data and social analytics.

I would also like to share that perfect data might be elusive and different quality levels can be applied. Hence, we talk in terms of things being “directionally correct” or “good enough” to keep things moving. In some other cases, over-engineering data by going beyond what’s really needed (data overload) can shortchange focus, efforts and budgets, which would be better allocated to other priority and/or pressing tasks. We can also face crunch time situations when we need to operate without benefiting from more data since delays would trigger higher risks.


Nonetheless, acknowledging that we need to make those kind of judgment calls does not excuse giving up on perfecting how to work with data. But, data alone will not deliver SA: this involves intertwining analysis and synthesis cycles as well as fine tuning sensemaking, which is an iterative and continuous improvement process.

Keeping cognitive biases at bay is a challenge. Subjective statements supporting adversarial stances such as “been there done that, it just doesn’t work” (even if that experience happened in a different context and a distant past) or the “not-invented here” (NIH) “not-one-of-us” syndromes can be easy to spot. But, there is a wide range of logical fallacies and “devil’s advocate” plays which can be perceived as reasonable even though the underlying logic is flawed.  


I designed the above chart drawing from the all familiar Strengths-Weaknesses-Opportunities-Threats (SWAT) model. As far as Frequently Asked Questions (FAQ) is concerned, the one I get the most is about the difference between “clash” and “shift”. Basically, the clash’s bucket is there to outline ongoing mismatches and adversarial confrontations. Those having reached critical mass can be plotted in the “clash x critical” quadrant.

The “shift” column captures game changing items that are still evolving, where a succession of disruptive believes and assumptions reshape the context and prompt new environments that can render a project obsolete… shouldn’t we gear up in advance or course correct as needed. Looking into impact levels, correlations, outliers and then sorting things accordingly is part of the thought process.

The next FAQ relates to how to best address “core” vs. “beyond comfort zone”. A core competence is an existing skill and capability. This refers to traits worth leveraging and further developing provided that they continue to make a difference. Though, asking any person, organization or system to just focus on what they already know and do well might not necessarily be the best approach in today’s rapidly changing and commonplace uncertain environments. Therefore, the need for assessing what and how to continuously grow beyond a given comfort zone, and at what point that new capability can be rolled up as a core competency.

One other thought, let’s keep in mind that being aware and alert are good things. Taking no action or seating tight while waiting for the dust to settle happen to be options available to us, though paralysis by analysis or paralyzing fear are not.

What about “organic” vs. “inorganic”? The former entails opportunities that can be approached with existing competencies and, possibly, scaling by growing resources. The latter talks to efforts that involve collaborating (collaborating with customers and partners, coopetition with competitors) and even acquiring other ecosystem players in the value chain, mergers being another example.


Last but not least, perspective is of the essence and the journey is comprised by experiences (where we come from) situational awareness (where we are) and foresight (where we are headed). Antonio Machado (Spanish poet, 1875-1939) stated that we make our path as we walk, which anyone working on innovation projects can relate to. Delineating and providing a sense involves the following “journey points”, which I will discuss on another post on agile project planning:

  • kick-off / start point
  • takeoff limits
  • checkpoints
  • milestones
  • crossing chasms
  • touch points
  • breakeven point
  • tipping point
  • chain reaction
  • inflection point
  • turbulences
  • crossroads
  • decision points
  • pivoting
  • point of no return
  • breaking point
  • moving targets
  • valley of death
  • dead ends
  • aftermath

Hope this remains of interest. As usual, I will welcome your comments and emails to continue our discussion.

Link to my latest post.