Tagged: Big Data

Chicago’s CIO Forums on Digital Transformation, Customer Experience and Big Data

This is just a quick note to share that I’m planning to attend the following forums on Digital Transformation, Customer Experience and Big Data in Chicago. Let me know if you will be there and let’s plan to meet. Thanks to Argyle’s team for their kind invitations.


image

Leading the Business through Digital Transformation / The Future of Cognitive Customer Engagement is Now 


image

Customer Experience Think Tank


image

Leadership in Big Data & Analytics Forum


See you there : )

Human Factors Engineering: leveraging the Internet of Things and Mobile Sensing Networks with Big Data and Social Analytics


“The ultimate test of a practical theory, of course, is whether or not it can be used to build working systems. It is good enough to use in the real world? […] Almost uniquely among the social sciences, this new social physics framework provides quantitative results at scales ranging from small groups, to companies, to cities, and even to entire societies […] it provides people –e.g., government and industry leaders, academics, and average citizens- a language that is better than the old vocabulary of markets and classes, capital and production […] the engine that drives social physics is big data: the newly ubiquitous digital data now available about all aspects of human life. Social physics functions by analyzing patterns of human experience and idea exchange.”Social Physics by Alex Pentland.


image


Back in 2010 I worked on the Amazing Learning Unit, a research project leading to a proof of concept demonstration. The anecdote behind it’s name was that by calling it A.L.U. we played with the fact that those same three letters formed Alcatel-Lucent’s stock ticker. On a more serious note, we partnered with Lego and the Illinois Math & Science Academy (IMSA) to unveil a simulation at Mobile World Congress in 2011, which was very well received.

The Amazing Learning Unit’s concept entailed “Lego robotics” equipped with Touchatag’s RFID readers and Android phones and tablets. As you can see in the above picture, these “mobile units” were designed to look, behave and roam around like autonomous screens, cameras and sensors with wheels.

Driven by human factors engineering principles, the thinking behind the project was centered not on technology, but on taking down the classroom’s physical walls, which can make today’s schools and school districts behave like “geofenced silos”. This is an environment that can constrain kids’ exposure to an outside world that’s growing more connected and diverse. The project’s main goal was to enable boundariless collaborative learning, our technologies being the means to that end.

The concept called for the robots to roam around the classroom and sense what a kid was playing with, or what book she/he was reading. Classroom’s objects and books would feature the Touchatag’s stickers to that end. The result is a mobile sensing network that falls in the IoT, Internet of Things, category.

Leveraging social analytics, we thought of a “serendipity engine” which would then connect the kid with another child from any other school who would be engaged in a similar activity, and whose skill and learning behaviors happened to be a good match for them to play together. The smartphone screens would prompt interactive online activities jointly with video calls engaging them in context-aware and “peer-to-peer collaborative learning”.

We discussed what’s now known as collaborative filtering and matchmaking options to promote role model behaviors and how to adequately display them to help realize everyone’s potential, and to do so in everyone’s best interest. We also looked into sensitive matters centered on behavioral analytics, privacy and the pros and cons of emotional and persuasive design features.

As part of the project’s research, gamification techniques were thought out to incentivize players, such as competitive challenges, progressive skill levels, in-game rewards and scoreboards. Circling back with a recent post on working with personas, the ones created for this project were modeled after our own children and my kid inspired and enjoyed participating in the project’s living lab.

The prototype unveiled at Mobile World Congress showcased some of the above concepts. It is worth sharing that the business goal was to help experience some as complex as the IP Multimedia Subsystem (IMS) in a new and radically light back in 2010. I strived to humanize what can otherwise come across as overlay technical and rather obscure sets of technologies behind network infrastructure, platforms and telecommunication services, the essence of our company’s product portfolio. Therefore, we purposely placed the emphasis on creating new experiences such as the one delivered by the Amazing Learning Unit. Our inventiveness and technologies became transparent and were in place to deliver the magic.

Interestingly enough, this research project led to discussions with MIT and a leading global network operator. That time around, we looked at how this kind of experiences can be applied in enterprise environments to raise productivity and foster collaborative and multi-disciplinary workstyles. Enabling new organizational and decision making cultures in other words. The following phase of the research was titled Immersive Mobile Systems, IMS in short : )


Human Factors Engineering: Big Data & Social Analytics to #MakeTechHuman


“Netflix’s analytical orientation has already led to a high level of success and growth. But the company is also counting on analytics to drive it through a major technological shift […] by analytics we mean the extensive use of data, statistical and quantitative analysis, explanatory and predictive models, and fact-based management to drive decisions an actions”. Competing on Analytics by Thomas H. Davenport and Jeanne G. Harris.

“Big data changes the nature of business, markets, and society […] the effects on individuals may be the biggest shock of all […] this will force an adjustment to traditional ideas of management, decision making, human resources and education”. Big Data by Viktor Mayer-Schonberger and Kenneth Cukier.

“Social physics functions by analyzing patters of human experience and idea exchange within the digital breadcrumbs we all leave behind as we move through the world […] the process of analyzing the patterns is called reality mining […] one of the ten technologies that will change the world [according to MIT Technology Review]”. Social Physics by Alex Pentland.


image

It’s Saturday night and I am happy to share that I just submitted my last two Jupyter notebooks and, therefore, completed MIT’s first certificate course on Big Data and Social Analytics.

This was one intensive summer with very little time left for anything else beyond work, day-to-day family life and spending most evenings and weekends studying. MIT BD&SA course developers estimated a weekly workload of 8 to 12 hours through 9 weeks. Though, many of us have spent north of 15 hours a week to cover: videos and readings, Python programming and written assignments, quizzes, and forum discussions. By the way, all definitely worthwhile.

While taking this course, I couldn’t help recalling the kind of scarce data we used to work with when I got my postgrad on Human Factors Engineering at BarcelonaTech in the early 90s, also graduating with the first class.

By means of an example, one of the industrial ergonomics projects got kicked off with statistical data provided by the military. Stats on Marines fit for service being the only readily available physiological data for us to design a local civilian application.  We knew that wasn’t a representative model of the target user base for the industrial workstation under design. Back then, undertaking a proper data collection study was costly and beyond project means.

Our group worked with small data by testing things on ourselves and leveraging in-house dogfooding to some extent. Though, unfortunately, this kind of findings might not adequately reflect the reality of human variability. If overlooked, that can result on implementing designs that optimize for a set of “proficient some” while undermining ease of use for many others and missing the mark in the process. Let’s keep in mind that, as clearly outlined in Crossing the Chasm, early success among devoted early adopters might not translate in mainstream praise and popularity, then failing to grow the user base and failing in the market.  


imageTo be clear, working with secondary research (e.g. reference data sets from third parties) and conducting primary research by testing things on ourselves coupled with in-house dogfooding are all valuable practices. Though not necessarily enough to make a compelling difference in today’s “big data” day and age.

MIT BD&SA discusses the benefits of working with living labs driven by UCD, User Centered Design. We now have commercial off-the-shelf technologies (smartphones, Internet of Things, sensing networks, machine learning) at our disposal, which allow us to capture user actions and behavior on location and, most importantly, with greater data resolution.

Couple that with ethnographic research focusing on understanding human factors by observing users in their own environment and usage context and, most importantly, capturing their PoV, Point of View at each step.

So, those of us working on Human Factors Engineering and driven by User Centered Design to deliver processes, tools, products and services, can create new experiences that take the human possibilities of technologies to new unprecedented levels, analytics becoming of the essence to #MakeTechHuman.



image

Big Data Revolution. TED Radio Hour. NPR.

image

The Human Face of Big Data. PBS.

image

Source: Business Innovation Demands Accelerated Insights. Intel.

image 

MakeTechHuman. Nokia.


See you at RecSys 2016 next week : )

#MakeTechHuman