Big Data 2020 : Augmenting Humans

Apple wants to measure your mood : A patent filing shows the company is exploring the possibility of deriving mood from body sensors, user habits, and consumed data”.

 

"How Your Computer Will Read You Like A Book : The term for this technology is "affective computing" and it involves reading, interpreting, and even simulating emotion”

 

"This Google Glass App Will Detect Your Emotions” 

 

"Rise of the machines that read your mind : They will do so by using ever smaller and cheaper sensors that measure pulse, sweat and other physiological indicators” (Murad Ahmed, thetimes.co.uk)

 

What is going on?

 

All these announcements are part of a trend - the use of big data to “augment humans”. Specifically, to augment information workers.

 

The last 100 years have seen massive physical augmentation of humans. For example, research from the Agricultural Council of America reports that in 1930, one person could harvest 11 bushels of wheat. Today, that same person can harvest 900 bushels. 

 

Improvements in the productivity of information workers have not been anything like as dramatic. McKinsey estimates that mainframes improved productivity of information workers by 2.8 percent, the mini-computer and PC by 1.49 percent, the Internet by 2.5 percent and mobility by 2.7 percent. (ref : McKinsey : Big Data – The next frontier for innovation, competition and productivity, 2012)

 

I believe that the key application of big data in the next seven years will be to improve the productivity of information workers. This won’t happen by automating what they do, but through augmenting what they do, as we shall see below.

 

There are three steps to augmenting humans:

  1. knowing their current state and intensions
  2. knowing how they like to work and receive information, and
  3. providing recommendations and informational help.

Let’s look at each of these in turn.

 

1. Your current state and intensions

Imagine you are having email or interactive message dialog with a couple of colleagues. Imagine if big data were able to monitor the conversation, understanding the intent of your conversation (trying to create a marketing campaign around memristors, for example). Today’s smartphones already contain an impressive array of sensors.

 

Jump forwards in time and they may be able to understand your brain waves, your voice tones, and your facial ticks. So there is a good chance that will be able to use this information to infer what you are currently trying to achieve as you travel or shop. personal sensors.png

 

This is where all the announcements at the start of this blog post come in.

 

2. Your Personal Avatar

Imagine an intelligence that knows all about you. Think of it as the best personal assistant you ever had; a personal assistant who has been with you for years. It knows how you like to travel, how you like to arrange and attend meetings, how you like to shop – everything. This is not simply a set of application preferences. It’s much smarter and “fuzzier” than that, and it goes across all your applications.

 

Many people are already working on this “personal avatar” concept. For example, a team at Cambridge University in the UK is working on a tool that will interact with you, and in doing so, will build up a model of your world and the way you like operate in that world. 

 

3. The augmentation engine

Today, when you are shopping on Amazon, the Amazon program takes your current state (what you are trying to buy) and your personal preferences (how you have shopped in the past) and it uses this, and the information about what is available to buy right now, to make recommendations – to “augment” your shopping experience.

 

Now take this functionality to the enterprise, make it much more generic, and allow it to use corporate information and information about how you behave at work, and you have applications that augment humans, both at work and at play. While some business-orientated augmentation engines do exist, but they tend to be hard-coded and expensive to change for new data sources.

 

A couple of examples

Let’s consider an example. Two colleagues in an interactive messaging conversation are trying to create a marketing plan for a new drug.

 

The augmentation system would infer the intention of the conversation itself. It would then supply the participants with a list of others in their organization who are having similar conversations. It would summarize the conclusions these groups had already reached. It might offer to summarize their dialog to share it with other groups. And it might search for industry best practices in this area, so that the team didn’t end of reinventing the wheel.

 

The diagram below shows this. 

 

 

augmented conversation.png

 

Or, an harassed parent is zooming around a grocery store as fast as he can before his parking runs out and/or he is late for soccer practice. He’s also angry because “someone” took his lose change for the parking meter from the car. His cell phone’s wifi is used to position him - where he is in the store and how fast he is zooming around. He has his hand on his phone and this allows the phone to know he’s not in a peaceful, zen state. The augmentation system therefore does all it can to make his grocery shop as fast as possible.

 

Next week, however, he goes back to the same store. His partner has the kids, he has time on the parking meter and life is good. The augmentation system recommends a white wine to go with his fish, and he’s grateful for the suggestion.

 

Had the system made the same suggestion the week earlier, he would probably have thrown his phone at the fish counter!

 

 

man shopping.jpg

 

 

Summary

When researching the mobility 2020 and this chapter, the number one theme that came thru to me was that of augmenting humans. The combination of mobility and big data will allow information workers to do more than they do today - it will increase the gearing of information workers, just like the tractor has massively increased the gearing of agricultural workers.

 

We are already starting to see“information working augmentation systems” popping up.

 

A startup in California is creating education tools that adjust to the capabilities and needs of the child (they understand the current state and preferences of the child and create learning experiences based upon that) 

 

As a customer interacts with anyone using HP’s SaaS service desk product, HP Service Anywhere, the interaction is analysed by our HP Autonomy product. The meaning of the conversation is automatically derived and relevant knowledge base articles, incidents, problems and changes are pulled from the service desk’s data store. (This is shown in the diagram below)

 

grabbing relevent content.png

 

Want more?

For a lovely graphical listing of all Big Data 2020-related postings, please go to my Scoop.it Big Data 2020 web page.

 

To find out what HP Big Data can do for your today, please go so our HP HAVEn page

Comments
parminder_sohal | ‎05-25-2014 09:24 PM

Really interesting blog.....  Industry is coming up with new term to describ what you have mentioned in this article called Code Halos.. This is the trend which will become predominate in the future..

hughesthe1st | ‎05-29-2014 11:32 AM

This is a great article Mike, and very relevant in the context of the Human Information apps developers are already building today. We just ran a hackathon on http://idolondemand.topcoder.com and had a Google Glass entry with an app that uses the HP IDOL OnDemand APIs to analyze the mood of the user. Very cool.

 

You have also provided inspiration for the first discussion post on the HP IDOL OnDemand Developers group page on LinkedIn... https://www.linkedin.com/groups/Innovation-focus-on-human-information-6704619.S.5877840940613255170?...

 

Thanks for sharing and I look forward to your next post!

 

Regards,


Sean.

Leave a Comment

We encourage you to share your comments on this post. Comments are moderated and will be reviewed
and posted as promptly as possible during regular business hours

To ensure your comment is published, be sure to follow the Community Guidelines.

Be sure to enter a unique name. You can't reuse a name that's already in use.
Be sure to enter a unique email address. You can't reuse an email address that's already in use.
Type the characters you see in the picture above.Type the words you hear.
Search
About the Author
Mike has been with HP for 30 years. Half of that time was in R&D, mainly as an architect. The other 15 years has been spent in product manag...


Follow Us
The opinions expressed above are the personal opinions of the authors, not of HP. By using this site, you accept the Terms of Use and Rules of Participation