In the near future, I think Google Now could tell you are pregnant or diagnose you with a medical condition before your doctor ever could. Humans are great at recognising patterns but only if we know we are creating them or where to look. Remember the Target story of how they knew a young girl was pregnant before she or her father did? Increases in technology like smart watches and the trend of “the quantified self” mean messages like being told you are pregnant aren’t impossible in the near future. So how do we go from weather reports and traffic updates to a medical diagnosis?
Strings-to-things, things-to-actions
When Google, Yahoo and Bing announced Schema.org in 2011, search engines were still in the strings-to-things phase. In my opinion, Google, in particular, are already moving on from that goal. The most recent addition to the Schema.org vocabulary is actions. See, the Schema site for more details or my SMX Munich deck for more details.
In my presentation, I made the point that the future of structured data isn’t about understanding what a thing is, it’s about understanding what a thing can do. If search engines can understand what your website, app or other interfaces can do, and they can understand user intent, they can match queries to the best place to do that action. How does Google know what we want to do?
Actions-to-anticipation
Many people have said that Google wants to become the ultimate personal assistant. Things like Google Now and conversational search reinforce this standpoint. However, a prerequisite for that position is the concept of time. For a computerised personal assistant to be truly as useful as the real thing, they need to be aware of the past, the present and more importantly the future.
Historically, Google and other search engines have dealt with things from the past. Webpages by their nature are in the past, or at best, live. This makes the anticipation and initiative that you would expect from great personal assistant difficult for Google. They have very little data to predict what you might want to do or are going to do in the future. Gmail and Google calendar are the two most obvious ones that come to mind (if you use them).
Forgetting privacy or intellectual property for a second, imagine Google had access to every app on your phone and the data within it. What might they be able to know about you?
Just the apps above could give Google access to:
- What music I’ve listened to in the past
- What movies I’ve watched
- What I’ve been eating and drinking recently
- How much exercise I do
- What articles I might read in the near future
- Flights I have booked
- Houses I might want to buy
Google Now – An IFTTT for your life
While I was in Munich, I saw an announcement that Google had opened the Google Now API to a selection of hand-picked, third-party apps.
This got me thinking. I do not know what the relationship will be or what data Google would have access to but one of the apps that have been accepted to work with Google Now is Lyft. The example Google gave in the article was a generic prompt to order a cab. For example, you arrive at an airport and Google Now might push you a notification to get a cab:
Some more examples
While the Lyft example above is interesting, it made me realise that allowing apps to talk to each other via Google Now would essentially turn your smartphone into an IFTTT for your life. So rather than a generic Lyft alert, what if they combined a few apps? They could use my British Airways app to see I have an upcoming flight, Google maps to know when I’ve arrived in Munich, and my Gmail account to see where I am staying. There are probably specific hotel apps they could use too. Using this, rather than getting a generic get a car card, I get one that’s already personalised the quote to where I’m going.
Anticipation to diagnosis
The ultimate personal assistant would not only tell us what we expect, they would tell us things we never thought to consider. This would only be made possible by advanced pattern recognition, anticipation, and initiative beyond the possibility of a human.
What patterns do you already create but don’t currently correlate? If you feel sluggish or tired on a Thursday, we do not necessarily correlate that to something that you may be allergic to that you ate on Monday. Many people spend years with conditions such as gluten or lactose intolerance but never make the connection between what they eat and how they feel. Humans cannot easily track and analyse lots of data like that, computers can.
So how can Google tell you are pregnant? I am not a doctor but I suspect like the Target example, there may be early signs of pregnancy that we do not think about at the moment (biological or otherwise). For a start, there could be a process of first increasing the priority that a particular pattern receives. For example, there may be lots of small things that people change before trying to get pregnant. If you’re using a lot of different apps combined with hardware like heart rate monitors and blood pressure monitors, it wouldn’t be too difficult for Google to take an educated guess. Just using the information in the Target article we know people do things like:
- Change their diet – This would be easy to see through apps like MyFitnessPal
- Change their buying habits – Amazon app or other store apps
- They may do more exercise – Several places they could get this
After all of the above, let’s not forget Google knows everything you’ve searched for online and your browsing history if you use Chrome. I do not think it would take much to guess someone is thinking about having a family based on his or her search history alone.
Let’s assume that based on the above, Google lowers the “pregnancy card” trigger threshold. This means they look closer at changes that might suggest your pregnant. I am not a doctor, so bear with me while I think out loud. Other than urine or blood samples, what other quantitative data is there that you might be pregnant?
For context, I recently learned that eating something that you have an intolerance to can show an elevated heart rate for two hours after eating. One test to check for allergies is to track your heart rate throughout the day. This was where this idea came from in the first place. Using a smartwatch with a heart rate monitor, plus My Fitness Pal, Google could make suggestions that you are allergic to foods you never thought of due to recognising patterns in elevated heart rate after your meals. This made me wonder what else could be possible. There’s a ton of tech for tracking:
Could Google make a guess from this data alone? I cannot stress enough about my lack of medical qualifications, but I wonder if pregnancy impacts things like REM and deep sleep changes, significant blood pressure or heart rate changes at certain times of the day. Who knows, and maybe one of these things alone wouldn’t be enough to know for sure, but combined, I think it will not be long before pregnancy prediction or similar could be done.
Enough about pregnancy, (Google probably thinks I am looking to start a family) what else? What things using heart rate alone could Google diagnose or push to us in Google Now? Could they push notifications to people who are diabetic to remember to take insulin? Could they diagnose diabetes? Could they flag heart problems before it is too late? I have no idea, but I’m excited to see where things go in the next few years.