Yo! Minimalist Notifications, API and the Internet of Things

Yo LogoThought it was a joke, but having 4 hours of code resulting in $1m of VC funding, at an estimated $10M company valuation, raised quite a few eyebrows. The Yo! project team have now released their API, and with it some possibilities – over and above the initial ability to just say “Yo!” to a friend. At the time he provided some of the funds, John Borthwick of Betaworks said that there is a future of delivering binary status updates, or even commands to objects to throw an on/off switch remotely (blog post here). The first green shoots are now appearing.

The main enhancement is the ability to carry a payload with the Yo!, such as a URL. Hence your Yo!, when received, can be used to invoke an application or web page with a bookmark already put in place. That facilitates a notification, which is effectively guaranteed to have arrived, to say “look at this”. Probably extensible to all sorts of other tasks.

The other big change is the provision of an API, which allows anyone to create a Yo! list of people to notify against a defined name. So, in theory, I could create a virtual user called “IANWARING-SIMPLICITY-SELLS”, and to publicise that to my blog audience. If any user wants to subscribe, they just send a “Yo!” to that user, and bingo, they are subscribed and it is listed (as another contact) on their phone handset. If I then release a new blog post, I can use a couple of lines of Javascript or PHP to send the notification to the whole subscriber base, carrying the URL of the new post; one key press to view. If anyone wants to unsubscribe, they just drop the username on their handset, and the subscriber list updates.

Other applications described include:

  • Getting a Yo! when a FedEx package is on it’s way
  • Getting a Yo! when your favourite sports team scores – “Yo us at ASTONVILLA and we’ll Yo when we score a goal!
  • Getting a Yo! when someone famous you follow tweets or posts to Instagram
  • Breaking News from a trusted source
  • Tell me when this product comes into stock at my local retailer
  • To see if there are rental bicycles available near to you (it can Yo! you back)
  • You receive a payment on PayPal
  • To be told when it starts raining in a specific town
  • Your stocks positions go up or down by a specific percentage
  • Tell me when my wife arrives safely at work, or our kids at their travel destination

but I guess there are other “Internet of Things” applications to switch on home lights, open garage doors, switch on (or turn off) the oven. Or to Yo! you if your front door has opened unexpectedly (carrying a link to the picture of who’s there?). Simple one click subscriptions. So, an extra way to operate Apple HomeKit (which today controls home appliance networks only through Siri voice control).

Early users are showing simple Restful URLs and http GET/POSTs to trigger events to the Yo! API. I’ve also seen someone say that it will work with CoPA (Constrained Application Protocol), a lightweight protocol stack suitable for use within simple electronic devices.

Hence, notifications that are implemented easily and over which you have total control. Something Apple appear to be anal about, particularly in a future world where you’ll be walking past low energy bluetooth beacons in retail settings every few yards. Your appetite to be handed notifications will degrade quickly with volumes if there are virtual attention beggars every few paces. Apple have been locking down access to their iBeacon licensees to limit the chance of this happening.

With the Yo! API, the first of many notification services (alongside Google Now, and Apples own notification services), and a simple one at that. One that can be mixed with IFTTT (if this, then that), a simple web based logic and task action system also produced by Betaworks. And which may well be accessible directly from embedded electronics around us.

The one remaining puzzle is how the authors will be able to monetise their work (their main asset is an idea of the type and frequency of notifications you welcome receiving, and that you seek). Still a bit short of Google’s core business (which historically was to monetise purchase intentions) at this stage in Yo!’s development. So, suggestions in the case of Yo! most welcome.

 

Microbiomes and a glimpse to doctors becoming a small niche

Microbiomes, Gut and Spot the Salmonella

When I get up in the morning, I normally follow a path on my iPad through email, Facebook, LinkedIn, Twitter, Google+, Feedly (for my RSS feeds) and Downcast (to update my Podcasts for later listening). This morning, Kevin Kelly served up a comment on Google+ that piqued my interest, and that led to a long voyage of discovery. Much to my wifes disgust as I quoted gory details about digestive systems at the same time she was trying to eat her breakfast. He said:

There are 2 reasons this great Quantified Self experiment is so great. One, it shows how important your microbial ecosystem is. Two, it shows how significant DAILY genome sequencing will be.

He then gave a pointer to an article about Microbiomes here.

The Diet Journey

I’ve largely built models based on innocent attempts to lose weight, dating back to late 2000 when I tried the Atkins diet. That largely stalled after 3 weeks and one stone loss. Then fairly liberated in 2002 by a regime at my local gym, when I got introduced (as part of a six week program) to the website of Weight Loss Resources. This got me in the habit of recording my food intake and exercise very precisely, which translated branded foods and weights into daily intake of carbs, protein and fat. That gave me my calorie consumption and nutritional balance, and kept track alongside weekly weight readings. I’ve kept that data flowing now for over 12 years, which continues to this day.

Things i’ve learnt along the way are:

  • Weight loss is heavily dependent on me consuming less calories than my Basal Metabolic Rate (BMR), and at the same time keeping energy deduced from carbs, protein and fat at a specific balance (50% from Carbs, 20% Protein, 30% fat)
  • 1g of protein is circa 4.0 Kcals, 1g of carbs around 3.75 Kcals, and fat around 9.0 Kcals.
  • Muscle weighs 2x as much as fat
  • There is a current fixation at gyms with upping your muscle content at first, nominally to increase your energy burn rate (even at rest)
  • The digestive system is largely first in, first out; protein is largely processed in acidic conditions, and carbs later down the path in alkaline equivalents. Fat is used as part of both processes.
  • There are a wide variety of symbiotic (opposite of parasite!) organisms that assist the digestive process from beginning to end
  • Weight loss is both heat and exhaust. Probably other forms of radiation too, given we are all like a light bulb in the infrared spectrum (I always wonder how the SAS manage to deploy small teams in foreign territory and remain, for the most part, undetected)

I’ve always harboured a suspicion that taking antibiotics have an indiscriminate bombing effect on the population of microbiomes there to assist you. Likewise the effect of what used to be my habit of drinking (very acidic) Diet Coke. But never seen anyone classify the variety and numbers of Microbiomes, and to track this over time.

The two subjects had the laboratory resources to examine samples of their own saliva, and their own stool samples, and map things over time. Fascinating to see what happened when one of them suffered Salmonella (the green in the above picture), and the other got “Delhi Belly” during a trip abroad.

The links around the article led to other articles in National Geographic, including one where the author reported much wider analysis of the Microbiomes found in 60 different peoples belly buttons (here) – he had a zoo of 58 different ones in his own. And then to another article where the existence of certain microbiome mutations in the bloodstream were an excellent leading indicator of the presence of cancerous tumours in the individual (here).

Further dips into various Wikipedia articles cited examples of microbiome populations showing up in people suffering from various dilapidating illnesses such as ME, Fibromyalgia and Lyme disease, in some instances having a direct effect on driving imbalances to cause depression. Separately, that what you ate often had quite an effect in altering the relative sizes of parts of the Microbiome population in short order.

There was another article that suggested new research was going to study the Microbiome Zoo present in people’s armpits, but I thought that an appropriate time to do an exit stage left on my reading. Ugh.

Brain starts to wander again

Later on, I reflected for a while on how I could supply some skills i’ve got to build up data resources – at least should suitable sensors be able to measure, sample and sequence microbiomes systematically every day. I have the mobile phone programming, NoSQL database deployment and analytics skills. But what if we had sensors that everyone could have on them 24/7 that could track the microbiome zoo that is you (internally – and I guess externally too)? Load the data resources centrally, and I suspect the Wardley Map of what is currently the NHS would change fundamentally.

I also suspect that age-old Chinese Medicine will demonstrate it’s positive effects on further analysis. It was about the only thing that solved my wifes psoriasis on her hands and feet; she was told about the need to balance yin/yan and remove heat put things back to normal, which was achieved by consumption of various herbs and vegetation. It would have been fascinating to see how the profile of her microbiomes changed during that process.

Sensors

I guess the missing piece is the ability to have sensors that can help both identify and count types microbiomes on a continuous basis. It looks like a laboratory job at the moment. I wonder if there are other characteristics or conditions that could short cut the process. Health apps about to appear from Apple and Google initiatives tend to be effective at monitoring steps, heart rate. There looks to be provision for sensing blood glucose levels non-invasively by shining infrared light on certain parts of the skin (inner elbow is a favourite); meanwhile Google have patented contact lenses that can measure glucose levels in the blood vessels in the wearers eyes.

The local gym has a Boditrax machine that fires an electrical up one foot and senses the signal received back in the other, and can relate body water, muscle and fat content. Not yet small enough for a mobile phone. And Withings produce scales that can report back weight to the users handset over bluetooth (I sometimes wonder if the jarring of the body as you tread could let a handset sensors deduce approximate weight, but that’s for another day).

So, the mission is to see if anyone can produce sensors (or an edible, communicating pill) that can effectively work, in concert with someones phone and the interwebs, to reliably count and identify biome mixes and to store these for future analysis, research or notification purposes. Current research appears to be in monitoring biome populations in:

  1. Oral Cavity
  2. Nasal
  3. Gastrointestinal Organs
  4. Vaginal
  5. Skin

each with their own challenges for providing a representative sample surface sufficient to be able to provide regular, consistent and accurate readings. If indeed we can miniaturize or simplify the lab process reliably. The real progress will come when we can do this and large populations can be sampled – and cross referenced with any medical conditions that become apparent in the data provider(s). Skin and the large intestine appear to have the most interesting microbiome profiles to look at.

Long term future

The end result – if done thoroughly – is that the skills and error rates of GP provided treatment would become largely relegated, just as it was for farm workers in the 19th century (which went from 98% of the population working the land to less than 2% within 100 years).

With that, I think Kevin Kelly is 100% correct in his assessment – that the article shows how significant DAILY genome sequencing will be. So, what do we need to do to automate the process, and make the fruits of its discoveries available to everyone 24/7?

Footnote: there look to be many people attempting to automate subsets of the DNA/RNA identification process. One example highlighted by MIT Review today being this.

How that iPhone handset knows where I am

Treasure Island MapI’ve done a little bit of research to see how an Apple iPhone tracks my location – at least when i’ll be running iOS 8 later this autumn. It looks like it picks clues up from lots of places as you go:

  1. The signal from your local cell tower. If you switch your iPhone on after a flight, that’s probably the first thing it sees. This is what the handset uses to set your timezone and adjust your clock immediately.
  2. WiFi signals. As with Google, there is a location database accessed that translates WiFi router Mac addresses into an approximate geographic location where they’ve been sensed before. At least for the static ones.
  3. The Global Positioning System sensors, that work with both the US and Russian GPS satellite networks.  If you can stand in a field and see the horizon all around you, then your phone should have up to 14 satellites visible. Operationally, if it can see 2, you can get your x and y co-ordinates to within a meter or two. If it can see 3, then you get x, y and z co-ordinates – enough to give your elevation above sea level as well.
  4. Magnetometer and Gyroscope. The iPhone has an electronic compass and some form of gyroscope inside, so the system software can sense the direction, orientation (in 3D space) and movement. So, when you move from outdoors to an indoor location (like a shopping centre or building), the iPhone can remember the last known accurate GPS fix, and deduce (based on direction and speed as you move since that last sampling) your current position.

The system software on iOS 8 just returns your location and an indication of error scale based on all of the above. For some reason, the indoor positioning with the gyroscope is of high resolution for your x and y position, but returns the z position as a floor number only (0 being the ground floor, -1 one down from there, 1..top level above).

In doing all the above, if it senses you’ve moved indoors, then it shuts down the GPS sensor – as it is relatively power hungry and saves the battery at a time when the sensor would be unusable anyway.

Beacons

There are a number of applications where it would be nice to sense your proximity to a specific location indoors, and to do something clever in an application. For example, when you turn up in front of a Starbucks outlet, for Apple Passport to put your loyalty/payment card onto the lock screen for immediate access; same with a Virgin Atlantic check-in desk, where Passport could bring up your Boarding Pass in the same way.

One of the ways of doing this is to deploy low energy bluetooth beacons. These normally have two numbers associated with them; the first 64-bits is a licensee specific number (such as “Starbucks”), the second 64-bit number a specific identifier for that licensee only. This may be a specific outlet on their own applications database, or an indicator of a department location in a department store. It is up to the company deploying the Low Energy Bluetooth Beacons to encode this for their own iPhone applications (and to reflect the positions of the beacons in their app if they redesign their store or location layouts).

Your iPhone can sense beacons around it to four levels:

  1. I can’t hear a beacon
  2. I can sense one, but i’m not close to it yet
  3. I can sense one, and i’m within 3 meters (10 feet) of it right now
  4. I can sense one, and my iPhone is immediately adjacent to the beacon

Case (4) being for things like cash register applications. (2) and (3) are probably good enough for your store specific application to get fired up when you’re approaching.

There are some practical limitations, as low energy bluetooth uses the same 2.4Ghz spectrum that WiFi does, and hence suffers the same restrictions. That frequency agitates water (like a Microwave), hence the reason it was picked for inside applications; things like rain, moisture in walls and indeed human beings standing in the signal path tend to arrest the signal strength quite dramatically.

The iPhone 5S itself has an inbuilt Low Energy Bluetooth Beacon, but in line with the way Apple protect your privacy, it is not enabled by default. Until it is explicitly switched on by the user (who is always given an ability to decline the location sharing when any app requests this), hardware in store cannot track you personally.

Apple appear to have restricted licensees to using iBeacons for their own applications only, so only users of Apple iOS devices can benefit. There is an alternative “Open Beacon” effort in place, designed to enable applications that run across multiple vendor devices (see here for further details).

The Smart Watch Future

With the recent announcement and availability of various Android watches from Samsung, LG and Motorola, it’s notable that they all appear to have the compass, gyroscope but no current implementation of a GPS (i’ve got to guess for reasons of limited battery power and the sensors power appetite). Hence I expect that any direction sensing Smartwatch applications will need to talk to an application talking to the mobile phone handset in the users pocket – over low energy bluetooth. Once established, the app on the watch will know the devices orientation in 3D space and the direction it is headed; probably enough to keep pointing you towards a destination correctly as you walk along.

The only thing we don’t yet know is whether Apple’s own rumoured iWatch will break the mould, or like it’s Android equivalents, act as a peripheral to the network hub that is the users phone handset. We should know that later on this year.

In the meantime, it’s good to see that Apple’s model is to protect the users privacy unless they explicitly allow a vendor app to track their location, which they can agree to or decline at any time. I suspect a lot of vendors would like to track you, but Apple have picked a very “its up to the iPhone user and no-one else” approach – for each and every application, one by one.

Footnote: Having thought about it, I think I missed two things.

One is that I recall reading somewhere that if the handset battery is running low, the handset will bleat it’s current location to the cloud. Hence if you dropped your handset and it was lost in vegetation somewhere, it would at least log it’s last known geographic location for the “Find my iPhone” service to be able to pinpoint it as best it could.

Two is that there is a visit history stored in the phone, so your iPhones travels (locations, timestamps, length of time stationary) are logged as a series of move vectors between stops. These are GPS type locations, and not mapped to any physical location name or store identifier (or even position in stores!). The user has got to give specific permission for this data to be exposed to a requesting app. Besides use for remembering distances for expenses, I can think of few user-centric applications where you would want to know precisely where you’ve travelled in the last few days. Maybe a bit better as a version of the “secret” app available for MacBooks, where if you mark your device on a cloud service as having been stolen, you can get specific feedback on its movements since.

The one thing that often bugs me is people putting out calls on Facebook to help find their stolen or mislaid phones. Every iPhone should have “Find my iPhone” enabled (which is offered at iOS install customisation time) or the equivalent for Android (Android Device Manager) activated likewise. These devices should be difficult to steal.