New Learnings, 12 week Boditrax Challenge, still need Tableau

The Barn Fitness Club Cholsey

One of the wonderful assets at my excellent local gym – The Barn Fitness Club in Cholsey – is that they have a Boditrax machine. This looks like a pair of bathroom scales with metal plates where you put your feet, hooked up to a PC. It bounces a small charge through one foot and measures the signal that comes back through the other. Measuring your weight at the same time and having previously been told your age, it can then work out the composition of your body in terms of fat, muscle, water and bone. The results are dropped on the Boditrax web site, where you can monitor your progress.

For the last 12 weeks, the gym has run a 12 week Boditrax challenge. Fortunately, I pay James Fletcher for a monthly Personal Training session there, where he takes readings using this machine and adjusts my 3x per week gym routine accordingly. The end results after 12 weeks have been (top  graph my weight progress, the bottom my composition changes):

Boditrax Challenge Ian W Weight Tracking

Boditrax Challenge Ian W Final Results

The one difference from previous weight loss programmes i’ve followed is the amount of weight work i’d been given this time around. I used to be always warned that muscle weighs considerably more than fat, so to try to keep to cardio work to minimise both. The thinking these days appears to be to increase your muscle mass a little, which increases your metabolic rate – to burn more calories, even at standstill.

The one thing i’ve done since June 3rd 2002 is to tap my food intake and exercise daily into the excellent Weight Loss Resources web site. Hence I have a 12 year history of exact figures for fat, carbs and protein intake, weight and corresponding weight changes throughout. I used these in a recent Google Course on “Making sense of Data”, which used Google Fusion tables, trying to spot what factors led to a consistent 2lbs/week weight loss.

There are still elements of the storyboard I still need to fit in to complete the picture, as Fusion Tables can draw a scatter plot okay, but can’t throw a weighted trend line through that cloud of data points. This would give me a set of definitive stories to recite; what appears so far is that I make sustainable 2lbs/week losses below a specific daily calorie value if I keep my carbs intake down at a specific level at the same time. At the moment, i’m tracking at around 1lb/week, which is half the rate I managed back in 2002-3 – so i’m keen to expose the exact numbers I need to follow. Too much, no loss; too little, body goes into a siege mentality – and hence the need for a happy medium.

I tried to get a final fix on the exact nett intake and carb levels in Google Spreadsheets, which isn’t so adept at picking data subsets with filters – so you end up having the create a spreadsheet for each “I wonder if” question. So, i’ve largely given up on that until I can get my mits on a Mac version of Tableau Desktop Professional, or can rent a Windows Virtual Desktop on AWS for $30 for 30 days to do the same on it’s Windows version. Until then, I can see the general picture, but there are probably many data points from my 3,800+ weeks of sampled data that plot on top of each other – hence the need for the weighted trend line in order to expose the definitive truth.

The nice thing about the Boditrax machine is that it knows your Muscle and Fat composition, so can give you an accurate reading for your BMR – your Basal Metabolic Rate. This is the minimum level of energy your body needs when at rest to function effectively including your respiratory and circulatory organs, neural system, liver, kidneys, and other organs. This is typically circa 70% of your daily calorie intake, the balance used to power you along.

My BMR according to the standard calculation method (which assumes a ‘typical’ %muscle content) runs about 30 kcals under what Boditrax says it actually is. So, I burn an additional 30 Kcals/day due to my increased muscle composition since James Fletchers training went into place.

Still a long way to go, but heading in the correct direction. All I need now is that copy of Tableau Desktop Professional so that I can work out the optimum levels of calorie and carbs intake to maximise the long term, relentless loss – and to ensure I track at those levels. In the meantime, i’ll use the best case I can work out from visual inspection of the scatter plots.

I thoroughly recommend the Barn Fitness Club in Cholsey, use of their Boditrax machine and regular air time with any of their Personal Trainers. The Boditrax is only £5/reading (normally every two weeks) and an excellent aid to help achieve your fitness goals.

Just waiting to hear the final result of the 12 week Boditrax challenge at the Club – and to hope i’ve done enough to avoid getting the wooden spoon!

Boditrax Challenge Home Page

 

In the meantime, it’s notable that my approx nett calorie intake level (calories eaten less exercise calories) to lose 2lbs/week appears to be my current BMR – which sort of suggests the usual routine daily activity I don’t log (walking around the home, work or shops) is sufficient to hit the fat reserves. An hour of time with Tableau on my data should be enough to confirm if that is demonstrably the case, and the level of carbs I need to keep to in order to make 2lbs/week a relentless loss trend again.

Announcing DECola (then compare to how you buy Cola)

Digital Equipment Corp LogoI had another of those days when simple things irritate me – nominally because the designers of some software went off designing something I use with no appreciation of what happens when someone just wants to get something done.

The first was to add some capabilities to a web site to allow users to avoid creating yet another identity to login to one of my customers web sites; so, let’s give them the ability to login using their Facebook, LinkedIn, Google+ or Twitter credentials instead. Simples! I put in the add-in to WordPress to enable this, which then left me to register as a developer on each site, and retrieve an API key and an API secret (effectively the username and password that identifies my login application as being interfaced by programmer me). Facebook, LinkedIn, Twitter – easy peasy. Google? I gave up after trying to find the API secret for an hour.

The second one was this morning, with my valiantly searching for a proper recycling bin to dispose of two spent HP ink cartridges in Reading while Jane was shopping elsewhere in the town. I thought she was a long time contacting me, so I fished my Nexus 5 phone out of my pocket, and lo behold – a bar on my screen indicating a missed call from Jane. I didn’t have my glasses on, so just pressed the notification expecting it to immediately return her call. What did it do? I got a complete (and to me, fuzzy mess) of a screenful of options, which offered me opportunities to contact her in a wide variety of ways – but no obvious one that suggested it would place a phone call. So, out came my glasses, looked at the screen, and I still couldn’t work it out. SMS, Hangout (video call!), Email… then in one area was her name (repeated twice), one with her current phone number, one with her old O2 phone number – so I pressed what I thought was her current one. Bingo – up popped her picture, and it duly rang her.

WTF. Isn’t it obvious that if I have a missed call, the thing that 99.9% of users seeing it will do is to call right back? After a brief wish that the 20 year old Google employee that wrote the code should be sentenced to wearing glasses to degrade their sight like someone a bit older, and to test the usual usage patterns for a day or two with them on, I thought – this reminds me at some things at Digital.

One personal case was doing the DECdirect Software Catalogue – where we aimed to take the time to look up the part number and price of any of the 250+ products we sold (and over 40,000 part numbers!) to something that could be achieved within the normal attention span of a good salesperson (around 30 seconds). We distilled that down until we hit that 30 second goal every time, often faster.

Ken Olsen (CEO at Digital for 16 of the 17 years I worked there, and many years before that) had a habit of issuing long parables, some of which we spent some time on trying to decode into applicability for us working at a Computer Manufacturer. He would decide he needed to dig a short trench in his back garden in Lincoln (Massachusetts in this case), pop into the local Ford tractor dealer, and try to buy something to give him what he wanted. I should probably note here that he was a Main Board Director at Ford at the time also. The following Monday would come out a parable about going through a tortuous sales process, where he was expected to know the dimensions of the trench and all sorts of detail about the type of soil – and that even before he got subjected to all the different tractor models and payment options available. He closed off the text saying that he often sees that type of situation inside our company, and that we need to fix it.

So, off went a debating round trying to assess what he really meant. In this case, I think he ended up running an offsite (known there as a “Woods Meeting” – due to it often being held in a hut in the New England Forests) and getting the assembled VP’s to order a Minicomputer from Manufacturing, which was duly delivered to where they were meeting. And then he invited them to go build it from the parts shipped, just like a customer. You didn’t have to wait more than a day before all the VP Management edicts started being rained down across the organisation – to vastly simplify the whole installation process for customers.

Unfortunately, I can’t offer anyone at Google a visit to such a Woods Meeting. All I can do is to give one lurid example from someone who got fed up with the complex way we used to tell salespeople how to order a system for their customers. That person wrote a spoof article, styled in exactly the same typical structure as articles that appeared in the Monthly Field Sales Magazine, “Sales Update”. In it, they announced a new bottled Cola drink called DECola (you may need to click on the image to make the text big enough to read):

DECola Sales Update Article

DECola Sales Update Page 2 - Ordering Table

If I ever got close to developers of the Google Login API/Secret keys developers website, or of the “Missed Call” flow on an Android handset, i’d be sorely tempted to send them these two pages, Ken Olsen style.

“OK Google. Where did I park my car?”

Google Now "Where did I Park my Car?" CardThere appears to be a bit of controversy with some commentators learning exactly what “Favorite Locations” are, as stored by every iPhone handset. What happens is that the number of visits to common locations are recorded, from which, based on time spans and days of week, Apple can deduce your “normal” working location and the address at which you sleep most nights. This is currently stored only in your iPhone handset and apparently not yet used; it is designed to enable services to advise you of traffic conditions to and from work, to be used at some point in the future.

The gut reaction is “Whey! They can see exactly where i’m going all the time!”. Well, yes, your handset can; GPS co-ordinates are usually good for an approx location to a meter or two, you have a compass in there that indicates which way you’re facing, and various accelerometers that can work out the devices orientation in 3 dimensions. The only downside is that the full mix tends to be heavy on battery power, and hence currently used by applications on the phone fairly sparingly.

Some privacy concerns then started to arise. However, I thought it was fairly common knowledge that mobile phone operators (certainly in the USA) could deduce the locations of spectators as being inside a sports stadium, and tell the stadium owners the basic demographics of people present, and the locations from which they travelled to the event. This sort of capability will extend to low power bluetooth beacons which can be positioned in retail outlets, which armed with a compatible application (and your permission to share your data), will give them analysis gold. Full coverage, 365 days a year, to a level that doesn’t need Paco Underhill class analysis (Paco is the author of seminal book “Why We Buy: The Science of Shopping“, itself based on years of analysis of customer behaviour in and around retail establishments).

I think i’m fairly cool with it all. Google Android handsets can already sense internally whether you are walking, cycling, on a bus or driving in a car. The whole premise of Google Now is to do searches or to provide service to you before you have to explicitly ask for it. I got quite used to my Nexus phone routinely volunteering commute traffic conditions before I got in my car, or to warn me to leave earlier to hit an appointment in time given current driving (or bus service) conditions on the route I usually took. I was also very impressed when I walked past a bus stop in Reading and Google Now flashed up the eta and destination of the next bus, and a summary of the timetable for buses leaving from that stop.

Google have just released another card on Google Now that automatically notes where you parked your car, and navigates you back to it if you feel the need for it to do so later on.

All of this is done with your explicit permission, and one of the nice things on Android is that if the software vendors data policies change in any way, it will not allow through the update to enable that functionality without explicitly asking you for permission first. Hence why I knocked LinkedIn off my Nexus 5 when they said an update would enable them to collect my phone call data of who I was calling and receiving calls from. I thought that was unnecessary for the service I receive (and pay for) from them.

The location services i’m sharing with a small number of vendors are already returning great benefit to me. If that continues, and service providers are only intrusive enough to help deliver a useful service to me, then i’m happy to share that data. If you don’t want to play, that’s also your call. What’s not to like?

Becoming More Efficient; Moonshot scale ideas available

 

Efficiency Straight Ahead

The statistics below are from an unashamed promotion of a new book, but I thought this was well articulated. The authors cite some statistics to think about:

Examples of Energy Inefficiency

  • The average car spends more than 95 percent of its time …. doing nothing.
  • Less than 40 percent of electrical transmission capacity is in use at any given time.
  • A calorie of beef requires 160 times more energy to produce than one of corn—and as the world grows richer, more people eat beef.
  • The cost of bringing an oil well online has more than tripled over the last decade.
  • A Motorway operating at peak throughput is less than 10 percent covered with cars.
  • Phnom Penh has a lower water leakage rate than London.

There used to be a very small detached house just inside Pamber Forest which I used to pass daily, and often wondered whether I could live quite happily in such a small place. Not quite as extreme as the Capsule Hotels you find in some areas of Japan, but a step in that direction nonetheless. This would probably mean quite a ruthless clean of the miscellaneous stuff we have all over the current house, but i’m sure there would be impressive efficiencies if we knuckled down to it.

The good thing about looking at stats like this is to start having thoughts of what Larry Page (CEO of Google) terms “Moonshots“. What could be done to improve things 10x, 100x or 1000x better than is considered normal by the rest of us, and what changes will that lead us to.

The authors feel that there’s a lot of waste in the status quo, and thus a great chance to produce and use resources much more effectively. But they don’t think it means that the sky is falling, and that our grandchildren are fated to inherit a poisoned, angry, gloomy, planet. That is also the argument of their new book, Resource Revolution: How to Capture the Biggest Business Opportunity in a Century
by McKinsey’s Stefan Heck and Matt Rogers.

My brain starts to wander at this point, and I still have this nagging feeling that all the books in my bookcase could be summarised down to 1-2pages each of people really tried – or less than 30 if examples are cited. One of the neat things about Kindle Books is that Amazon actually allow you to produce and sell stuff at that length; the The Bitcoin Primer: Risks, Opportunities, And Possibilities book
I purchased was an excellent 27 page read.

In terms of manufacturing (I guess they must be manufacturing consultants by day), they suggest looking at five areas: substitution (replacing costly, clunky, or scarce materials with cheaper, better ones); optimization (using IT to improve the production and use of resources – to order rather than into stock?); virtualization (which must really mean sweating otherwise idle assets?); circularity (finding value in products after their initial use) and waste elimination.

However, then then start citing “having to deal with more complex supply chains”, while integrating “big data” (hmmm – fad alert!) and finding diverse talent with new skills in areas like software- and system-integration (while I thought those were pretty well established!).

They conclude, “any bet that we will succumb to a global economic crisis is a bet against human ingenuity. No such bet has ever paid off.”

Looks an interesting book nonetheless, and i’m sure some good nuggets to pick at. Duly added to my Wish List.

Watches? Give me a Hearing Aid that knows when to psst… in my ear

iWatch Concept Devices

Speculation is still rife on the nature of Apple’s upcoming iWatch device, the latest of which was speculation of a $1000 price tag or a positioning against Rolex. If it is, I may need quite a bit of advance warning before Jane sends me to collect hers (if indeed Apple release such a device).

Probably the best overview of the watch industry i’ve heard was a Cubed Podcast featuring Bill Geiser, the CEO of MetaWatch, but who previously did work for Fossil and before that for Sony on their email capable watch ranges. If you have a spare hour in a car or train journey, it’s well worth the listen; it’s Episode 11 of the Cubed Podcast, downloadable from iTunes or listen here.

One of the statistics Bill cites is that the watch market is worth circa $1.2Billion per annum, with 85% of this revenue attributable to watches costing more than $500. He is also at pains to point out that they are a very visible fashion accessory, have many variations and focus on doing just one thing well – which is telling the time. A lot of forays into putting more intelligence into them in the past have failed to make a large impact.

Since the time of that Podcast, Pebble have come out with the second iteration of their popular watch (known as the “Pebble Steel“, Samsung have sprung out two attempts at their Samsung Gear, and Motorola (who are in the middle of transitioning ownership from Google to Lenovo) have “pre-announced” their Moto-360 concept device.

The Motorola concept looks impressive (the core competence of high technology companies is normally far removed from consumer-attractive fashionable design). A few samples are as follows (you’ll need to click on these images to blow them up to full size in order to see them animate properly – or alternatively, see all the related demos at https://moto360.motorola.com/):

Moto 360 Speed Reading

 

Moto 360 Set Alarm

The only gotcha is that space constraints usually kill the size of battery you can install in these devices, and the power required to drive the display and supporting electronics – while doing any of these applications – will empty their capacity in minutes. The acceptable norm would be at least a working day. As someone whos found their phone running out of power while trying to navigate myself around unfamiliar streets in Central and West London, this is something of a show stopper. And these Moto 360 concepts appear to be destined for science fiction only, as modern day physics will stop these becoming a reality – yet.

So, at face value, we may need new display technologies, and/or new batteries, and/or moving as much as possible away from the wrist and into powered packaging elsewhere on a person. I’m not sure if you can cast the display (like a TV using Google Chromecast, or using Apple Airplay) over low power Bluetooth, or if there are other charging mechanisms that could feed a decent display using the movement of the user, or daylight.

It’ll be interesting to see what Apple come to market with, but we may all have it wrong and find their device is a set of health sensors coupled with a simple notifications system.

While technologists may think a watch spewing the already compelling “Google Now” type notifications would be impressive, many should be reminded that looking at your watch in a meeting is often a social no-no. It’s a sign that the person doing so is disinterested in the subject of conversation and is keen to move on.

Likewise for the current generation of Google Glass, the devices look dorky and social norms around the presence of sound/picture/video recording have yet to be widely established. Sticking the glasses on top of your head is the one norm if you’re using public conveniences, but usage isn’t wide enough outside San Francisco and various tech conferences yet. And the screen real estate still too small to carry much data.

My Nexus 5 handset has one colour LED on the front that blinks White if i’ve received an email, Blue for a Facebook update, Yellow for a Snapchat and Green for an SMS. Even a service like IFTTT (“If this then that”) sitting in front of a notifications system could give a richer experience to help prioritise what is allowed to interrupt me, or what notifications get stored for review later.

Personally, i’d prefer an intelligent hearing aid type device that could slip the “psst…” into my ear at appropriate times. That would me much more useful to me in meetings and while on the move.

In the interim, the coming wave of intelligent, mobile connected electronics have yet to get evenly distributed across a very, very wide range of fashion accessories of all kinds. From the sound of Google’s work, it sounds like they are aiming at a large number of fashion OEMs – folks primarily fashion providers but who can embed licensed electronics that talk to the hub that is an Internet connected smartphone. I wouldn’t be surprised if Apple’s approach will be similar, but allowing such devices to hook into Apple provided app platforms than sit on an iPhone (such as the widely expected HealthBook).

We’ll hopefully have all the answers – and the emergent ecosystems running at full clip – this side of Christmas 2014. Or at last have a good steer following Apple’s WWDC (Worldwide Developers Conference) and Google I/O (the Google equivalent) before mid-year, when developers should be let loose getting their software ready for these new (or at least, class of these new) devices.

Public Clouds, Google Cloud moves and Pricing

Google Cloud Platform Logo

I went to Google’s Cloud Platform Roadshow in London today, nominally to feed my need to try and rationalise the range of their Cloud offerings.  This was primarily for my potential future use of their infrastructure and to learn to what I could of any nuances present. Every provider has them, and I really want to do a good job to simplify the presentation for my own sales materials use – but not to oversimplify to make the advice unusable.

Technically overall, very, very, very impressive.

That said, i’m still in three minds about the way the public cloud vendors price their capacity. Google have gone to great lengths – they assure us – to simplify their pricing structure against industry norms. They were citing industry prices coming down by 6-8% per year, but the underlying hardware following Moores law much more closely – at 20-30% per annum lower.

With that, Google announced a whole raft of price decreases of between 35-85%, accompanied by simplifications to commit to:

  • No upfront payments
  • No Lock-in or Contracts
  • No Complexity

I think it’s notable that as soon as Google went public with that a few weeks back, they were promptly followed by Amazon Web Services, and more recently by Microsoft with their Azure platform. The outside picture is that they are all in a race, nip and tuck – well, all chasing the volume that is Amazon, but trying to attack from underneath, a usual industry playbook.

One graph came up, showing that when a single virtual instance is fired up, it costs around 7c per hour if used up to 25% of the month – after which the cost straight lines down. If that instance was up all month, then it was suggested that the discount of 30% would apply. That sort of suggests a monthly cost of circa $36.

Meanwhile, the Virtual Instance (aka Droplet) running Ubuntu Linux and my WordPress Network on Digital Ocean, with 30GB flash storage and a 3TB/month network bandwidth, currently comes out (with weekly backups) at a fixed $12 for me. One third the apparent Google price.

I’m not going to suggest they are in any way comparable. The Digital Ocean droplet was pretty naked when I ran it up for the first time. I had to very quickly secure it (setting up custom iptables to close off the common ports, ensure secure shell only worked from my home fixed IP address) and spend quite a time configuring WordPress and associated email infrastructure. But now it’s up, its there and the monthly cost very predictable. I update it regularly and remove comment spam volumes daily (ably assisted by a WordPress add-in). The whole shebang certainly doesn’t have the growth potential that Google’s offerings give me out of the box, but like many developers, it’s good enough for it’s intended purpose.

I wonder if Google, AWS, Microsoft and folks like Rackspace buy Netcraft’s excellent monthly hosting provider switching analysis. They all appear to be ignoring Digital Ocean (and certainly not appearing to be watching their churn rates to an extent most subscription based businesses usually watch like a hawk) while that company are outgrowing everyone in the industry at the moment. They are the one place that are absorbing developers, and taking thousands of existing customers away from all the large providers. In doing so, they’ve recently landed a funding round from VC Andreessen Horowitz (aka “A16Z” in the industry) to continue to push that growth. Their key audience, that of Linux developers, being the seeds from which many valuable companies and services of tomorrow will likely emerge.

I suspect there is still plenty time for the larger providers to learn from their simplicity – of both pricing, and the way in which pre-configured containers of common Linux-based software stacks (WordPress, Node.js, LAMP, email stacks, etc) can be deployed quickly and inexpensively. If indeed, they see Digital Ocean as a visible threat yet.

In the meantime, i’m trying to build a simple piece of work that can articulate how all the key Public Cloud vendor services are each structured, from the point of view of the time-pressured, overly busy IT Manager (the same as I did for the DECdirect Software catalogue way back when). I’m scheduled to have a review of AWS at the end of April to this end. The presence of a simple few spreads of comparative collateral appears to be the missing reference piece in the Industry to date.

Great Technology. Where’s the Business Value?

Exponential Growth Bar GraphIt’s a familiar story. Some impressive technical development comes up, and the IT industry adopts what politicians will call a “narrative” to try push its adoption – and profit. Two that are in the early stages right now are “Wearables” and “Internet of Things”. I’m already seeing some outlandish market size estimates for both, and wondering how these map back to useful applications that people will pay for.

“Internet of Things” is predicated on an assumption that with low cost sensors and internet connected microcomputers embedded in the world around us, the volume of data thrown onto the Internet will necessitate a ready market needing to consume large gobs of hardware, software and services. One approach to try to rationalise this is to spot where there are inefficiencies in a value chain exist, and to see where technology will help remove them.

One of my sons friends runs a company that has been distributing sensors of all sorts for over 10 years. Thinking there may be an opportunity to build a business on top of a network of these things, I asked him what sort of applications his products were put to. It appears to be down to networks of flows in various utilities and environmental assets (water, gas, rivers, traffic) or in industrial process manufacturing. Add some applications of low power bluetooth beacons, then you have some human traffic monitoring in retail environments. I start running out of ideas for potential inefficiencies that these (a) can address and (b) that aren’t already being routinely exploited. One example is in water networks, where fluid flows across a pipe network can help quickly isolate the existence of leaks, markedly improving the supply efficiency. But there are already companies in place that do that and they have the requisite relationships. No gap there apparent.

One post on Gigaom showed some interesting new flexible electronic materials this week. The gotcha with most such materials is the need for batteries, the presence of which restricts the number of potential applications. One set of switches from Swiss company Algra could emit a 2.4GHz radio signal between 6-10 meters using only energy from someone depressing a button; the main extra innovations are that the result is very thin, and have (unlike predecessors) extremely long mechanical lifetimes. No outside power source required. So, just glue your door bells or light switches where you need them, and voila – done forever.

The other material that caught my eye was a flexible image sensor from ISORG (using Plastic Logic licensed technology). They managed to have a material that you could layer on the surface of a display, and which can read the surface of any object placed against it. No camera needed, and with minimal thickness and weight. Something impossible with a standard CMOS imaging scanner, because that needs a minimum distance to focus on the object above it. So, you could effectively have an inbuilt scanner on the surface of your tablet, not only for monochrome pictures, but even fingerprints and objects in close proximity – for contactless gesture control. Hmmm – smart scanning shelves in retail and logistics – now that may give users some vastly improved efficiencies along the way.

The source article is at: http://gigaom.com/2014/04/07/how-thin-flexible-electronics-will-revolutionize-everything-from-user-interfaces-to-packaging/

A whole field is opening up around collecting data from the Onboard Diagnostics Bus that exists in virtually every modern car now, but i’ve yet to explore that in any depth so far. I’ve just noticed a trickle of news articles about Phil Windley’s FUSE project on Kickstarter (here) and some emerging work by Google in the same vein (with the Open Automotive Alliance). Albeit like TVs, vehicle manufacturers have regulatory challenges and/or slow replacement cycles stopping them moving at the same pace as the computer and electronic industries do.

Outside of that, i’m also seeing a procession of potential wearables, from glasses, to watches, to health sensors and to clip-on cameras.

Glasses and Smart Watches in general are another much longer story (will try and do that justice tomorrow), but these are severely limited by the need for battery power in limited space to so much more than their main application – which is simple display of time and pertinent notifications.

Health sensors are pretty well established already. I have a FitBit One on me at all times bar when i’m sleeping. However, it’s main use these days is to map the number of steps I take into an estimated distance I walk daily, which I tap pro-rata into Weight Loss Resources (I know a walk to our nearest paper shop and back is circa 10,000 steps – and 60 mins of moderate speeds – enough to give a good estimate of calories expended). I found the calorie count questionable and the link to MyFitnessPal a source of great frustration for my wife; it routinely swallows her calorie intake and rations out the extra extra calories earnt (for potential increased food consumption) very randomly over 1-3 days. We’ve never been able to correlate it’s behaviour rationally, so we largely ignore that now.

There’s lots of industry speculation around now that Apple’s upcoming iWatch will provide health related sensors, and to send readings into a Passbook-like Health Monitoring application on a users iPhone handset. One such report here. That would probably help my wife, who always appears to suffer a level of anxiety whenever her blood pressure is taken – which worsens her readings (see what happens after 22 days of getting used to taking daily readings – things settle down):

Jane Waring Blood Pressure Readings

I dare say if the reading was always on, she’d soon forget it’s existence and the readings reflect a true reality. In the meantime, there are also feelings that the same Health monitoring application will be able to take readings from other vendors sensors, and that Apple are trying to build an ecosystem of personal health devices that can interface to it’s iPhone based “hub” – and potentially from there onto Internet based health services. We can but wait until Apple are ready to admit it (or not!) at upcoming product announcement events this year.

The main other wearables today are cameras. I’ve seen some statistics on the effect of Police Officers wearing these in the USA:

US Police Officer with Camera

One of my youngest sons friends is a serving Police Officer here, and tells us that wearing of cameras in his police force is encouraged but optional. That said, he said most officers are big fans of using them. When turned off, they have a moving 30 second video buffer, so when first switched on, they have a record of what happened up to 30 seconds before that switch was applied. Similarly, when turned off, they continue filming for a further 30 seconds before returning to their looping state.

Perhaps surprising, he says that his interactions are such that he’s inclined to use less force even though, if you saw footage, you’d be amazed at his self restraint. In the USA, Police report that when people they’re engaging know they’re being filmed/recorded, they are far more inclined to behave themselves and not to try to spin “he said that, I said that” yarns.

There are all sorts of privacy implications if everyone starts wearing such devices, and they are getting increasingly smaller. Muvi cameras as one example, able to record 70-90 minutes of hi res video from their 55mm tall, clip attached enclosure. Someone was recently prosecuted in Seattle for leaving one of these lens-up on a path between buildings frequented by female employees at his company campus (and no, I didn’t see any footage – just news of his arrest!).

We’re moving away from what we thought was going to be a big brother world – but to one where such cameras use is “democratised” across the whole population.

Muvi Camcorder

 

I don’t think anyone has really comprehended the full effect of this upcoming ubiquity, but I suspect that a norm will be to expect that the presence of a working camera to be indicated vividly. I wonder how long it will take for that to become a new normal – and if there are other business efficiencies that their use – and that of other “Internet of Things” sensors in general – can lay before us all.

That said, I suspect industry estimates for “Internet of Things” revenues, as they stand today, along with a lack of perceived “must have this” applications, make them feel hopelessly optimistic to me.

Focus on End Users: a flash of the bleeding obvious

Lightbulb

I’ve been re-reading Terry Leahy’s “Management in 10 Words”; Sir Terry was the CEO of Tesco until recently. I think the piece in the book introduction relating to sitting in front of some Government officials was quite funny – if it weren’t a blinding dose of the obvious that most IT organisations miss:

He was asked “What was it that turned Tesco from being a struggling supermarket, number three retail chain in the UK, into the third largest retailer in the World?”. He said: “It’s quite simple. We focussed on delivering for customers. We set ourselves some simple aims, and some basic values to live by. And we then created a process to achieve them, making sure that everyone knew what they were responsible for”.

Silence. Polite coughing. Someone poured out some water. More silence. “Was that it?” an official finally asked. And the answer to that was ‘yes’.

The book is a good read and one we can all learn from. Not least as many vendors in the IT and associated services industry and going in exactly the opposite direction compared to what he did.

I was listening to a discussion contrasting the different business models of Google, Facebook, Microsoft and Apple a few days back. The piece I hadn’t rationalised before is that of this list, only Apple have a sole focus on the end user of their products. Google and Facebook’s current revenue streams are in monetising purchase intents to advertisers, while trying to not dissuade end users from feeding them the attention and activity/interest/location signals to feed their business engines. Microsoft’s business volumes are heavily skewed towards selling software to Enterprise IT departments, and not the end users of their products.

One side effect of this is an insatiable need focus on competition rather than on the user of your products or services. In times of old, it became something of a relentless joke that no marketing plan would be complete without the customary “IBM”, “HP” or “Sun” attack campaign in play. And they all did it to each other. You could ask where the users needs made it into these efforts, but of the many I saw, I don’t remember a single one of those featured doing so at all. Every IT vendor was playing “follow the leader” (and ignoring the cliffs they may drive over while doing so), where all focus should have been on your customers instead.

The first object lesson I had was with the original IBM PC. One of the biggest assets IBM had was the late Philip “Don” Estridge, who went into the job running IBM’s first foray into selling PCs having had personal experience of running an Apple ][ personal computer at home. The rest of the industry was an outgrowth of a hobbyist movement trying to sell to businesses, and business owners craved “sorting their business problems” simply and without unnecessary surprises. Their use of Charlie Chaplin ads in their early years was a masterstroke. As an example, spot the competitive knockoff in this:

There isn’t one! It’s a focus on the needs of any overworked small business owner, where the precious asset is time and business survival. Trading blows trying to sell one computer over another completely missing.

I still see this everywhere. I’m a subscriber to “Seeking Alpha“, which has a collection of both buy-side and sell-side analysts commentating on the shares of companies i’ve chosen to watch. More often than not, it’s a bit like sitting in an umpires chair during a tennis match; lots of noise, lots of to-and-fro, discussions on each move and never far away from comparing companies against each other.

One of the most prescient things i’ve heard a technology CEO say was from Steve Jobs, when he told an audience in 1997 that “We have to get away from the notion that for Apple to win, Microsoft have to lose”. Certainly, from the time the first iPhone shipped onwards, Apple have had a relentless focus on the end user of their products.

Enterprise IT is still driven largely by vendor inspired fads and with little reference to end user results (one silly data point I carry in my head is waiting to hear someone at a Big Data conference mention a compelling business impact of one of their Hadoop deployments that isn’t related to log file or Twitter sentiment analyses. I’ve seen the same software vendor platform folks float into Big Data conferences for around 3 years now, and have not heard one yet).

One of the best courses I ever went on was given to us by Citrix, specifically on selling to CxO/board level in large organisations. A lot of it is being able to relate small snippets of things you discover around the industry (or in other industries) that may help influence their business success. One example that I unashamedly stole from Martin Clarkson was that of a new Tesco store in South Korea that he once showed to me:

I passed this onto to the team in my last company that sold to big retailers. At least four board level teams in large UK retailers got to see that video and to agonise if they could replicate Tesco’s work in their own local operations. And I dare say the salespeople bringing it to their attention gained a good reputation for delivering interesting ideas that may help their client organisations future. That’s a great position to be in.

With that, i’ve come full circle from and back to Tesco. Consultative Selling is a good thing to do, and that folks like IBM are complete masters at it; if you’re ever in an IBM facility, be sure to steal one of their current “Institute for Business Value” booklets (or visit their associated group on LinkedIn). Normally brim full of surveys and ideas to stimulate the thought processes of the most senior users running businesses.

We’d do a better job in the IT industry if we could replicate that focus on our end users from top to bottom – and not to spend time elbowing competitors instead. In the meantime, I suspect those rare places that do focus on end users will continue to reap a disproportionate share of the future business out there.

A modern take on peoples valiant attempts to get attention

Facebook Newsfeed Algorithm Equation

A really well written story in Techcrunch today, which relates the ever increasing difficulty of getting a message you publish in front of people you know. Well worth a read if you have a spare 5 minutes: http://techcrunch.com/2014/04/03/the-filtered-feed-problem/

The main surprise for me is that if you “Like” a particular vendors Facebook page, the best historical chance (from Feb 2012) of seeing one individual post from them was around 1 in 6 – 16%. With an increase in potential traffic to go into your personal news feed, it is (in March 2014) now down to 1 in 15 – 6.51%. So, businesses are facing the same challenges to that of the Advertising industry in general, even on these new platforms.

Despite the sheer amount of signal data available to them, even folks like Facebook (and I guess the same is true of Google, Twitter, LinkedIn, Pinterest, etc) have a big challenge to separate what we value seeing, and what we skip by. Even why we look at these social media sites can be interpreted in many different ways from the get go. One of my ex-work colleagues, at a s Senior Management program at Harvard, had a professor saying that males were on Facebook for the eye candy, and females to one-plus their looks and social life among their social circle (and had a habit of publishing less flattering pictures of other women in the same!).

The challenge of these sites is one of the few true need for “big data” analyses that isn’t just IT industry hype to sell more kit. Their own future depends on getting a rich vein of signals from users they act as a content platform for, while feeding paid content into the stream that advertisers are willing to subvert in their favo(u)r  – which is a centuries old pursuit and nothing remarkable, nor new.

Over the past few weeks, i’ve increased the number of times per week I go out for a walk with my wife. This week, Google Now on my Nexus 5 flashed this up:

Google Now Walking Stats Screenshot

 

So, it knows i’m walking, and how far! I guess this isn’t unusual. I know that the complete stock of photographs people upload also contain location data (deduced from GPS or the SSID of Wireless routers close by), date/time and readily admit the make and model of the device that it was taken on. And if you have a professional DSLR camera, often with the serial number of the camera and lens on board (hence some organisations offering to trace stolen cameras by looking at the EXIF data in uploaded photographs).

Individually identifiable data like that is not inserted by any of the popular mobile phones (to the best of my knowledge), and besides, most social media sites strip the EXIF data out of pictures they display publicly anyway. You’d need a warrant to request a search of that sort of data from the social media company, case by case. That said, Facebook and their ilk do have access to the data, and also a fair guess at your social circle given who gets tagged in your pictures!

Traditional media will instead trot out statistics on OTS (aka “Opportunities to see” an advert) and be able to supply some basic demographics – gleaned from subscriptions and competition entries – to work out the typical demographics of their audience you can pay to address. Getting “likely purchase intent” signals is much, much more difficult.

Love At First Website Demon Ad

When doing advertising for Demon Internet, we used to ask the person calling up for a trial CD some basic questions about where they’d seen the advert that led them to contact us. Knowing the media used, and it’s placement cost, we could in time measure the cost per customer acquired and work to keep that as low as possible. We routinely shared that data every week with our external media buyers, who used the data as part of their advertising space buying negotiation patter, and could relate back which positions and advert sizes in each publication pulled the best response.

The main gotcha is that if you ask, you may not get an accurate answer from the customer, or you can be undone by your own staff misattributing the call. We noticed this when we were planning to do a small trial some TV advertising, so had “TV” put on the response systems menu – as it happens, it appeared as the first option on the list. We were somewhat bemused after a week that TV was our best source of new customers – but before any of our ads had been aired. So, a little nudge to our phone staff to please be more accurate, while we changed every ad, for each different media title we used, to different 0800 numbers – and could hence take the response readings off the switch, cutting out the question and generally making the initial customer experience a bit more friction free.

With that, our cost per acquired customer stayed around the £20 each mark, and cost per long term retained customer kept at around £30 (we found, along the way, some publications had high response rates, but high churn rates to go with them).

Demon Trial Postmark

The best response rates of all were getting the Royal Mail franking machines to cancel stamps on half of all stamped letters in the UK for two two-week periods – which came out at £7 per acquired customer; a great result for Michelle Laufer, who followed up when she noticed letters arriving at home cancelled with “Have a Break, Have a Kit Kat”. Unfortunately, the Royal Mail stopped allowing ads to be done in this way, probably in the knowledge that seeing “Demon Internet” on letters resulted in a few complaints from people and places with a nervous disposition (one Mental Hospital as a case in point).

The main challenge for people carrying a Marketing job title these days is to be relentless on their testing, so they can measure – with whatever signals they can collect – what works, what doesn’t and what (from two alternative different treatments) pulls better. Unfortunately, many such departments are littered with people with no wherewithal beyond “please get this mailer out”. Poorest of Amateur behaviour, and wasting money unnecessarily for their shareholders.

As in most walks in life, those that try slightly harder get a much greater proportion of the resulting spoils for their organisation. And that is why seminal books like “Commonsense Direct and Digital Marketing“, and indeed folks like Google, Facebook et al, are anal about the thoroughness of testing everything they do.

Tech Careers delivering results, slowed by silly Nuances

Caution: Does Stupid Things

Early in my career, I used to write and debug device drivers, which was a mix of reading code in octal/hex, looking at source listings, pouring over crash dumps and trying to work out which code paths executed in parallel. Each potentially in conflict with each other if you weren’t extra careful. Doing that for a time gets you used to being able to pattern match the strangest of things. Like being able to tell what website someone is looking at from far across the room, or reading papers on a desk upside down, or being able to scroll through a spreadsheet looking for obvious anomalies at pretty impressive speeds.

The other thing it gives you is “no fear” whenever confronted by something new, or on the bleeding edge. You get a confidence that whatever may get thrown at you, that you can work your way around it. That said, I place great value in stuff that’s well designed, and that has sensible defaults. That speeds your work, as you’re not having to go back and spell out in minute detail what every smallest piece of the action needs to do.

I’ve been really blessed with Analytics products like Tableau Desktop Professional, and indeed more recently with Google Spreadsheets and Google Fusion Tables. These are the sort of tools I use routinely when running any business, so that I can simply, in a data-based way, adjudge what is and isn’t working business-wise.

The common characteristic of these tools are that they all guess what you need to show most of the time, and don’t delay you by having to go through every piece of text, every line, every smallest detail with individual calls for font, font size, colo(u)r and the need to cut the graph display of a line once the last data point is rolled out – and not, as one product does, just throw all future months stuck on a flat line once the plot goes into future months with no data yet present.

There have been several times when i’ve wanted to stick that “Does Stupid Things” sign on Microsoft SQL Server Reporting Services.

I diligently prototyped (as part of a Business improvement project) a load of daily updated graphs/reports for a team of managers using Tableau Desktop Professional. However, I was told that the company had elected to standardise on a Microsoft Reporting product, sitting above a SQL Services based Datamart. In the absence of the company wanting to invest in Tableau Server, I was asked to repurpose the Tableau work into Microsoft SQL Services Reporting Services (aka “SSRS”). So I duly read two books, had a very patient and familiar programmer show the basics and to set me up with Visual Studio, get the appropriate Active Directory Access Permissions, and away I went. I delivered everything before I found no line Management role to go back to, but spent some inordinate time between the two dealing with a few “nuances”.

Consider this. I built a table to show each Sales Team Manager what their units Revenue and Profit was, year to date, by month, or customer, or vendor. The last column of the table was a percentage profit margin, aka “Profit” divided by “Revenue”. The gotcha with this is that if something is given away for free, (nominally negative) profit over revenue throws a divide by zero error. So simple to code around, methinks:

=iif(revenue>0, profit/revenue, 0)

Which, roughly translated, tells the software to calculate the percentage profit if revenue is greater than zero, otherwise just stick zero in as the answer. So, I rerun the view, and get #error in all the same places and the same 13 examples of attempted divide by zeroes in as before.

Brain thinks – oh, there must be some minuscule revenue numbers in the thousandths of pence in there, so change the formula to:

=iif(revenue>1,profit/revenue, 0)

so that the denominator is at least one, so the chance of throwing a divide by zero error is extremely remote. The giveaway would need to be mind bogglingly huge to get anything close to a divide by zero exception. Re-run the view. Result: Same 13 divide by zero #error exceptions.

WTF? Look at where the exceptions are being thrown, and the revenue is zero, so the division shouldn’t even be being attempted. So off to Google with “SQL Services Reporting iif divide by zero” I go. The answer came from a Microsoft engineer who admits, nominally for performance reasons, both paths of the iif statement get executed at the same time as a performance shortcut, so that whichever half needs to give it’s result, it’s already populated and ready to use. So, the literal effect of:

=iif(revenue>0, profit/revenue,0)

works like this:

  • Calculate 0 on the one side.
  • Calculate Profit/Revenue on the other.
  • If Revenue > 0, pick the second option, else pick the first.
  • If either side throws an exception (like divide by zero), blat the answer, substitute “#Error” instead.

Solution is to construct two nested “iif” statements in such a way that the optimiser does’t execute the division before the comparison with zero is made.

With that, I’m sure wearing underpants on your head has the same sort of perverse logic somewhere. This is simply atrociously bad software engineering.