Officially Certified: AWS Business Professional

AWS Business Professional Certification

That’s added another badge, albeit the primary reason was to understand AWS’s products and services in order to suss how to build volumes via resellers for them – just in case I can get the opportunity to be asked how i’d do it. However, looking over the fence at some of the technical accreditation exams, I appear to know around half of the answers there already – but need to do those properly and take notes before attempting those.

(One of my old party tricks used to be that I could make it past the entrance exam required for entry into technical streams at Linux related conferences – a rare thing for a senior manager running large Software Business Operations or Product Marketing teams. Being an ex programmer who occasionally fiddles under the bonnet on modern development tools is a useful thing – not least to feed an ability to be able to spot bullshit from quite a distance).

The only AWS module I had any difficulty with was the pricing. One of the things most managers value is simplicity and predictability, but a lot of the pricing of core services have pricing dependencies where you need to know data sizes, I/O rates or the way your demand goes through peaks and troughs in order to arrive at an approximate monthly price. While most of the case studies amply demonstrate that you do make significant savings compared to running workloads on your own in-house infrastructure, I guess typical values for common use cases may be useful. For example, if i’m running a SAP installation of specific data and access dimensions, what operationally are typically running costs – without needing to insert probes all over a running example to estimate it using the provided calculator?

I’d come back from a 7am gym session fairly tired and made the mistake of stepping through the pricing slides without making copious notes. I duly did all that module again and did things properly the next time around – and passed it to complete my certification.

The lego bricks you snap together to design an application infrastructure are simple in principle, loosely connected and what Amazon have built is very impressive. The only thing not provided out of the box is the sort of simple developer bundle of an EC2 instance, some S3 and MySQL based EBD, plus some open source AMIs preconfigured to run WordPress, Joomla, Node.js, LAMP or similar – with a simple weekly automatic backup. That’s what Digital Ocean provide for a virtual machine instance, with specific storage and high Internet Transfer Out limits for a fixed price/month. In the case of the WordPress network on which my customers and this blog runs, that’s a 2-CPU server instance, 40GB of disk space and 4TB/month data traffic for $20/month all in. That sort of simplicity is why many startup developers have done an exit stage left from Rackspace and their ilk, and moved to Digital Ocean in their thousands; it’s predictable and good enough as an experimental sandpit.

The ceiling at AWS is much higher when the application slips into production – which is probably reason enough to put the development work there in the first place.

I have deployed an Amazon Workspace to complete my 12 years of Nutrition Data Analytics work using the Windows-only Tableau Desktop Professional – in an environment where I have no Windows PCs available to me. Just used it on my MacBook Air and on my iPad Mini to good effect. That will cost be just north of £21 ($35) for the month.

I think there’s a lot that can be done to accelerate adoption rates of AWS services in Enterprise IT shops, both in terms of direct engagement and with channels to market properly engaged. My real challenge is getting air time with anyone to show them how – and in the interim, getting some examples ready in case I can make it in to do so.

That said, I recommend the AWS training to anyone. There is some training made available the other side of applying to be a member of the Amazon Partner Network, but there are equally some great technical courses that anyone can take online. See http://aws.amazon.com/training/ for further details.

12 years of data recording leads to dose of the obvious

Ian Waring Weight Loss Trend Scatter Graph

As mentioned yesterday, I finally got Tableau Desktop Professional (my favourite Analytics software) running on Amazon Workspaces – deployed for all of $35 for the month instead of having to buy my own Windows PC. With that, a final set of trends to establish what I do right when I consistently lose 2lbs/week, based on an analysis of my intake (Cals, Protein, Carbs and Fat) and Exercise since June 2002.

I marked out a custom field that reflected the date ranges on my historical weight graph where I appeared to consistently lose, gain or flatline. I then threw all sorts of scatter plots (like the one above, plotting my intake in long periods where I had consistent weight losses) to ascertain what factors drove the weight changes i’ve seen in the past. This to nominally to settle on a strategy going forward to drop to my target weight as fast as I could in a sustainable fashion. Historically, this has been 2lbs/week.

My protein intake had zero effect. Carbs and Fat did, albeit they tracked the effect of my overall Calorie intake (whether in weight or in the number of Calories present in each – 1g of Carbs = 3.75 Kcals, and 1g of Fat = 9 Kcals; 1g of Protein is circa 4 Kcals). The WeightLossResources recommended split of Kcals from the mix to give an optimum balance in their view (they give a daily pie-chart of Kcals from each) is 50% Carbs, 30% Fat and 20% Protein.

So, what are the take-homes having done all the analysis?

Breathtakingly simple. If I keep my food intake, less exercise calories, at circa 2300-2350 calories per day, I will lose a consistent 2lbs. The exact balance between carbs, protein and fat intake doesn’t matter too materially, as long as the total is close, though my best every long term loss had me running things close to the recommended balance. All eyes on that pie chart on the WLR web site as I enter my food then!

The stupid thing is that my current BMR (Basal Metabolic Rate is the minimum level of energy your body needs when at rest to function effectively including your respiratory and circulatory organs, neural system, liver, kidneys, and other organs) is 2,364, and before the last 12 week Boditrax competition at my gym, it was circa 2,334 or so. Increased muscle through lifting some weights put this up a little.

So, the basic message is to keep what I eat down to the same calorie value, less the calories from any exercise, down to the same level as my BMR, which in turn will track down as weight goes. That sort of guarantees that any exercise I take over and above what I log – which is only long walks with Jane and gym exercise – will come off my fat reserves.

Simple. So, all else being equal, put less food in my mouth, and i’ll lose weight. The main benefit of 12 years of logging my intake is I can say authoritatively – for me – the levels at which this is demonstrably true. And that should speed my arrival at my optimum weight.

Fixed! Tableau on my Mac using Amazon WorkSpaces

AWS Logo

I found out today that we may need to wait another month for Tableau Desktop Professional for the Mac to be released, and i’ve been eager to finish off my statistical analysis project. I’ve collected 12 years worth of daily food intake courtesy of WeightLossResources, which splits out to calories, carbs, protein, fat and exercise calories – and is tabulated against weekly weight readings.

Google Fusion Tables – in which I did a short online course – can do most things except to calculate and draw a straight line, or exponential equivalent, through a scatter plot. This is meat and drink to Tableau, but which unfortunately (for Mac, Chromebook and iPad user me) runs only on Microsoft Windows.

I got a notification this morning that Amazon Web Services – as promised at their AWS Summit 2014 in London last week – had released Amazon WorkSpaces hosted within Europe. This provisions quite a meaty PC for you, but which you can operate through provided client software on your local PC, Mac, Android Tablet or iPad. There is also a free add-on to sync the content of a local Windows or Mac Directory with the virtual storage on the hosted PC, so you can hook in access to files on your local device if needed. There are more advanced options for corporate users, including Active Directory Support and the ability to use that to sideload apps for a user community – though that is way in advance of what i’m doing here.

There are a number of options, from the “Basic” single CPU, 3.75GB memory, 50GB disk PC up to one with 2 CPUs, 7GB of memory, 100GB of disk and the complete Microsoft Office Professional Suite on board. More here. Prices from $35 to $75/PC per month.

I thought i’d have a crack at provisioning one for the month, and to give me 2 weeks to play with a trial copy of Tableau Desktop Professional (i’ve not used it since V7, and the current release is 8.1). Within 20 minutes of requesting it off my AWS console, I received an email saying it had been provisioned and was ready to go. So…

WorkSpaces Set Up

 

You tell it what you want, and it goes away for 20 minutes provisioning your request (I managed to accidentally do this for a US region, but deleted that and selected Ireland instead – it provisioned just the one in the Ireland datacentre). Once done, it sent me an email with a URL and a registration code for my PC (it will do this for each user if you provision several at once):

AWS WorkSpaces Registration

 

Tap in the registration code from the email received, it does the initial piece of the client end of the configuration, then asks me to login:

AWS Workspaces Login

 

Once i’d done that, it then invited me to install the client software, which I did for Mac OS/X locally, and emailed the links for Android and iOS to my email address to pick up on those devices. For what it’s worth, the Android version said my Nexus 5 wasn’t a supported device (I guess it needs a tablet), but the iOS version installed fine on my iPad Mini.

AWS Workspaces Client Setup

 

And in I went. A Windows PC. Surprisingly nippy, and I felt no real difference between this and what I remember of a local Windows 7 laptop I used to have at Computacenter some 18 months ago now:

AWS Workspaces Microsoft Windows

 

The main need then was to drop a few files onto the hard disk, but I had to go revisit the Amazon WorkSpaces web site and download the Sync package for Mac OS/X. Once installed on my Mac, it asked me for my PC’s registration code again (wouldn’t accept it copy/pasted in on that one screen, so I had to carefully re-enter a short string), asked which local Mac directory I wanted to use to sync with the hosted PC, and off it went. Syncs just like dropbox, took a few minutes to populate that with quite a few files I had sitting there already. Once up, I used the provided Firefox to download Tableau Desktop Professional, the Excel driver I needed (as I don’t have Microsoft Office on my basic version here) and – voila. Tableau running fine on AWS WorkSpaces, on my MacBook Air:

Tableau Desktop Professional Running

 

Very snappy too, and i’m now back at home with my favourite Analytics software of all time – on my Mac, and directly on my iPad Mini also. The latter with impressive keyboard and mouse support, just a two finger gesture (not that one) away at all times.

So, I now have the tools to complete the statistical analysis storyboard of my 12 years of nutrition and weight data – and to set specific calorie and carb content to hit my 2lbs/week downward goal again (i’ve been tracking at only half that rate in the last 6 months).

In the meantime, i’ve been really impressed with Amazon WorkSpaces. Fast, Simple and inexpensive – and probably of wide applicability to lots of Enterprise customers I know. A Windows PC that I can dispose of again as soon as i’ve finished with it, for a grand sum of less than £21 for my months use. Tremendous!

New Learnings, 12 week Boditrax Challenge, still need Tableau

The Barn Fitness Club Cholsey

One of the wonderful assets at my excellent local gym – The Barn Fitness Club in Cholsey – is that they have a Boditrax machine. This looks like a pair of bathroom scales with metal plates where you put your feet, hooked up to a PC. It bounces a small charge through one foot and measures the signal that comes back through the other. Measuring your weight at the same time and having previously been told your age, it can then work out the composition of your body in terms of fat, muscle, water and bone. The results are dropped on the Boditrax web site, where you can monitor your progress.

For the last 12 weeks, the gym has run a 12 week Boditrax challenge. Fortunately, I pay James Fletcher for a monthly Personal Training session there, where he takes readings using this machine and adjusts my 3x per week gym routine accordingly. The end results after 12 weeks have been (top  graph my weight progress, the bottom my composition changes):

Boditrax Challenge Ian W Weight Tracking

Boditrax Challenge Ian W Final Results

The one difference from previous weight loss programmes i’ve followed is the amount of weight work i’d been given this time around. I used to be always warned that muscle weighs considerably more than fat, so to try to keep to cardio work to minimise both. The thinking these days appears to be to increase your muscle mass a little, which increases your metabolic rate – to burn more calories, even at standstill.

The one thing i’ve done since June 3rd 2002 is to tap my food intake and exercise daily into the excellent Weight Loss Resources web site. Hence I have a 12 year history of exact figures for fat, carbs and protein intake, weight and corresponding weight changes throughout. I used these in a recent Google Course on “Making sense of Data”, which used Google Fusion tables, trying to spot what factors led to a consistent 2lbs/week weight loss.

There are still elements of the storyboard I still need to fit in to complete the picture, as Fusion Tables can draw a scatter plot okay, but can’t throw a weighted trend line through that cloud of data points. This would give me a set of definitive stories to recite; what appears so far is that I make sustainable 2lbs/week losses below a specific daily calorie value if I keep my carbs intake down at a specific level at the same time. At the moment, i’m tracking at around 1lb/week, which is half the rate I managed back in 2002-3 – so i’m keen to expose the exact numbers I need to follow. Too much, no loss; too little, body goes into a siege mentality – and hence the need for a happy medium.

I tried to get a final fix on the exact nett intake and carb levels in Google Spreadsheets, which isn’t so adept at picking data subsets with filters – so you end up having the create a spreadsheet for each “I wonder if” question. So, i’ve largely given up on that until I can get my mits on a Mac version of Tableau Desktop Professional, or can rent a Windows Virtual Desktop on AWS for $30 for 30 days to do the same on it’s Windows version. Until then, I can see the general picture, but there are probably many data points from my 3,800+ weeks of sampled data that plot on top of each other – hence the need for the weighted trend line in order to expose the definitive truth.

The nice thing about the Boditrax machine is that it knows your Muscle and Fat composition, so can give you an accurate reading for your BMR – your Basal Metabolic Rate. This is the minimum level of energy your body needs when at rest to function effectively including your respiratory and circulatory organs, neural system, liver, kidneys, and other organs. This is typically circa 70% of your daily calorie intake, the balance used to power you along.

My BMR according to the standard calculation method (which assumes a ‘typical’ %muscle content) runs about 30 kcals under what Boditrax says it actually is. So, I burn an additional 30 Kcals/day due to my increased muscle composition since James Fletchers training went into place.

Still a long way to go, but heading in the correct direction. All I need now is that copy of Tableau Desktop Professional so that I can work out the optimum levels of calorie and carbs intake to maximise the long term, relentless loss – and to ensure I track at those levels. In the meantime, i’ll use the best case I can work out from visual inspection of the scatter plots.

I thoroughly recommend the Barn Fitness Club in Cholsey, use of their Boditrax machine and regular air time with any of their Personal Trainers. The Boditrax is only £5/reading (normally every two weeks) and an excellent aid to help achieve your fitness goals.

Just waiting to hear the final result of the 12 week Boditrax challenge at the Club – and to hope i’ve done enough to avoid getting the wooden spoon!

Boditrax Challenge Home Page

 

In the meantime, it’s notable that my approx nett calorie intake level (calories eaten less exercise calories) to lose 2lbs/week appears to be my current BMR – which sort of suggests the usual routine daily activity I don’t log (walking around the home, work or shops) is sufficient to hit the fat reserves. An hour of time with Tableau on my data should be enough to confirm if that is demonstrably the case, and the level of carbs I need to keep to in order to make 2lbs/week a relentless loss trend again.

Tech Careers delivering results, slowed by silly Nuances

Caution: Does Stupid Things

Early in my career, I used to write and debug device drivers, which was a mix of reading code in octal/hex, looking at source listings, pouring over crash dumps and trying to work out which code paths executed in parallel. Each potentially in conflict with each other if you weren’t extra careful. Doing that for a time gets you used to being able to pattern match the strangest of things. Like being able to tell what website someone is looking at from far across the room, or reading papers on a desk upside down, or being able to scroll through a spreadsheet looking for obvious anomalies at pretty impressive speeds.

The other thing it gives you is “no fear” whenever confronted by something new, or on the bleeding edge. You get a confidence that whatever may get thrown at you, that you can work your way around it. That said, I place great value in stuff that’s well designed, and that has sensible defaults. That speeds your work, as you’re not having to go back and spell out in minute detail what every smallest piece of the action needs to do.

I’ve been really blessed with Analytics products like Tableau Desktop Professional, and indeed more recently with Google Spreadsheets and Google Fusion Tables. These are the sort of tools I use routinely when running any business, so that I can simply, in a data-based way, adjudge what is and isn’t working business-wise.

The common characteristic of these tools are that they all guess what you need to show most of the time, and don’t delay you by having to go through every piece of text, every line, every smallest detail with individual calls for font, font size, colo(u)r and the need to cut the graph display of a line once the last data point is rolled out – and not, as one product does, just throw all future months stuck on a flat line once the plot goes into future months with no data yet present.

There have been several times when i’ve wanted to stick that “Does Stupid Things” sign on Microsoft SQL Server Reporting Services.

I diligently prototyped (as part of a Business improvement project) a load of daily updated graphs/reports for a team of managers using Tableau Desktop Professional. However, I was told that the company had elected to standardise on a Microsoft Reporting product, sitting above a SQL Services based Datamart. In the absence of the company wanting to invest in Tableau Server, I was asked to repurpose the Tableau work into Microsoft SQL Services Reporting Services (aka “SSRS”). So I duly read two books, had a very patient and familiar programmer show the basics and to set me up with Visual Studio, get the appropriate Active Directory Access Permissions, and away I went. I delivered everything before I found no line Management role to go back to, but spent some inordinate time between the two dealing with a few “nuances”.

Consider this. I built a table to show each Sales Team Manager what their units Revenue and Profit was, year to date, by month, or customer, or vendor. The last column of the table was a percentage profit margin, aka “Profit” divided by “Revenue”. The gotcha with this is that if something is given away for free, (nominally negative) profit over revenue throws a divide by zero error. So simple to code around, methinks:

=iif(revenue>0, profit/revenue, 0)

Which, roughly translated, tells the software to calculate the percentage profit if revenue is greater than zero, otherwise just stick zero in as the answer. So, I rerun the view, and get #error in all the same places and the same 13 examples of attempted divide by zeroes in as before.

Brain thinks – oh, there must be some minuscule revenue numbers in the thousandths of pence in there, so change the formula to:

=iif(revenue>1,profit/revenue, 0)

so that the denominator is at least one, so the chance of throwing a divide by zero error is extremely remote. The giveaway would need to be mind bogglingly huge to get anything close to a divide by zero exception. Re-run the view. Result: Same 13 divide by zero #error exceptions.

WTF? Look at where the exceptions are being thrown, and the revenue is zero, so the division shouldn’t even be being attempted. So off to Google with “SQL Services Reporting iif divide by zero” I go. The answer came from a Microsoft engineer who admits, nominally for performance reasons, both paths of the iif statement get executed at the same time as a performance shortcut, so that whichever half needs to give it’s result, it’s already populated and ready to use. So, the literal effect of:

=iif(revenue>0, profit/revenue,0)

works like this:

  • Calculate 0 on the one side.
  • Calculate Profit/Revenue on the other.
  • If Revenue > 0, pick the second option, else pick the first.
  • If either side throws an exception (like divide by zero), blat the answer, substitute “#Error” instead.

Solution is to construct two nested “iif” statements in such a way that the optimiser does’t execute the division before the comparison with zero is made.

With that, I’m sure wearing underpants on your head has the same sort of perverse logic somewhere. This is simply atrociously bad software engineering.

Office for the iPad; has the train already left the station?

Meeting notes by @Jargonautical

One asset I greatly admire (and crave!) is the ability to communicate simply, but with panache, speed and reasoned authority. That’s one characteristic of compelling journalism, of good writing and indeed a characteristic of some of the excellent software products i’ve used. Not to throw in the kitchen sink, but to be succinct and to widen focus only to give useful context supporting the central brass tacks.

I’ve now gone 15 months without using a single Microsoft product. I spend circa £3.30/month for my Google Apps for Business account, and have generally been very impressed with Google Spreadsheet and with Google Docs in there. The only temporary irritant along the way was the inability for Google Docs to put page numbers in the Table of Contents of one 30 page document I wrote, offering only html links to jump to the content – which while okay for a web document, was as much use as a cow on stilts for the printed version. But it keeps improving by leaps and bounds every month. That issue solved, and now a wide array of free add-ons to do online review sign-offs, adding bibliographies and more.

This week, i’ve completed all the lessons on a neat piece of Analytics software called Google Fusion Tables, produced by Google Research and available as a free Google Drive add-on. To date, it appears to do almost everything most people would use Tableau Desktop for, including map-based displays, but with a much simpler User Interface. I’m throwing some more heavy weight lifting at it during the next couple of days, including a peek at it’s Python-accessible API – that nominally allows you to daisy chain it in as part of an end-to-end a business process. The sort of thing Microsoft had Enterprises doing with VBA customisations way back when.

My reading is also getting more focussed. I’ve not read a newspaper regularly for years, dip into the Economist only once or twice every three months, but instead go to other sources online. The behaviour is to sample less than 10 podcasts every week, some online newsletters from authoritative sources, read some stuff that appears in Medium, but otherwise venture further afield only when something swims past in my Twitter stream.

This morning, this caught my eye, as posted by @MMaryMcKenna. Lucy Knight (@Jargonautical) had posted her notes made during a presentation Mary had made very recently. Looking at Lucy’s Twitter feed, there were some other samples of her meeting note taking:

Meeting Notes: Minimal Viable Product

Meeting Notes Cashflow Modelling in Excel

Meeting Notes: Customer Service

Aren’t they beautiful?

Lucy mentions in her recent tweets that she does these on an iPad Mini using an application called GoodNotes, which is available for the princely sum of £3.99 here (she also notes that she uses a Wacom Bamboo stylus – though a friend of hers manages with a finger alone). Short demo here. I suspect my attempts using the same tool, especially in the middle of a running commentary, would pale in comparison to her examples here.

With that, there are reports circulating today that the new Microsoft CEO, Satya Nadella, will announce Microsoft Office for iOS this very afternoon. I doubt that any of the Office components will put out work of the quality of Lucy’s iPad Meeting Notes anytime soon, but am open to being surprised.

Given we’ve had over three years of getting used to having no useful Microsoft product (outside of Skype) on the volume phone or tablet devices here, I wonder if that’s a route back to making money on selling software again, or supporting Office 365 subscriptions, or a damp squib waiting to happen.

My bet’s on the middle of those three by virtue of Microsofts base in Large Enterprise accounts, but like many, I otherwise feel it’s largely academic now. The Desktop software market is now fairly well bombed (by Apple and Google) into being a low cost conduit to a Services equivalent instead. The Server software market will, I suspect, aim the same way within 2 years.

Shareholder Positions and Profits YTD

Pie Chart of Ian Waring Stock Profits YTD

At the start of last year, I started to run my pension funds personally. In line with normal advice, I put circa 70% of the available funds in Vantage LifeStrategy 100% Equity Accumulating Index Trackers, and the balance i’ve used to bet on a few companies I believe have a rosy long term future – while being fully aware of likely swings of the US Dollar vs GB pounds. I’ve lived long enough to see that cycle between $1.40 and $2.20 to the pound – currently nicely in the middle and with holdings I can leave in situ waiting for the right position to enact an exit – albeit that’s not for many years into the future.

At the moment, my total fund is up 9.05% in the year i’ve been running it. The shares portion of this has come in +17.68% year to date (touch wood – profit of £21,871.99). The sources that contributed to the increase in the fund are as above.

In the early days, I did bet on a few others but kept the shareholding small until I had visibility of what they were doing. When they weren’t giving meaningful return, or where I felt I didn’t understand the company direction (strategically) well enough – or learnt their susceptibility to unfounded competitor rumours – I unloaded them. That applied (for various reasons) to ARM, Baidu, Red Hat, Salesforce.com, Splunk and Netflix.

At this stage, i’ve retained positions in Google, Amazon, Tableau Software, Apple and… Facebook. I did buy some Facebook shares in the early days after it’s IPO, but unloaded them at a slight loss while they were in the $20’s. I’ve now bought back in at $69 – largely because Mark Zuckerberg’s purchase of WhatsApp, and the fact he’s put the WhatsApp CEO – who is vehemently against advertising – onto the Facebook Board. I thought this was brilliant, as advertising is the major current source of Facebook’s income, and there was a willingness to put someone up there that will push an alternative, subscription based model. A good sign that Facebook are willing to be radical with their business models, and not follow the normal high technology malaise of clinging to a failing business model into oblivion.

I always think that the Advertising Industry is naive to think their next frontier is the screen of people’s mobile phones; it’s a bit like having a kiddie jumping up and down in front of the TV when you’re trying to watch something. WhatsApp currently charge $1/year for the instant messaging service, and at that level, there is even scope for friends (or vendors) to offer to pay the subscriptions of large numbers of users.

Liquid Metal and GT Advanced Technologies were a small punt based on hearing about various Apple licensing agreements two years ago, and then seeing Apple employees start filing patents on the associated materials just ahead of contract renewal due dates. Liquid Metal is likely to be used for the carcass of new iPhones (without the need to mill aluminium as at present) and GT Advanced Technology supply very resilient Sapphire screens large enough for the display surface of same. Those shares turned out to be quite volatile, so I did an exit stage left on profits of 30.9% and 24.2% respectively – within 3 weeks of their original purchase.

Google shares will split in April, improving their liquidity. Amazon have had a recent fall, but i’m confident that they’ll recover 20-25% in the next 2-3 months. Tableau Software are just about to dilute things a bit with a new share offering, but my returns are still very good (not too far away from 100% returns for the 218 shares I still have). Apple are a blog post all by themselves, cursed by Analyst expectations of slowing growth (despite ratcheting up their market share relentlessly, plus earning 70% of the mobile industry profits) and discounting the likelihood of laying another category of Golden Egg, as they’ve done for iPod, iPhone and iPad already. Quite funny when Amazon trade at huge multiples on the suspicion that their conveyor belt will magic Golden Egg league profits as soon as Jeff Bezos decides that’s what he wants. And Facebook is a wait and see.

There are three potential IPOs i’m looking out for, but that apart, the strategy is “Long Term Buy and Hold”. Working well so far, touch wood.

“Big Data” is really (not so big) Data-based story telling

Aircraft Cockpit

I’m me. My key skill is splicing together data from disparate sources into a compelling, graphical and actionable story that prioritises the way(s) to improve a business. When can I start? Eh, Hello, is anyone there??

One characteristic of the IT industry is its penchants for picking snappy sounding themes, usually illustrative of a future perceived need that their customers may wish to aspire to. And to keep buying stuff toward that destination. Two of these terms de rigueur at the moment are “Big Data” and “Analytics”. There are attached to many (vendor) job adverts and (vendor) materials, though many searching for the first green shoots of demand for most commercial organisations. Or at least a leap of faith that their technology will smooth the path to a future quantifiable outcome.

I’m sure there will be applications aplenty in the future. There are plenty of use cases where sensors will start dribbling out what becomes a tidal wave of raw information, be it on you personally, in your mobile handset, in lower energy bluetooth beacons, and indeed plugged into the “On Board Diagnostics Bus” in your car. And aggregated up from there. Or in the rare case that the company has enough data locked down in one place to get some useful insights already, and has the IT hardware to crack the nut.

I often see desired needs for “Hadoop”, but know of few companies who have the hardware to run it, let alone the Java software smarts to MapReduce anything effectively on a business problem with it. If you do press a vendor, you often end up with a use case for “Twitter sentiment analysis” (which, for most B2B and B2C companies, is a small single digit percentage of their customers), or of consolidating and analysing machine generated log files (which is what Splunk does, out of the box).

Historically, the real problem is data sitting in silos and an inability (for a largely non-IT literate user) to do efficient cross tabulations to eek a useful story out. Where they can, the normal result is locking in on a small number of priorities to make a fundamental difference to a business. Fortunately for me, that’s a thread that runs through a lot of the work i’ve done down the years. Usually in an environment where all hell is breaking loose, where everyone is working long hours, and high priority CEO or Customer initiated “fire drill” interruptions are legion. Excel, Text, SQLserver, MySQL or MongoDB resident data – no problem here. A few samples, mostly done using Tableau Desktop Professional:

  1. Mixing a years worth of Complex Quotes data with a Customer Sales database. Finding that one Sales Region was consuming 60% of the teams Cisco Configuration resources, while at the same time selling 10% of the associated products. Digging deeper, finding that one customer was routinely asking our experts to configure their needs, but their purchasing department buying all the products elsewhere. The Account Manager duly equipped to have a discussion and initiate corrective actions. Whichever way that went, we made more money and/or better efficiency.
  2. Joining data from Sales Transactions and from Accounts Receivable Query logs, producing daily updated graphs on Daily Sales Outstanding (DSO) debt for each sales region, by customer, by vendor product, and by invoices in priority order. The target was to reduce DSO from over 60 days to 30; each Internal Sales Manager had the data at their fingertips to prioritise their daily actions for maximum reduction – and to know when key potential icebergs were floating towards key due dates. Along the way, we also identified one customer who had instituted a policy of querying every single invoice, raising our cost to serve and extending DSO artificially. Again, Account Manager equipped to address this.
  3. I was given the Microsoft Business to manage at Metrologie, where we were transacting £1 million per month, not growing, but with 60% of the business through one retail customer, and overall margins of 1%. There are two key things you do in a price war (as learnt when i’d done John Winkler Pricing Strategy Training back in 1992), which need a quick run around customer and per product analyses. Having instituted staff licensing training, we made the appropriate adjustments to our go-to-market based on the Winkler work. Within four months, we were trading at £5 million/month and at the same time, doubled gross margins, without any growth from that largest customer.
  4. In several instances that demonstrated 7/8-figure Software revenue and profit growth, using a model to identify what the key challenges (or reasons for exceptional performance) were in the business. Every product and subscription business has four key components that, mapped over time, expose what is working and what is an area where corrections are needed. You then have the tools to ask the right questions, assign the right priorities and to ensure that the business delivers its objectives. This has worked from my time in DECdirect (0-$100m in 18 months), in Computacenter’s Software Business Units growth from £80-£250m in 3 years, and when asked to manage a team of 4, working with products from 1,072 different vendors (and delivering our profit goals consistently every quarter). In the latter case, our market share in our largest vendor of the 1,072 went from 7% UK share to 21% in 2 years, winning their Worldwide Solution Provider of the Year Award.
  5. Correlating Subscription Data at Demon against the list of people we’d sent Internet trial CDs to, per advertisement. Having found that the inbound phone people were randomly picking the first “this is where I saw the advert” choice on their logging system, we started using different 0800 numbers for each advert placement, and took the readings off the switch instead. Given that, we could track customer acquisition cost per publication, and spot trends; one was that ads in “The Sun” gave nominal low acquisition costs per customer up front, but were very high churn within 3 months. By regularly looking at this data – and feeding results to our external media buyers weekly to help their price negotiations – we managed to keep per retained customer landing costs at £30 each, versus £180 for our main competitor at the time.

I have many other examples. Mostly simple, and not in the same league as Hans Rosling or Edward Tufte examples i’ve seen. That said, the analysis and graphing was largely done out of hours during days filled with more customer focussed and internal management actions – to ensure our customer experience was as simple/consistent as possible, that the personal aspirations of the team members are fulfilled, and that we deliver all our revenue and profit objectives. I’m good at that stuff, too (ask any previous employer or employee).

With that, i’m off writing some Python code to extract some data ready ahead of my Google “Making Sense of Data” course next week. That to extend my 5 years of Tableau Desktop experience with use of some excellent looking Google hosted tools. And to agonise how to get to someone who’ll employ me to help them, without HR dissing my chances of interview airtime for my lack of practical Hadoop or MapR experience.

The related Business and People Management Smarts don’t appear to get onto most “Requirements” sheet. Yet. A savvy Manager is all I need air time with…

Google Shares: Stick or Twist?

Danger - Will Robinson

A fairly quiet Sunday. Trip down to the gym for the last of my three weekly visits, finding that they were finishing their 24 hour Charity Spinathon, £20 duly donated. En route, listening to the last of a two hour John Gruber Podcast talking at length about Crypto currencies, which was fascinating. Then back home for a walk to the local shops with Jane to pick up some milk, then back to catching up on my various high technology news feeds.

I reflect on Robots getting more and more impressive. Saw a video of a guy in Germany debugging a table tennis playing robot, which is already showing promise (3 minute video here). Then saw that on Tuesday, there is a match planned between Timo Boll, the #1 German Professional Table Tennis Player (currently #8 in the world), playing against a KUKA industrial robot (preview here). Robots are one feature that keeps hitting news headlines concerning Google, who are making many related investments recently.

On a related thought, one thing that has started to bug me a bit is the so far excellent performance of my Self Invested Pension, which over the last 10 months has grown 10.68%. Given 64% of it is in an index tracker, the performance of various stocks i’ve traded (normally on a long term buy and hold basis) has been over 24% to date. The nagging feeling is always asking if i’m carrying too much of too few companies, albeit I tend to focus on ones that I feel have high market shares and future growth potential.

I’ve made good returns trading in then later out on Netflix, Splunk and Salesforce.com. I got slight losses from the early days in Red Hat, ARM, Baidu and Facebook, so reversed out with minimal damage. I made a returns of over 20% on GT Advanced Technologies (GTAT) and 49% on Liquid Metal (LQMT) in 3 weeks by joining the dots on some future Apple investments from patent filings. Those apart, I wound up investments in Google, Amazon, Apple and (having IPO’d) Tableau. I have more recently taken a position in Facebook (it was in the $20’s when I left it, and bought back in recently at $69).

One personal irritation about the Advertising Industry is its relentless pursuit to derive advertising revenue on mobile phones. This is a practice I hold akin to having a kiddie jumping up and down in front of the telly when you’re trying to watch something; something to be actively discouraged. The one concern I have is that Google are my biggest shareholding (at the time of writing, they represent just under 50% of my (non Index Tracker) stock investments), and derive almost all its revenue today from monetising purchase intentions – read: targeted advertising. Likewise Facebook.

Something that impressed me greatly with Facebook was CEO Mark Zuckerberg buying SMS app “WhatsApp”, which has over 400 million users (70% using the service daily) for a jaw dropping $19 billion. The ethos of WhatsApp is to never let advertising interfere with the user experience, instead relying on a nominal $1/year subscription to use their service. Despite that being the antithesis of Facebook’s current business model, they put the WhatsApp CEO straight onto the Facebook Board. While it may sound a very basic simplification, their willingness to do this sort of “eat our children” move gives me confidence that they are aiming for the long term – and not clamouring to keep hold of a business model that may go stale.

With that, I turn to Google. I put £40,834 in them and have so far seen that go up to £54,508 – an approx return of 35%. Around 98% of their current income is tied to advertising revenues. I have quite a wide view of the various initiatives they are undertaking, which while mind-blowing, don’t translate into a likely future revenue/profit stream for the next two years or so. Maybe Chrome Tablets will arrive. Maybe they’ll cotton on that it may be a good idea to sell their excellent ChromeCast outside the USA. One thing I don’t yet understand well is their fixation – and many investments – in both Deep Thinking technologies and in Robotics.

The fact they may pull a rabbit out of their hat on one of a wide range of initiatives means i’ll leave my shareholding in them where it is. Likewise the shareholdings held by my three grandchildren (Ellie age 12, Charlie age 9 and Ruby age 2 all have shares in Google, Amazon and Tableau alongside their index trackers). Google shares will split in April, so I think a likely increase as their Google shares get more liquid. After that, we’ll see if the value of those shares continue their relentless march northward.

I’m also confident Amazon will bounce back – I reckon up another 20% in the next quarter to recover from their recent downturn. Apple and Facebook will soldier on. Tableau Software will continue to impress (they are my highest returns to date – over 67% at the time of writing). Those apart, i’m keeping my eye out for signs of three potential IPOs that I think will become very valuable – when they’re ready. But that’s a story for another day.