Public Clouds, Google Cloud moves and Pricing

Google Cloud Platform Logo

I went to Google’s Cloud Platform Roadshow in London today, nominally to feed my need to try and rationalise the range of their Cloud offerings.  This was primarily for my potential future use of their infrastructure and to learn to what I could of any nuances present. Every provider has them, and I really want to do a good job to simplify the presentation for my own sales materials use – but not to oversimplify to make the advice unusable.

Technically overall, very, very, very impressive.

That said, i’m still in three minds about the way the public cloud vendors price their capacity. Google have gone to great lengths – they assure us – to simplify their pricing structure against industry norms. They were citing industry prices coming down by 6-8% per year, but the underlying hardware following Moores law much more closely – at 20-30% per annum lower.

With that, Google announced a whole raft of price decreases of between 35-85%, accompanied by simplifications to commit to:

  • No upfront payments
  • No Lock-in or Contracts
  • No Complexity

I think it’s notable that as soon as Google went public with that a few weeks back, they were promptly followed by Amazon Web Services, and more recently by Microsoft with their Azure platform. The outside picture is that they are all in a race, nip and tuck – well, all chasing the volume that is Amazon, but trying to attack from underneath, a usual industry playbook.

One graph came up, showing that when a single virtual instance is fired up, it costs around 7c per hour if used up to 25% of the month – after which the cost straight lines down. If that instance was up all month, then it was suggested that the discount of 30% would apply. That sort of suggests a monthly cost of circa $36.

Meanwhile, the Virtual Instance (aka Droplet) running Ubuntu Linux and my WordPress Network on Digital Ocean, with 30GB flash storage and a 3TB/month network bandwidth, currently comes out (with weekly backups) at a fixed $12 for me. One third the apparent Google price.

I’m not going to suggest they are in any way comparable. The Digital Ocean droplet was pretty naked when I ran it up for the first time. I had to very quickly secure it (setting up custom iptables to close off the common ports, ensure secure shell only worked from my home fixed IP address) and spend quite a time configuring WordPress and associated email infrastructure. But now it’s up, its there and the monthly cost very predictable. I update it regularly and remove comment spam volumes daily (ably assisted by a WordPress add-in). The whole shebang certainly doesn’t have the growth potential that Google’s offerings give me out of the box, but like many developers, it’s good enough for it’s intended purpose.

I wonder if Google, AWS, Microsoft and folks like Rackspace buy Netcraft’s excellent monthly hosting provider switching analysis. They all appear to be ignoring Digital Ocean (and certainly not appearing to be watching their churn rates to an extent most subscription based businesses usually watch like a hawk) while that company are outgrowing everyone in the industry at the moment. They are the one place that are absorbing developers, and taking thousands of existing customers away from all the large providers. In doing so, they’ve recently landed a funding round from VC Andreessen Horowitz (aka “A16Z” in the industry) to continue to push that growth. Their key audience, that of Linux developers, being the seeds from which many valuable companies and services of tomorrow will likely emerge.

I suspect there is still plenty time for the larger providers to learn from their simplicity – of both pricing, and the way in which pre-configured containers of common Linux-based software stacks (WordPress, Node.js, LAMP, email stacks, etc) can be deployed quickly and inexpensively. If indeed, they see Digital Ocean as a visible threat yet.

In the meantime, i’m trying to build a simple piece of work that can articulate how all the key Public Cloud vendor services are each structured, from the point of view of the time-pressured, overly busy IT Manager (the same as I did for the DECdirect Software catalogue way back when). I’m scheduled to have a review of AWS at the end of April to this end. The presence of a simple few spreads of comparative collateral appears to be the missing reference piece in the Industry to date.

Great Technology. Where’s the Business Value?

Exponential Growth Bar GraphIt’s a familiar story. Some impressive technical development comes up, and the IT industry adopts what politicians will call a “narrative” to try push its adoption – and profit. Two that are in the early stages right now are “Wearables” and “Internet of Things”. I’m already seeing some outlandish market size estimates for both, and wondering how these map back to useful applications that people will pay for.

“Internet of Things” is predicated on an assumption that with low cost sensors and internet connected microcomputers embedded in the world around us, the volume of data thrown onto the Internet will necessitate a ready market needing to consume large gobs of hardware, software and services. One approach to try to rationalise this is to spot where there are inefficiencies in a value chain exist, and to see where technology will help remove them.

One of my sons friends runs a company that has been distributing sensors of all sorts for over 10 years. Thinking there may be an opportunity to build a business on top of a network of these things, I asked him what sort of applications his products were put to. It appears to be down to networks of flows in various utilities and environmental assets (water, gas, rivers, traffic) or in industrial process manufacturing. Add some applications of low power bluetooth beacons, then you have some human traffic monitoring in retail environments. I start running out of ideas for potential inefficiencies that these (a) can address and (b) that aren’t already being routinely exploited. One example is in water networks, where fluid flows across a pipe network can help quickly isolate the existence of leaks, markedly improving the supply efficiency. But there are already companies in place that do that and they have the requisite relationships. No gap there apparent.

One post on Gigaom showed some interesting new flexible electronic materials this week. The gotcha with most such materials is the need for batteries, the presence of which restricts the number of potential applications. One set of switches from Swiss company Algra could emit a 2.4GHz radio signal between 6-10 meters using only energy from someone depressing a button; the main extra innovations are that the result is very thin, and have (unlike predecessors) extremely long mechanical lifetimes. No outside power source required. So, just glue your door bells or light switches where you need them, and voila – done forever.

The other material that caught my eye was a flexible image sensor from ISORG (using Plastic Logic licensed technology). They managed to have a material that you could layer on the surface of a display, and which can read the surface of any object placed against it. No camera needed, and with minimal thickness and weight. Something impossible with a standard CMOS imaging scanner, because that needs a minimum distance to focus on the object above it. So, you could effectively have an inbuilt scanner on the surface of your tablet, not only for monochrome pictures, but even fingerprints and objects in close proximity – for contactless gesture control. Hmmm – smart scanning shelves in retail and logistics – now that may give users some vastly improved efficiencies along the way.

The source article is at: http://gigaom.com/2014/04/07/how-thin-flexible-electronics-will-revolutionize-everything-from-user-interfaces-to-packaging/

A whole field is opening up around collecting data from the Onboard Diagnostics Bus that exists in virtually every modern car now, but i’ve yet to explore that in any depth so far. I’ve just noticed a trickle of news articles about Phil Windley’s FUSE project on Kickstarter (here) and some emerging work by Google in the same vein (with the Open Automotive Alliance). Albeit like TVs, vehicle manufacturers have regulatory challenges and/or slow replacement cycles stopping them moving at the same pace as the computer and electronic industries do.

Outside of that, i’m also seeing a procession of potential wearables, from glasses, to watches, to health sensors and to clip-on cameras.

Glasses and Smart Watches in general are another much longer story (will try and do that justice tomorrow), but these are severely limited by the need for battery power in limited space to so much more than their main application – which is simple display of time and pertinent notifications.

Health sensors are pretty well established already. I have a FitBit One on me at all times bar when i’m sleeping. However, it’s main use these days is to map the number of steps I take into an estimated distance I walk daily, which I tap pro-rata into Weight Loss Resources (I know a walk to our nearest paper shop and back is circa 10,000 steps – and 60 mins of moderate speeds – enough to give a good estimate of calories expended). I found the calorie count questionable and the link to MyFitnessPal a source of great frustration for my wife; it routinely swallows her calorie intake and rations out the extra extra calories earnt (for potential increased food consumption) very randomly over 1-3 days. We’ve never been able to correlate it’s behaviour rationally, so we largely ignore that now.

There’s lots of industry speculation around now that Apple’s upcoming iWatch will provide health related sensors, and to send readings into a Passbook-like Health Monitoring application on a users iPhone handset. One such report here. That would probably help my wife, who always appears to suffer a level of anxiety whenever her blood pressure is taken – which worsens her readings (see what happens after 22 days of getting used to taking daily readings – things settle down):

Jane Waring Blood Pressure Readings

I dare say if the reading was always on, she’d soon forget it’s existence and the readings reflect a true reality. In the meantime, there are also feelings that the same Health monitoring application will be able to take readings from other vendors sensors, and that Apple are trying to build an ecosystem of personal health devices that can interface to it’s iPhone based “hub” – and potentially from there onto Internet based health services. We can but wait until Apple are ready to admit it (or not!) at upcoming product announcement events this year.

The main other wearables today are cameras. I’ve seen some statistics on the effect of Police Officers wearing these in the USA:

US Police Officer with Camera

One of my youngest sons friends is a serving Police Officer here, and tells us that wearing of cameras in his police force is encouraged but optional. That said, he said most officers are big fans of using them. When turned off, they have a moving 30 second video buffer, so when first switched on, they have a record of what happened up to 30 seconds before that switch was applied. Similarly, when turned off, they continue filming for a further 30 seconds before returning to their looping state.

Perhaps surprising, he says that his interactions are such that he’s inclined to use less force even though, if you saw footage, you’d be amazed at his self restraint. In the USA, Police report that when people they’re engaging know they’re being filmed/recorded, they are far more inclined to behave themselves and not to try to spin “he said that, I said that” yarns.

There are all sorts of privacy implications if everyone starts wearing such devices, and they are getting increasingly smaller. Muvi cameras as one example, able to record 70-90 minutes of hi res video from their 55mm tall, clip attached enclosure. Someone was recently prosecuted in Seattle for leaving one of these lens-up on a path between buildings frequented by female employees at his company campus (and no, I didn’t see any footage – just news of his arrest!).

We’re moving away from what we thought was going to be a big brother world – but to one where such cameras use is “democratised” across the whole population.

Muvi Camcorder

 

I don’t think anyone has really comprehended the full effect of this upcoming ubiquity, but I suspect that a norm will be to expect that the presence of a working camera to be indicated vividly. I wonder how long it will take for that to become a new normal – and if there are other business efficiencies that their use – and that of other “Internet of Things” sensors in general – can lay before us all.

That said, I suspect industry estimates for “Internet of Things” revenues, as they stand today, along with a lack of perceived “must have this” applications, make them feel hopelessly optimistic to me.

Apple, Sapphire, Liquid Metal & a relentlessly stationary Stock Price

Old Apple Rainbow Logo

I started running my own SIPP around one year ago, put 70% of the funds in an accumulating 100% Equity Index Tracker, and bet the rest of various US high technology stocks. So far, quite happy with overall progress. Just waiting for the recent split in Google shares to work through, but i’m beating 10% returns overall, which is tremendous.

The one bizarre one is Apple. I bought 30 shares at $523.47 (cost of around £9,593) and they are, after a year, sitting at a 1.43% loss – £137.09 overall – not big, but not showing signs of progressing either. That despite hoovering up over 70% of the mobile phone industry profits worldwide, and having an iPhone business that in itself is bigger than virtually every other company in the world. 2013 revenues of $170.91Bn.

The stock market seems to think Apple will no longer be able to continue it’s last 5 year bottom line compound annual growth rate of 39%, so are treating it as a slow and steady cash generative company, rather than one whose growth rate will continue undiminished. A Price/Earnings multiple of 13x, way lower than that of Google (30x) and that of Amazon (580x) – both of which have given me healthy returns.

Long Memory sometimes helps

My one respite has been in remembering, at the turn of the year, that Apple had signed a 2 year licensing agreement with a company called Liquid Metal (LQMT), nominally to reserve an exclusive option to use their impressive alloys – nearly 2 years ago. They’d also taken an equity investment in GT Advanced Technologies (GTAT), who were to use the investment to build a production facility in Arizona capable to manufacture phone-size displays made from Sapphire. Then a small smoke signal appeared in my Twitter feed, where someone said that Apple had filed a patent related to Liquid Metal at the US Patent Office in November 2013.

Hmmmm. I then started to get emails from the Motley Fool, saying that they knew of a recent patent filing, and that it was going to be a good opportunity to “get in early” on some secret stock they had the other side of a sales pitch for one of their investment programs. I’m already a subscriber to one of their other initiatives, so just sat that out, but tried to stay very tuned to LQMT (trading at 20c/share) and GTAT (who were around the $9 mark).

Buy buy buy

With a lack of signals arriving, I thought something would have to out ahead of the February end date of the Liquid Metal license, given that Apple had already put a patent in place using that technology. So, with around £4,700 cash in my SIPP, I elected to buy 22,000 shares of LQMT at $0.20 and 478 shares of GTAT at $9.22 on the 22nd January.

A couple of days later, news broke that employees of Apple had filed at least 7 more patents related to Liquid Metal alloys, and that GT Advanced Technology were putting pressure on US regulators to speed up the building of their new Arizona facility.

One blog I found along the way examined the Liquid Metal patents in great detail and were a fantastic, well researched read from a German expert. See: Philip Guehlke’s blog at http://www.techinsighter.com/

The Liquid Metal shares went off like a Roman Candle, zapping up from $0.20 to $0.39 in a week – and then meandered back down, sticking around the $0.29 mark. GT Advanced Technology were also climbing, but much less aggressively – though still over 20% in the first week.

I’m not used to holding shares that turn out to be quite that volatile, so I did an exit stage left on profits of 30.9% and 24.2% respectively – within 3 weeks of their original purchase. £1,450 profit. Well, that’s more than paid off the lack of Apple ROI ten times over. That said, LQMT shares are back at $0.22 now, and GTAT up at $16.00 or so, but overall happy with my lot.

Out of daily wild swings

I’m now back to my traditional position in long term buy and hold stocks only. That said, I do keep thinking about a few things as I see the buy side and sell side analysts batting Apple, Amazon and Google over tennis nets every day.

The way the market treat Apple is pretty unique. They’ve seen new product categories roll down the Apple conveyor belt to unprecedented worldwide success. iPod, iPhone, iPad… and they’re all staring down looking for evidence of yet another golden egg on its way. Contrast that to Amazon, who reinvest every penny of potential profit into building out their worldwide e-commerce infrastructure – to such an extent, the analysts think the first golden egg will arrive as soon as Jeff Bezos thinks it’s time to let it out.

New Product Categories

Apple management have indicated multiple new product categories will arrive in 2014. They have been going around the world trademarking the word “iWatch” and appear to have a wearable device in the works. Reputedly to have health monitoring related features. The main gotcha is that the market for watches worldwide at this stage is circa $1.2B/year, so it would be difficult to make any noticeable contribution to $170B/year Apple. More an accessory to the information hub that is the users iPhone.

They were reputed to be trying out a new set-top TV box that they were trialing with Time Warner Cable in the USA – at least until Comcast initiated their still ongoing attempt to buy that company. Comcast already have their own X1 box in several markets in the USA already. And again, the current Apple TV box – $99 each – and of which some 13 million had been sold to date – will again unlikely make a marked difference in Apple’s huge turnover. More as accessories to the main iPhone and iPad show.

All eyes on the next set of announcements

With that, it looks like iPhone, and to a lesser extend iPad, will continue to be their revenue and profit drivers, with new product categories likely to be ancillary accessories to support user engagement with those devices. Having seen the analyses of the Liquid Metal patents Apple have submitted, i’m fairly convinced the next models will be encased in Liquid Metal (instead of milled Aluminium for current high end models) and have scratch resistant Sapphire screens. That may also give them lower manufacturing costs to allow them to fill some of the other price bands that are growing most aggressively for their Android competitors. Or not!

Time will tell. The only thing i’m sure of is, that if they produce a health related iWatch, that my wife expect me to be first in the queue to buy one for her. And indeed a new handset if it’s an improvement to her much loved iPhone 5S. I certainly won’t let her down.

Focus on End Users: a flash of the bleeding obvious

Lightbulb

I’ve been re-reading Terry Leahy’s “Management in 10 Words”; Sir Terry was the CEO of Tesco until recently. I think the piece in the book introduction relating to sitting in front of some Government officials was quite funny – if it weren’t a blinding dose of the obvious that most IT organisations miss:

He was asked “What was it that turned Tesco from being a struggling supermarket, number three retail chain in the UK, into the third largest retailer in the World?”. He said: “It’s quite simple. We focussed on delivering for customers. We set ourselves some simple aims, and some basic values to live by. And we then created a process to achieve them, making sure that everyone knew what they were responsible for”.

Silence. Polite coughing. Someone poured out some water. More silence. “Was that it?” an official finally asked. And the answer to that was ‘yes’.

The book is a good read and one we can all learn from. Not least as many vendors in the IT and associated services industry and going in exactly the opposite direction compared to what he did.

I was listening to a discussion contrasting the different business models of Google, Facebook, Microsoft and Apple a few days back. The piece I hadn’t rationalised before is that of this list, only Apple have a sole focus on the end user of their products. Google and Facebook’s current revenue streams are in monetising purchase intents to advertisers, while trying to not dissuade end users from feeding them the attention and activity/interest/location signals to feed their business engines. Microsoft’s business volumes are heavily skewed towards selling software to Enterprise IT departments, and not the end users of their products.

One side effect of this is an insatiable need focus on competition rather than on the user of your products or services. In times of old, it became something of a relentless joke that no marketing plan would be complete without the customary “IBM”, “HP” or “Sun” attack campaign in play. And they all did it to each other. You could ask where the users needs made it into these efforts, but of the many I saw, I don’t remember a single one of those featured doing so at all. Every IT vendor was playing “follow the leader” (and ignoring the cliffs they may drive over while doing so), where all focus should have been on your customers instead.

The first object lesson I had was with the original IBM PC. One of the biggest assets IBM had was the late Philip “Don” Estridge, who went into the job running IBM’s first foray into selling PCs having had personal experience of running an Apple ][ personal computer at home. The rest of the industry was an outgrowth of a hobbyist movement trying to sell to businesses, and business owners craved “sorting their business problems” simply and without unnecessary surprises. Their use of Charlie Chaplin ads in their early years was a masterstroke. As an example, spot the competitive knockoff in this:

There isn’t one! It’s a focus on the needs of any overworked small business owner, where the precious asset is time and business survival. Trading blows trying to sell one computer over another completely missing.

I still see this everywhere. I’m a subscriber to “Seeking Alpha“, which has a collection of both buy-side and sell-side analysts commentating on the shares of companies i’ve chosen to watch. More often than not, it’s a bit like sitting in an umpires chair during a tennis match; lots of noise, lots of to-and-fro, discussions on each move and never far away from comparing companies against each other.

One of the most prescient things i’ve heard a technology CEO say was from Steve Jobs, when he told an audience in 1997 that “We have to get away from the notion that for Apple to win, Microsoft have to lose”. Certainly, from the time the first iPhone shipped onwards, Apple have had a relentless focus on the end user of their products.

Enterprise IT is still driven largely by vendor inspired fads and with little reference to end user results (one silly data point I carry in my head is waiting to hear someone at a Big Data conference mention a compelling business impact of one of their Hadoop deployments that isn’t related to log file or Twitter sentiment analyses. I’ve seen the same software vendor platform folks float into Big Data conferences for around 3 years now, and have not heard one yet).

One of the best courses I ever went on was given to us by Citrix, specifically on selling to CxO/board level in large organisations. A lot of it is being able to relate small snippets of things you discover around the industry (or in other industries) that may help influence their business success. One example that I unashamedly stole from Martin Clarkson was that of a new Tesco store in South Korea that he once showed to me:

I passed this onto to the team in my last company that sold to big retailers. At least four board level teams in large UK retailers got to see that video and to agonise if they could replicate Tesco’s work in their own local operations. And I dare say the salespeople bringing it to their attention gained a good reputation for delivering interesting ideas that may help their client organisations future. That’s a great position to be in.

With that, i’ve come full circle from and back to Tesco. Consultative Selling is a good thing to do, and that folks like IBM are complete masters at it; if you’re ever in an IBM facility, be sure to steal one of their current “Institute for Business Value” booklets (or visit their associated group on LinkedIn). Normally brim full of surveys and ideas to stimulate the thought processes of the most senior users running businesses.

We’d do a better job in the IT industry if we could replicate that focus on our end users from top to bottom – and not to spend time elbowing competitors instead. In the meantime, I suspect those rare places that do focus on end users will continue to reap a disproportionate share of the future business out there.

12 years, Google Fusion Tables then Gold Nuggets

Making Sense of Data Course Logo

I’ve had a secret project going since June 2002, entering every component and portion size of my food intake – and exercise – religiously into web site www.weightlossresources.co.uk. Hence when Google decided to run an online course on “Making Sense of Data”, I asked Rebecca Walton at the company if she would be able to get daily intake summary for me in machine readable form: Calorie Intake, Carbs, Protein, Fat weights in grams, plus Exercise calories for every day since I started. Some 3,500 days worth of data. She emailed the spreadsheet to me less than two hours later – brilliant service.

WLR Food Diary

Over that time, i’ve also religiously weighed myself almost every Monday morning, and entered that into the site too. I managed to scrape those readings off the site, and after a few hours work, combined all the data into a single Google Spreadsheet; that’s a free product these days, and has come on leaps and bounds in the last year (i’d not used Excel in anger at all now since late 2012).

Google Spreadsheets Example Sheet - Ian's Weight Loss Stats

With that, I then used the data for the final project of the course, loading the data into Google’s new Fusion Tables Analytics tool on which the course was based.

I’m currently in a 12 week competition at my local gym, based on a course of personal training and bi-weekly progress measures on a Boditrax machine. Effectively a special set of bathroom scales that can shoot electrical signals up one foot and down to the other, and to indicate your fat, muscle and water content. The one thing i’ve found strange is that a lot of the work i’m given is on weights, resulting in a muscle build up, a drop in fat – but at the same time, minimal weight loss. I’m usually reminded that muscle weighs much more than fat; my trainer tells me that the muscle will up my metabolism and contribute to more effective weight loss in future weeks.

Nevertheless, I wanted to analyse all my data and see if I could draw any historical inferences from it that could assist my mission to lose weight this side of the end of the competition (at the end of April). My main questions were:

  1. Is my weekly weight loss directly proportional to the number of calories I consume?
  2. Does the level of exercise I undertake likewise have a direct effect on my weight loss?
  3. Are there any other (nutritional) factors that directly influence my weekly weight loss?

Using the techniques taught in this course, I managed to work out answers to these. I ended up throwing scatter plots like this:

Ian Intake vs Weight Change Scatter Plot

Looking at it, you could infer there was a trend. Sticking a ruler on it sort of suggests that I should be keeping my nett calories consumption around the 2,300 mark to achieve a 2lb/week loss, which is some 200 calories under what i’d been running at with the www.weightlossresources.co.uk site. So, one change to make.

Unlike Tableau Desktop Professional, the current iteration of Google Fusion Tables can’t throw a straight trend line through a scatter chart. You instead have to do a bit of a hop, skip and jump in the spreadsheet you feed in first, using the Google Spreadsheet trend() function – and then you end up with something that looks like this:

Nett Calorie Intake vs Weight Change Chart

The main gotcha there is that every data element in the source data has to be used to draw the trend line. In my case, there were some days when i’d recorded my breakfast food intake, and then been maxed out with work all day – and hence created some outliers I needed to filter out before throwing the trend line. In my case, having the outliers present made the line much shallower than it should have been. Hence one enhancement request for Fusion Tables – please add a “draw a trend line” option that I can invoke to draw a straight line through after filtering out unwanted data. That said, the ability of Fusion Tables to draw data using Maps is fantastic – just not applicable in this, my first, use case.

Some kinks, but a fantastic, easy to use analytics tool – and available as a free add-on to anyone using Google Drive. But the real kudos has to go to Google Spreadsheets; it’s come on leaps and bounds and i’d no longer routinely need Excel any more – and it already now does a lot more. It simply rocks.

The end results of the exercise were:

  1. I need to drop my daily nett calorie intake from 2,511 to 2,300 or so to maintain a 2lb/week loss.
  2. Exercise cals by themselves do not directly influence weight loss performance; there is no direct correlation here at all.
  3. Protein and Fat intake from food have no discernable effect on changes to my weight. However, the level of Carbs I consume have a very material effect; less carbs really help. Reducing the proportion of my carbs intake from the recommended 50% (vs Protein at 20% and Fat at 30%) has a direct correlation to more consistent 2lbs/week losses.

One other learning (from reverse engineering the pie charts in www.weightlossresources.co.uk web site) was that 1g of carbs contains approx 3.75 cals, 1g of Protein maps to 4.0 cals, and 1g of fat to 9.0 cals – and hence why the 30% of a balanced diet attributable to fat consumption is, at face value, quite high.

And then I got certified:

Google Making Sense of Data Course Completion Certificate

So, job done.  One more little exercise to test a theory that dieting one week most often gives the most solid results over a week later, but that can wait for another day (has no material effect if i’m being good every week!). Overall, happy that I can use Google’s tools to do ad-hoc data analysis whenever useful for the future. And a big thankyou to Rebecca Walton and her staff at www.weightlossresources.co.uk, and to Amit, Max and the rest of the staff at Google for an excellent course. Thank you.

Now, back to learning  the structure and nuances of Amazon and Google public Cloud services – a completely different personal simplification project.

-ends-

Footnote: If you ever need to throw a trend line in Google Spreadsheets – at least until that one missing capability makes it into the core product – the process using a simplified sheet is as follows:

Trend Line through Scatter Plot Step 1

Scatter plot initially looks like this:

Trend Line through Scatter Plot Step 2

Add an “=trend()” function to the top empty cell only:

Trend Line through Scatter Plot Step 3

That then writes all the trendline y positions of for all x co-ordinates right down all entries in one take:

Trend Line through Scatter Plot Step 4

which then, when replotted, looks like this. The red dots represent the trend line:

Trend Line through Scatter Plot Step 5

Done!

Police, Metrics and the missing comedy of the Red Beads

Deming Red Bead Experiment

I heard a report on Friday related to the Metropolitan Police possessing an internal “culture of fear” because of a “draconian” use of Performance targets, based on an interview and survey with 250 police officers. The report author went on to say that officers who missed targets were put on a “hit list”, with some facing potential misconduct action. Some of the targets were:

  • 20% arrest rate for stop and searches
  • 20% of stop and searches should be for weapons
  • 40% for neighbourhood property crime
  • 40% for drugs

and some for one policing team in 2011:

  • PCs to make one arrest and five stop and searches per shift
  • Special Constabulary officers to make one arrest per month and perform 5 stop and searches per shift
  • Police Community Support Officers (PCSOs) to make five stop-and-accounts per shift, and two criminal reports per shift

But Metropolitan Police Assistant Commissioner accused the reports authors of “sensationalising” the issue. He also then said something that threw the red flag up in my simple brain – that “it was the Met’s job to bring down crime“. Then said that since it had a “more accountable way of doing things”, rates were down by nearly 10%”.

One officer told the report: “Every month we are named and shamed with a league table by our supervisors, which does seem very bullying/overbearing.”

Another officer refers to a “bullying-type culture”.

The report says: “There is evidence of a persistent and growing culture of fear spawned by the vigorous and often draconian application of performance targets, with many officers reporting that they feel almost constantly under threat of being blamed and subsequently punished for failing to hit targets.”

But Scotland Yard denied officers were being unfairly pressurised. In a statement, the force said it was faced with many challenges, but insisted it did not have a bullying culture.”We make no excuses for having a culture that values performance,” it said.

“We have pledged to reduce crime, increase confidence and cut costs. It’s a big task and we have a robust framework in place to ensure we achieve this. The public expects no less.”

A source of confusion here

I thought that the “it was the Mets job to bring down crime” comment was a very curious thing to say, not least that I traced it’s origin to his ultimate boss, the Home Secretary, who also said the only Police metric important to her was that of reducing crime.

Think about that for a moment. Does the Police have total control to dictate the crime rate? I wouldn’t dispute they have some behavioral, presence and advisory influences, but in the final analysis, there are many external influences (outside their control) that i’d suspect have a much greater impact on that measure. With that, you’re entering into a world where your main control at your disposal – that of diligently recording the statistics to back up a political narrative – is wide open to wholesale abuse.

Meanwhile in Bristol

The private sector is far from immune also. At one stage earlier in my career, I worked out of a company branch office in Bristol, serving IT customers in the South West UK. For the most part, we were very matter of fact, honest and straightforward with customers. And then came the annual customer satisfaction survey, a multiple choice questionnaire sent to the IT Managers at most of the key customers we dealt with in our work.

I remember being in an office with the IT Manager at Camborne School of Mines (we had a big VAX doing scientific work, supporting their drilling for warm underground water as a potential future energy source). The customer satisfaction survey was sitting open on his desk, with the page showing his yet to be filled in customer satisfaction measure for quality of Field Service Maintenance. In walks the Field Service engineer who’d just arrived, said “Hello, i’m here” around the door, and was called back by the IT Manager. The Manager then held the tip of his pen over the 1-10 rating boxes on the survey, and said “When can we have the new disk drive that arrived yesterday installed?”. Field Service engineer said “Is next Wednesday okay?”. Pen moves over to the 1/10 Customer Sat box. “Eh, I can probably do it just after lunchtime today!”. Pen moves over the 10/10 box. “Yes, you’ll have everything working this afternoon”. With that, the 10/10 box was ticked. A wry smile from everyone, and a thought that if genuine feedback was sent back by customers in general, it would result in service improvements that benefitted the company.

As it turns out, very naive on our part.

A missive rolled down from the European HQ in Geneva that said our office was the 3rd worst office for customer satisfaction in Europe, and hence someone in the office would be nominated to enact changes to improve performance for next year – with serious consequences if big improvements weren’t delivered. And with that, the European President said – to all 30,000 staff in Europe – is that the minimum acceptable performance next year would be an overall 8/10.

So, what happened? The guy in the office nominated to manage the transition to high quality (wry smile here) was the same guy who did the large scale benchmarking exercises for prospective customers against competitors of that time. Where the main skill was politically getting things coded into the customers benchmarking spec handed out to every vendor that suited the performance characteristics of our own machines, and in generally playing whatever games he could to win on key measures on which the bidding competition would be judged.

Customers known to be unhappy magically disappeared from the survey mailing list. Anyone visiting customers routinely in their working week were trained on how to set customer expectations that anything under 9/10 was deemed a failure, and that 10/10 was a norm. And everyone knew who was going to get a survey, and worked doubly hard to ensure those customers were as happy as we could make them – with the minimum marking scores in mind. Several thought of it no more than one week when they had more blackmail capital than at any other time of the year, but otherwise complied with the expressed wishes.

End result: Top office in customer satisfaction in the country, and only 3rd among all the branches in Europe (1 and 2 in Austria – suspicious that, but hey).

Were customers any happier? No. Was the survey a useful improvement device? No. Did it suit the back story for the political narrative? You bet! And with that the years continued to roll on.

My own Lightbulb moment

Somewhere along the line between Bristol and more senior roles in the same company, I came upon one W Edwards Deming, and one thing he routinely did to managers to fix this sort of malaise. But a slight detour first (based on what I did after that following my experiencing one of his lessons).

Doing things right (I think)

When I was Director of Merchandising and Operations at Computacenter’s Software Business Unit, the internal Licensing Desk reported into me; a team of five people who dispensed advice about how to buy software in the most cost effective way possible without unwanted surprises. And administering all the large license orders with vendors in support of this. A super team, managed by Claire Hallissey.

Claire had one member of her team consolidating data collection on the number of calls coming into the team and how long each enquiry was taking to handle; not something i’d imposed on the team at all, but I suspect for her own management use. It became pretty obvious from the graphs that growth in demand to use her team was far outpacing the revenue growth of the Business Unit, at a time when we were likely to be under pressure not to increase headcount.

So, what did we do? I indicated that the data collection was brilliant, and didn’t want to see effort or accuracy of that compromised in any way. However, if they managed to work out any way of reducing the volume and length of calls into her team by 15% by the next quarter end, i’d put a £150 bonus in each of their pay packets. The thinking here is that they were the folks who could ask “why” most effectively, and enact changes – be it local office new sales support hire training, simplifying documentation, and generally tracing back why people were calling in the first place. And then relentless putting their corrective actions into play.

In the event, they got overall call volume down by 25%, the source data quality stood up to my light scrutiny, and all duly got the £150 bonus each – plus senior accolades for that achievement. One of the innovations was adding a sentence or two to standard template response emails they’d built to answer common secondary questions too – and hence to take out repeat calls with better content in the first email answer sent back. With that, the work volume growth trailed the sales volume increases, and the group more productive – and less bored by the same repeat questions, ad nauseum.

Then in Southend

Likewise on day 2 of my job at Demon Internet, when a group of us walked into the Southend Tech Support Centre to see a maxed out floor of people on the phones to customers, and a classroom with 10 new recruits being trained. The Support Centre manager, looking very harassed, just said “that’s this weeks intake. We’ve got another 10 next week, and another 10 the week after that”. I think I completely threw him when I said nonchalantly “But why are customers calling in?”. He just looked at me as if i’d asked a very stupid question, and replied “We just haven’t got enough staff to handle the phone calls”.

Fortunately, his deputy was able to give us a dump of their Remedy system, so a couple of us could sample the call reasons and what specifically was requiring technical assistance. In the event, 27% of the calls related to setting up the various TCP/IP settings; we then changed the product and simplified it’s supporting documentation to work those issues away. At least some respite until Microsoft shipped Internet Explorer 6, which resulted in the Customer Services Director admitting later as having “fundamentally broken my call centre”. But that’s another story.

W Edwards Deming

W Edwards Deming Quote

But back to metrics. The one thing in all my career that made my light bulb go on related to measures and metrics was an experiment conducted by W Edwards Deming. Deming was an American statistician who was sent to Japan after World War 2 to assist in it’s reconstruction, and found himself teaching motorcycle and car manufacturers on how to improve the quality of their products. As quality improved, they also found prices went down, and companies like Honda, Suzuki, Kawasaki, Datsun (now Nissan) and Toyota went from local to worldwide attention with motorcycles, then cars. The products from which, unlike their western counterparts, rarely broke down and remained inexpensive – so much so, western governments instituted quotas to arrest the siege on their own manufacturing industries. To this day, the highest accolade for excellence of quality in Japan remains “The Deming Prize”. It was only much later that the work of Deming was widely acknowledged, and then used, by western manufacturers as well.

During his training seminars, Deming conducts what is known as “The Red Bead” experiment. Unfortunately, the comedy of promoting good workers, firing underperformers, and urging improved performance with no control over the components of a process is largely lost in videos of him running this himself, given that he was well into his 90’s when recorded. His dry humour is a bit harder to spot than it would have been earlier in his career – when he openly acknowledged that some Japanese managers routinely imposed the same class of bad metrics on their staff as those of the worst examples he found in the West.

If you can buy a copy of his seminal book Out of the Crisis, you can see the full description between pages 109-112, in Chapter 3, “Diseases and Obstacles”, following the subtitle “Fair Rating is impossible”. Something the Home Secretary, and all echelons of Managers in the Public Sector, should read and internalise. If they did, I think the general public would be pleased with the changes i’m sure they’d enact based on his wise knowledge.

In the absence of an original Deming version, a more basic version of the same “your job security depends on things outside your control” sentiments can be found on this (it’s around 2 minutes long):

or a longer 24 minute version, truer to the original real McCoy:

A modern take on peoples valiant attempts to get attention

Facebook Newsfeed Algorithm Equation

A really well written story in Techcrunch today, which relates the ever increasing difficulty of getting a message you publish in front of people you know. Well worth a read if you have a spare 5 minutes: http://techcrunch.com/2014/04/03/the-filtered-feed-problem/

The main surprise for me is that if you “Like” a particular vendors Facebook page, the best historical chance (from Feb 2012) of seeing one individual post from them was around 1 in 6 – 16%. With an increase in potential traffic to go into your personal news feed, it is (in March 2014) now down to 1 in 15 – 6.51%. So, businesses are facing the same challenges to that of the Advertising industry in general, even on these new platforms.

Despite the sheer amount of signal data available to them, even folks like Facebook (and I guess the same is true of Google, Twitter, LinkedIn, Pinterest, etc) have a big challenge to separate what we value seeing, and what we skip by. Even why we look at these social media sites can be interpreted in many different ways from the get go. One of my ex-work colleagues, at a s Senior Management program at Harvard, had a professor saying that males were on Facebook for the eye candy, and females to one-plus their looks and social life among their social circle (and had a habit of publishing less flattering pictures of other women in the same!).

The challenge of these sites is one of the few true need for “big data” analyses that isn’t just IT industry hype to sell more kit. Their own future depends on getting a rich vein of signals from users they act as a content platform for, while feeding paid content into the stream that advertisers are willing to subvert in their favo(u)r  – which is a centuries old pursuit and nothing remarkable, nor new.

Over the past few weeks, i’ve increased the number of times per week I go out for a walk with my wife. This week, Google Now on my Nexus 5 flashed this up:

Google Now Walking Stats Screenshot

 

So, it knows i’m walking, and how far! I guess this isn’t unusual. I know that the complete stock of photographs people upload also contain location data (deduced from GPS or the SSID of Wireless routers close by), date/time and readily admit the make and model of the device that it was taken on. And if you have a professional DSLR camera, often with the serial number of the camera and lens on board (hence some organisations offering to trace stolen cameras by looking at the EXIF data in uploaded photographs).

Individually identifiable data like that is not inserted by any of the popular mobile phones (to the best of my knowledge), and besides, most social media sites strip the EXIF data out of pictures they display publicly anyway. You’d need a warrant to request a search of that sort of data from the social media company, case by case. That said, Facebook and their ilk do have access to the data, and also a fair guess at your social circle given who gets tagged in your pictures!

Traditional media will instead trot out statistics on OTS (aka “Opportunities to see” an advert) and be able to supply some basic demographics – gleaned from subscriptions and competition entries – to work out the typical demographics of their audience you can pay to address. Getting “likely purchase intent” signals is much, much more difficult.

Love At First Website Demon Ad

When doing advertising for Demon Internet, we used to ask the person calling up for a trial CD some basic questions about where they’d seen the advert that led them to contact us. Knowing the media used, and it’s placement cost, we could in time measure the cost per customer acquired and work to keep that as low as possible. We routinely shared that data every week with our external media buyers, who used the data as part of their advertising space buying negotiation patter, and could relate back which positions and advert sizes in each publication pulled the best response.

The main gotcha is that if you ask, you may not get an accurate answer from the customer, or you can be undone by your own staff misattributing the call. We noticed this when we were planning to do a small trial some TV advertising, so had “TV” put on the response systems menu – as it happens, it appeared as the first option on the list. We were somewhat bemused after a week that TV was our best source of new customers – but before any of our ads had been aired. So, a little nudge to our phone staff to please be more accurate, while we changed every ad, for each different media title we used, to different 0800 numbers – and could hence take the response readings off the switch, cutting out the question and generally making the initial customer experience a bit more friction free.

With that, our cost per acquired customer stayed around the £20 each mark, and cost per long term retained customer kept at around £30 (we found, along the way, some publications had high response rates, but high churn rates to go with them).

Demon Trial Postmark

The best response rates of all were getting the Royal Mail franking machines to cancel stamps on half of all stamped letters in the UK for two two-week periods – which came out at £7 per acquired customer; a great result for Michelle Laufer, who followed up when she noticed letters arriving at home cancelled with “Have a Break, Have a Kit Kat”. Unfortunately, the Royal Mail stopped allowing ads to be done in this way, probably in the knowledge that seeing “Demon Internet” on letters resulted in a few complaints from people and places with a nervous disposition (one Mental Hospital as a case in point).

The main challenge for people carrying a Marketing job title these days is to be relentless on their testing, so they can measure – with whatever signals they can collect – what works, what doesn’t and what (from two alternative different treatments) pulls better. Unfortunately, many such departments are littered with people with no wherewithal beyond “please get this mailer out”. Poorest of Amateur behaviour, and wasting money unnecessarily for their shareholders.

As in most walks in life, those that try slightly harder get a much greater proportion of the resulting spoils for their organisation. And that is why seminal books like “Commonsense Direct and Digital Marketing“, and indeed folks like Google, Facebook et al, are anal about the thoroughness of testing everything they do.

Tech Careers delivering results, slowed by silly Nuances

Caution: Does Stupid Things

Early in my career, I used to write and debug device drivers, which was a mix of reading code in octal/hex, looking at source listings, pouring over crash dumps and trying to work out which code paths executed in parallel. Each potentially in conflict with each other if you weren’t extra careful. Doing that for a time gets you used to being able to pattern match the strangest of things. Like being able to tell what website someone is looking at from far across the room, or reading papers on a desk upside down, or being able to scroll through a spreadsheet looking for obvious anomalies at pretty impressive speeds.

The other thing it gives you is “no fear” whenever confronted by something new, or on the bleeding edge. You get a confidence that whatever may get thrown at you, that you can work your way around it. That said, I place great value in stuff that’s well designed, and that has sensible defaults. That speeds your work, as you’re not having to go back and spell out in minute detail what every smallest piece of the action needs to do.

I’ve been really blessed with Analytics products like Tableau Desktop Professional, and indeed more recently with Google Spreadsheets and Google Fusion Tables. These are the sort of tools I use routinely when running any business, so that I can simply, in a data-based way, adjudge what is and isn’t working business-wise.

The common characteristic of these tools are that they all guess what you need to show most of the time, and don’t delay you by having to go through every piece of text, every line, every smallest detail with individual calls for font, font size, colo(u)r and the need to cut the graph display of a line once the last data point is rolled out – and not, as one product does, just throw all future months stuck on a flat line once the plot goes into future months with no data yet present.

There have been several times when i’ve wanted to stick that “Does Stupid Things” sign on Microsoft SQL Server Reporting Services.

I diligently prototyped (as part of a Business improvement project) a load of daily updated graphs/reports for a team of managers using Tableau Desktop Professional. However, I was told that the company had elected to standardise on a Microsoft Reporting product, sitting above a SQL Services based Datamart. In the absence of the company wanting to invest in Tableau Server, I was asked to repurpose the Tableau work into Microsoft SQL Services Reporting Services (aka “SSRS”). So I duly read two books, had a very patient and familiar programmer show the basics and to set me up with Visual Studio, get the appropriate Active Directory Access Permissions, and away I went. I delivered everything before I found no line Management role to go back to, but spent some inordinate time between the two dealing with a few “nuances”.

Consider this. I built a table to show each Sales Team Manager what their units Revenue and Profit was, year to date, by month, or customer, or vendor. The last column of the table was a percentage profit margin, aka “Profit” divided by “Revenue”. The gotcha with this is that if something is given away for free, (nominally negative) profit over revenue throws a divide by zero error. So simple to code around, methinks:

=iif(revenue>0, profit/revenue, 0)

Which, roughly translated, tells the software to calculate the percentage profit if revenue is greater than zero, otherwise just stick zero in as the answer. So, I rerun the view, and get #error in all the same places and the same 13 examples of attempted divide by zeroes in as before.

Brain thinks – oh, there must be some minuscule revenue numbers in the thousandths of pence in there, so change the formula to:

=iif(revenue>1,profit/revenue, 0)

so that the denominator is at least one, so the chance of throwing a divide by zero error is extremely remote. The giveaway would need to be mind bogglingly huge to get anything close to a divide by zero exception. Re-run the view. Result: Same 13 divide by zero #error exceptions.

WTF? Look at where the exceptions are being thrown, and the revenue is zero, so the division shouldn’t even be being attempted. So off to Google with “SQL Services Reporting iif divide by zero” I go. The answer came from a Microsoft engineer who admits, nominally for performance reasons, both paths of the iif statement get executed at the same time as a performance shortcut, so that whichever half needs to give it’s result, it’s already populated and ready to use. So, the literal effect of:

=iif(revenue>0, profit/revenue,0)

works like this:

  • Calculate 0 on the one side.
  • Calculate Profit/Revenue on the other.
  • If Revenue > 0, pick the second option, else pick the first.
  • If either side throws an exception (like divide by zero), blat the answer, substitute “#Error” instead.

Solution is to construct two nested “iif” statements in such a way that the optimiser does’t execute the division before the comparison with zero is made.

With that, I’m sure wearing underpants on your head has the same sort of perverse logic somewhere. This is simply atrociously bad software engineering.

Collaborating with Chinese Copycats – the Open Source Way

3D Robotics Iris Drone Copter

Last year, I bought the book Makers: The New Industrial Revolution by Chris Anderson. Previously the Editor-in-Chief of “Wired” magazine, he set up his own company making model flying drones, each containing a mobile phone “system on a chip” and most often these days including a camera. I think it was very instructive what happened when he found out someone in China was cloning his designs and translating his user manual in Chinese.

Some of the community members were shocked at this “blatant piracy” and asked Chris what he was going to do about it. His answer: Nothing. Instead of pointing legal guns at the person doing this, Chris engaged him instead – human being to human being.

A member called “Hazy” said he’d been working with some Chinese hardware cloning folks, and was the person doing the translation of the documentation into Chinese. Chris complemented him on the speed it had been done, and asked if he’d consider bringing the translation into their official manual. He agreed, so Chris gave him edit access to the project Wiki (a shared, public document editing space), and set things up so that people could switch over from English to the parallel Chinese translation if preferred.

Hazy proceeded to integrate the Chinese version of the manual seamlessly. Then he started correcting errors in the English version too. Chris could see all the commits flowing by and approved them all: they were smart, correct and written in perfect English. Then it got interesting.

Hazy started fixing bugs in the drones software code. At first, Chris thought he’d published documentation changes in the wrong folder; he checked it out, and it was code and his fix was not only correct, but properly documented. Chris thanked him for the fix, and thought little more about it.

But then the code commits kept coming. Hazy was working his way through the issues list, picking off bugs one after another that the Development team had been too busy to handle themselves. Today, Chris considers him to be one of their best Dev team members.

He turned out to be a PhD student in Peking University, who as a kid was fascinated by radio control models, and always wanted his own RC plane. When he could afford one, he and his friends learnt about Chris’s work, but found it inconvenient as it was all documented in English . So, he translated it so Chinese fans could also build on the work. He signs off saying “Thank you for the great work of DIY Drones (Chris Andersons company), and I hope it will help more people make their dreams come true”.

The DIY Drones industry has come on leaps and bounds since. I notice many of the units you can buy ready-assembled (like this Parrot one) can be operated via WiFi using an iPad, which can show the view from the onboard camera as it flies. More advanced models can, if they lose communication with the user – or are running low on fuel or charge – return automatically to the location they originally took off from.

That said, the strategy that Chris followed was “Open Source” done properly. Open things up, and let everyone learn from, then stand on the shoulders, of giants.

The rise & rise of A1 (internet fuelled) Journalism

Newspaper Industry RIPThere’s been a bit of to and fro about the future of Newspapers and Journalism in the last week, where both bundling of advertising and editorial content is being disaggregated by Internet dynamics. Readership of newspapers is increasingly a preserve of the old. Like many other folks I know, we increasingly derive a lot of our inbound content from online newsletters, blogs, podcasts and social media feeds. Usually in much smaller chunks than we’d find in mainstream media of old.

Ben Thompson (@monkbent) wrote a great series of pieces on Journalist “winner takes all” dynamics, where people tend to hook primarily onto personalities or journalists they respect:

I think he’s absolutely correct, but the gotcha is that they all publish in different places and among different colleagues, so it’s difficult (or at the very least time consuming) for a lot of us to pick them out systematically. A few examples of the ones I think are brilliant are folks like:

  • John Lanchester – usually on the London Review of Books and talking about the state of the UK economy (“Let’s Call it Failure“), the behaviour of our post-crash Banking Industry (“Let’s consider Kate“), and about the PPI scandal (“Are we having fun yet?“)
  • Douglas Adams – now RIP – on how people always resist new things as they age or where things work differently to what they’re used to – in “Stop worrying and Learn to Love the Internet
  • Tim Harford – mainly in books, but this corker of an Article about “Big Data: are we making a big mistake“. There is a hidden elephant in the room, given “Big Data” is one of the keystone fads to drive equipment sales in the IT Industry right now. Most companies have a Timely Data Presentation problem in most scenarios i’ve seen; there’s only so much you can derive from Twitter Sentiment Analysis (which typically only derives stats from single percentage figure portions of your customer/prospect base), or from working out how to throw log file data at a Hadoop cluster (where Splunk can do a “good enough” job already).
  • The occasional random article on Medium, such as a probably emotive one to the usual calls of the UK press: “How we were fooled into thinking that sexual predators lurk everywhere” – suggesting that Creating a moral panic about social media didn’t protect teens – it left them vulnerable. There are many other, very readable, articles on there every week across a whole spectrum of subjects.
  • The Monday Note (www.mondaynote.com), edited by Frederic Filloux and Jean-Louis Gassee (JLG used to be CTO of Apple). The neat thing here is that Jean-Louis Gassee never shirks from putting some numbers up on the wall before framing his opinions – a characteristic common to many senior managers i’ve had the privilege to work for.
  • There’s a variety of other newsletter sources I feed from, but subject for another day!

The common thread through what appears to run here is that each other can speak authoritatively, backed by statistically valid proof points, rather than fast trips to the areas of Maslow’s Hierarchy that are unduly influenced by fear alone. I know from reading Dan Ariely’s Predictably Irrational: The Hidden Forces that Shape Our Decisions book that folks will, to a greater or lesser extent, listen to what they want to hear, but I do nevertheless value opinions with some statistically valid meat behind them.

There was another piece by Ken McCarthy (@kenmccarthy), who did a piece shovelling doubt on the existence of Journalism as a historical trade; more as a side effect of needing to keep printing presses occupied – here. He cites:

Frank Luther Mott who won the 1939 Pulitzer Prize for “A History of American Magazines” described the content of the newspapers from this era thusly:

  • 1. Scare headlines in huge print, often of minor news
  • 2. Lavish use of pictures, or imaginary drawings
  • 3. Use of faked interviews, misleading headlines, pseudoscience, and a parade of false learning from so-called experts
  • 4. Emphasis on full-color Sunday supplements, usually with comic strips
  • 5. Dramatic sympathy with the “underdog” against the system

Besides the fact that this sounds an awful like like TV news today, where in this listing of the characteristics of turn-of-the-last-century newspapers is there any mention of journalism? There isn’t because there wasn’t any.

I’d probably add a sixth, which is as a platform to push a political agenda to the more gullible souls in the population – most of whom are opinionated, loud and/or old – or all three – but have a tendency to not spend time fact checking. And amongst the section of the population who still buy printed newspapers and who have a tendency to turn out on election day to vote in large numbers, which is an ever aging phenomenon. Very susceptible to “Don’t let facts get in the way of a good story”, rather than the younger audience that relies instead on a more varied news feed from the Internet at large.

We were treated to a classic example last year. The Sun reported news of the latest “Eastern European Benefits Scrounger”, milking the UK economy for all it’s worth while those who’ve worked hard for years suffer. The responsible government minister, Ian Duncan-Smith, weighs in with a paragraph to be appalled by the injustice. This is followed by over 800 replies, the tone of which (post moderation) is heavily “Nationalistic”:

The Sun - Headline "You're a soft touch"

So, who is this single Mum from the Baltics? She was, in fact, a Russian model hired for the role:

Natalia - Russian Model for Hire

Meanwhile, all comments pointing out the hypocrisy of the paper on the associated forums, or to fill in the blanks on the missing facts, got conveniently deleted. Got to stir things up to sell the papers, and to provide a commentary to victimise a large swathe of the population while greater wrongs elsewhere are shovelled under the carpet.

At some point, the readership of the main UK newspaper titles, owned as they are by six organisations, will ebb away into obscurity as their readership progressively dies off.

I sincerely hope we can find some way of monetising good quality journalists who are skilled in fact finding, of conveying meaningful statistics and to tell it like it is without side; then to give them the reach and exposure in order to fill the void. A little difficult, but eminently possible in a world where you don’t have to fill a fixed number of pages, or minutes of TV news, with superfluous “filler”.

A consolidated result, tuned to your interest areas (personal, local, national and beyond) would probably be the greatest gift to the UK population at large. I wonder if Facebook will be the first to get there.