AWS Summit 2014, London. Impressed.

Amazon Web Services Logo

Having been to the Google equivalent a few weeks ago, I went to the 2014 AWS Summit in London today. Around 2,000 of us managed to steer around the RMT tube strike and overall, very impressed.

AWS have a “Windows Desktop as a Service” offering arriving real soon now, giving you both a Windows Server 2008 R2 server plus client software (for Windows, Mac, iOS and Android) for circa $30/month/user. That increases to between $50-$70/user/month with Windows and Office in place. I can see major opportunities for them at that pricing, not least as they appear to have solved the issues around high performance graphics being driven remotely, and have also got things like keypads available on the tablet implementations of the client. You can side load apps into the mix either directly or using Active Directory.

So, I will shortly have the ability to run up a PC and do a 30 trial of the current Windows-only Tableau Desktop Professional for around £20 – so I can at last finish off the story telling end piece of my 12 year long weight/nutrition analysis, without having to buy a Windows PC. Just need to be able to through trend lines through a few different filtered scatter plots now (something I couldn’t do with Google Fusion Tables).

There are also several traditional Licensed Software providers offering server implementations of their product as instances you only pay for when active, and with no long term commits. Jaspersoft and Tableau Server being two such examples (there are many more). Amazon are also offering assistance to other software providers to provide more products under this basis, including helping to drive free 30 day trials.

Much else to be very impressed by, and the differences between themselves, Digital Ocean and Google Cloud Services are fairly stand-out – to me at least. I think i’d know what i’d do to fire up Enterprise volumes for either of AWS or Google, but the things i’d do are very different based on what i’ve now learnt.

The most populous stand appears to be that of Splunk, who were one of my 3-4 “bets for the future” when I was at Computacenter. Talking to them, it now looks like IT Security is now their biggest application area, followed by e-commerce infrastructure flows and lastly by their traditional log file (and associated performance) analysis business. The product now appears to have plugins for virtually piece of data centre, storage and network device vendor log file, and relationships in place with all the key large brand vendors – and of course links into AWS infrastructure as well now.

I didn’t win a Kindle HDX, or the iPhone 5S Telecity were raffling, nor either of the two drones. But learnt a lot, and will be applying the learnings over the next few weeks.

Help available to keep malicious users away from your good work

Picture of a Stack of Tins of Spam Meat

One thing that still routinely shocks me is the shear quantity of malicious activity that goes on behind the scenes of any web site i’ve put up. When we were building Internet Vulnerability Testing Services at BT, around 7 new exploits or attack vectors were emerging every 24 hours. Fortunately, for those of us who use Open Source software, the protections have usually been inherent in the good design of the code, and most (OpenSSL heartbleed excepted) have had no real impact with good planning. All starting with closing off ports, and restricting access to some key ones from only known fixed IP addresses (that’s the first thing I did when I first provisioned our servers in Digital Ocean Amsterdam – just surprised they don’t give a template for you to work from – fortunately I keep my own default rules to apply immediately).

With WordPress, it’s required an investment in a number of plugins to stem the tide. Basic ones like Comment Control, that  can lock down pages, posts, images and attachments from having comments added to them (by default, spammers paradise). Where you do allow comments, you install the WordPress provided Akismet, which at least classifies 99% of the SPAM attempts and sticks them in the spam folder straight away. For me, I choose to moderate any comment from someone i’ve not approved content from before, and am totally ruthless with any attempt at social engineering; the latter because if they post something successfully with approval a couple of times, their later comment spam with unwanted links get onto the web site immediately until I later notice and take them down. I prefer to never let them get to that stage in the first place.

I’ve been setting up a web site in our network for my daughter in law to allow her to blog abound Mental Health issues for Children, including ADHD, Aspergers and related afflictions. For that, I installed BuddyPress to give her user community a discussion forum, and went to bed knowing I hadn’t even put her domain name up – it was just another set of deep links into my WordPress network at the time.

By the morning, 4 user registrations, 3 of them with spoof addresses. Duly removed, and the ability to register usernames then turned off completely while I fix things. I’m going into install WP-FB-Connect to allow Facebook users to work on the site based on their Facebook login credentials, and to install WangGuard to stop the “Splogger” bots. That is free for us for the volume of usage we expect (and the commercial dimensions of the site – namely non-profit and charitable), and appears to do a great job  sharing data on who and where these attempts come from. Just got to check that turning these on doesn’t throw up a request to login if users touch any of the other sites in the WordPress network we run on our servers, whose user communities don’t need to logon at any time, at all.

Unfortunately, progress was rather slowed down over the weekend by a reviewer from Kenya who published a list of best 10 add-ins to BuddyPress, #1 of which was a Social Network login product that could authenticate with Facebook or Twitter. Lots of “Great Article, thanks” replies. In reality, it didn’t work with BuddyPress at all! Duly posted back to warn others, if indeed he lets that news of his incompetence in that instance back to his readers.

As it is, a lot of WordPress Plugins (there are circa 157 of them to do social site authentication alone) are of variable quality. I tend to judge them by the number of support requests received that have been resolved quickly in the previous few weeks – one nice feature of the plugin listings provided. I also have formal support contracts in with Cyberchimps (for some of their themes) and with WPMU Dev (for some of their excellent Multisite add-ons).

That aside, we now have the network running with all the right tools and things seem to be working reliably. I’ve just added all the page hooks for Google Analytics and Bing Web Tools to feed from, and all is okay at this stage. The only thing i’d like to invest in is something to watch all the various log files on the server and to give me notifications if anything awry is happening (like MySQL claiming an inability to connect to the WordPress database, or Apache spawning multiple instances and running out of memory – something I had in the early days when the Google bot was touching specific web pages, since fixed).

Just a shame that there are still so many malicious link spammers out there; they waste 30 minutes of my day every day just clearing their useless gunk out. But thank god that Google are now penalising these very effectively; long may that continue, and hopefully the realisation of the error of their ways will lead to being a more useful member of the worldwide community going forward.

What do IT Vendors/Distributors/Resellers want?

What do you want? Poster

Off the top of my head, what are the expectations of the various folks along the flow of vendor to end user of a typical IT Product or Service? I’m sure i’ve probably missed some nuances, and if so, what is missing?

Vendors

  • Provide Product and/or Services for Resale
  • Accountable for Demand Creation
  • Minimise costs at scale by compensating channels for:
    • Customer Sales Coverage and Regular Engagement of each
    • Deal Pipeline, and associated activity to increase:
      • Number of Customers
      • Range of Vendor Products/Services Sold
      • Customer Purchase Frequency
      • Product/Service Mix in line with Vendor objectives
    • Investment in skills in Vendor Products/Services
    • Associated Technical/Industry Skills useful to close vendor sales
    • Activity to ensure continued Customer Success and Service Renewals
    • Engagement in Multivendor components to round out offering
  • Establish clear objectives for Direct/Channel engagements
    • Direct Sales have place in Demand Creation, esp emerging technologies
    • Direct Sales working with Channel Partner Resources heavily encouraged
    • Direct Sales Fulfilment a no-no unless clear guidelines upfront, well understood by all
    • Avoid unnecessary channel conflict; actively discourage sharing results of reseller end user engagement history unless presence/relationship/history of third party reseller with end user decision makers (not just purchasing!) is compelling and equitable

Distributors

  • Map vendor single contracts/support terms to thousands of downstream resellers
  • Ensure the spirit and letter of Vendor trading/marketing terms are delivered downstream
  • Break Bulk (physical logistics, purchase, storage, delivery, rotation, returns)
  • Offer Credit to resellers (mindful that typically <25% of trading debt in insurable)
  • Centralised Configuration, Quotation and associated Tech Support used by resellers
  • Interface into Vendor Deal Registration Process, assist vendor forecasting
  • Assistance to vendor in provision of Accreditation Training

Resellers

  • Have Fun, Deliver Good Value to Customers, Make Predictable Profit, Survive
  • Financial Return for time invested in Customer Relationships, Staff knowledge, Skills Accreditations, own Services and institutional/process knowledge
  • Trading terms in place with vendor(s) represented and/or distributor(s) of same
  • Manage own Deal Pipeline, and associated activity to increase one or more of:
    • Number of Customers
    • Range of Vendor Products/Services Sold
    • Customer Purchase Frequency
    • Product/Service Mix in line with Vendor objectives
    • Margins
  • Assistance as needed from Vendor and/or Distributor staff
  • No financial surprises

So, what have I missed?

I do remember, in my relative youth, that as a vendor we used to work out what our own staffing needs were based on the amount of B2B revenue we wanted to achieve in each of catalogue/web sales, direct sales, VARs and through IT Distribution. If you plug in the revenue needs at the top, it gives the number of sales staff needed, then the number of support resources for every n folks at the layer before – and then the total advertising/promotion needed in each channel. It looked exactly like this:

1991 Channel Mix Ready Reckoner

Looking back at this and comparing to today, the whole IT Industry has gotten radically more efficient as time has gone by. That said, I good ready reckoner is to map in the structure/numbers of whoever you feel are the industry leader(s) in your market today, do an analogue of the channel mix they use, and see how that pans out. It will give you a basis from which to assess the sizes and productivity of your own resources – as a vendor at least!

Avoiding the strangling of your best future prospects

Escape Velocity Book Cover

I’m a big fan of the work of Geoffrey Moore, whose seminal work “Crossing the Chasm” i’ve cited before (in fact, the one page version is the #1 download from this blog). However, one of his other books is excellent if you’re faced with a very common issue in High Technology companies; having successful, large product line(s) thats suck all the life out of new, emerging businesses in the same enterprise. The book is “Escape Velocity”:

Unlike Crossing the Chasm, i’ve not yet summarised it on one sheet of A4, but have outlined the major steps on 14 slides. It sort of works like this:

The main revenue/profit engines in most organisations occur between the early and late majority consumers of the product or services; that can last a long time, denoted by the Elastic Middle:

Product Lifecycle

That said, there are normally products that sales will focus on to drive the current years Revenue and Profit targets; these routinely consume a majority of the resources available. Given a fair crack of the whip, there are normally emergent products that while not material in size today, are showing good signs of growth, and which may generate significant revenue and profits in the 1 to 3 year future. There are also likely to be some longer term punts which have yet to show promise, but which may do so in a 3 to 6 year timeframe:

3 Horizons

The chief way to categorise products/services against the relevant Product Horizon is to graph a scatter plot of revenue or profit for each line on one axis, against growth on the other (10% growth is a typical divider between the High and Low growth Quadrants):

3 Horizons to Category Power

Any products or services on Horizon 0 needs to be shielded from core resources and to be optimised to be cash generative while it lasts. The other product/service horizons are segregated and typically have a different go-to-market team (with appropriate Key Performance Indicators) assigned to each:

Focus Areas

The development pattern for Horizon 2 products are typical of the transition from “Chasm” into the “Tornado” stage on the normal Chasm lifecycle diagram. It’s a relentless learning experience, ruthlessly designing out custom services to form a standard offering for the market segments you target:

Free Resources to Context

As you execute through the various sales teams and move between financial years, there’s a lot of introspection to ensure that the focus on likely winners continues is appropriately ruthless:

Action

The sales teams driving Horizon 2 offerings should be seeking to aim high in customer organisations and drive strategies to establish a beachhead, then dominate, specific focus segments. In doing so, be mindful that a small supporting community tends to cross reference each other. Good salespeople get to know the people networks that do so, and work diligently to connect across them with their colleagues.

Trusted Advisor

The positioning of your Horizon 2 offers tend to vary depending on price and benefit; this in turn looks about like the findings from another seminal work, “The Discipline of Market Leaders”. That book suggested that really successful companies put their relentless effort into only one of three possible core competences; to be the Product Innovator, to be Customer Intimate or to be Operationally Excellent:

Benefit Sensitivity

Once you have the positioning, the Horizon 2 sales team relentlessly focus on the key people or organisations that make up their target market segment(s):

Drive to Share of Segment

The number of organisations they engage differ markedly between Enterprise (Complex) and Consumer (Volume) markets:

Target Customers

So the engagement checklist needs to address all these areas:

Target Market Initiatives

The sales team need to be able to articulate “What makes their offer different”:

Differentials

Then pick their targets:

Growing Horizon 2

Above all, be conscious who your competitors are and where you’re positioned against them:

From Whom

That’s largely it. Just a process to keep assessing the source of future revenue and profits, and ensuring you segment your sales teams to drive both this years business, and separately working on the green shoots that will provide your future. And avoiding what often happens, which is that the existing high revenue or high profit lines demand so much resources that they suffocate your future.

You can probably name a few companies that have done exactly that. Yours doesn’t need to be the next one now!

The Jelly Effect and the importance of focus on AFTERS

Jelly Effect by Andy BoundsI have a few books in my bookcase that I keep for a specific reason. Normally that they are succinct enough to say the obvious things that most people miss. One of these is The Jelly Effect: How to Make Your Communication Stick by Andy Bounds.

His insight is that most people want problem solvers, not technicians. They typically don’t care two hoots about the history of your company, or all the detailed features of your products or services. What they do typically care about is what will have changed for them AFTER your assignment or project with them has been completed. Focussing on that is normally sufficient to be succinct, to the point and framed around delivering the goals that customer feels are important to them. All that without throwing large volumes of superfluous information at your prospect on that journey. Summarised:

“Customers don’t care what you do. They only care what they’re left with AFTER you’ve done it”.

The end results of taking the deeper advice in the book include:

  • One bank, who won business from 18 pitches out of 18 after having implemented AFTERs
  • Another bank increased weekly sales by 47% based on focus on AFTERs
  • A PR and Marketing Company that have won every single sales pitch they have made after having previously won far less sales than their available skills deserved
  • The author suggests it’s worked for every single company he has worked with, from multinational blue-chips, to small local businesses, to charities looking to win National accounts, to family run businesses.

He describes the process outlined in the book in a short 5 minute video here.

I was once asked to write out the 10 reasons why customers should buy Software from my then Company – Computacenter, widely considered to be the largest IT reseller in Europe. Using the principles of “The Jelly Effect”, I duly wrote them out for use by our Marketing Team (they could choose which one of the 11 reasons to drop):

10 Reasons to buy Software from Computacenter

  1. Reducing your Costs. We compensate our folks on good advice, not sales or profits. We would rather show you ways to lower or eliminate your software spend, rather than stuffing you to the gills with products or services that you don’t need. Putting a commission hungry software salesperson rarely delivers cost savings in a tough economic environment; we think being synonymous with “help” is a better long term business strategy.
  2. Improving Service Levels. Your Internal Account Manager or Sales Support contact is the central hub through which we bring all our software skills to bear to help you. We expect to be able to answer any question, on any software or licensing related query, within four working hours.
  3. Access to Skills. Computacenter staff have top flight accreditation levels with almost all of the key infrastructure and PC software vendors, and a track record of doing the right thing, first time, to deliver it’s customers business objectives cost effectively and without surprises. Whether it’s the latest Microsoft technologies, virtualising your data centre, securing your network/data or weighing up the possible deployment of Open Source software, we have impartial experts available to assist.
  4. Freeing up your time. Computacenter has trading agreements in place with over 1,150 different software vendors and their local distribution channels, almost all signed up to advantageous commercial terms we make available to you. We can find most software quickly and buy it for you immediately on very cost effective commercial terms, and with minimal administration. Chances are we’re buying the same thing for many of your industry peers already.
  5. Reducing Invoice Volumes and associated costs. We’re happy to consolidate your spend so you receive just one invoice to process per month from us across all your hardware, software and services purchases from Computacenter. We often hear of cost-to-handle of £50 per invoice, as well as the time you staff take to process each one. Let us help you save money, and reduce your costs at the same time.
  6. Renewals without surprises. We can give you full visibility of your software renewals, enabling more effective budgeting, timely end user notifications, simpler co-termed plus consolidated contracts, and lower support costs. Scheduled reporting makes late penalty fees and interrupted support a thing of the past. Reduced management burden, and more time to focus on your key management challenges.
  7. Self Service without maverick buying. We work with IT and Purchasing Managers to make only their approved software products, at their most cost effective licensing levels, available using our CC Connect online purchasing service. This can often halve the spend that users would otherwise spend themselves on retail boxed versions.
  8. Purchase Power. Computacenter customers together account for the largest spend of any reseller on almost all of the major Software vendors we trade with. In the final analysis, you get the best prices and access to the best vendor, distributor and Computacenter skills to help achieve your business objectives.
  9. Spend Reporting. Knowing what license assets you have is the first step to ensuring you’re not inadvertently duplicating purchases; we’ve been known to deliver 23%+ savings on new software spend by giving IT Managers the ability to “farm” their existing license assets when staff leave or systems evolve in different parts of their organisation. Reporting on your historical purchase volumes via Computacenter is available without charge.
  10. Managed Procurement. We’re fairly adept at, and often manage, relationships for new and renewal purchases across 80-120 different software vendors on behalf of IT and Purchasing staff. If you’d like to delegate that to us, we’re be delighted to assist.
  11. Services. If you’ve not got time to work out what you’ve purchased down the years, and wish to consolidate this into a single “bank statement” of what your current and upgrade entitlements are, we can do this for you for a nominal cost (we use our own internal tools to do this fast and accurately for the major PC software vendors, independent of the mix of routes you used to procure your software assets). When times are tough, many vendors think “time to audit our software users”; your knowledge is your power, and even if you find there is some degree of non-compliance, we work to minimise the financial exposure and protect your reputation. We’ve been known to strip 75% off a vendors proposed multi million pound compliance bill using our licensing experts and some thorough research.

So can we help you?

I think that summarised things pretty well (my boss thought so too). Not least as the company were surrounded at the time by competitors that had a tendency to put software sales foxes straight into customer chicken coups. We always deliberately separated what media outlets consider a divide between advertising and editorial, or between church and state; we physically kept consultants measured on customer satisfaction and not on sales revenue. Computacenter are still pretty unique in that regard.

They still do that to this day, a long time after my involvement there as the Director of Merchandising and Operations of the Software Business Unit finished.

I don’t think the Andy Bounds has overhyped his own book at all. Its lessons still work impeccably to this day.

 

Public Clouds, Google Cloud moves and Pricing

Google Cloud Platform Logo

I went to Google’s Cloud Platform Roadshow in London today, nominally to feed my need to try and rationalise the range of their Cloud offerings.  This was primarily for my potential future use of their infrastructure and to learn to what I could of any nuances present. Every provider has them, and I really want to do a good job to simplify the presentation for my own sales materials use – but not to oversimplify to make the advice unusable.

Technically overall, very, very, very impressive.

That said, i’m still in three minds about the way the public cloud vendors price their capacity. Google have gone to great lengths – they assure us – to simplify their pricing structure against industry norms. They were citing industry prices coming down by 6-8% per year, but the underlying hardware following Moores law much more closely – at 20-30% per annum lower.

With that, Google announced a whole raft of price decreases of between 35-85%, accompanied by simplifications to commit to:

  • No upfront payments
  • No Lock-in or Contracts
  • No Complexity

I think it’s notable that as soon as Google went public with that a few weeks back, they were promptly followed by Amazon Web Services, and more recently by Microsoft with their Azure platform. The outside picture is that they are all in a race, nip and tuck – well, all chasing the volume that is Amazon, but trying to attack from underneath, a usual industry playbook.

One graph came up, showing that when a single virtual instance is fired up, it costs around 7c per hour if used up to 25% of the month – after which the cost straight lines down. If that instance was up all month, then it was suggested that the discount of 30% would apply. That sort of suggests a monthly cost of circa $36.

Meanwhile, the Virtual Instance (aka Droplet) running Ubuntu Linux and my WordPress Network on Digital Ocean, with 30GB flash storage and a 3TB/month network bandwidth, currently comes out (with weekly backups) at a fixed $12 for me. One third the apparent Google price.

I’m not going to suggest they are in any way comparable. The Digital Ocean droplet was pretty naked when I ran it up for the first time. I had to very quickly secure it (setting up custom iptables to close off the common ports, ensure secure shell only worked from my home fixed IP address) and spend quite a time configuring WordPress and associated email infrastructure. But now it’s up, its there and the monthly cost very predictable. I update it regularly and remove comment spam volumes daily (ably assisted by a WordPress add-in). The whole shebang certainly doesn’t have the growth potential that Google’s offerings give me out of the box, but like many developers, it’s good enough for it’s intended purpose.

I wonder if Google, AWS, Microsoft and folks like Rackspace buy Netcraft’s excellent monthly hosting provider switching analysis. They all appear to be ignoring Digital Ocean (and certainly not appearing to be watching their churn rates to an extent most subscription based businesses usually watch like a hawk) while that company are outgrowing everyone in the industry at the moment. They are the one place that are absorbing developers, and taking thousands of existing customers away from all the large providers. In doing so, they’ve recently landed a funding round from VC Andreessen Horowitz (aka “A16Z” in the industry) to continue to push that growth. Their key audience, that of Linux developers, being the seeds from which many valuable companies and services of tomorrow will likely emerge.

I suspect there is still plenty time for the larger providers to learn from their simplicity – of both pricing, and the way in which pre-configured containers of common Linux-based software stacks (WordPress, Node.js, LAMP, email stacks, etc) can be deployed quickly and inexpensively. If indeed, they see Digital Ocean as a visible threat yet.

In the meantime, i’m trying to build a simple piece of work that can articulate how all the key Public Cloud vendor services are each structured, from the point of view of the time-pressured, overly busy IT Manager (the same as I did for the DECdirect Software catalogue way back when). I’m scheduled to have a review of AWS at the end of April to this end. The presence of a simple few spreads of comparative collateral appears to be the missing reference piece in the Industry to date.

Great Technology. Where’s the Business Value?

Exponential Growth Bar GraphIt’s a familiar story. Some impressive technical development comes up, and the IT industry adopts what politicians will call a “narrative” to try push its adoption – and profit. Two that are in the early stages right now are “Wearables” and “Internet of Things”. I’m already seeing some outlandish market size estimates for both, and wondering how these map back to useful applications that people will pay for.

“Internet of Things” is predicated on an assumption that with low cost sensors and internet connected microcomputers embedded in the world around us, the volume of data thrown onto the Internet will necessitate a ready market needing to consume large gobs of hardware, software and services. One approach to try to rationalise this is to spot where there are inefficiencies in a value chain exist, and to see where technology will help remove them.

One of my sons friends runs a company that has been distributing sensors of all sorts for over 10 years. Thinking there may be an opportunity to build a business on top of a network of these things, I asked him what sort of applications his products were put to. It appears to be down to networks of flows in various utilities and environmental assets (water, gas, rivers, traffic) or in industrial process manufacturing. Add some applications of low power bluetooth beacons, then you have some human traffic monitoring in retail environments. I start running out of ideas for potential inefficiencies that these (a) can address and (b) that aren’t already being routinely exploited. One example is in water networks, where fluid flows across a pipe network can help quickly isolate the existence of leaks, markedly improving the supply efficiency. But there are already companies in place that do that and they have the requisite relationships. No gap there apparent.

One post on Gigaom showed some interesting new flexible electronic materials this week. The gotcha with most such materials is the need for batteries, the presence of which restricts the number of potential applications. One set of switches from Swiss company Algra could emit a 2.4GHz radio signal between 6-10 meters using only energy from someone depressing a button; the main extra innovations are that the result is very thin, and have (unlike predecessors) extremely long mechanical lifetimes. No outside power source required. So, just glue your door bells or light switches where you need them, and voila – done forever.

The other material that caught my eye was a flexible image sensor from ISORG (using Plastic Logic licensed technology). They managed to have a material that you could layer on the surface of a display, and which can read the surface of any object placed against it. No camera needed, and with minimal thickness and weight. Something impossible with a standard CMOS imaging scanner, because that needs a minimum distance to focus on the object above it. So, you could effectively have an inbuilt scanner on the surface of your tablet, not only for monochrome pictures, but even fingerprints and objects in close proximity – for contactless gesture control. Hmmm – smart scanning shelves in retail and logistics – now that may give users some vastly improved efficiencies along the way.

The source article is at: http://gigaom.com/2014/04/07/how-thin-flexible-electronics-will-revolutionize-everything-from-user-interfaces-to-packaging/

A whole field is opening up around collecting data from the Onboard Diagnostics Bus that exists in virtually every modern car now, but i’ve yet to explore that in any depth so far. I’ve just noticed a trickle of news articles about Phil Windley’s FUSE project on Kickstarter (here) and some emerging work by Google in the same vein (with the Open Automotive Alliance). Albeit like TVs, vehicle manufacturers have regulatory challenges and/or slow replacement cycles stopping them moving at the same pace as the computer and electronic industries do.

Outside of that, i’m also seeing a procession of potential wearables, from glasses, to watches, to health sensors and to clip-on cameras.

Glasses and Smart Watches in general are another much longer story (will try and do that justice tomorrow), but these are severely limited by the need for battery power in limited space to so much more than their main application – which is simple display of time and pertinent notifications.

Health sensors are pretty well established already. I have a FitBit One on me at all times bar when i’m sleeping. However, it’s main use these days is to map the number of steps I take into an estimated distance I walk daily, which I tap pro-rata into Weight Loss Resources (I know a walk to our nearest paper shop and back is circa 10,000 steps – and 60 mins of moderate speeds – enough to give a good estimate of calories expended). I found the calorie count questionable and the link to MyFitnessPal a source of great frustration for my wife; it routinely swallows her calorie intake and rations out the extra extra calories earnt (for potential increased food consumption) very randomly over 1-3 days. We’ve never been able to correlate it’s behaviour rationally, so we largely ignore that now.

There’s lots of industry speculation around now that Apple’s upcoming iWatch will provide health related sensors, and to send readings into a Passbook-like Health Monitoring application on a users iPhone handset. One such report here. That would probably help my wife, who always appears to suffer a level of anxiety whenever her blood pressure is taken – which worsens her readings (see what happens after 22 days of getting used to taking daily readings – things settle down):

Jane Waring Blood Pressure Readings

I dare say if the reading was always on, she’d soon forget it’s existence and the readings reflect a true reality. In the meantime, there are also feelings that the same Health monitoring application will be able to take readings from other vendors sensors, and that Apple are trying to build an ecosystem of personal health devices that can interface to it’s iPhone based “hub” – and potentially from there onto Internet based health services. We can but wait until Apple are ready to admit it (or not!) at upcoming product announcement events this year.

The main other wearables today are cameras. I’ve seen some statistics on the effect of Police Officers wearing these in the USA:

US Police Officer with Camera

One of my youngest sons friends is a serving Police Officer here, and tells us that wearing of cameras in his police force is encouraged but optional. That said, he said most officers are big fans of using them. When turned off, they have a moving 30 second video buffer, so when first switched on, they have a record of what happened up to 30 seconds before that switch was applied. Similarly, when turned off, they continue filming for a further 30 seconds before returning to their looping state.

Perhaps surprising, he says that his interactions are such that he’s inclined to use less force even though, if you saw footage, you’d be amazed at his self restraint. In the USA, Police report that when people they’re engaging know they’re being filmed/recorded, they are far more inclined to behave themselves and not to try to spin “he said that, I said that” yarns.

There are all sorts of privacy implications if everyone starts wearing such devices, and they are getting increasingly smaller. Muvi cameras as one example, able to record 70-90 minutes of hi res video from their 55mm tall, clip attached enclosure. Someone was recently prosecuted in Seattle for leaving one of these lens-up on a path between buildings frequented by female employees at his company campus (and no, I didn’t see any footage – just news of his arrest!).

We’re moving away from what we thought was going to be a big brother world – but to one where such cameras use is “democratised” across the whole population.

Muvi Camcorder

 

I don’t think anyone has really comprehended the full effect of this upcoming ubiquity, but I suspect that a norm will be to expect that the presence of a working camera to be indicated vividly. I wonder how long it will take for that to become a new normal – and if there are other business efficiencies that their use – and that of other “Internet of Things” sensors in general – can lay before us all.

That said, I suspect industry estimates for “Internet of Things” revenues, as they stand today, along with a lack of perceived “must have this” applications, make them feel hopelessly optimistic to me.

Focus on End Users: a flash of the bleeding obvious

Lightbulb

I’ve been re-reading Terry Leahy’s “Management in 10 Words”; Sir Terry was the CEO of Tesco until recently. I think the piece in the book introduction relating to sitting in front of some Government officials was quite funny – if it weren’t a blinding dose of the obvious that most IT organisations miss:

He was asked “What was it that turned Tesco from being a struggling supermarket, number three retail chain in the UK, into the third largest retailer in the World?”. He said: “It’s quite simple. We focussed on delivering for customers. We set ourselves some simple aims, and some basic values to live by. And we then created a process to achieve them, making sure that everyone knew what they were responsible for”.

Silence. Polite coughing. Someone poured out some water. More silence. “Was that it?” an official finally asked. And the answer to that was ‘yes’.

The book is a good read and one we can all learn from. Not least as many vendors in the IT and associated services industry and going in exactly the opposite direction compared to what he did.

I was listening to a discussion contrasting the different business models of Google, Facebook, Microsoft and Apple a few days back. The piece I hadn’t rationalised before is that of this list, only Apple have a sole focus on the end user of their products. Google and Facebook’s current revenue streams are in monetising purchase intents to advertisers, while trying to not dissuade end users from feeding them the attention and activity/interest/location signals to feed their business engines. Microsoft’s business volumes are heavily skewed towards selling software to Enterprise IT departments, and not the end users of their products.

One side effect of this is an insatiable need focus on competition rather than on the user of your products or services. In times of old, it became something of a relentless joke that no marketing plan would be complete without the customary “IBM”, “HP” or “Sun” attack campaign in play. And they all did it to each other. You could ask where the users needs made it into these efforts, but of the many I saw, I don’t remember a single one of those featured doing so at all. Every IT vendor was playing “follow the leader” (and ignoring the cliffs they may drive over while doing so), where all focus should have been on your customers instead.

The first object lesson I had was with the original IBM PC. One of the biggest assets IBM had was the late Philip “Don” Estridge, who went into the job running IBM’s first foray into selling PCs having had personal experience of running an Apple ][ personal computer at home. The rest of the industry was an outgrowth of a hobbyist movement trying to sell to businesses, and business owners craved “sorting their business problems” simply and without unnecessary surprises. Their use of Charlie Chaplin ads in their early years was a masterstroke. As an example, spot the competitive knockoff in this:

There isn’t one! It’s a focus on the needs of any overworked small business owner, where the precious asset is time and business survival. Trading blows trying to sell one computer over another completely missing.

I still see this everywhere. I’m a subscriber to “Seeking Alpha“, which has a collection of both buy-side and sell-side analysts commentating on the shares of companies i’ve chosen to watch. More often than not, it’s a bit like sitting in an umpires chair during a tennis match; lots of noise, lots of to-and-fro, discussions on each move and never far away from comparing companies against each other.

One of the most prescient things i’ve heard a technology CEO say was from Steve Jobs, when he told an audience in 1997 that “We have to get away from the notion that for Apple to win, Microsoft have to lose”. Certainly, from the time the first iPhone shipped onwards, Apple have had a relentless focus on the end user of their products.

Enterprise IT is still driven largely by vendor inspired fads and with little reference to end user results (one silly data point I carry in my head is waiting to hear someone at a Big Data conference mention a compelling business impact of one of their Hadoop deployments that isn’t related to log file or Twitter sentiment analyses. I’ve seen the same software vendor platform folks float into Big Data conferences for around 3 years now, and have not heard one yet).

One of the best courses I ever went on was given to us by Citrix, specifically on selling to CxO/board level in large organisations. A lot of it is being able to relate small snippets of things you discover around the industry (or in other industries) that may help influence their business success. One example that I unashamedly stole from Martin Clarkson was that of a new Tesco store in South Korea that he once showed to me:

I passed this onto to the team in my last company that sold to big retailers. At least four board level teams in large UK retailers got to see that video and to agonise if they could replicate Tesco’s work in their own local operations. And I dare say the salespeople bringing it to their attention gained a good reputation for delivering interesting ideas that may help their client organisations future. That’s a great position to be in.

With that, i’ve come full circle from and back to Tesco. Consultative Selling is a good thing to do, and that folks like IBM are complete masters at it; if you’re ever in an IBM facility, be sure to steal one of their current “Institute for Business Value” booklets (or visit their associated group on LinkedIn). Normally brim full of surveys and ideas to stimulate the thought processes of the most senior users running businesses.

We’d do a better job in the IT industry if we could replicate that focus on our end users from top to bottom – and not to spend time elbowing competitors instead. In the meantime, I suspect those rare places that do focus on end users will continue to reap a disproportionate share of the future business out there.

Office for the iPad; has the train already left the station?

Meeting notes by @Jargonautical

One asset I greatly admire (and crave!) is the ability to communicate simply, but with panache, speed and reasoned authority. That’s one characteristic of compelling journalism, of good writing and indeed a characteristic of some of the excellent software products i’ve used. Not to throw in the kitchen sink, but to be succinct and to widen focus only to give useful context supporting the central brass tacks.

I’ve now gone 15 months without using a single Microsoft product. I spend circa £3.30/month for my Google Apps for Business account, and have generally been very impressed with Google Spreadsheet and with Google Docs in there. The only temporary irritant along the way was the inability for Google Docs to put page numbers in the Table of Contents of one 30 page document I wrote, offering only html links to jump to the content – which while okay for a web document, was as much use as a cow on stilts for the printed version. But it keeps improving by leaps and bounds every month. That issue solved, and now a wide array of free add-ons to do online review sign-offs, adding bibliographies and more.

This week, i’ve completed all the lessons on a neat piece of Analytics software called Google Fusion Tables, produced by Google Research and available as a free Google Drive add-on. To date, it appears to do almost everything most people would use Tableau Desktop for, including map-based displays, but with a much simpler User Interface. I’m throwing some more heavy weight lifting at it during the next couple of days, including a peek at it’s Python-accessible API – that nominally allows you to daisy chain it in as part of an end-to-end a business process. The sort of thing Microsoft had Enterprises doing with VBA customisations way back when.

My reading is also getting more focussed. I’ve not read a newspaper regularly for years, dip into the Economist only once or twice every three months, but instead go to other sources online. The behaviour is to sample less than 10 podcasts every week, some online newsletters from authoritative sources, read some stuff that appears in Medium, but otherwise venture further afield only when something swims past in my Twitter stream.

This morning, this caught my eye, as posted by @MMaryMcKenna. Lucy Knight (@Jargonautical) had posted her notes made during a presentation Mary had made very recently. Looking at Lucy’s Twitter feed, there were some other samples of her meeting note taking:

Meeting Notes: Minimal Viable Product

Meeting Notes Cashflow Modelling in Excel

Meeting Notes: Customer Service

Aren’t they beautiful?

Lucy mentions in her recent tweets that she does these on an iPad Mini using an application called GoodNotes, which is available for the princely sum of £3.99 here (she also notes that she uses a Wacom Bamboo stylus – though a friend of hers manages with a finger alone). Short demo here. I suspect my attempts using the same tool, especially in the middle of a running commentary, would pale in comparison to her examples here.

With that, there are reports circulating today that the new Microsoft CEO, Satya Nadella, will announce Microsoft Office for iOS this very afternoon. I doubt that any of the Office components will put out work of the quality of Lucy’s iPad Meeting Notes anytime soon, but am open to being surprised.

Given we’ve had over three years of getting used to having no useful Microsoft product (outside of Skype) on the volume phone or tablet devices here, I wonder if that’s a route back to making money on selling software again, or supporting Office 365 subscriptions, or a damp squib waiting to happen.

My bet’s on the middle of those three by virtue of Microsofts base in Large Enterprise accounts, but like many, I otherwise feel it’s largely academic now. The Desktop software market is now fairly well bombed (by Apple and Google) into being a low cost conduit to a Services equivalent instead. The Server software market will, I suspect, aim the same way within 2 years.

Cutting Software Spend: a Checklist

Arrow going down

No real rocket science, but if you’ve been put in a position to try to make savings on your software spend, this is the sort of checklist i’d run down. It is straight off the top of my head, so if there are nuggets you know that i’ve missed, please throw a comment at the end, and i’ll improve it. The list applies whether you are looking at a single organisations spend, or are trying to reconcile the combined assets from any company merger or acquisition.

General rules:

  1. Don’t buy new when you have redundant assets already
  2. Be mindful that committing to buy in volume is lower unit cost than buying individually
  3. Beware of committing to spend over several years where the vendor prices any agreement assuming straight line deployment toward your total user base at the end of the term. Assume most of the deployment will happen much faster – and that your projected spend will front-load with large true-up costs at annual contract anniversaries.
  4. Don’t pay extra for software updates where no updates are planned in the license term
  5. Don’t pay for software you’re not using!

So, the checklist:

  1. If there is a recommended software list to be deployed for a new employee, be sure to engage HR with a weekly list of leavers, and ensure their license assets are returned to a central pool. Licenses in that central pool should be reallocated out of that pool before electing to go forward with any new purchase. I’ve seen one company save 23% of their total desktop software spend just by implementing this one process.
  2. Draw up a master list of all boxed software (termed “Fully Packaged Product” or “FPP”) that appears to have been historically purchased by the organisation. The associated licenses are normally invisible to the software vendor from a purchase history point of view. Two main uses: (a) it forms a list of what should or could be purchased at more favourable terms in the future using an appropriate volume licensing agreement and (b) it’s a useful defence if your CFO receives a spurious “demand for unpaid licenses” from a vendor. I’ve seen one case of a subsequent reconciliation of previous purchases result in an unsolicited £6m invoice being settled for £1.8m instead.
  3. Likewise, compile a list of the various software licenses purchased, per vendor. This is often complicated because a single vendors products can be purchased from multiple sources, and there are several licensing programs in every vendor. You will often find purchases made for a specific project, where an organisation wide reconciliation can take overall licensing and support prices down – but only if centralising the negotiation supports each projects goals. I have seen one such reconciliation of a vendors licenses in one large multinational company run to 80 pages (and a huge discount to bring in an end-of-financial year renewal), though most result in a 1-2 page reconciliation. You then have the data to explore available change options with a vendor or reseller of your choice.
  4. Ensure that the support levels purchased are appropriate for the use of the products. There is no point paying “Software Assurance” for the remainder of a 3 year term if no new version is scheduled to be released in that timeframe (most effective resellers will have visibility of these release pipelines if you can’t get them directly from the vendor). Likewise, you probably don’t need 24/7/365 support on an asset that is used casually.
  5. Finally, don’t buy support on products that you’re no longer using. While this sounds like a flash of the obvious, knowing what is and isn’t being used is often a lengthy consolidation exercise. There are a variety of companies that sell software that can reconcile server based software use, and likewise others (like Camwood) that do an excellent job in reconciling what is present, and used/unused, across a population of Windows PCs. Doing this step is usually a major undertaking and will involve some consultancy spend.

If the level of your buying activity is large enough to be likely to attract the attention of a vendor or reseller salesperson visiting you in person, a few extra considerations:

  1. Be conscious of their business model; it is different for PC software vendors, Enterprise Software Vendors and Vendors predominantly selling “Software as a Service” or Open Source Software based subscriptions. Likewise for the channels of distribution they employ between themselves and your organisation – including the elements of the sales processes a reseller is financially incented to follow. Probably the subject for another day, but let me know if that’s of any interest.
  2. Know a resellers and vendors fiscal quarter year, and particularly their end of financial year, date boundaries. The extent to which prices will flex in your favour will blossom at no other time like these. The quid pro quo is that you need to return the favour to commit your approved order to be placed before their order cut off schedule.
  3. Beware getting locked into products with data formats exclusive to or controlled by one supplier; an escape route with your data assets (and associated processes) intact ensures you don’t get held to future ransom
  4. Consider “Software as a Service” subscriptions wherever possible, aka pay in line with the user population or data sizes actually employed, and flex with any changes up or down. You normally absolve your IT dept from having to update software releases and doing backups for you in the price, and you should get scale advantages to keep that price low. That said, (3) still applies – being able to retrieve your data assets is key to keep pricing honest.
  5. Always be conscious of substitutable products. Nothing oils the wheels of a larger than expected discount from a vendor than that of the presence of a hated competitor. If it’s Microsoft, that’s Google!
  6. Benchmark. If you’re trading with a reseller with many customers, they have an unparalleled view of previous deals of similar dimensions to your own – including past discounts offered, special deal allowances and all the components needed to lower a price. At the very least, an assurance that you’re “getting a good deal”. I have seen one example of a project deferred when it became apparent that the vendor was giving a hitherto good customer a comparatively poor deal that time around.
  7. For multinational companies, explore the cost differences in different territories you buy through and use the software in. I did one exercise for a well known bank that resulted in a 30% drop in their unit costs with one specific vendor – two years running.

So, what nuggets have I missed? Comments most welcome.