Reinventing Healthcare

Comparison of US and UK healthcare costs per capita

A lot of the political effort in the UK appears to circle around a government justifying and handing off parts of our NHS delivery assets to private enterprises, despite the ultimate model (that of the USA healthcare industry) costing significantly more per capita. Outside of politicians lining their own pockets in the future, it would be easy to conclude that few would benefit by such changes; such moves appear to be both economically farcical and firmly against the public interest. I’ve not yet heard any articulation of a view that indicates otherwise. But less well discussed are the changes that are coming, and where the NHS is uniquely positioned to pivot into the future.

There is significant work to capture DNA of individuals, but these are fairly static over time. It is estimated that there are 10^9 data points per individual, but there are many other data points – which change against a long timeline – that could be even more significant in helping to diagnose unwanted conditions in a timely fashion. To flip the industry to work almost exclusively to preventative and away from symptom based healthcare.

I think I was on the right track with an interest in Microbiome testing services. The gotcha is that commercial services like uBiome, and public research like the American (and British) Gut Project, are one-shot affairs. Taking a stool, skin or other location sample takes circa 6,000 hours of CPU wall time to reconstruct the 16S rRNA gene sequences of a statistically valid population profile. Something I thought I could get to a super fast turnaround using excess capacity (spot instances – excess compute power you can bid to consume when available) at one or more of the large cloud vendors. And then to build a data asset that could use machine learning techniques to spot patterns in people who later get afflicted by an undesirable or life threatening medical condition.

The primary weakness in the plan is that you can’t suss the way a train is travelling by examining a photograph taken looking down at a static railway line. You need to keep the source sample data (not just a summary) and measure at regular intervals; an incidence of salmonella can routinely knock out 30% of your Microbiome population inside 3 days before it recovers. The profile also flexes wildly based on what you eat and other physiological factors.

The other weakness is that your DNA and your Microbiome samples are not the full story. There are many other potential leading indicators that could determine your propensity to become ill that we’re not even sampling. The questions of which of our 10^18 different data points are significant over time, and how regularly we should be sampled, are open questions

Experience in the USA is that in environments where regular preventative checkups of otherwise healthy individuals take place – that of Dentists – have managed to lower the cost of service delivery by 10% at a time where the rest of the health industry have seen 30-40% cost increases.

So, what are the measures that should be taken, how regularly and how can we keep the source data in a way that allows researchers to employ machine learning techniques to expose the patterns toward future ill-health? There was a good discussion this week on the A16Z Podcast on this very subject with Jeffrey Kaditz of Q Bio. If you have a spare 30 minutes, I thoroughly recommend a listen: https://soundcloud.com/a16z/health-data-feedback-loop-q-bio-kaditz.

That said, my own savings are such that I have to refocus my own efforts elsewhere back in the IT industry, and my MicroBiome testing service Business Plan mothballed. The technology to regularly sample a big enough population regularly is not yet deployable in a cost effective fashion, but will come. When it does, the NHS will be uniquely positioned to pivot into the sampling and preventative future of healthcare unhindered.

Politicians and the NHS: the missing question

 

The inevitable electioneering has begun, with all the political soundbites simplified into headline spend on the NHS. That is probably the most gross injustice of all.

This is an industry lined up for the most fundamental seeds of change. Genomics, Microbiomes, ubiquitous connected sensors and quite a realisation that the human body is already the most sophisticated of survival machines. There is also the realisation that weight and overeating are a root cause of downstream problems, with a food industry getting a free ride to pump unsuitable chemicals into the food chain without suffering financial consequences for the damage caused. Especially at the “low cost” end of the dietary spectrum.

Politicians, pharma and food lobbyists are not our friends. In the final analysis, we’re all being handed a disservice because those leading us are not asking the fundamental question about health service delivery, and to work back from there.

That question is: “What business are we in?”.

As a starter for 10, I recommend this excellent post on Medium: here.

Hooked, health markets but the mind is wandering… to pooh and data privacy

Hooked by Nir Eyal

One of the things I learnt many years ago was that there were four fundamental basics to increasing profits in any business. You sell:

  • More Products (or Services)
  • to More People
  • More Often
  • At higher unit profit (which is higher price, lower cost, or both)

and with that, four simple Tableau graphs against a timeline could expose the business fundamentals explaining good growth, or the core reason for declining revenue. It could also expose early warning signs, where a small number of large transactions hid an evolving surprise – like the volume of buying customers trending relentlessly down, while the revenue numbers appeared to be flying okay.

Another dimension is that a Brand equates to trust, and that consistency and predictability of the product or service plays a big part to retain that trust.

Later on,  a more controversial view was that there were two fundamental business models for any business; that of a healer or a dealer. One sells an effective one-shot fix to a customer need, while the other survives by engineering a customers dependency to keep on returning.

With that, I sometimes agonise on what the future of health services delivery is. One the one hand, politicians verbal jousts over funding and trying to punt services over to private enterprise. In several cases to providers of services following the economic rent (dealer) model found in the American market, which, at face value, has a business model needing per capita expense that no sane person would want to replicate compared to the efficiency we have already. On the other hand, a realisation that the market is subject to radical disruption, through a combination of:

  • An ever better informed, educated customer base
  • A realisation that just being overweight is a root cause of many adverse trends
  • Genomics
  • Microbiome Analysis
  • The upcoming ubiquity of sensors that can monitor all our vitals

With that, i’ve started to read “Hooked” by Nir Eyal, which is all about the psychology of engineering habit forming products (and services). The thing in the back of my mind is how to encourage the owner (like me) of a smart watch, fitness device or glucose monitor to fundamentally remove my need to enter my food intake every day – a habit i’ve maintained for 12.5 years so far.

The primary challenge is that, for most people, there is little newsworthy data that comes out of this exercise most of the time. The habit would be difficult to reinforce without useful news or actionable data. Some of the current gadget vendors are trying to encourage use by encouraging steps competition league tables you can have with family and friends (i’ve done this with relatives in West London, Southampton, Tucson Arizona and Melbourne Australia; that challenge finished after a week and has yet to be repeated).

My mind started to wander back to the challenge of disrupting the health market, and how a watch could form a part. Could its sensors measure my fat, protein and carb intake (which is the end result of my food diary data collection, along with weekly weight measures)? Could I build a service that would be a data asset to help disrupt health service delivery? How do I suss Microbiome changes – which normally requires analysis of a stool samples??

With that, I start to think i’m analysing this the wrong way around. I remember an analysis some time back when a researcher assessed the extent drug (mis)use in specific neighbourhoods by monitoring the make-up of chemical flows in networks of sewers. So, rather than put sensors on people’s wrists (and only see a subset of data), is there a place for technology in sewer pipes instead? If Microbiomes and the Genetic makeup of our output survives relatively intact, then sampling at strategic points of the distribution network would give us a pretty good dataset. Not least as DNA sequencing could allow the original owner (source) of output to connect back to any pearls of wisdom that could be analysed or inferred from their contributions, even if the drop-off points happened at home, work or elsewhere.

Hmmm. Water companies and Big Data.

Think i’ll park that and get on with the book.

Ians Brain goes all Economics on him

A couple of unconnected events in the last week. One was an article by Scott Adams of Dilbert Fame, with some observations about how Silicon Valley was really one big Psychological Experiment (see his blog post: http://dilbert.com/blog/entry/the_pivot/).

It’s a further extension on a comment I once read by Max Schireson, CEO of MongoDB, reflecting on how Salespeoples compensation works – very much like paying in lottery tickets: http://maxschireson.com/2013/02/02/sales-compensation-and-lottery-tickets/.

The main connection being that Salespeople tend to get paid in lottery tickets in Max’s case, whereas Scott thinks the same is an industry-wide phenomenon – for hundreds of startup companies in one part of California just south of San Francisco. Both hence disputing a central ethos of the American Dream – that he who works hard gets the (financial) spoils.

Today, there was a piece on BBC Radio 2 about books that people never get to finish reading. This was based on some analysis of progress of many people reading Kindle books; this being useful because researchers can see where people stop reading as they progress through each book. By far the worst case example turned out to be “Capital in the Twenty-First Century” by Thomas Piketty, where people tended to stop around Page 26 of a 700-page book.

The executive summary of this book was in fact quite pithy; it predicts that the (asset) rich will continue to get richer, to the expense of the rest of the population whose survival depends on receiving an income flow. Full review here. And that it didn’t happen last century due to two world wars and the 1930’s depression, something we’ve not experienced this century. So far. The book just went into great detail, chapter by chapter, to demonstrate the connections leading to the authors thesis, and people abandoned the book early en mass.

However, it sounds plausible to me; assets tend to hold their relative “value”, whereas money is typically deflationary (inflation of monetary values and devaluation through printing money, no longer anchored to a specific value of gold assets). Even the UK Government factor the devaluation in when calculating their future debt repayment commitments. Just hoping this doesn’t send us too far to repeat what happened to Rome a couple of thousand years ago or so (as cited in one of my previous blog posts here).

Stand back – intellectual deep thought follows:

The place where my brain shorted out was the thought that, if that trend continued, that at some point our tax regime would need to switch from being based monetary income flows to being based on assets owned instead. The implications of this would be very far reaching.

That’ll be a tough sell – at least until everyone thinks we’ve returned to a feudal system and the crowds with pitchforks appear on the scene.

European Courts have been great; just one fumble to correct

Delete Spoof Logo

We have an outstanding parliament that works in the Public Interest. Where mobile roaming charges are being eroded into oblivion, where there is tacit support in law for the principles of Net Neutrality, and where the Minister is fully supportive of a forward looking (for consumers) Digital future. That is the European Parliament, and the excellent work of Neelie Kroes and her staff.

The one blight on the EC’s otherwise excellent work has been the decision to enact – then outsource – a “Right to be Forgotten” process to a commercial third party. The car started skidding off the road of sensibility very early in the process, albeit underpinned by one valid core assumption.

Fundamentally, there are protections in place, where a personal financial misfortune or a criminal offence in a persons formative years has occurred, to have a public disclosure time limit enshrined in law. This is to prevent undue prejudice after an agreed time, and to allow the afflicted to carry on their affairs without penalty or undue suffering after lessons have been both internalised and not repeated.

There are public data maintenance and reporting limits on some cases of data on a criminal reference database, or on financial conduct databases, that are mandated to be erased from the public record a specific number of years after first being placed there. This was the case with the Spanish Gentleman who believed his privacy was being violated by the publication of a bankruptcy asset sale well past this statutory public financial reporting boundary, in a newspaper who attributed that sale to him personally.

In my humble opinion, the resolution of the court should have been to (quietly) order the Newspaper to remove (or obfuscate) his name from that article at source. Job done; this then formally disassociated his name from the event, and all downstream (searchable) references to it likewise, so achieving the alignment of his privacy with the usual public record financial reporting acts in law.

By leaving the source in place, and merely telling search engine providers to enact processes to allow individuals to request removal of unwanted facts from the search indexes only, opens the door to a litany of undesirable consequences – and indeed leaves the original article on a newspaper web site untouched and in direct violation of the subjects right to privacy over 7 years after his bankruptcy; this association should now have no place on the public record.

Besides timescales coded into law on specific timescales where certain classes of personal data can remain on the public record, there are also ample remedies at law in place for enforcing removal (and seeking compensation for) the publication of libellous or slanderous material. Or indeed the refusal to take-down such material in a timely manner with, or without, a corresponding written apology where this is judged appropriate. No new laws needed; it is then clear that factual content has its status reinforced in history.

In the event, we’re now subject to a morass of take-down requests that have no legal basis for support. Of the initial volume (of 10’s of 1,000’s of removal requests):

  • 31 percent of requests from the UK and Ireland related to frauds or scams
  • 20 percent to arrests or convictions for violent or serious crimes
  • 12 percent to child pornography arrests
  • 5 percent to the government and police
  • 2 percent related to celebrities

That is demonstrably not serving the public interest.

I do sincerely hope the European Justices that enacted the current process will reflect on the monster they have created, and instead change the focus to enact privacy of individuals in line with the financial and criminal record keeping edicts of publicly accessible data coded in law already. In that way, justice will be served, and we will no longer be subjected to a process outsourced to a third party who should never be put in a position of judge and jury.

That is what the courts are for, where the laws are very specific, and in which the public was full confidence.

Start with the needs of the end user, and work back from there…

Great Customer Service

A bit of a random day. I learnt something about the scale of construction taking place in China; not just the factoid that they’re building 70 airports at the moment, but a much more stunning one. That, in the last 3 years, the Chinese have used more cement than the USA did in the 100 years between 1900 and 2000. The very time when all the Interstate and Road networks were built, in addition to construction in virtually every major city.

5 of the top 10 mobile phone vendors are Chinese (it’s not just an Apple vs Samsung battle now), and one appears to be breaking from the pack in emerging markets – Xiaomi (pronounced show – as in shower – and me). Their business model is to offer Apple-class high end phones at around cost, target them at 18-30 year “fans” in direct sales (normally flash sales after a several 100,000 unit production run), and to make money from ROM customisations and add-on cloud services. I’ve started hearing discussions with Silicon Valley based market watchers who are starting to cite Xiaomi’s presence in their analyses, not least as in China, they are taking market share from Samsung – the first alternative Android vendor to consistently do so. I know their handsets, and their new tablet, do look very nice and very cost effective.

That apart, I have tonight read a fantastic blog post from Neelie Kroes, Vice President of the European Commission and responsible for the Digital Agenda for Europe – talking specifically about Uber and this weeks strikes by Taxi drivers in major cities across Europe. Well worth a read in full here.

Summarised:

  • Let me respond to the news of widespread strikes and numerous attempts to limit or ban taxi app services across Europe. The debate about taxi apps is really a debate about the wider sharing economy.
  • It is right that we feel sympathy for people who face big changes in their lives.
  • Whether it is about cabs, accommodation, music, flights, the news or whatever.  The fact is that digital technology is changing many aspects of our lives. We cannot address these challenges by ignoring them, by going on strike, or by trying to ban these innovations out of existence
  • a strike won’t work: rather than “downing tools” what we need is a real dialogue
  • We also need services that are designed around consumers.
  • People in the sharing economy like drivers, accommodation hosts, equipment owners and artisans – these people all need to pay their taxes and play by the rules.  And it’s the job of national and local authorities to make sure that happens.
  • But the rest of us cannot hide in a cave. 
  • Taxis can take advantage of these new innovations in ways consumers like – they can arrive more quickly, they could serve big events better, there could be more of them, their working hours could be more flexible and suited to driver needs – and apps can help achieve that.
  • More generally, the job of the law is not to lie to you and tell you that everything will always be comfortable or that tomorrow will be the same as today.  It won’t. Not only that, it will be worse for you and your children if we pretend we don’t have to change. If we don’t think together about how to benefit from these changes and these new technologies, we will all suffer.
  • If I have learnt anything from the recent European elections it is that we get nowhere in Europe by running away from hard truths. It’s time to face facts:  digital innovations like taxi apps are here to stay. We need to work with them not against them.

It is absolutely refreshing to have elected representatives working for us all and who “get it”. Focus on consumers, being respectful of those afflicted by changes, but driving for the collective common good that Digital innovations provide to society. Kudos to Neelie Kroes; a focus on users, not entrenched producers – a stance i’ve only really heard with absolute clarity before from Jeff Bezos, CEO of Amazon. It does really work.

 

Uber in London: The Streisand Effect keeps on giving

Uber Logo

With the same overall theme as yesterday, if you’re looking at your future, step one is to look at what your customers would value, then to work back to the service components to deliver it.

I’ve followed Uber since I first discovered them in San Francisco, and it looks a simple model – to the user. You want to go from where you are to another local destination. You typically see where the closest driver is to you on your smartphone. You ask your handset for a price to go to a specific destination. It tells you. If you accept, the car is ordered and comes to pick you up. When you get dropped off, your credit card is charged, and both you and the taxi driver get the opportunity to rate each other. Job done.

Behind that facade is a model of supply and demand. Taxi drivers that can clock on and off at will. At times of high demand and dwindling available ride capacity, prices are ramped up (to “surge” pricing) to encourage more drivers onto the road. Drivers and customers with voluminous bad ratings removed. Drivers paid well enough to make more money than those in most taxi firms ($80-90,000/year in New York), or the freedom to work part time – even down to a level where your reward is to pay for your car for a few hours per week of work, and have free use of it at other times.

The service is simple and compelling enough that i’d have thought tax firms would have cottoned onto how the service works, and to replicate it before Uber ever appeared on these shores. But, with a wasted five years, they’ve appeared – and Taxi drivers all over Europe decided to run the most effective advertising campaign for an upstart competitor in their history. A one-day 850% subscriber growth; that really takes some doing, even if you were on the same side.

I’m just surprised that whoever called the go-slows all over Europe didn’t take the time out to study what we in the tech industry know as “The Streisand Effect” – Wikipedia reference here. BBC Radio 2 even ran a segment on Uber at lunchtime today, followed by every TV News Bulletin i’ve heard since. I downloaded the app as a result of hearing it on that lunchtime slot, as I guess many others did too (albeit no coverage in my area 50 miles West of London – yet). Given the five years of missed prep time, I think they’ve now lost – or find themselves in fast follower mode to incorporate similar technology into their service before they have a mass exodus to Uber (of customers, then drivers).

London Cabbies do know all the practical use of rat runs that SatNav systems are still learning, but even that is a matter of time now. I suspect appealing for regulation will, at best, only delay the inevitable.

The safest option – given users love the simplicity and lack of surprises in the service – is to get busy quickly. Plenty of mobile phone app prototyping help available on the very patch that London Black Cab drivers serve.

Starting with the end in mind: IT Management Heat vs Light

A very good place to startOne source of constant bemusement to me is the habit of intelligent people to pee in the industry market research bathwater, and then to pay handsomely to drink a hybrid mix of the result collected across their peers.

Perhaps betrayed by an early experience of one research company coming in to present to the management of the vendor I was working at, and finding in the rehearsal their conjecture that sales of specific machine sizes had badly dipped in the preceding quarter. Except they hadn’t; we’d had the biggest growth in sales of the highlighted machines in our history in that timeframe. When I mentioned my concern, the appropriate slides were corrected in short order, and no doubt the receiving audience impressed with the skill in their analysis that built a forecast starting with an amazingly accurate, perceptive (and otherwise publicly unreported) recent history.

I’ve been doubly nervous ever since – always relating back to the old “Deep Throat” hints given in “All the Presidents Men” – that of, in every case, “to follow the money”.

Earlier today, I was having some banter on one of the boards of “The Motley Fool” which referenced the ways certain institutions were imposing measures on staff – well away from a useful business use that positively supported better results for their customers. Well, except of providing sound bites to politicians. I can sense that in Education, in some elements of Health provision, and rather fundamentally in the Police service. I’ve even done a drains-up some time ago that reflected on the way UK Police are measured, and tried trace the rationale back to source – which was a senior politician imploring them to reduce crime; blog post here. The subtlety of this was rather lost; the only control placed in their hands was that of compiling the associated statistics, and to make their behaviours on the ground align supporting that data collection, rather than going back to core principles of why they were there, and what their customers wanted of them.

Jeff Bezos (CEO of Amazon) has the right idea; everything they do aligns with the ultimate end customer, and everything else works back from there. Competition is something to be conscious of, but only to the extent of understanding how you can serve your own customers better. Something that’s also the central model that W. Edwards Deming used to help transform Japanese Industry, and in being disciplined to methodically improve “the system” without unnecessary distractions. Distractions which are extremely apparent to anyone who’s been subjected to his “Red Beads” experiment. But the central task is always “To start with the end in mind”.

With that, I saw a post by Simon Wardley today where Gartner released the results of a survey on “Top 10 Challenges for I&O Leaders”, which I guess is some analogue of what used to be referred to as “CIOs”. Most of which felt to me like a herd mentality – and divorced from the sort of issues i’d have expected to be present. In fact a complete reenactment of this sort of dialogue Simon had mentioned before.

Simon then cited the first 5 things he thought they should be focussed on (around Corrective Action), leaving the remainder “Positive Action” points to be mapped based on that appeared upon that foundation. This in the assumption that those actions would likely be unique to each organisation performing the initial framing exercise.

Simon’s excellent blog post is: My list vs Gartner, shortly followed by On Capabilities. I think it’s a great read. My only regret is that, while I understand his model (I think!), i’ve not had to work on the final piece between his final strategic map (for any business i’m active in) and articulating a pithy & prioritised list of actions based on the diagram created. And I wish he’d get the bandwidth to turn his Wardley Maps into a Book.

Until then, I recommend his Bits & Pieces Blog; it’s a quality read that deserves good prominence on every IT Manager’s (and IT vendors!) RSS feed.

CloudKit – now that’s how to do a secure Database for users

Data Breach Hand Brick Wall Computer

One of the big controversies here relates to the appetite of the current UK government to release personal data with the most basic understanding of what constitutes personal identifiable information. The lessons are there in history, but I fear without knowing the context of the infamous AOL Data Leak, that we are destined to repeat it. With it goes personal information that we typically hold close to our chests, which may otherwise cause personal, social or (in the final analysis) financial prejudice.

When plans were first announced to release NHS records to third parties, and in the absence of what I thought were appropriate controls, I sought (with a heavy heart) to opt out of sharing my medical history with any third party – and instructed my GP accordingly. I’d gladly share everything with satisfactory controls in place (medical research is really important and should be encouraged), but I felt that insufficient care was being exercised. That said, we’re more than happy for my wife’s Genome to be stored in the USA by 23andMe – a company that demonstrably satisfied our privacy concerns.

It therefore came as quite a shock to find that a report, highlighting which third parties had already been granted access to health data with Government mandated approval, ran to a total 459 data releases to 160 organisations (last time I looked, that was 47 pages of PDF). See this and the associated PDFs on that page. Given the level of controls, I felt this was outrageous. Likewise the plans to release HMRC related personal financial data, again with soothing words from ministers in whom, given the NHS data implications, appear to have no empathy for the gross injustices likely to result from their actions.

The simple fact is that what constitutes individual identifiable information needs to be framed not only with what data fields are shared with a third party, but to know the resulting application of that data by the processing party. Not least if there is any suggestion that data is to be combined with other data sources, which could in turn triangulate back to make seemingly “anonymous” records traceable back to a specific individual.Which is precisely what happened in the AOL Data Leak example cited.

With that, and on a somewhat unrelated technical/programmer orientated journey, I set out to learn how Apple had architected it’s new CloudKit API announced this last week. This articulates the way in which applications running on your iPhone handset, iPad or Mac had a trusted way of accessing personal data stored (and synchronised between all of a users Apple devices) “in the Cloud”.

The central identifier that Apple associate with you, as a customer, is your Apple ID – typically an email address. In the Cloud, they give you access to two databases on their cloud infrastructure; one a public one, the other private. However, the second you try to create or access a table in either, the API accepts your iCloud identity and spits back a hash unique to your identity and the application on the iPhone asking to process that data. Different application, different hash. And everyone’s data is in there, so it’s immediately unable to permit any triangulation of disparate data that can trace back to uniquely identify a single user.

Apple take this one stage further, in that any application that asks for any personal identifiable data (like an email address, age, postcode, etc) from any table has to have access to that information specifically approved by the handset owners end user; no explicit permission (on a per application basis), no data.

The data maintained by Apple, besides holding personal information, health data (with HealthKit), details of home automation kit in your house (with HomeKit), and not least your credit card data stored to buy Music, Books and Apps, makes full use of this security model. And they’ve dogfooded it so that third party application providers use exactly the same model, and the same back end infrastructure. Which is also very, very inexpensive (data volumes go into Petabytes before you spend much money).

There are still some nuances I need to work. I’m used to SQL databases and to some NoSQL database structures (i’m MongoDB certified), but it’s not clear, based on looking at the way the database works, which engine is being used behind the scenes. It appears to be a key:value store with some garbage collection mechanics that look like a hybrid file system. It also has the capability to store “subscriptions”, so if specific criteria appear in the data store, specific messages can be dispatched to the users devices over the network automatically. Hence things like new diary appointments in a calendar can be synced across a users iPhone, iPad and Mac transparently, without the need for each to waste battery power polling the large database on the server waiting for events that are likely to arrive infrequently.

The final piece of the puzzle i’ve not worked out yet is, if you have a large database already (say of the calories, carbs, protein, fat and weights of thousands of foods in a nutrition database), how you’d get that loaded into an instance of the public database in Apple’s Cloud. Other that writing custom loading code of course!

That apart, really impressed how Apple have designed the datastore to ensure the security of users personal data, and to ensure an inability to triangulate data between information stored by different applications. And that if any personal identifiable data is requested by an application, that the user of the handset has to specifically authorise it’s disclosure for that application only. And without the app being able to sense if the data is actually present at all ahead of that release permission (so, for example, if a Health App wants to gain access to your blood sampling data, it doesn’t know if that data is even present or not before the permission is given – so the app can’t draw inferences on your probably having diabetes, which would be possible if it could deduce if it knew that you were recording glucose readings at all).

In summary, impressive design and a model that deserves our total respect. The more difficult job will be to get the same mindset in the folks looking to release our most personal data that we shared privately with our public sector servants. They owe us nothing less.

Am I the only one shaking my head at US Net Neutrality?

Internet Open Sign

I’ve always had the view that:

  1. ISPs receive a monthly payment for the speed of connection I have to the Internet
  2. Economics are such that I expect this to be effectively uncapped for almost all “normal” use, though the few edge cases of excessive use would be subject to a speed reduction to ration use of the resources for the good of the ISPs user base as a whole (to avoid a tragedy of the commons)
  3. That a proportion of my monthly costs would track investments needed to ensure peering equipment and the ISPs own infrastructure delivered service to me at the capacity needed to deliver (1) and (2) without any discrimination based on traffic nor its content.

Living in Europe, i’ve been listening to lots of commentary in the USA about both the proposed merger between Comcast and Time Warner Cable on one hand, and of the various ebbs and flows surrounding “Net Neutrality” and the FCC on the other. It’s probably really surprising to know that broadband speeds in the USA are at best mid-table on the world stage, and that Comcast and Time Warner have some of the worst customer satisfaction scores in their respective service areas. There is also the spectacle of seeing the widespread funding of politicians there by industry, and the presence of a far from independent chairman of the FCC (the regulator) whose term is likely to be back through the revolving door to the very industry he currently is charged to regulate and from whence he came.

I’ve read “Captive Audience: The Telecom Industry and Monopoly Power in the New Gilded Age” by Susan Crawford, which logged what happened as the Bell Telephone Monopoly was deregulated, and the result the US consumer was left with. Mindful of this, there was an excellent blog post that amply demonstrates what happens when the FCC lets go of the steering wheel, and refuses to classify Internet provision being subject to the “common carrier” status. Dancing around this serves no true political purpose, other than to encourage the receipt of Economic rent in ample excess to the cost of service provision in areas of mandated exclusivity of provision.

It appears that the 5 of the major “last mile” ISPs in the USA (there are 6 of them – while unnamed, folks on various forums suspect that Verizon are the only ones not cited) are not investing in equipment at their peering points, leading to an inference that they are double dipping. ie: asking the source of traffic (like Netflix, YouTube, etc) to pay transit costs to their customers for the “last mile”. Equipment costs that are reckoned to be marginal (fractions of a cent to each customer served) to correct. There is one European ISP implicated, though comments i’ve seen around the USA suggest this is most likely to be to Germany.

The blog post is by Mark Taylor, an executive of Level 3 (who provide a lot of the long distance bandwidth in the USA). Entitled “Observations of an Internet Middleman”, it is well worth a read here.

I just thank god we’re in Europe, where we have politicians like Neelie Kroes who works relentlessly, and effectively, to look after the interests of her constituents above all else. With that, a commitment to Net Neutrality, dropping roaming charges for mobile telcos, no software patents and pushing investments consistent with the long term interests of the population in the EC.

We do have our own challenges in the UK. Some organisations still profit handsomely from scientific research we pay for. We fund efforts by organisations to deliver hammer blows to frustrated consumers rather than encouraging producers to make their content accessible in a timely and cost effective fashion. And we have one of the worst cases of misdirected campaigns, with no factual basis and playing on media-fanned fear, to promote government mandated censorship (fascinating parallels to US history in “The Men who open your mail” here – it’ll take around 7 minutes to read). Horrific parallels to this, and conveniently avoiding the fact that wholesale games of “wac-a-mole” have demonstrably never worked.

That all said, our problems will probably trend to disappear, be it with the passing of the current government and longer term trends in media readership (the Internet native young rarely read Newspapers – largely a preserve of the nett expiring old).

While we have our own problems, I still don’t envy the scale of task ahead of consumers in the USA to unpick their current challenges with Internet access. I sincerely hope the right result makes it in the end.