Mueller Report

US media sources are very divided on partisan lines, so I thought I’d read the whole Mueller report. Accordingly, I’ve just finished reading the redacted version.

First 20% was all about the relentless Russian campaigns to insert divisive ads in social media. That included spear phishing Democratic National Convention accounts, downloading sensitive documents and releasing them in batches timed to nullify release of news critical of Trump.

The next 30% catalogues Russian attempts to engage the Trump campaign staff (and most other candidate campaigns) leading up to, and just after the election. Some staff told lies about approaches related to asking support of action in Crimea and support of a UN resolution about Israeli settlements in the West Bank, and were duly prosecuted for perjury. That said, no real success, and no real effect.

The last half of the work related to Trump going completely unhinged and relentless attempts to meddle with the work of the investigation. Given the relatively clean bill of health to the main work, it’s difficult to rationalise the reason why. Trump has largely saved by his staff refusing to carry out the more contentious directives from him. You’re left wondering why Trump went so far given the relatively benign nature of the allegations, given that his reaction was so intense. Got to wonder why.

Nigel Farage is mentioned as having useful folks “in his orbit” in London in approaches to Wikileaks. Still curious on why he visited Assange in the Ecuadonian Embassy at the same time Russia where seeking support for their invasion of Crimea. There were separate threads on large influxes of data into Assanges servers that didn’t appear to come over a network at various times, the implication that carrying data around by hand was a thing. But more questions than answers.

The first 70 pages of the work are the most chilling, and the same behaviours on Social Media appear to happening in the UK right now. Still wondering who is bankrolling Farages campaign and his plane travel – but hopefully some good journalism will give some answers in time.

Overall, Mueller did a quality job in the most difficult of circumstances. I hope that we’ll get a similar exercise asking similar questions this side of the Atlantic. There are lessons to learn here too. In the interim, the same behaviours continue unchecked…

Quality Journalism – UK Oxymoron?

I’m writing this the day that John McCain died in the USA – and the most compelling eulogy came from Barack Obama. It’s a rare day right now when people can disagree fervently with each others views, but still hold each other in greatest respect.

In reading “The Secret Barrister”, you come away with a data filled summary of the comparatively and continued poor state of Westminster politics. Of successive abuses to a system of justice by politicians of all colours. To prioritise “PR” on everything to mask poor financial choices with sound bites, while quietly robbing us all blind of values we hold dear. And i’m sure Chris Grayling will receive few Christmas Cards from members of the judiciary based on their experience of him documented in this books pages.

Politics is but only half the story in this. I often muse to wonder where quality journalism disappeared to? There are good pockets in the London Review of Books, and with the work on the Panama Papers by ICIJ – but where else are the catalogue of abuses systematically documented in a data based, consumable way? Where is the media with the same bite as “World in Action” back on the day? It appears completely AWOL.

One of the really curious things about Westminster is that MPs are required to align to the terms of the “The Code of Conduct for Members of Parliament“. If you go down to item 6, it reads “Members have a general duty to act in the interests of the nation as a whole; and a special duty to their constituents”. Now, tell me how the Whip system works there. On the face of it, it is profoundly against the very code in which our democracy is enshrined.

There appears to be no data source published on the number of votes taken, and whether they were “free” votes or directed to be 1, 2 or 3 line instructions from each whips office. Fundamentally, how many votes taken were allowed to rest on the conscious obligations to be exercised by MPs freely, or to what extent were they compelled like sheep through the abattoir voting booths there?

My gut suggests our current government are probably inflicting more divisive whips more often than any UK government in our history, not least as the future interests of our country appear to being driven by a very small proportion of representatives there. The bare complexion of this should be easily apparent from the numbers and some simple comparative graphs – so, who’s keeping count?

Democracy this isn’t. And the lack of quality journalism in the UK is heavily complicit in it’s disappearance.

Ians Brain goes all Economics on him

A couple of unconnected events in the last week. One was an article by Scott Adams of Dilbert Fame, with some observations about how Silicon Valley was really one big Psychological Experiment (see his blog post: http://dilbert.com/blog/entry/the_pivot/).

It’s a further extension on a comment I once read by Max Schireson, CEO of MongoDB, reflecting on how Salespeoples compensation works – very much like paying in lottery tickets: http://maxschireson.com/2013/02/02/sales-compensation-and-lottery-tickets/.

The main connection being that Salespeople tend to get paid in lottery tickets in Max’s case, whereas Scott thinks the same is an industry-wide phenomenon – for hundreds of startup companies in one part of California just south of San Francisco. Both hence disputing a central ethos of the American Dream – that he who works hard gets the (financial) spoils.

Today, there was a piece on BBC Radio 2 about books that people never get to finish reading. This was based on some analysis of progress of many people reading Kindle books; this being useful because researchers can see where people stop reading as they progress through each book. By far the worst case example turned out to be “Capital in the Twenty-First Century” by Thomas Piketty, where people tended to stop around Page 26 of a 700-page book.

The executive summary of this book was in fact quite pithy; it predicts that the (asset) rich will continue to get richer, to the expense of the rest of the population whose survival depends on receiving an income flow. Full review here. And that it didn’t happen last century due to two world wars and the 1930’s depression, something we’ve not experienced this century. So far. The book just went into great detail, chapter by chapter, to demonstrate the connections leading to the authors thesis, and people abandoned the book early en mass.

However, it sounds plausible to me; assets tend to hold their relative “value”, whereas money is typically deflationary (inflation of monetary values and devaluation through printing money, no longer anchored to a specific value of gold assets). Even the UK Government factor the devaluation in when calculating their future debt repayment commitments. Just hoping this doesn’t send us too far to repeat what happened to Rome a couple of thousand years ago or so (as cited in one of my previous blog posts here).

Stand back – intellectual deep thought follows:

The place where my brain shorted out was the thought that, if that trend continued, that at some point our tax regime would need to switch from being based monetary income flows to being based on assets owned instead. The implications of this would be very far reaching.

That’ll be a tough sell – at least until everyone thinks we’ve returned to a feudal system and the crowds with pitchforks appear on the scene.

European Courts have been great; just one fumble to correct

Delete Spoof Logo

We have an outstanding parliament that works in the Public Interest. Where mobile roaming charges are being eroded into oblivion, where there is tacit support in law for the principles of Net Neutrality, and where the Minister is fully supportive of a forward looking (for consumers) Digital future. That is the European Parliament, and the excellent work of Neelie Kroes and her staff.

The one blight on the EC’s otherwise excellent work has been the decision to enact – then outsource – a “Right to be Forgotten” process to a commercial third party. The car started skidding off the road of sensibility very early in the process, albeit underpinned by one valid core assumption.

Fundamentally, there are protections in place, where a personal financial misfortune or a criminal offence in a persons formative years has occurred, to have a public disclosure time limit enshrined in law. This is to prevent undue prejudice after an agreed time, and to allow the afflicted to carry on their affairs without penalty or undue suffering after lessons have been both internalised and not repeated.

There are public data maintenance and reporting limits on some cases of data on a criminal reference database, or on financial conduct databases, that are mandated to be erased from the public record a specific number of years after first being placed there. This was the case with the Spanish Gentleman who believed his privacy was being violated by the publication of a bankruptcy asset sale well past this statutory public financial reporting boundary, in a newspaper who attributed that sale to him personally.

In my humble opinion, the resolution of the court should have been to (quietly) order the Newspaper to remove (or obfuscate) his name from that article at source. Job done; this then formally disassociated his name from the event, and all downstream (searchable) references to it likewise, so achieving the alignment of his privacy with the usual public record financial reporting acts in law.

By leaving the source in place, and merely telling search engine providers to enact processes to allow individuals to request removal of unwanted facts from the search indexes only, opens the door to a litany of undesirable consequences – and indeed leaves the original article on a newspaper web site untouched and in direct violation of the subjects right to privacy over 7 years after his bankruptcy; this association should now have no place on the public record.

Besides timescales coded into law on specific timescales where certain classes of personal data can remain on the public record, there are also ample remedies at law in place for enforcing removal (and seeking compensation for) the publication of libellous or slanderous material. Or indeed the refusal to take-down such material in a timely manner with, or without, a corresponding written apology where this is judged appropriate. No new laws needed; it is then clear that factual content has its status reinforced in history.

In the event, we’re now subject to a morass of take-down requests that have no legal basis for support. Of the initial volume (of 10’s of 1,000’s of removal requests):

  • 31 percent of requests from the UK and Ireland related to frauds or scams
  • 20 percent to arrests or convictions for violent or serious crimes
  • 12 percent to child pornography arrests
  • 5 percent to the government and police
  • 2 percent related to celebrities

That is demonstrably not serving the public interest.

I do sincerely hope the European Justices that enacted the current process will reflect on the monster they have created, and instead change the focus to enact privacy of individuals in line with the financial and criminal record keeping edicts of publicly accessible data coded in law already. In that way, justice will be served, and we will no longer be subjected to a process outsourced to a third party who should never be put in a position of judge and jury.

That is what the courts are for, where the laws are very specific, and in which the public was full confidence.

Facebook Mood Research: who’s really not thinking this through?

Facebook Logo

Must admit, i’ve been totally bemused by the reaction of many folks and media outlets I usually respect to this “incident”. As you may recall from other news sources, Facebook did some research to see if posts they deemed as “happier” (or the opposite) had a corresponding effect on the mood of other friends seeing those status posts. From what I can make out, Facebook didn’t inject any changes to any text; they merely prioritised the feed of specific posts based on a sentiment analysis of the words in them. With that came cries of outrage that Facebook should not be meddling with the moods of it’s users.

The piece folks miss is that due to the volume of status updates – and the propensity of your friends to be able to consume that flow of information from their friends – an average of 16% of your status posts get seen by folks in your network (the spread, depending on various other factors, is from 2% to 47% – but the mean is 16% – 1 in 6). This has been progressively stepping down; two years ago, the same average was 25% or so. Facebooks algorithms make a judgement on how pertinent any status makes to each of your friends, and selectively places (or ignores) that in their feed at the time they read their wall.

As an advertiser with Facebook, you can add weight to a posts exposure to show ads in the wall of people with specific demographics or declared interests (aka “likes”). Which can usually be a specific advert, or an invite to “like” a specific interest area or brand – and hence to be more likely to see that content in your wall alongside other posts from friends.

So, Facebook changed their algorithm, based on text sentiment analysis, to slightly prioritise updates with a seemingly positive (or negative) disposition – and to see if that disposition found it’s way downstream into your friends’ own status updates. And in something like 1 in a 1000 cases, it did have an influence.

Bang! Reports everywhere of “How dare Facebook cross the line and start to meddle with the mood swings of their audience”. My initial reaction, and one I still hold, is the surprising naivety of that point of view, totally out of depth with:

  1. the physics of how many people see your Facebook updates
  2. the fact that Facebook did not inject anything into the text – just prioritised based on an automated sentiment analysis of what was written and above all:
  3. have people being living under a rock that they don’t know how editorial decisions get prioritised by *every* media outlet known to man?

There are six Newspaper proprietors in the UK that control virtually all the National Newsprint output, albeit a business that will continue to erode with an ever aging readership demographic. Are people so naive that they don’t think Tabloid headlines, articles and limited right to reply do not follow a carefully orchestrated interest of their owners and associated funding sources? Likewise the Television and Radio networks.

The full horror is seeing output from a Newspaper, relaying stories about foreign benefit cheats, who end up hiring a Russian model to act as a Latvian immigrant, inject alleged comments from her to incite a “how dare you” reaction, add text of a government ministerial condemnation, and then heavily moderate the resulting forum posts to keep a sense of “Nationalistic” outrage at the manufactured fiction. That I find appalling and beneath any sense of moral decency. That is the land of the Tabloid Press; to never let facts get in the way of a good story. That is a part of society actively fiddling with the mood swings of their customers. By any measure, Facebook don’t even get on the same playing field.

In that context, folks getting their knickers in a twist about this Facebook research are, I fear, losing all sense of perspective. Time to engage brain, and think things through, before imitating Mr Angry. They should know better.

What if Quality Journalism isn’t?

Read all about it

Carrying on with the same theme as yesterdays post – the fact that content is becoming disaggregated from a web sites home page – I read an excellent blog post today: What if Quality Journalism isn’t? In this, the author looks at the seemingly divergent claims from the New York Times, who claim:

  • They are “winning” at Journalism
  • Readership is falling, both on web and mobile platforms
  • therefore they need to pursue strategies to grow their audience

The author asks “If its product is ‘the world’s best journalism‘, why does it have a problem growing its audience?”. You can’t be the world’s best and fail at the same time. Indeed. And then goes into a deeper analysis.

I like the analogue of the supermarket of intent (Amazon) versus a supermarket of interest (social) versus Niche. The central issue is how to curate articles of interest to a specific subscriber, without filling their delivery with superfluous (to the reader) content. This where Newspapers (in the authors case) typically contain 70% or more of wasted content to a typical specific user.

One comment under the article suggests one approach: existence of an open source aggregation model for the municipal bond market on Twitter via #muniland… journos from 20+ pubs, think tanks, govts, law firms, market commentators hash their story and all share.

Deep linking to useful, pertinent and interesting content is probably a big potential area if alternative approaches can crack it. Until then, i’m having to rely on RSS feeds of known authors I respect, or from common watering holes, or from the occasional flash of brilliance that crosses my twitter stream at times i’m watching it.

Just need to update Aaron Swartz’s code to spot water-cooler conversations on Twitter among specific people or sources I respect. That would probably do most of the leg work to enlighten me more productively, and without subjecting myself to pages of search engine discovery.

Death of the Web Home Page. What replaces it??

Go Back You Are Going Wrong Way Sign

One of the gold nuggets on the “This week in Google” podcast this week was that some US News sites historically had 20% of their web traffic coming in through their front door home page. 80% of their traffic arrived from links elsewhere that landed on individual articles deep inside their site. More recently, that has dropped to 10%.

If they’re anything like my site, only a small proportion of these “deep links” will come from search engine traffic (for me, search sources account for around 20% of traffic most days). Of those that do, many arrive searching for something more basic than what I have for them here. By far my most popular “accident” is my post about “Google: where did I park my car?”. This is a feature of Google Now on my Nexus 5 handset, but I guess many folks are just tapping that query into Google’s search box absolutely raw (and raw Google will be clueless – you need a handset reporting your GPS location and the fact it sensed your transition from driving to walking for this to work). My second common one is people trying to see if Tesco sell the Google Chromecast, which invariably lands on me giving a demo of Chromecast working with a Tesco Hudl tablet.

My major boosts in traffic come when someone famous spots a suitably tagged Twitter or LinkedIn article that appears topical. My biggest surge ever was when Geoffrey Moore, author of “Crossing the Chasm”, mentioned my one page PDF that summarised his whole book on LinkedIn. The second largest when my post that congratulated Apple for the security depth in their CloudKit API, as a fresh change to the sort of shenanigans that several UK public sector data releases violate, appeared on the O’Reilly Radar blog. Outside of those two, I bump along at between 50-200 reads per day, driven primarily by my (in)ability to tag posts on social networks well enough to get flashes of attention.

10% coming through home pages though; that haunts me a bit. Is that indicative of a sea change to single, simple task completion by a mobile app? Or that content is being littered around in small, single article chunks, much like the music industry is seeing a transition from Album Compilations to Singles? I guess one example is this weeks purchase of Songza by Google – and indeed Beats by Apple – giving both companies access to curated playlists. Medium is one literary equivalent, as is Longreads. However, I can’t imagine their existence explains the delta between searches and targeted landing directly into your web site.

So, if a home page is no longer a valid thing to have, what takes it’s place? Ideas or answers on a postcard (or comment here) please!

Email: is 20% getting through really a success?

Baseball Throw

Over the weekend, I sent an email out to a lot of my contacts on LinkedIn. Because of the number of folks i’m connected to, I elected to subscribe to Mailchimp, the email distribution service recommended by most of the experts I engage in the WordPress community. I might be sad, but it’s been fascinating to watch  the stats roll in after sending that email.

In terms of proportion of all my emails successfully delivered, that looks fine:

Emails Delivered to LinkedIn Contacts

However, 2 days after the email was sent, readership of my email message, with the subject line including the recipients Christian name to avoid one of the main traps that spam gets caught in, is:

Emails Seen and UnOpened

Eh, pardon? Only 47.4% of the emails I sent out were read at all? On first blush, that sounds really low to an amateur me. I would have expected it for folks on annual leave, but still not as low as less than half of all messages sent out. In terms of device types used to read the email:

Desktop vs Mobile Email Receipt

which I guess isn’t surprising, given the big volume of readers that looked at the email in the first hour of when it was sent (which was at around 9:00pm on Saturday night). There was another smaller peak between 7am-8am on Sunday morning, and then fairly level tides with small surges around working day arrival, lunch and departure times. In terms of devices used:

Devices used to read Email

However, Mailchimp insert a health warning, saying that iOS devices do handshake the email comms reliably, whereas other services are a lot more fickle – so the number of Apple devices may tend to over report. That said, it reinforces the point I made in a post a few days ago about the importance of keeping your email subject line down to 35 characters – to ensure it’s fully displayed on an iPhone.

All in, I was still shocked by the apparent number of emails successively delivered but not opened at all. Thinking it was bad, I checked and found that Mailchimp reckon the average response aimed into folks aligned to Computers and Electronics (which is my main industry), voluntarily opted in, is 17.8%, and click throughs to provided content around the 1.9% mark. My email click through rate is running at 2.9%. So, my email was 2x the industry norm for readership and 50% above normal click-through rates, though these are predominantly people i’ve enjoyed working with in the past – and who voluntarily connected to me down the years.

So, sending an email looks to be as bad at getting through as expecting to see Tweets from a specific person in your Twitter stream. I know some of my SMS traffic to my wife goes awry occasionally, and i’m told Snapchat is one of the few messaging services that routinely gives you an indication that your message did get through and was viewed.

Getting guaranteed attention of a communication is hence a much longer journey than I expected, and probably (like newspaper ads of old) relying on repeat “opportunities to see”. But don’t panic – i’m not sending the email again to that list; it was a one-time exercise.

This is probably a dose of the obvious to most people, but the proportion of emails lost in action – when I always thought it a reliable distribution mechanism – remains a big learning for me.

Am I the only one shaking my head at US Net Neutrality?

Internet Open Sign

I’ve always had the view that:

  1. ISPs receive a monthly payment for the speed of connection I have to the Internet
  2. Economics are such that I expect this to be effectively uncapped for almost all “normal” use, though the few edge cases of excessive use would be subject to a speed reduction to ration use of the resources for the good of the ISPs user base as a whole (to avoid a tragedy of the commons)
  3. That a proportion of my monthly costs would track investments needed to ensure peering equipment and the ISPs own infrastructure delivered service to me at the capacity needed to deliver (1) and (2) without any discrimination based on traffic nor its content.

Living in Europe, i’ve been listening to lots of commentary in the USA about both the proposed merger between Comcast and Time Warner Cable on one hand, and of the various ebbs and flows surrounding “Net Neutrality” and the FCC on the other. It’s probably really surprising to know that broadband speeds in the USA are at best mid-table on the world stage, and that Comcast and Time Warner have some of the worst customer satisfaction scores in their respective service areas. There is also the spectacle of seeing the widespread funding of politicians there by industry, and the presence of a far from independent chairman of the FCC (the regulator) whose term is likely to be back through the revolving door to the very industry he currently is charged to regulate and from whence he came.

I’ve read “Captive Audience: The Telecom Industry and Monopoly Power in the New Gilded Age” by Susan Crawford, which logged what happened as the Bell Telephone Monopoly was deregulated, and the result the US consumer was left with. Mindful of this, there was an excellent blog post that amply demonstrates what happens when the FCC lets go of the steering wheel, and refuses to classify Internet provision being subject to the “common carrier” status. Dancing around this serves no true political purpose, other than to encourage the receipt of Economic rent in ample excess to the cost of service provision in areas of mandated exclusivity of provision.

It appears that the 5 of the major “last mile” ISPs in the USA (there are 6 of them – while unnamed, folks on various forums suspect that Verizon are the only ones not cited) are not investing in equipment at their peering points, leading to an inference that they are double dipping. ie: asking the source of traffic (like Netflix, YouTube, etc) to pay transit costs to their customers for the “last mile”. Equipment costs that are reckoned to be marginal (fractions of a cent to each customer served) to correct. There is one European ISP implicated, though comments i’ve seen around the USA suggest this is most likely to be to Germany.

The blog post is by Mark Taylor, an executive of Level 3 (who provide a lot of the long distance bandwidth in the USA). Entitled “Observations of an Internet Middleman”, it is well worth a read here.

I just thank god we’re in Europe, where we have politicians like Neelie Kroes who works relentlessly, and effectively, to look after the interests of her constituents above all else. With that, a commitment to Net Neutrality, dropping roaming charges for mobile telcos, no software patents and pushing investments consistent with the long term interests of the population in the EC.

We do have our own challenges in the UK. Some organisations still profit handsomely from scientific research we pay for. We fund efforts by organisations to deliver hammer blows to frustrated consumers rather than encouraging producers to make their content accessible in a timely and cost effective fashion. And we have one of the worst cases of misdirected campaigns, with no factual basis and playing on media-fanned fear, to promote government mandated censorship (fascinating parallels to US history in “The Men who open your mail” here – it’ll take around 7 minutes to read). Horrific parallels to this, and conveniently avoiding the fact that wholesale games of “wac-a-mole” have demonstrably never worked.

That all said, our problems will probably trend to disappear, be it with the passing of the current government and longer term trends in media readership (the Internet native young rarely read Newspapers – largely a preserve of the nett expiring old).

While we have our own problems, I still don’t envy the scale of task ahead of consumers in the USA to unpick their current challenges with Internet access. I sincerely hope the right result makes it in the end.

Simple words often work better than neat adverts

Love at First Website Advert

An example advert from the time I led the Marketing Services Team at Demon Internet. It was a dumb sounding advert, but it pulled response like crazy. Some of the responses we received back in the mail (asking for trial CDs) contained nice poems, so it appeared to strike a healthy connection.

When we first entertained bids for a new agency, we had super looking, consistent, nicely branded advert samples from one company, and these tongue in cheek worded ones from another. Cliff Stanford (owner of Demon Internet) liked the worded ones, while I thought he was nuts – but he agreed to do some tests to see who was correct. He was absolutely right; the worded ads pulled much more effectively. Lesson learnt!

The Valentines Day Advert was done in a rush a week before, and Les Hewitt (media buyer extraordinaire) got it in most target newspapers near the back. Once in, he phoned them hourly to twist their arm relentlessly, getting it shifted page by page towards the front. The advert made it to the dating page on Valentines Day in the Times I believe, where we got fantastic response levels.

We ran quite a few variations of the theme in over 40 different publications:

Thick as two short planks advert

piece at cake advert

We also tried cross-track and a 40-sheet poster treatment of the piece@cake advert, but had a bit of a mishap on the approach to Wembley Stadium the evening when the Spice Girls were giving a concert. Hence thousands of young fans, being driven in by their parents to see the concert were greeted with:

Piece @ Cake Advert, dropped E

We had them paste the ‘e’ panel back on the next day.

Average cost to land a £10/month paying customer was £30, around 1/6 that of competitive ISPs at the time (this was 1998-9). We tested everything, and knew what the landed cost of a customer was for every ad we placed. Even knew which ones gave us high response and then heavy churn 3 months later (waves hello to the Sun and Mirror). The most effective medium one of my folks tried gave us acquisition costs of £4 per landed customer, but many odd ball complaints. But that’s another story, and described near the end of an older post here.

Class work, well executed and full of personality. In my humble opinion, of course.