In the world of open data there is lots of it.  At the last count 5600 data sets through data.gov.uk.  Lots of this and others get usefully highlighted by the expert “brokers” (such as the Guardian).  Some great apps emerge and some lovely visualisations.   The depth of impact can be limited and the longevity short lived.   As someone who has been a deep user of data for strategic purposes – academic, policy, regulatory, operation - this can be a little superficial.

In short there’s plenty to publish, publicise and visualise…

1.        Published.  Not just though the main portals such as data.gv.uk, and the Office for National Statistics but in light of the UK Statistics Authority Code Of Practice, there’s a greater consistency in the publication of statistical releases at the departmental level.

2.       Publicised. Some sets of data get the big headlines, such as police crime maps.  But of course there’s the rolling highlight from media champions…such as Guardian and Straight Statistics.

3.       Visualised. For some of this data we see some typically intricate, elaborate and attractive visualisations.  See for example the Guardian’s flick pages with nearly 500 visualisations to data and nearly 1000 members.  

So we can get so something which is “pretty nice”, the question then begs is that also “pretty useful”. So I’ve developed the data “Impactometer” as a simple tool to evaluate effectiveness.  So let’s keep the published, publicised and visualised.  Then add on some analysis (whether driven by hypothesis or data mining), synthesis of that analysis (to distil the key messages for insight and understanding), and then subsequent impact.  It’s that impact which can lead to improvement, change and even transformation. 

It can seem like there’s lots of publish, publicise, visualise, and then it’s really on to the next item to publish, publicise and visualise… so this can feel quite perishable.  So while these data sets are publicised, in some cases the comments section can get closed after a week or so.   Given that knowledge builds over time, there’s no easy way to consolidate that at present.

So while we have a holding framework for the data, perhaps we need a holding framework for the analysis that has flowed from that data….. so analysis.gov.uk?  As bit like the way academia broadly ‘stores’ it’s analysis through the peer reviewed journal process.  So the new researcher can stand on the shoulders of giants that have gone before.    Certainly to provide a home for more than the visualisations, to include the basic descriptive analysis including the basic messages  - at least the more straightforward ones….   

After all any more advanced analysis is will need to go through that basic descriptive stage anyway, - important context for the deeper analysis - so maybe an opportunity for a head start, or at least some early validation of the basics. 

Some a home for more analysis, including the audit trail for the methodology of that analysis.  I see this all this in the form of a recipe card… the specific data ingredients, the techniques applied and the tools used and the output. Plus some lessons learned for next time…. Perhaps a little too sweet this time….