Using Foursquare in Local Government

| Comments

Foursquare. It’s just a vanity publishing tool that takes over your Twitter stream and Facebook feed right? OK, I’m as guilty as the next Foursquare user for checking in to the pub at lunchtime and telling all my Twitter followers, but Foursquare can actually be a pretty useful tool for giving visitors and residents useful information about their surroundings.

As well as being able to check into places and see where your friends are, Foursquare also allows you to leave ‘tips’ about venues for your friends and followers, so when they check into that venue (or one nearby), they’ll see this tip. I’ve left a couple of tips on my personal Foursquare account (mainly plugging nights I play at, or calling out bad service at restaurants), but I’ve always thought Foursquare tips could be really useful for our tourism team, highlighting special offers and also giving historical titbits of information about places in the district.

I never really knew how to get started with a Foursquare page until I read Joseph Stashko’s post on how he added a Foursquare page for Blog Preston. I’ll leave you to read Joseph’s explanation on how he got set up, but needless to say, the sign up process is a little bit convoluted (To be fair, Foursquare say they will be changing this soon).

That said, once I’d filled everything out, got our designer to design a header and sent it all off to Foursquare, I was up and running within a few hours (together with a nice personal response from a Foursquare staffer).

You can now see the results on the Visit Lichfield Foursquare page. I’ve added a few tips in and around Lichfield City with historical information that I know about personally, together with a few special offers, and I’m looking at getting a few more tips added after talking to our Green Badge guides (who know much more about Lichfield’s history than me) before launching the page officially (bizarrely, we already seem to have 600+ followers without doing any promotion, mainly people from Indonesia and New York).

However, it’s not just tourism that Foursquare can help with, I’d love to develop something like, which uses public data on food safety inspections in New York to warn you if you’re about to eat at a place with a poor hygiene report, and if you’re a council with a lot of venues, you could leave tips about upcoming events or special offers. The only limit, as they say, is your imagination!

If anyone else has any useful tips on local gov using Foursquare, I’d love to hear them - drop me a line in the comments.

A Beginner’s Guide to SPARQLing Linked Data (Part 1)

| Comments

Regular readers of this blog will have seen that, over the past 12 months or so, I’ve been banging on about linked data and SPARQL quite a bit, posting up example queries and the like, but with not much explanation about why I’m doing what I’m doing.

Thanks to the good folks at Talis and their offer of a free linked data store for spending data, I’ve also got a nice little store of my own, and I thought it was high time I went back to basics and passed on what I’ve learnt to other people.

First, a bit of background

The web, as most of us know it, is a network of documents, with each document having an individual URI to show its location on the web. With linked data, the web becomes an network of things with each thing having an individual URI to tell us more about it.

When talking about linked data, we talk about URIs, rather than URLs, as URLs refer the the location of a file, whereas URIs are identifiers for things, which could return any format.

An example of a URI is - when you access this via a web browser, the web browser asks for HTML and a web page giving information about this particular cost centre is returned. If you request this in any other way (say via the command line) and ask for RDF, you will get an RDF representation of information about this cost centre (web browsers filter out some of the XML, so it might be worth right clicking and selecting “view source” to see everything).

As well as identifying things, URIs also represent categories of things (called properties), the previous example therefore is an expenditure category, represented by the URI To get all the properties being used for our spending dataset, we can do the following SPARQL query:


See the results of this query

SPARQL queries themselves

SPARQL stands for SPARQL Protocol and RDF Query Language and is a similar sort of idea to SQL (which is used in most databases both on and off the web). The main difference is that while SQL generally allows us read and write from and to a database (and therefore not particularly safe to give everyone access), SPARQL is built for public access, so is non-destructive in nature. SPARQL is, of course, still intended for developers and is just as powerful as SQL for querying datasets.

RDF datastores have a SPARQL endpoint, which is basically a fancy name for a place where you can make SPARQL queries. Our SPARQL endpoint is located at and we make SPARQL queries by posting a GET request to this URI in the format{SPARQL query goes here}, or by entering our query on a web-based form that posts to our endpoint. The form we will be using is located here.

The first thing we do when writing a SPARQL query is define our prefixes. If you look at a snapshot of the data we’ll be querying, you’ll see that most of the XML tags are written in the format prefix:type> (RDF can be written in various formats, but I’m using XML, which is not necessarily the best format, but that’s a discussion for another day!). If you look at the top of the document (you might need to view source in your web browser) you’ll see that all the prefixes are defined at the top as XML namespaces (for example ‘xmlns:payment=””’ means that the prefix “payment” will be shorthand for “”, so “payment:reference” actually refers to a URI, in this case “”). We can do this in a similar way in our SPARQL like so:

PREFIX payment:

There may be other prefixes too, and we do these a line at a time, for example:

PREFIX payment: PREFIX rdfs: PREFIX xsd:

The next thing to do is choose what data we want to return (in a similar way to an SQL SELECT query)

SELECT ?payment WHERE {

Here we’re asking for one part of the dataset, which will be the URI which represents a particular payment. At this point, it doesn’t matter what we call it, as we need to define this in the next bit of our query:

?payment a payment:Payment .

Here we’re saying “You know that thing I asked for in the SELECT part of the statement? That needs to be a”.

We could stop here, but we’ll end up getting all the data back, which would take a long time, and probably not be very useful, so let’s filter this by only asking for payments made to a particular supplier:

?Payment payment:payee

Let’s put it all together:

PREFIX payment: SELECT ?Payment WHERE { ?Payment a payment:Payment . ?Payment payment:payee }

See the results of the query.

The query returns a list of URIs, if you copy and paste these into your browser, you’ll see a web page about that payment. The list isn’t very useful in itself, so let’s ask for a bit more data.

We’ll first modify the prefixes:

PREFIX payment: PREFIX rdfs:

Then the rest of the query:

SELECT ?Payment ?label ?date { ?Payment a payment:Payment . ?Payment payment:payee . ?Payment rdfs:label ?label . ?Payment payment:date ?date }

See the results of this query.

Hopefully you’ll be able to see what’s going on here, we’re now asking for the label of the payment (rdfs:label) and also the date (Payment:date). In this dataset, the date is returned as a URI (i.e., but we can also return the date’s label in text form by adding the following:

SELECT ?Payment ?label ?date **?datelabel** {


?date rdfs:label ?datelabel .

Here we’re saying “You know that date I asked for? I also want to see it’s label, and I want to call it datelabel”. The query now looks like this:

PREFIX payment: PREFIX rdfs: SELECT ?Payment ?label ?date ?datelabel { ?Payment a payment:Payment . ?Payment payment:payee . ?Payment rdfs:label ?label . ?Payment payment:date ?date . ?date rdfs:label ?datelabel }

See the results of this query.

This is all well and good, but we don’t see any actual figures yet, which isn’t very useful. Going back to the snapshot of data I showed earlier, you can see that each payment has one or more expenditureLines, expressed as:

payment:expenditureLine rdf:resource=""/

If you look below each Payment, you’ll see the expenditureLine(s) for the payment, e.g:

payment:ExpenditureLine rdf:about="" rdfs:labelPayment number 9178363/rdfs:label payment:expenditureCategory rdf:resource=""/ payment:expenditureCategory rdf:resource=""/ payment:expenditureCategory rdf:resource=""/ payment:payment rdf:resource=""/ payment:netAmount rdf:datatype=""1855.88/payment:netAmount qb:dataSet rdf:resource=""/ /payment:ExpenditureLine

We can get this information in a similar way to how we got the date labels, so if we want to get a net amount for each payment we can add:

SELECT ?Payment **?line** ?label ?date ?datelabel **?amount** {


?line payment:payment ?Payment . ?line payment:netAmount ?amount

Here we’re now asking for each line that has a payment that is equal to “?Payment” (i.e. the Payments we’ve originally requested) and asking for their net amounts. The query now looks like this:

PREFIX payment: PREFIX rdfs: SELECT ?Payment ?line ?label ?date ?datelabel ?amount { ?Payment a payment:Payment . ?Payment payment:payee . ?Payment rdfs:label ?label . ?Payment payment:date ?date . ?date rdfs:label ?datelabel . ?line payment:payment ?Payment . ?line payment:netAmount ?amount }

See the results of this query.

Which seems like a pretty neat place to leave it now. Feel free to have a play with the stuff I’ve gone thorough so far, and ask any questions in the comments. I don’t claim to have all the answers, and I may have made some incorrect assumptions, so please feel free to put me right too!

Stay tuned for part 2, where I’ll cover filters and other cool things.

Spending Data - Beyond the CSV

| Comments

It’s been a while since my last post about local spending data, and, since then, our finance team have been busy beavering away behind the scenes to get our own spending data out of the council’s finance systems.

However, from the outset, I was really pleased that both our finance people and our chief exec wanted to go beyond just whacking a CSV (or worse, a PDF) on the internet and show our commitment to transparency by putting the data on the web in an easy, human readable format.

Armed with a few CSVs, a Macbook Pro and some server space, I was put to task, and the the results can be seen here. Users can navigate by the various parts of the cost code (called “Department”, “Type” and “Subject”), as well as by month and supplier.

As well as human readable formats, almost every view of the data has an XML, JSON, CSV and RDF equivalent, achieved either by putting the relevant extension on the end, or through content negotiation.

The user interface had been heavily influenced by OpenlyLocal (after all, if it ain’t broke, why fix it?) and, as well as feeding our data in to OpenlyLocal, we also take a little bit away. When data is imported into OpenlyLocal, the system does some clever data matching to match company names to real life entities on Open Corporates. We have a batch job that takes that information and puts it on our spending system - you can see an example here. This is an example of how open data cuts both ways, by sharing our data, we can improve it.

Also, thanks to the wonderful people at Talis, we have our RDF data in a triple store, so it can be queried, both through SPARQL queries and a free text search, making the data even more useful from the off. I’m hoping to put together a tutorial in SPARQL soon, both here and on the Lichfield District Council website, which should hopefully demystify the process a bit more.

Finally, I’m hoping to open source the code (once it’s tidied up a bit), so of you’re from a local authority and want to make your spending data a little more clear to Joe Public, then you can benefit from my work too!

Grit Bins - Your Help Needed!

| Comments

During the Lichfield Hacks and Hackers day yesterday (bigger blog post to come on that later) - I was gifted a list of grit bins in Lichfield District, I got very excited about this, but my excitement was quickly dampened when I realised that there was no geographical data in the list, just road names and a vague location.

However, I then thought ‘not to worry, I can just ask people who live nearby to geolocate them on their phones and add them to a Google Doc’, I asked a @robthedog (who lives in a village north of the district) if he could do a few when he got home, his reply was ‘I can do one now’:

Grit Bin on Google Street View

Of course! Google Street view! This is where you lot come in. I’m going to crowdsource the location of these grit bins (I’ve already done a few myself) - to help, all you need to do is follow a few simple steps:

  1. Open up this Google document

  2. Drag this bookmarklet to your browser’s toolbar.getCenter()));)

  3. Open up Google Maps and search for a road name and town / village which doesn’t have a lat/lng yet

  4. Drag the little yellow street view man (this guy) to the map and have a look round Street View for the grit bin

  5. Once you’re right by it - zoom out of Street View and click the bookmarklet - a window will pop up with the lat/lng - you can then copy and paste it and enter it into the Google doc!

It’s that simple! If you could do just a couple, I’ll be forever in your debt (I might buy you a pint if I see you in the flesh too!)

Once it’s done, I’ll be able to build a handy map of grit bins in the area and hopefully save a few slips and falls (as well as a few costly calls to our contact centre!). There’s 87of them, and, as I’ve already done 10, it should be long before I’ve got them all.

Thanks again, you lovely people you!

First Steps to Councils Publishing Their Own Linked Data

| Comments

On Friday, I toddled along to London for a bit of a chat about local spending data, organised by LeGSB, the local government e-Standards body. In particular we looked at how councils can publish their spending data in a linked data format. It was an interesting day, and much of the work seems to have already been done by folks much cleverer than myself.

However, at the end of the day, Paul Davidson (the Director of Standards at LeGSB) raised an interesting topic. At the moment, much of the describing of councils in the linked data world seems to be done by external bodies, such as The Office for National Statistics and Ordnance Survey, and often, this isn’t always the best fit, with the data referring to geographical regions, rather than councils themselves.

Paul then made the point that really, it should be the councils themselves that describe who they are, rather than some arbitrary body, after all, there is no one better placed to know about the council than the council themselves! We then talked very briefly about the possibility of councils minting their own URIs and publishing some RDF with some information about themselves (such as contact details etc). Other organisations and people can then use these URIs to refer to the councils in their data.

I’ve been mulling this over over the weekend, and I fired off an email this morning suggesting that, rather than publishing a page of RDF, people simply add data to their council homepages, either as RDFa or metadata (although I’m not sure how you’d do the latter!). People could then use their council homepage URLs as URIs (perhaps with #id at the end of the URI, to make the URI for the council different to the one for the page).

With this in mind I’ve knocked up a sample dataset for Lichfield District Council using various ontologies from around the linked data web (including FOAF, vCard, Chris Taggart’s Openly Local ontology and a bit of the Organizational (sic) Ontology). I’ve also sameAsed to the National Statistics Dataset, Openly Local and DBpedia:

` ?xml version=”1.0”?> rdf:RDF xmlns:rdf=”” xmlns:skos=”” xmlns:rdfs=”” xmlns:owl=”” xmlns:foaf=”” xmlns:dct=”” xmlns:vcard=”” xmlns:administrative-geography=”” xmlns:openlylocal=”” xmlns:org=””> rdf:Description rdf:about=””>

    rdfs:labelLichfield District Council/rdfs:label 
    vcard:organisation-nameLichfield District Council/vcard:organisation-name 
    rdf:type rdf:resource=""/ 
    rdf:type rdf:resource=""/ 
    skos:notation rdf:datatype=""41UD/skos:notation 
    foaf:homepage rdf:resource=""/ 
    foaf:phone rdf:resource="tel:+44-1543-308999"/ 
  foaf:OnlineAccount rdf:about="" 
    foaf:accountServiceHomepage rdf:resource=""/ 
  foaf:OnlineAccount rdf:about="" 
    foaf:accountServiceHomepage rdf:resource=""/ 
    foaf:accountNameLichfield District Council/foaf:accountName 
vcard:ADR rdf:parseType="Resource" 
  vcard:ExtaddDistrict Council House, Frog Lane, Lichfield WS13 6YY/vcard:Extadd 
  vcard:postal-code rdf:resource="" / 
    administrative-geography:coverage rdf:resource=""/ 
    owl:sameAs rdf:resource=""/ 
    owl:sameAs rdf:resource=""/ 
    owl:sameAs rdf:resource=""/ 

/rdf:Description> /rdf:RDF> `

It’s not perfect, but it’s a start, there’s 24 triples there and I’m sure there could be more. Only problem is, I’m not a linked data expert, so I’m looking for a bit of feedback both on what I’ve done, but also what I could add - For example, I could add the chief executive and leader of the council - but I’m not sure how I’d do it!

Feedback from non-linked data experts is welcome too - especially those at the coal face of local gov - if there was a step by step guide to doing this, could you do it? (i.e. have you got access to add metadata to your homepage) and, more importantly, would you do it?

UPDATE I’ve now published the final(ish) version at, with a rel=’alternate’ meta link on the Lichfield homepage. I’ve also reused the ontology to show the councils CEO, leader etc, together with a bit of FOAF for their contact details.

Thanks to Jeni Tennison, Leigh Dodds and Dave Reynolds for helping me to get this far, and hopefully we can encourage more councils to do the same!

A Local Spending Data Brain Splurge

| Comments

Today I was at a ‘quick and dirty’ local spending data workshop at Birmingham City Council’s newly refurbished offices on Lancaster Circus. There’ll be more detailed info to come on the Local Open Data Community (login required), but I just wanted to blog a few of my thoughts post the meeting.

There was a lot of talk about what other councils have done (such as Windsor and Maidenhead, Barnet, Islington etc), and it was agreed that there was a lot of difference in how the data was presented, Paul Davidson also talked about some of the work that the Local eGovernment Standards Body had done regarding getting spending data out there in a linked data format. There did seem to be a bit of resistance to the linked data approach, mainly because agreeing standards seems to be a long, drawn out process, which is counter to the JFDI approach of publishing local data.

However, while I am a fan of the possibilities of linked data, I also recognise that there are difficulties in both publishing the data and also working with it. For example, I think it’s unrealistic to expect every local authority to maintain a triple store to publish their spending data. As we learned from the local elections project, often local authorities don’t even have people who are competent in HTML, let alone RDF, SPARQL etc.

Therefore, I think the way forward is a centralised approach, with authorities publishing CSVs in a standard format on their website and some kind of system picking up these CSVs (say, on a monthly basis) and converting this data to a linked data format (as well as publishing in vanilla XML, JSON and CSV format).

The great thing about the linked data approach is it will mean that each item of spending can have its own URI - e.g.

(The first part of the URI would be the SNAC code for the authority, and then the second part of the URI would be the internal reference number)

As well as having a human-readable summary of the data (together with links to the actual data in RDF, XML, CSV and JSON), there would be a comments box (similar to Adrian Short’s fantastic Armchair Auditor), as well as the ability to ask any questions about an item of expenditure - the answers to these questions would then be automatically published next to the item of spend (hopefully helping to cut down on multiple FOI requests).

While this may be a bit of a pie in the sky idea, I do feel that there does need to be some kind of effort on the part of central government to help move this project along, as we’ve seen already (naming no names!) some authorities have got it drastically wrong, and while there is definitely mileage in the ‘just get it out there’ approach, I think if we’re going to end up with something really useful (for both members of the public and local authorities), we need to get the data in one place.


Further Adventures in SPARQL

| Comments

It’s been a while since my last blog post, and after a request from Twitter, I thought it might be time to dust off Wordpress and do a quick blog post. Since we last spoke, UK Postcodes (the postcodes webservice I blogged about a few blog posts back) has been plodding along nicely and, as a natural tinkerer, I’ve been tinkering ever since.

I’ve sped up request times considerably, and wanted to add the ability to get parliamentary constituencies from a Postcode - originally, I was using the TheyWorkForYou API, which, while useful, didn’t return a National Statistics or Ordnance Survey URI / ID, so I got to thinking about how I could SPARQL this using the National Statistics Endpoint, after some exploration, I noticed that district wards were always inside one parliamentary consituency, so thanks to some linked data knowledge (helpfully pumped into my brain at a two-day training course at Talis a few weeks back), I got to writing a SPARQL query, which you can see below (using the SNAC ID for Coleshill South Ward, where I live):

PREFIX administrative-geography:
PREFIX rdfs:
PREFIX skos:
PREFIX electoral-geography:

SELECT ?ward ?label ?constituency ?constituencyname
a .
?ward skos:notation "44UBGC"^^administrative-geography:StandardCode .
?ward rdfs:label ?label .
?ward electoral-geography:parliamentaryConstituency ?constituency .
?constituency skos:prefLabel ?constituencyname ;

This query basically asks for the URI and name of a ward with a specific SNAC ID, as well as the constiuency’s URI and name. I’ve now added this to UK Postcodes.

But I haven’t stopped there, oh no! The next step is to get parishes. Again, ward and parish boundaries don’t overlap, so I’ve used another (slightly more complex) SPARQL query to find out what parish a ward is inside, this time using the Ordnance Survey Endpoint:

PREFIX foaf:
PREFIX spatialrelations:
PREFIX admingeo:

SELECT ?parish ?parishcode ?ward ?label
?ward a .
?ward admingeo:hasCensusCode "44UBGB" .
?parish a .
?parish spatialrelations:contains ?ward .
?parish admingeo:hasCensusCode ?parishcode .
?parish foaf:name ?label ;

This fella here is asking for a parish which contains a given ward - at the moment I’m using the OS URI, but I’m hoping to figure out a way of using SNAC IDs before I put it onto the live UK Postcodes system.

Hope this gives people a bit of an insight, and if anyone who is cleverer than me wants to point out any problems, or make any suggestions, feel free! :)

Ringing Mapping the Changes

| Comments

Some people would say we’re mad, but last Saturday, on the hottest day of the year so far, a group of people eschewed barbecues and beer in the garden for a whole day spent working with mapping data and other bits of open data to help people understand the benefits of open data at Mapitude, an event aimed at developing understanding and practical collaboration between web developers and mappers.

As well as unconference style talks and discussions, there was also a hack day, where a group of us got together to try and solve practical problems using open data in just one day. The brief we decided on was to map a council ward, add some statistical information to it, and then compare it with a neigbouring ward.

As Dan Slee from Walsall Council was in attendance, he was keen that we map St. Matthew’s ward in Walsall, a deprived ward in the centre of Walsall, and compare it with Aldridge Central and South, a neighbouring, but significantly less deprived area.

Once the brief was written up, I got together with Chris Taggart of OpenlyLocal to identify some data sources. The first thing we noticed was that OpenlyLocal didn’t have much information about the wards and councillors in Walsall - however, after a bit of trawling Walsall’s website and adapting an existing screen scraper, OpenlyLocal had Walsall’s data up, this made it easier to get the councillors for each ward.

The next challenge was getting the ward boundaries. Now, a few months ago, this would have been nigh on impossible without either breaking the law, or physically walking the boundaries with a GPS tracker (which, even then, might have been dodgy). However, thanks the the release of Ordnance Survey data, this is now significantly easier.

However, even with Open Data, this was still not an easy task. The boundary data comes in ESRI shapefile format, which, on it’s own isn’t a particularly friendly format for web developers, as it’s designed for desktop GIS software, so the first task was converting the Shapefiles to something we can easily work with online.

Thankfully, I’d been doing some research a few days prior, and chanced upon this bit of work, which includes a program that converts ESRI Shapefiles to MySQL tables, a format that is much kinder to web developers. Once that was done, we then had to convert the boundaries themselves from Ordnance Survey National Grid References to latitude and longitude, which was easily done by this PHP script.

We now had councillors and shapefiles, and already time was against us. The group who put together the brief had decided on population, families claiming child benefit, average income, and number of unpaid carers. This could be retrieved from the National Statistics Data Exchange, but we ran out of time, so ended up simply copying and pasting the data. However, given enough time, we knew we could’ve done it.

You can see the results of our work here, and some of the comparisons are quite stark. We would’ve liked to add crime data from the Police API, but again, pressures of time meant we didn’t get round to it. However, if you look at the crime stats for St Matthew’s and Aldridge Central and South, you can see there’s quite a difference, with crime in St Matthew’s being much much higher. However, this may be slightly skewed by the fact that St Matthew’s takes in much of Walsall town centre where there are a lot of pubs and bars, so fights etc at chucking out time are, sadly, common.

Where next?

So, was it all worth it? Well, I know Chris Taggart will be adding similar functionality to OpenlyLocal in the future, so in that respect it was useful. It also helped me get a handle on how to work with the OS boundary data, and yesterday, I released the Lichfield portion in a much more usable format on the Lichfield District Council open data section.

The most important aspect though, I think, is serving as an example of what can be done with open data. For too long, us open data folks have been banging on about how great open data is and everyone should do it. However, the people who have the power to really open stuff up aren’t always convinced, they might be wary, or not understand the issues. With examples like this, we can start with a problem, and in less than a day, have a solution. It’s these solutions which open people’s eyes to the power of data, and convince them that this is the future.

The Postcode - Freed!

| Comments

Well, it’s finally happened, Ordnance Survey have gone from being the bad guy of the internet, a big, bumbling behemoth, squishing innovation wherever it goes, to being the darling of the internet, throwing out free data wherever it goes (well, sort of).

As well as lots of mapping data being released, it also release Codepoint Open, which matches every postcode up a geographical location, it’s what I, along with the oh-so-clever uber trolling project Ernest Marples have been banging on about for ages. Now, finally, developers can get a user’s location details based on one thing that almost everyone in the UK is guaranteed to know, their postcode - this can then be used to get all sorts of useful information, like their nearest school, fire station, police station, cafe, tanning salon (or whatever).

As well as matching postcodes to a physical location, CodePoint Open also matches postcodes to local authority information, something which, in the past, I’ve used the NESS Data Exchange for, but I’ll probably migrate to this, ‘cos it’s easier.

Anyway, although the data has been released and is out there, it’s in a whopping 250mb CSV file, so if anyone wants to use it, they need to download a local copy and import it into a database for their own use. I wanted to do this, but thought, instead of everyone doing this, why don’t I make it easier for everyone (myself included) and wrap it in a web service?

That is wot I dun - enter UK Postcodes. Initially I built this over the Easter break, in a few snatched moments between eating easter eggs and making sure my new puppy didn’t wee anywhere. On the surface it’s very simple, a request like this:

will give you all the information about WS13 6YY in XML format, including latitude/longitude, easting/northing, a Geo Hash URI, as well as the county council (if applicable), district council and ward the postcode is in. You can also have JSON, CSV and RDF just by changing the extension (Thanks to Jeni Tennison for her help with the latter format!).

After posting the results of my fiddling on the UK government Data Developers mailing list - I got a lot of helpful people suggesting other ways I could interact with the data, so I know have methods for finding the nearest postcodes to a point, as well as reverse geocoding (going from a lat/lng point to a postcode) - there’s more information on the API page.

That’s basically it, I’m always happy to hear feedback, so if you’ve got any suggestions or feedback (good or bad!), please feel free to comment below!

(Oh, and a massive thanks goes out to Adrian Short (actually it was Adrian who created the torrent of the data - proving that file sharing isn’t always used for nefarious purposes :cough:#debill:cough:) Matthew Somerville, who translated all those Eastings and Northings in the original dataset to Latitude and Longitude - you can download his version from the MySociety mirror.)

Postcode to Councillor in One Easy Step!

| Comments

Finding out who your councillor is isn’t always easy is it? Even if you know which council provides your services, wards often have odd sounding names, and council websites don’t always have an easy postcode search. With this in mind, and now the Office of National Statistics have released a brand new API and because I’ve got a day off looking after my new puppy, I decided to have a play and see if I could make it easier.

First I trawled through the documentation to find the best API methods for my needs, it turns out that this URL did what I needed:;=**{POSTCODE}**

This basically asks the NeSS Data Exchange to return the details of a ward (which in Office of National Statistics language is level type ID 14) which a particular postcode is in. Helpfully, I don’t get the SNAC ID straight away (you’ll see why I need this later), so I have to get the ONS internal ID, and make another API call:**{AREA ID}**

This gives a whole bunch of extra info about the area, including the SNAC ID (which is called the ExtCode). Once I have that, I can then move across to OpenlyLocal, where I can get the councillor’s information:**SNAC ID**.xml

Once I’ve got this info, I can then display it all, like so! Obviously if the council isn’t on OpenlyLocal, then we can’t show their details, but we can say who their council is and what ward they’re in, and then, with a bit more jiggery pokery on the OpenlyLocal side, we can direct them to their local authority’s website, like so!

Give it a try yourself, and let me know any feedback in the ol’ comments below. Obviously, this only shows your district council if you’re in a two tier council, but I’m working on getting county council info later :)

Oooh, it’s worth pointing out that you need a PSI click-use licence to publish the Data Exchange data on your own website, but it’s a doddle to apply for and goes live instantly!

Update: I’ve published the code on Github in case you want to see how I did it!