Librarian Jayne and Young Robert recently grabbed the opportunity to take well deserved vacations. Which not only made something of a dent on our productivity charts but also caused a considerable spike in Michael’s craving for company. Or desire for attention, some might say. Nevertheless, we ploughed on.
Our efforts have been firmly focussed on general election planning. As our regular reader will know, back in the day, Library staff managed election-related data in a database built and maintained by our Oli, ex of this parish. The plan for the next election is to use Democracy Club as our data provider. If we were in the Wardley mapping business, we’d be throwing in references to commoditisation round about now. Unfortunately, we’re too busy trying to map identifiers.
Our first problem was identifiers for people. Or candidates if you will. Luckily Sym and co already have Wikidata IDs and Wikidata has a well-populated property linking to MNIS people IDs. So it’s perfectly possible to take a Wikidata ID, query Wikidata and get back a MNIS ID. At least in theory. Unfortunately, practice proved slightly harder when Librarian Ned discovered that some Wikidata people had more than one MNIS ID. Not a thing that should happen. Librarian Phil investigated. We had suspected for some time that our MNIS database was haunted - we’ve seen the odd ghost on our travels, one record for a Member fully populated, and somehow another phantom, blank profile for the same Member. The spectres slip out of the database, into the API, on to the website. It turns out our ghosts had escaped further in to the wild, materialising in Wikidata too. After a chat with Andrew on deprecation and deletion, links from Wikidata to the ghost records have been cast out and Sym is free to go ahead and import MNIS IDs. And we have an inventory of our spooks. Attempts to purge the poltergeists from the database using the admin interface have sadly failed. An exorcism is foretold for mid-April.
Our second problem was identifiers for places. Or constituencies if you will. All made much easier by both Democracy Club and MNIS records having ONS code entries. That said, the upload script expects XML carrying a MNIS constituency ID rather than an ONS code and we’re told the scripts are unlikely to change before the next general election. Data scientist Louie is gonna have to grab the data from Democracy Club, match against ONS codes in a spreadsheet, grab the MNIS ID from said spreadsheets, munge it together and upload. The MNIS constituency table is something of an identifier minefield, having fields like PConName, PCACode and OldDisId. All of which the librarians’ MNIS manual was silent on. Most of which got solved when Librarians Anna, Emily, Phil and honorary librarian Michael met with Library tech manager Jeremy and statistician Carl on Wednesday. There’s really no substitute for experience. We now have a much better idea of what we need to populate and what we can happily ignore, if and when there’s a boundary change. Thanks Jeremy. Thanks Carl. We hereby promise to update our manual.
Of the three identifier alignment problems, the last one’s probably the worst one. Democracy Club have Electoral Commission IDs for parties. MNIS does not. Librarian effort is currently focussed on de-duping the MNIS party table which is something of a muddle. Once that’s done, we need to generate another lookup spreadsheet matching MNIS party IDs to Electoral Commission IDs. At which point, data scientist Louie will have more side-loading work cut out for him.
Holding a licked finger in the air, cocking heads to one side and looking pensively to the horizon, we speculate ourselves to be two thirds of the way towards general election data success. Roughly speaking. Not to be sniffed at.
In other general election news, Librarian Emily has made magnificent progress investigating membership ‘end reasons’ and how they’re applied at dissolution and more generally. A meeting has been pencilled in to chat through her findings. More work will, inevitably, arise.
With Azure announcing their deprecation of Ruby, our flight back to Heroku continues. Last time we spoke, we’d managed to port across the code for our written answer bots, the old MNIS prodder and our beloved egg timer. Since then we’ve finished the job; both our historical UK general election application - thanks for the lovely new data Resul - and our attempt to turn the FCDO treaty website into something that actually uses HTTP now both lifted. And shifted. As we computational types like to say.
In the lockdown years, Young Robert and Michael spent many a Teams call taking our procedure maps and writing code to parse them. What they like to call their “masterpiece”. Nobody having ever forced a computer to parse a parliamentary procedure before. To the best of our knowledge. Our Jianhan took their childlike scribbles and turned them into production ready C# code. Nevertheless, having a playground for Robert and Michael to test new procedure-related functionality remains desirable, so that code is also now reborn. If you’ve ever wondered how a computer might draw the journey of an instrument subject to the made affirmative procedure, now’s your chance to find out.
Our reader will be delighted to learn that computational life is not all lifting. Or indeed shifting. Sometimes we make new things. Our latest effort was commissioned by Librarian Anya whose crack team have had a long standing difficulty getting their hands on bill-adjacent papers. The system that manages such things not being kind enough to spit out feeds. Librarian Jason has the task of indexing impact assessments for bills, but - with no feed to rely on - feared he may well be missing one or two.
In January, Anya asked Michael if it might be possible to use the bills API to somehow construct an RSS feed of incoming bill papers of a given type. At that point, Michael’s computer was dysfunctional and his usual “magick” quite beyond him. Last week, his computer was given a new lease of life, gaining a script that allows him to install things in a 30 minute window. After which it turns back into a pumpkin.
Pumpkin or otherwise, this window of opportunity is not nothing. We’re now in proud possession of a new bill papers website providing RSS - and CSV - outputs for papers of a given type and papers for a given bill. If you’re a dedicated follower of the Illegal Migration Bill or of delegated powers memoranda or of human rights memoranda or - like Jason - of impact assessments, just plug the RSS into your favourite email client and you’ll get an email every time a new paper drops. Other bills and publication types are available.
Away from our client base of crack librarians, we’re happy to report that we already have one satisfied customer. Thanks David. Your feedback makes it all worthwhile.
Paper modelling work has moved off the whiteboard and into the much less fun realm of documentation. We now have brand spanking new models for the presentation of papers and the reporting of committee things. Both models specialise our earlier efforts on making things available and should be read in conjunction. They won’t make much sense otherwise.
Changes have also been made to our legislation model - the addition of FRBRisation had complicated things somewhat, leaving us with something resembling an overcooked bowl of linguine, rather than our preferred fag packet domain model. The model has now been cleaved in two. There is a delegation model and a delegated legislation model. More concise and true to the old bounded contexts. As Silver might say. The comments still need work, so if you do click, don’t judge us.
Off the back of our “what the hell is a parliamentary paper?” debacle, we sat down with Journal Office Eve and talked through a model or two, to check we’d not disappeared down any more dangerous alleyways. A pixel-based meeting turned into an in-person meeting when Eve, Anya and Michael realised they were all in the office. A relief - Teams being a terrible medium for drawing out the intersections of seven different models. Gathered before a whiteboard - god, we miss whiteboards - they took a whistle-stop tour through the making available, laying, depositing, presentation, reporting, paper and bill models. Eve expressed doubts that linking to the incumbency of the person making the thing available was of much interest to many people, most punters being more interested in the organisation / department. She also wondered whether describing procedure as data might be akin to nailing butterflies to wheels. Which at least gave Michael the opportunity to draw out the Cynefin framework and rattle on about the difference between complicated and complex. A thing he always enjoys. That said, nothing we drew appeared to scare any Journal Office horses. So that’s good. Thanks for your time Eve. Lovely to see you.
Before Librarian Jayne headed off to the land of Tubbs and Crockett, she put in one last map making shift. Our remedial order maps now feature both an inquiry and a call for evidence from the JCHR. How that ever got missed, we’ll never know.
Jayne and her computational comrade Michael also put the finishing touches to their cheat sheets for motions and the cascading effects of withdrawals. So when Jayne returns - should she choose to return - the usual suspects may well be on the receiving end of an idiot checking telephone call. At this point we break off to send a jaunty wave in the direction of Mr Hennessy and Mr Korris.
Also in map land, our Jianhan made a most welcome change to our procedure dot files. Previously, if a business step takes place in a House, the name of that House was appended to the step label. If a business step takes place in both Houses - a joint committee report, for example - the names of both Houses were appended to the step label. And if a business step takes place outside Parliament, the step label remains untouched. Obviously. This worked fine for UK Parliament procedures but came a cropper when dealing with legislative consent motions in the devolved legislatures. Which is now fixed and looking lovely. Thanks Jianhan.
In other Jianhan news, he’s imported all of the Act data - so lovingly polished by Librarian Jayne - into our data platform. A nice circle to square. Jayne has also contacted Andrew to let him know that all her hard work is now available and free to use in Wikidataland.
Our regular reader will, of course, be aware of - and we hope subscribed to - both made-n-laid and tweaty twacker. What they might not know is we also make written answers available as part of our cross-cutting, omni-channel, content and service delivery strategy. As Young Robert might say if he had the misfortune to swallow an MBA sideways. Should an answering body provide a written answer - or a correction to a written answer - to a parliamentary question, bot subscribers will be notified.
In her constant efforts to increase the reach and engagement of our cross-cutting, omni-channel, content and service delivery strategy going forward, Librarian Anna has been expanding our family of busy bots, registering a hell of a lot of new Mastodon accounts. 30 of them in fact. She’s then been grabbing their bearer tokens and plugging them into Heroku. One for each current answering body from the Attorney General to Women and Equalities. Which means we now have full coverage of a bot per answering body. Lovely stuff. No machinery of government changes for a spell, please.
If you’re looking for a full list of all our bot accounts, we also have you covered. Our eagle-eyed reader may notice that many of the Twitter-hosted written answer bot accounts are marked ‘coming soon’. This is because Twitter’s new ownership makes it increasingly difficult to do … well, anything really. We do hope to add these accounts at some point soon. It really depends on what Twitter’s “strategy” might be next week.
As we’re often at pains to point out, not all data is stats. Most of our models deal with words, not numbers. But a recent Teams standup that was supposed to take 15 minutes, continued into chat and turned into an all-day modelling marathon. The subject of library researchers cropped up and boss boss ‘brarian Bryn pointed out that much of the data they rely has little to do with parliamentary procedure and more to do with the sort of statistical data that the ONS publish. Which triggered a memory in Michael. Way back in data platform mk 3 days, we made a sort of start on modelling statistical information. Not being stats experts, it was a bit rubbish. Discretion being the better part of valour, we retired from the battlefield and contacted Leigh. Who has the advantage of actually knowing about statistics. Leigh wrote a report suggesting we use Data Cube for our more statistical information and, rather handily, prepared some sample RDF. Which Michael promptly lost. Hard drive and GitHub were both searched, to no avail. Why don’t you try your deleted items, asked Anya. Which is where the report turned up. Obviously. All the good stuff’s always in deleted items. Thanks Leigh. We’re hoping we can finally put your work to some use in the not too distant future.
Week 10 saw Anya, Vanda, Young Robert and Michael escape the confines of Westminster for a trip to far-flung UCL. There to see Dave Snowden, creator of the aforementioned Cynefin framework and all round sense maker. All assembled were delighted to watch Dave draw out the thinking behind his estuarine framework. Our brains may be slowing with age, but it’s always so much better to see a thing being drawn than to see a drawing. We emerged wondering if our liminal line might be closer to the origin than would be considered ideal. Not to be bumptious, but we’d heartily encourage any of our workplace elders and betters tasked with designing “operating models” and whatnot to click here. Read, click and repeat. Bear in mind there are no prizes for finishing the internet.
And finally, not news from us but rather from friend of the family Andy, over at Full Fact. Given our reader’s obvious interest in accurate information conveyed via well-modelled, well-managed data - why else would you be here? - we’re working on the assumption that our dear reader will be as thrilled as we are by the recent minting of a Full Fact Wikidata property. It means that anything in Wikidata with a corresponding record in Full Fact can now be linked in a way that’s amenable to both people and machines. For reasons we won’t go into, Parliament already has three Wikidata properties - one for Members, one for data platform mk 3 and one for our own treasured thesaurus. Which - given the three things overlap - is probably one to two too many. But it does mean that our dear reader can polish their hacking skills and jump between parliamentary records and Full Fact records with something approaching ease. Applause Andy and fellow Fullfactarians.
NSFW ↩