Librarian Jayne and Michael had a fairly typical week, shouting random letters at one another as they set about checking the data for the newly logical made affirmative procedure. With only occasional bursts of swearing. And even then, only mild. Several hours of squinting turned up a single route entered in error. Which, for a procedure with 1,231 routes is not too shabby. Not too shabby at all.
Made affirmative checked, we now have all statutory instrument procedures remapped, with data entered and verified. Which means the next stop is the hot mess that makes up the map for treaties laid under the Constitutional Reform and Governance Act 2010. This made slightly easier by our late dawning realisation that we’d designed a model for componentised procedures. And promptly forgotten about it. Lords’ committees - defunct since the arrival of the International Agreements Committee, but still present in our data - have been mapped. More data has been added. And, as ever, more checking needs to happen.
Last week we mentioned that young Robert and Michael have been hard at work refactoring and commenting our procedure parsing code. And that we had a suspicion that the whole thing would run much faster with fewer queries. Parsing a route requires knowledge of the status of inbound routes to the source step of the route being parsed. And, having parsed a route, another query was triggered to get outbound routes from the target step of the route just parsed. All of which was rather expensive with the cost growing in proportion to the number of routes in the procedure. n+1 being never nice. All of this led to a work package subject to the made negative procedure taking over 15 seconds to parse. Robert and Michael both felt things would be much improved if they could run a couple of queries up front and wrangle the whole thing into a roughly graph shaped data structure. But neither had a clue how to do it. The revelation came when Michael called an early night and had one of his all too rare, dairy product powered brainwaves. Which means we now have a handful of queries running at the start of the request and four hashes: one to store the routes and their parsing status, one to store the steps, one to store routes to a step and one to store routes from a step. The made negative procedure now parses in roughly two seconds. Which is still probably one second more than it would ideally take. But a time span that we assume won’t increase too much should the procedure grow. As procedures tend to do. The code is in a number of pieces and needs a proper refactor. And yet more comments. More work for next week.
With the end of the current session upon us and a whole new session to look forward to, Robert and Michael made a few tweaks to our beloved egg timer to close off the old session and start off the new. Previous instructions were ignored as a new, more efficient method was found. And promptly written up.
They also took the opportunity to add a calculation for the brand spanking new published drafts under the European Union (Withdrawal) Act 2018. As ever, comments citing legislation were added as Markdown and parsed into HTML. It is important to show one’s working.
On the subject of paragraph 14 of schedule 8 of the European Union (Withdrawal) Act 2018, Jayne and Michael have made a small start on mapping the teeniest of tiniest procedures. For now just using route types; a more logical model to follow. A number of things are still unclear, not least whether such things can be withdrawn and, if so, how that might happen. But at least we have a straw man to poke at.
End of session also brought a flurry of tweets from our SI and tweaty tracking Twitter bots. Or at least from the team of crack librarians that sit behind them. Credit to Anya for the original idea and her determination to prove that Sessional Returns would ideally be formed from a set of reusable, repurpose-able queries. And why not, we all say.
Librarian Ned joined Robert and Michael to make a start on a more ontological model for peerages based on everything we’ve learned from David and Grant. For now, it’s just a picture. And will probably change. They’re hoping to make a start on transforming pixels to Turtle next week, though comments might take a wee while longer.
Anya, Robert, Michael and open data boss bloke Ian met with Oli to chat through his general election database and the assorted work and data flows that surround it. Which, for the lower House and for around two weeks every four or five years, is pretty much all of the data flows. Oli has sent over the schema set up files and Anya, Robert and Michael have been pouring over them and working out how they fit with our own election model. They’ve also started to add a few bits and bobs to our more relational efforts. Though, for now, only in picture format.