ontologies

2022 - Week 10

A relatively quiet week, all told. We had hoped to achieve more. There was the Scottish Parliament legislative consent motion map to add to the machines. But that didn’t happen. And a newly discovered bug in our procedure parsing code to investigate and fix. That didn’t happen either. Productivity was undermined by a newly discovered bug in Michael, our computational ‘expert’ having picked up a bad case of the ‘rona. Last we heard, he was face down on his day bed whimpering and indeed whining to himself. We gather he’s lost his sense of taste - at least, his most recent message included the phrase “rough as a badger’s bottom”. Or words to that effect. Luckily he has a strong constitution. We’re pretty sure he’ll pull through.

Tidying up behind ourselves

Back in week 9, our Jianhan tidied up the code in our staging environment to reflect the fact that our procedure maps no longer use typed routes. We awaited the return of Librarian Jayne from her well deserved vacation, her eagle-like eyes being necessary to check whether Jianhan’s interventions had made any discernible difference to the test website. Reader, they had not. Or not much.

A while ago, Jianhan wrote a new SPARQL query for our colleagues in Software Engineering which spanned the procedural tree to better order steps in a work package which take place on the same day. There are two points about that code which are pertinent to this tale:

Having stripped route types from application, database, orchestration and triple store, we rather imagined the existing queries would explode. But no trace of an explosion has been spotted to date. The ordering of steps is slightly better in some cases, slightly worse in others. Next week, we plan to add website queries to the rest of the queries in our library and try to work out what’s actually happening.

Remediating remedial orders

Having completed his housekeeping duties, Jianhan has been popping in the plumbing needed to go live with remedial orders. That work is now complete in staging. Once again, the test website has failed to explode. It only remains to do the same work in live, and we’ll be ready and willing to add remedial orders to our statutory instruments website. With any made ones we come across also appearing on our lovely little made’n’laid Twitter bot.

One does not simply pray against an affirmative instrument

For reasons that escape our somewhat compromised short-term memories, we had incorrectly labelled all House of Lords fatal amendments to approval motions in affirmative procedures as ‘prayers’. Journal Office Jane pointed out our error. Table Office Matt pointed out our error. But whilst we had route-typed maps in staging and step-typed maps in live there were just too many plates spinning to correct this error. Now that at least some plates have ceased spinning, that small snafu is fixed, business items re-actualised and pertinent queries updated.

On the laying of papers

This week saw the third - but by no means final - attempt at a domain model for the laying of papers and the papers thereby laid. This week we were delighted to be joined by TNA’s very own Helen and Catherine for what we believe to have been an extremely successful pixel-based session. What Helen doesn’t know about laid papers wouldn’t fit on a fag packet. This may well be misplaced optimism - if there’s one thing we’re guilty of, it’s our boundless optimism - but it does start to feel like some sort of shape is beginning to emerge. Our latest picture breaks down into four main parts:

If you’re expert in papers laid and see some failure on our part to quite grasp reality please do get in touch. More voices make work lighter.

As pleased as we are with current progress, we still feel our style is somewhat cramped whilst operating purely in pixels. It being almost impossible to clock that someone wants to chip in with a correction or clarification when you’re sharing your screen, can’t see anyone and are occupied with trying to attach an arrow head to a blob in omnigraffle. Which is why next week - COVID permitting - crack Librarians Anya and Jayne and computational experts young Robert and Michael are off to TNA for an actual in-person, face to face, whiteboard-equipped meeting with Helen and Catherine. And hopefully a pint with John.

One last rant about cardinality?

As our regular reader will know, young Robert and Michael have been hard at work attempting to extricate data from the Foreign, Commonwealth & Development Office treaty database. A job not made any easier by what appears to be a misunderstanding of HTTP at the FCDO end. We now think we have a model that works, code that reshapes and tidies - most of - the data, and a website that at least approximates browsability. Not bad going for a few hours work. We’re not ones to boast, and we feel pretty confident in saying our limited efforts are better than what the FCDO bought. And probably cheaper too - Robert and Michael might not be monkeys, but they do get paid peanuts. Unfortunately, you’ll need to take our word on the website because - once again - our Heroku account has run short on tokens. We await the end of the month, at which point our tokens should top up and we should be able to at least link to the thing.

Until then, young Robert continues to occupy most of Battersea’s broadband in continuing attempts to scrape several gigabytes of JSON. And Michael continues to battle the signing location data. Which is, quite frankly, appalling. It would, you might think, have been a fairly simple job for a few developers to sit down with a few international law experts and a whiteboard and ask questions such as, “and can a treaty be signed in many locations?” They might have felt a little daft asking such a question but, when it comes to cardinality, there are no daft questions. The answer would have come back, “why, yes, a treaty may be signed in any number of locations.” At which point, the developers would have been free to head back to their laptops and make - yes, you’ve guessed it - a join table. Reader, this conversation does not appear to have taken place. Instead, they came up with a single text field. The problem is compounded by a lack of any discernable information management principles, different locations being separated for different treaties by any combination of hyphens, slashes, semi-colons, ampersands, commas and the word ‘and’. We’re going to stop typing now, before Mr Downey kills a cat.

How hard can things be?

In news of a more offstage nature, librarians Anya, Silver and Ned continue to chip away at their ‘single-subject view of the Library’ work. Early explorations around adding subject indexing to Library enquiries remain somewhat unresolved. The intent remains but the pipework doesn’t yet appear to be quite in place. Or even drawn out. In the meantime, attention has turned once again to the Subject Specialist Directory. This is a printed publication that goes out to Commons Members, their staff and the front of House Library staff, listing the Library research specialists and their areas of specialism. Sort of like a Checkatrade if you’re only in the market for people with PhDs. Like all printed publications, it tends to date quicker than the publication cycle, and, like all directories, its subject headings have evolved somewhat organically. Which means our crack team of librarians are currently in search of the hard won middle ground in the eternal battle between precision and recall.

The current plan is to ‘index’ specialists with both low-level and - where appropriate - higher-level concepts from the Parliament Thesaurus, then apply a layer of transitivity to point - (synonymous) queries to the appropriate person. Ned has been busy compiling spreadsheets mapping specialisms to concepts, Silver has been working with colleagues at Data Language to turn assorted datasets into something we can click around and poor Anya has been in meetings. Endless meetings.

Thanks be to Andrew! Praise be to Wikidata!

This week - it could have been last week actually, we’ve quite lost track - our team of crack librarians were on the receiving end of a Member enquiry. The enquiry seemed simple enough: a count of Members since a given date fitting a given criteria. Unfortunately, the data Parliament keeps in the Members’ Names Information Service does not go back nearly far enough to fit the bill. And the data in the Rush database still requires a fair bit of tidying before it can respond accurately to queries with said criteria.

Luckily we know Andrew and Andrew knows Wikidata. A quick DM later and, within the hour, Andrew had written a new Wikidata query and returned with an answer. And not only this. Andrew also documented his work, including how he went about building queries of increasing complexity by layering facet over facet. It is both a Wikidata tutorial and a SPARQL tutorial. And if you scroll down far enough you even get a bit of a history tutorial. Work in the open, they say. And you should. You really should. No matter where you work, there are always more experts out there than in here.

Thanks Andrew. Again.