SPARQL

Last month in Populating a Schema.org dataset from Wikidata I talked about pulling data out of Wikidata and using it to create Schema.org triples, and I hinted about the possibility of updating Wikidata data directly. The SPARQL fun of this is to then perform queries against Wikidata and to see your data edits reflected within a few minutes. I was pleasantly surprised at how quickly edits showed up in query results, so I thought I would demo it with a little video.

One-click replacement of an IMDb page with the corresponding Wikipedia page

With some Python, JavaScript, and of course, SPARQL.

I recently tweeted “I find that @imdb is so crowded with ads that’s it’s easier to use Wikipedia to look up movies and actors and directors and their careers. And then there’s that Wikidata SPARQL endpoint!” Instead of just cursing the darkness, I decided to light a little SPARQL-Python-JavaScript candle, and it was remarkably easy.

Avoiding accidental cross products in SPARQL queries

Because one can sneak into your query when you didn't want it.

Have you ever written a SPARQL query that returned a suspiciously large amount of results, especially with too many combinations of values? You may have accidentally requested a cross product. I have spent too much time debugging queries where this turned out to be the problem, so I wanted to talk about avoiding it.

Custom HTML form front end, SPARQL endpoint back end

Your website's users sending SPARQL queries, even if they haven't heard of SPARQL.

In a recent Twitter exchange, Dr Joanne Paul asked “Does/can this exist? A website where I enter a title (eg. ’earl of pembroke’) and a year (eg. 1553) and it spits out who held that title in that year (in this case, William Herbert).” Michelle Watson replied “I bet you could probably write SPARQL query to Wikipedia that would come close to doing that. Not sure how you’d embed that into a webpage though.” I replied to that: “Have an HTML form that…

OpenStreetMap, or “OSM” to geospatial folk, is a crowd-sourced online map that has made tremendous achievements in its role as the Wikipedia of geospatial data. (The Wikipedia page for OpenStreetMap is really worth a skim to learn more about its impressive history.) OSM offers a free alternative to commercial mapping systems out there—and you better believe that the commercial mapping systems are reading that great free data into their own databases.

Converting JSON-LD schema.org RDF to other vocabularies

So that we can use tools designed around those vocabularies.

Last month I wrote about how we can treat the growing amount of JSON-LD in the world as RDF. By “treat” I mean “query it with SPARQL and use it with the wide choice of RDF application development tools out there”. While I did demonstrate that JSON-LD does just fine with URIs from outside of the schema.org vocabulary, the vast majority of JSON-LD out there uses schema.org.

Exploring JSON-LD

And of course, querying it with SPARQL.

I paid little attention to JSON-LD until recently. I just thought of it as another RDF serialization format that, because it’s valid JSON, had more appeal to people normally uninterested in RDF. Dan Brickley’s December tweet that “JSON-LD is much more widely used than Turtle” inspired me to look a little harder at the JSON-LD ecosystem, and I found a lot of great things. To summarize: the amount of JSON-LD data out there is exploding, and we can query it with SPARQL, so…