Hi, this is a shortened and updated version of our blog post in finnish about our first week of WLM where I explained some of the tech behind the lists. Also most important things at the first. We have received over 1500 photos so far and they are magnificent. If you want to participate you can do it here.
So now about the tech.
Wikidata and SPARQL
In practice the whole system functions, one way or another, on top of Wikidata which we used to record information about the monuments. Then we created lists about them in Wikipedia by listing using SPARQL those subjects that have a National Board of Antiquities id OR are part of a Wikidata item that has one.
The records look like this:
The items are fetched from Wikidata using this SPARQL query:
#Wiki Loves Monuments Finland SELECT ?item ?itemLabel ?rkyid ?mjid ?coord ?image WHERE { { ?item wdt:P361 ?rky . ?rky wdt:P4009 ?rkyid . } UNION { ?item wdt:P4009 ?rkyid . } UNION { ?item wdt:P4106 ?mjid. } UNION { ?item wdt:P361 ?mj . ?mj wdt:P4106 ?mjid . } OPTIONAL { ?item wdt:P625 ?coord } OPTIONAL { ?item wdt:P18 ?image } SERVICE wikibase:label { bd:serviceParam wikibase:language "fi". } }
You can try out the query here. Run the query by clicking the play button in the lower left hand side of the window. After the results are processed you can choose how the results are displayed from the menu above the results. “Map” and “Image grid” are useful for this query.
Wikipedia lists
The data moves to Wikipedia lists by having ListeriaBot save the lists once or twice a day. As far as Wikipedia is concerned the list is identical to Wikicode. It’s updated dynamically only in small parts. Here’s an example of a list. You can see the whole SPARQL query in the Wikicode. The row template used in the example is implemented by this module.
We have naturally encountered unforeseen problems because we’re doing this for the first time.
One of them is that although it’s possible to use Wikipedia modules for dynamic searches, either our lists were larger than Mediawiki allows or updating it took too long. In both cases Mediawiki left out parts of the page. As a temporary solution we had to save as much as we could pre-formatted. We also simplified our maps, and in case of Helsinki replaced them with links.
We also didn’t prepare for the fact that it isn’t enough to just add data to Wikidata. If you want to refer to the borders of the item on a map, then OpenStreetMap has to have knowledge its Wikidata id. Mostly it didn’t matter but it meant that we weren’t able to add rivers and roads to our Wikipedia maps and we need to do it in the future.
Mobile map
In addition to the lists we also used a separate mobile map that’s based Wikishootme. Wikishootme is a mobile map made by Magnus Manske using the Leaflet map library and OpenStreetMap. Items visible on the map are fetched using this SPARQL query. Links that lead to the map include the query as a url parameter.
Saving photos to Wikimedia Commons
Both Wikipedia lists and the mobile map save photos using Wikimedia Commons’ Upload Wizard Campaigns with suitable url parameters. We used these parameters: campaign, description, coordinates, Wikidata id and Wikimedia Commons categories. After the photo has been saved Wikidata id is used for matching it with the data from the National Board of Antiquities.
Next step the Monumental
When we checked our map options in June our options were Monumental and Wikishootme. We selected the latter because it worked nicely with mobile phones and we could add our own SPARQL queries.
WLM beta map (maps.wikilovesmonuments.org) is made on top of Monumental. It uses the P1435 values to get the items that have direct designations to the map. However, items that are part of those monuments aren’t currently included.
is there any photo about the monuments finlands?
Here is some photos from 2017
https://commons.wikimedia.org/wiki/User:Zache/Suomi