Python

A Digital Day of Archaeology

Wooston Castle Local Relief Model draped over a 3D Digital Terrain Model, all based on LiDAR data and available on Sketchfab

Wooston Castle Local Relief Model draped over a 3D Digital Terrain Model, all based on LiDAR data and available on Sketchfab

As is usual for me, my day comprises working on digital heritage projects, as in my previous Days of Archaeology (2011a, 2011b, 2012, 2013 and 2014). So no archaeological features were harmed in the making of this post!

Although on one current project, my GSTAR doctoral research, I am indeed working with archaeological excavation data from the archives of Wessex Archaeology combined with museums collections data from Wiltshire Museum and also heritage inventory data from the Wiltshire Historic Environment Record. This project is nearing completion (thesis due for submission April-ish next year!) and having already shown that geospatial information can be published and used in Semantic Web / Linked Data contexts through the integration of ontologies, I’m currently building demonstrators to show how data can then be used to undertake archaeological research through framing fairly complex archaeological research questions as spatial queries asked across the range of resources I’ve included.

Today however, I’m working mainly on Archaeogeomancy commercial projects as I do one day a week. And thanks to the wonders of digital technologies, I’m working out of Bristol for a change; my first Day of Archaeology away from Salisbury. It’s been a busy week this week, clocking up quite a few miles, as Monday and Tuesday were spent at the Pelagios Linked Pasts event held at Kings College London where a diverse group from across the world spent a very productive couple of days talking about Linked Data with particular emphasis on people, places, space and time.

This morning’s tasks focussed on an automation project involving planning applications. I’m building a system which consumes planning data collated by Glenigan, classifies it according to type of project (as defined by the client) and then pushes out regional and property specific maps and summaries on a weekly/monthly basis for a list of properties which may be affected by these planning applications. This allows specialists in each region to assess each planning application and make recommendations regarding any responses needed. So whilst not the shiniest and most academically interesting of projects, it is the kind of GIS based systems development and automation that can really make a difference by freeing up staff time from the mundane production of such maps and reports.

This afternoon’s tasks will focus on another system I’m developing, this time to assist with the analysis and interpretation of LiDAR data. I’m building a toolkit which incorporates a select range of visualisation techniques requested by the client including Local Relief Maps, Principal Components Analysis and the usual hillshades, slope, etc. The toolkit is to be deployed to users who are not necessarily experts in the analysis and interpretation of LiDAR data or GIS so needs to be simple to use with many variables preset and also needs to be integrated within their corporate GIS solution rather than be a standalone application. The first batch of tools mentioned above are all complete and working nicely; this afternoon’s mission is to wrap up the Openness and Sky View Factor visualisations.

Indeed, it’s been great working with LiDAR data again lately. When thinking of a suitable image for this year’s Day of Archaeology post, the one shown above immediately leapt to mind. It shows a screenshot of the output of the Local Relief Model (LRM) tool I built draped over the Digital Terrain Model (DTM) for a rather lovely hillfort as viewed on Sketchfab. I mention this because disseminating informative views of LiDAR data has long been problematic, but platforms such as Sketchfab allow us to composite 3D and 2D products and then share them in an interactive way with anyone who has a web browser and an internet connection without the need for any specialist software at all. Nice.

Wrestling Pythons, Blending Grass and Proofing Papers

Today has been a pretty normal day in my current archaeological life. I am in the final year of my PhD and so have been battling away infront of a laptop (like many others) trying to make sense of archaeological data and say something new and interesting about the past.

I am lucky in that I live in Cambridge, and so had a lovely cycle ride this morning across the meadows, past the cows, to install myself into the Cambridge University Library (UL). This is one of the joys of being a student in the UK, even though I am doing my PhD at UCL in London I am more than welcome to come and use the library in Cambridge for free which is not only great for books – it also has an excellent tea room.

Bronze Age Huts in QGIS

My PhD is on the Bronze Age hut settlements on Bodmin Moor, I am using Augmented Reality to examine the locations of the huts and how they fit into the landscape. This involves a lot of GIS work and also some 3D modelling. I have a lovely GIS dataset of the Bronze Age hut locations and a pretty decent elevation model. When out in the field archaeologists use quite few tools, but the trowel is probably the most useful. When in front of the computer archaeologists also use a lot of tools – today I was using the Python framework to script a way to get GRASS data into blender so that I could load virtual models of the huts into Unity3D to view them in my ARK database to then finally use Vuforia and Unity3D to display it in the real world. Today my most useful tool is Textmate.

Bodmin Moor in blender

Basically what I am trying to do is import 2D GIS data into a 3D gaming engine, that I can then use to explore the data and (using Augmented Reality) ‘overlay’ that onto the real world. The important thing is to ensure the spatial coordinates are preserved when it is imported into the gaming engine – otherwise the on-site GPS location won’t work during the Aug. Reality stage. So the distances, heights and topography seen int he gaming engine representation are as close to the real world as possible (at least the real as modelled in the GIS!). To keep track of the huts and their associated data I have been using the ARK database system (created by Day of Archaeology sponsors  L – P : Archaeology). ARK brings all of the various bits together  – data from the literature, basic dimensions of the huts, spatial data and also the 3D representation. I’ve been getting some pretty good results from my experiments and seem to have cracked the workflow – I’ll put up a proper walkthrough on my blog once the script is all sorted out as I think it will probably be pretty useful for others to see and use. In the meantime I have made a very small screencast to show the huts within ARK and Unity – which I think it pretty cool. For those of a techy bent, ARK is sending the Unity3D plugin the id of the hut currently being viewed and Unity is then figuring out where that hut is in the virtual world and placing the ‘player’ inside it.

Wow that was all a bit techy – sorry about that!

So as promised in the title of the post then – here is a link to some wrestling pythons…

and someone blending grass..

and the paper proofing is a bit more boring…

Today I also approved the final author proofs of an article on my research that is going to be published in Journal of Archaeological Method and Theory. Apparently when they have made my suggested corrections (c. 1 week) it should be available at: http://dx.doi.org/10.1007/s10816-012-9142-7  for people who have personal or institutional subscriptions to the journal, very exciting!

Right back to the coding… only an hour before I get chucked out of the library.