Category Archives: open source

Wilderness 4: Wilderness degradation analysis

Area with wilderness and encroachment. Visualization of change of wilderness status.

Area with wilderness and encroachment. Visualization of change of wilderness status.

Wilderness degradation happens when new encroachments are made changing the wilderness status of an area. It is a complex issue which does not easily lend it self to a GIS based analysis. I will refer to my posting on wilderness for a peek into the complex world of wilderness philosophy.

It is possible to set up a system like FME to do an analysis of changes in wilderness due to new encroachments. The procedure I made generates a wilderness degradation data set based on wilderness and (new) encroachment data. It is based on procedures used by the Norwegian government in their analysis of wilderness and encroachment. The system should however be easy to accommodate for different preconditions by manipulating the number of wilderness zones and/or their buffer distances.

To visualize the degradation of wilderness it is necessary to make a categorization and furthermore establish a cartography to carry the information to the reader. In my view this can not be done unless the author/mapmaker to some extent takes side in what is good or not good related to wilderness degradation. This will be the focus on a forthcoming posting, but I will touch into the issues here as well.

FME is but one of the potential solutions for producing the results. QGISGeoKettle and even PostGIS could be good alternatives. The GitHub project has room for alternative implementations.

Continue reading

Wilderness 3: About the examples

A philosophical approach towards wilderness analysis has been written. A FME-script has been designed and appropriately discussed. Now time has come to run the script on national level data to prove the concept and also open up for a discussion on challenges with the method and its results.

I have done this for several countries and will deliver the results and discuss them appropriately.

All examples are delivered as-is. There is no governmental context to the analysis. The data used for the analysis is OpenStreetMap data.

At the time of writing this article the number of examples has yet to be decided. The following are available as of writing this posting:

General challenges with the presented data are:

  • OSM data coverage
  • Encroachment criteria selection
  • Lack of context

In later postings I will look at the following:

  • Change analysis of wilderness
  • Cartography and wilderness mapping
  • Conclusions and discussion

Skandinavias midtpunkt – et sted i Norge

skandinavias midtpunktHvor er midtpunktet i Skandinavia? Dette er et spørsmål som dukket opp i familien for et par år siden. Min bror har en hytte i Rendalen og mente at hytta måtte ligge ganske nær Skandinavias midtpunkt? Rett eller galt? Familiens geograf tok saken.

Siden jeg kan litt om kart og geografiske informasjonssystemer (GIS) var det ikke så vanskelig å finne ut av det. Den metoden jeg valgte er bare en av mange mulige. Denne posten forteller litt om hvordan jeg løste oppgaven.

For de som ser etter et turmål kan Skandinavias midtpunkt være et fint et. Jeg kan allerede nå røpe at Skandinavias midtpunkt ligger i Trysil kommune. Det ligger ca 1080 meter fra nærmeste veg. Nærmeste hus ligger noen få meter nærmere.

Når det gjelder min brors hytte er det litt for langt unna til at man kan ta spasertur til midtpunktet fra hytta.

Continue reading

Converting shapefiles to Mission Planner .poly-files

15-9-plannedMission Planner by Michael Oborne is an impressive piece of software. It is used to program the open-source APM autopilot. The autopilot is used to control planes, copters and rovers. I have used Mission Planner a lot and I can not do without.

Some years ago I started making a map for the Mindland island in the archipelago of Norway using a GPS, OpenStreetMap and Bing aerial imagery. The main driver for this project was to document old place names. With the drones becoming somewhat of a hobby last year I thought it would be nice to also establish a proper open license ortophoto for the island. Mission planner has what it takes to approach such a task in a structured manner, save for one thing. The polygon tool only imports .poly-files.

When working with maps some of us tend to stick with shapefiles or geodatabases. I have made a small script which allows for the conversion of a shapefile with a geographic coordinate system (wgs84) to as many .poly files as there are objects in the shapefile. Adding the functionality to Mission Planner has been indicated as possible, but has yet to materialise. So until then the script associated with this posting remains relevant. Continue reading

Clearinghouse for the environment – the girders (II)

In February 2012 I wrote about “Environmental Spatial Data Infrastructure” on this blog. Later that year the case complex matured somewhat and in August I wrote the posting “Clearinghouse for the environment – the scaffolding (I)“.

Since then I have together with my colleagues had the opportunity to test systems in full scale by contributing to the implementation of clearinghouses in partner countries.

Last time I showed how a stack consisting a hardware layer with vmWare as one of the basic modules could form the basis of an environmental spatial data infrastructure. In some ways this was a rather optimistic setup.

A stack of hardware and software which could become very usefull...

Some of our main challenges with the above set up was maintenance of physical equipment. So we removed that layer. Maintaining a complex setup with a virtual machine environment, or getting access to local environments proved in general to be difficult. So we ditched it.

To get the systems running we needed:

  • Shared access to the systems for administrative purposes
  • A flexible backup-system
  • An option to duplicate successful setups
  • Scalability
  • High availability
  • Flexible security system

Other things we considered important

  • The system should not tie our partners up in future licensing costs
  • Compliance to central standards
  • An option for partners to move the systems to physical infrastructure if necessary
  • Option to keep traffick outside our own company networks – since they are de-facto external systems paid for by external partners

As you all can see in all a lot of considerations which we had to relate to.

AWS_LOGO_CMYK-588x214Since internet access across borders in any case would be relevant for retrieving external map layers we started looking at how we could use Amazon services. I already had experience in running virtual macines using Amazon EC2. Amazon helped us out with many of the issues mentioned above. So in short we moved the whole setup to Amazon. The following figure illustrates the setup.

sdi_onestop_amazon

In addition to the components relying on EC2 we have also found that using Amazon Simple Storage (S3) for storing survey data of some size could be a good ide. S3 allows the user to distribute files using “secure” links and even using the bit-torrent protocol for files up to 5 Gb.

We now have one such system built up and under testing. It looks good but as always the technology is but a small part of the equation. Establishing information flows, using standards etc represents the major parts of a national envoronmental spatial data infrastructure.

Should the need arise to develop custom made solutions it should be possible to add more virtual machines in the setup.

sdi_onestop_amazon

Given that our partners find this setup trustworthy we will probably suggest this as an entry level spatial data infrastructure for environmental data.

WordPress directly, and through countless plugins, supports many standards for embedding information. How information should flow between the different systems in this setup has been given some thought. I will try to elaborate on this in a later posting – hopefully in less than two years time.

QDGC shapefiles available for national and continental scale distribution

Aqdgc_logo new set of the Quarter Degree Grid Cell shapefiles has been generated. The update is global and delivers an error fix for the country level files as well as a new product – continent level files.

The QDGC shapefiles contain center lon/lat coordinates and the QDGC string for the different squares. The files are offered down to level four. For a country around the equator level four covers around 45 square kilometers with length and height a little under seven kilometres.

Read more about the use of QDGC on this page:

The calculations/export  this time took around 60 hours computer pricessing time including generation of world fishnet with the different sizes, square area calculations, assigning QDGC strings, compression and more. Continue reading

Integrating the OGC WMS getcapabilities information in WordPress (iframe)

wms_tools_codeOver the last two years I have worked with WordPress as a content management system for several projects. WordPress has proved to be a flexible platform for publishing documents, files in general, imagery and maps. There was one thing missing though. I wanted to be able to list map layers available on a given wms-server.

To solve this I have now made a small php-script which allows the user to integrate server capabilities information from a geoserver based WMS-server. The code is a work in progress and does admittedly have some shortcomings.

The feature would not be possible without wms-parser.php and Openlayers. Continue reading

QDGC global coverage level 1-4

Aqdgc_logo new set of the Quarter Degree Grid Cell shapefiles has been generated. This time the coverage is global and the publication is for individual countries.

The QDGC shapefiles contain center lon/lat coordinates and the QDGC string for the different squares. The files are offered down to level four. For a country around the equator level four covers around 45 square kilometers with length and height a little under seven kilometres.

Read more about the use of QDGC on this page:

The calculations/export  this time took around 60 hours computer pricessing time including generation of world fishnet with the different sizes, square area calculations, assigning QDGC strings, compression and more. Continue reading

Lokalkart for Mindland (heimstadkartlegging)

Skjærseth sett fra oven

Skjærseth sett fra oven

Det ble ikke heimstaddiktning denne sommeren. Ikke er jeg spesielt interessert i å skrive lokalhistorie heller. Men stedsnavn og kart har interessert meg en god del. Så da ble det heimstadkartlegging og ikke heimstaddiktning.

Øya Mindland er i likhet med de fleste andre steder i Norge godt kartlagt av Kartverket i samarbeid med kommuner, fylker og andre offentlige etater. Det er stort sett ikke mulig å konkurrere med kvaliteten på kartene fra Kartverket. Men dekningen av lokale stedsnavn er ikke alltid like god. Continue reading

American mink – alien species proliferation analysis

Procedure overview for the analysis

Procedure overview for the analysis

The aim of this posting is to document the more technical aspects of establishing the knowledge basis necessary to follow up the action plan against american mink (nevison vison) – an alien species in the Norwegian fauna. It will show how the Python programming language and relevant programming libraries (ArcPy and others) are used in an analysis aiming to understand where the mink can spread under given circumstances.

The motivation for this is to document the process for other relevant projects as well as to make relevant code and methodological descriptions available for other persons/institutions involved in similar projects. The work has been made possible with access to other freely available information online and as such this posting should be considered a timely way of paying back for “services provided”. Continue reading