The ESA Sentinel 2 satellites will provide the global community of environmental scientists and managers with fantastic terrestrial multi-spectral high-resolution optical data. ESA will give the general public and partners with access to these data sets. The respective users/countries will then have to do some processing of the data sets to render them useful.
In this posting I will try to present some of the work I did as part of a national level working group last year. I will also indicate some of the challenges ahead of institutions working with environmental data management in view of the Sentinel 2.
Challenges include establishing relevant operational products, coordinating such processes and making sure that time series of the same data are available. The posting is mostly based on our report to the Norwegian Space Center.
Highlights from ESA tells us that Sentinel 2 will offer a rather nice cocktail of data based on the following capabilities:
- Multi-spectral information with 13 bands in the visible, near infra-red and short wave infra-red part of the spectrum. In part overlapping with Landsat 8.
- Systematic global coverage of land surfaces : from 56°South to 84°North, coastal waters and all of the Mediterranean sea
- High revisit time: every 5 days at equator under the same viewing conditions. Not too bad in Europe either
- High spatial resolution at 10m, 20m and 60m depending on bands
- Wide field of view: 290 km
In addition to delivering data according to the specifications the ground segment will be providing the end users with a steady flow of data. When the regularity and quality is predictable we have the basis of w what can be called operative products.
From the perspective of a geographer working with biologists and environmental managers this poses both opportunities as well as challenges. On a professional level I would like to select relevant data and make products which provide the users with both current views of the environment and long time series. All of it after good processes involving multiple stakeholders. After this I would rather someone in a central agency magically deliver the data sets to me according to our specifications and upon request as well as store them for me for later or long term use.
Understanding satellite related products
Last year I was part of a work group doing an analysis for the Norwegian Space Center on how to prepare the reception and application of optical sensor data for Norway Digital – the national geographical infrastructure. Norway Digital is the Norwegian government’s initiative to build the national geographical infrastructure. Norway Digital has since 2005 been a working co-operation and infrastructure with reference to available data and thematic data covering more than 100 operational web map services, geoportal and other services. (source: Norge Digitalt).
In any case, the report we did (in Norwegian) is available for download here:
[wpfilebase tag=file id=144 tpl=download-button /]
One of the outcomes of the report was to specify national needs, particularly in the Norway Digital co-operation and assess current monitoring and surveillance programs. Part of this work was to go through older product surveys and also to arrange a workshop. In short we wanted to understand how Sentinel 2 imagery could be the basis of operative for Norway Digital partners. To be operational services should be continuously available with proven quality and regularity. The Sentinel 2 satellite is a good start. But it will require us to establish a chain of products relying of many operations – even after having the data delivered to our national doorsteps by ESA.
The below figure (1) shows how the Sentinel 2 data through a chain of processes and product definitions finally ends up as end user products.
One findings from the report was that institutions, offices or persons representing the user level does not necessarily know how to describe their own product. Furthermore – when they did try to describe a product we some times found that they were describing a product used by a neighboring institution – where the only difference is that they are using a different terminology.
We set out to understand what products where out there and came up with a lot of products we in turn tried to relate to each other. What we got back was initially a rather messy relation diagram. Still – we saw that there was a pattern in there. The below figure is an illustration of product independence based on our questionnaires and former product surveys.
We used yEd graph tool to help us see how the products related. Where we found that there were minor differences between the products, or we assumed they were actually the same products referred to with different terminology, we merged them into one. It was a coarse process, but all we wanted was an overview, so we thought it would be ok. We could always pick the brains of the experts at a later point in time.
More structure, some merging, and this is what we got (figure 2). Satellite products are yellow:
Using our trusty yEd we took a quick look at which products were more important than others. yEd allows for a weighted presentation and using that functionality we got an interesting visual presentation:
The referred report contains a table of products sorted according to how many derivative products are under a particular product. The big orange square in the figure above is a vegetation index product. Derivative products of the vegetation index are for example forestry related products, vegetation change index and more. Our conclusion was that a more structured approach is necessary to be able to understand how a national community of spatial data users (government and others) can find common ground for common products. We will get there, after a few workshops.
What lies ahead?
Where we used to have project oriented initiatives we will in the future hopefully have operative products. No longer products based on one-time funding initiatives, products based on long term funding allowing us to establish and maintain recurring versions of good basis products.
At a point involved institutions will have coordinated their own product needs enough and started to prepare products based on joint efforts. Some products will remain on a national level, while other will have to be coordinated on a lower level. As indicated in the start of this posting it will be a huge undertaking. Not only will it require us to develop and fund capacity to prepare our products, it will also require us to prepare and fund storage strategy for the same products.
Where we nowadays have systems for establishing, storing and disseminating vector based data we will in the future be expected to expand that capacity for vector data – and establish systems to handle pixel and scene imagery data.
Even if we are using vector data derived from satellite imagery both research and management institutions will still have to support availability of data for verification of scientific or management conclusions. Building a national spatial data infrastructure for satellite imagery will require great efforts. Interacting with such databases from a stakeholder level will require us to keep up to speed or even develop standards for satellite storage and data exchange.
If we add Landsat 8 and other relevant imagery products to equation we have a good starting point for many relevant products for understanding and managing our environment.
It is my guess that OGC will play a role in this work, and perhaps are they already involved in related work. Bringing satellite data to users will require good specifications from the ESA delivery mechanism to user level products. This will require excellent systems for communicating the specifications and it will above all require a great ability to compromise when making them.
In other words – if you are working with GIS and environmental data change is coming – in a couple of years 🙂
Until then we will have to find answers to these questions and more:
- How should national level pixel level data be stored?
- What meta data should accompany this data?
- What protocols are relevant for transfer of data?
- Are there any standards we can draw upon?
- Who else are working on these issues in relation to Sentinel 2 imagery?
- Best practices?
- Which stakeholders should be involved in this work?
- References to advice on how open source software could play a role in this work are most welcome.
If you know the answers to some of these questions I would like to hear about it. I am curious. Use the comments fields on this blog.
If you find errors or things which are incorrect you are also welcome to use the commenting field below.