Data Submission

Here we give an overview on the data submission process to the GTN-P Database. Please read carefully on the hierachy structure, submission levels, and the quality control.

 

Database tutorials

Tutorial Active Layer Monitoring Sites

Tutorial Ancillary Measurements

Tutorial Boreholes

Tutorial Citations

Tutorial Datasets and Data Collections

Tutorial Sites

 

Three hierachies

There are three kinds of hierarchies in the data submission process:

  1. 1. Site Manager
  2. 2. National Correspondent
  3. 3. Database Administrator

(More detailed description in progress.)

 

Submission levels

The GTN-P database is structured by increasing level of data complexity. Three variable levels are progressively implemented in the database.

 

Variables 1st level: one single full temperature profile

These variables are all mandatory and indicate the aggregation over the whole period of measure. If these variables are missing, the dataset will not be processed at all and not be included in the dataset. These data must be provided as free data.

Variables 2nd level: one measurement per year (annual minimum/maximum/average)

These variables are very important for many activities and if submitted, the site will be involved in more activities while the data quality index will increase. For this reason we strongly suggest to submit also these variables. These variables indicate the yearly aggregation - one value per year. The reference must be defined asap, between: calendar (Jan 1 - Dec 31), hydrological "Arctic" (Sep 1 - Aug 31), and hydrological "Alpine" (Oct 1 - Sep 30).

Variables 3rd level: time series

Variables 3rd level are all additional variables that you think are relevant and can be submitted within the ancillary measurement object. These variables indicate time series for all nodes in the boreholes (monthly or yearly aggregation). They may be provided as simple text files (e. g. comma separated) and stored in the database as archive.

 

Quality control

The quality of the data will be automatically calculated in the future based on a range of parameters such as metadata completeness, data levels, geographic location accuracy, etc.