Standardise#
Functions that accept data in specific formats, standardise it to a CF-compliant format and ensure it has the correct metadata attached. The data returned from these functions is then stored in the object store.
Measurement Standardisation#
These functions cover the four types of measurement we currently support.
Surface measurements#
- openghg.standardise.standardise_surface(filepaths, source_format, network, site, inlet=None, height=None, instrument=None, sampling_period=None, calibration_scale=None, update_mismatch='never', measurement_type='insitu', overwrite=False, verify_site_code=True, site_filepath=None, store=None)[source]#
Standardise surface measurements and store the data in the object store.
- Parameters:
filepath – Filepath(s)
source_format (
str
) – Data format, for example CRDS, GCWERKSsite (
str
) – Site code/namenetwork (
str
) – Network nameinlet (
Optional
[str
]) – Inlet height. Format ‘NUMUNIT’ e.g. “10m”. If retrieve multiple files pass None, OpenGHG will attempt to extract this from the file.height (
Optional
[str
]) – Alias for inlet.instrument (
Optional
[str
]) – Instrument namesampling_period (
Union
[Timedelta
,str
,None
]) – Sampling period in pandas style (e.g. 2H for 2 hour period, 2m for 2 minute period).calibration_scale (
Optional
[str
]) – Calibration scale for dataupdate_mismatch (
str
) –This determines how mismatches between the internal data “attributes” and the supplied / derived “metadata” are handled. This includes the options:
”never” - don’t update mismatches and raise an AttrMismatchError
”from_source” / “attributes” - update mismatches based on input attributes
”from_definition” / “metadata” - update mismatches based on input metadata
measurement_type (
str
) – Type of measurement e.g. insitu, flaskoverwrite (
bool
) – Overwrite previously uploaded dataverify_site_code (
bool
) – Verify the site codesite_filepath (
Union
[str
,Path
,None
]) – Alternative site info file (see openghg/supplementary_data repository for format). Otherwise will use the data stored within openghg_defs/data/site_info JSON file by default.store (writable) – Name of object store to write to, required if user has access to more than one
store –
- Returns:
Dictionary of result data
- Return type:
dict
Boundary Conditions#
- openghg.standardise.standardise_bc(filepath, species, bc_input, domain, period=None, continuous=True, overwrite=False, store=None)[source]#
Standardise boundary condition data and store it in the object store.
- Parameters:
filepath (
Union
[str
,Path
]) – Path of boundary conditions filespecies (
str
) – Species namebc_input (
str
) – Input used to create boundary conditions. For example: - a model name such as “MOZART” or “CAMS” - a description such as “UniformAGAGE” (uniform values based on AGAGE average)domain (
str
) – Region for boundary conditionsperiod (
Union
[str
,tuple
,None
]) – Period of measurements, if not passed this is inferred from the time coordscontinuous (
bool
) – Whether time stamps have to be continuous.overwrite (
bool
) – Should this data overwrite currently stored data.store (
Optional
[str
]) – Name of store to write to
- Returns:
Dictionary containing confirmation of standardisation process.
- Return type:
dict
Emissions / Flux#
- openghg.standardise.standardise_flux(filepath, species, source, domain, database=None, database_version=None, model=None, high_time_resolution=False, period=None, chunks=None, continuous=True, overwrite=False, store=None)[source]#
Process flux data
- Parameters:
filepath (
Union
[str
,Path
]) – Path of emissions filespecies (
str
) – Species namesource (
str
) – Emissions sourcedomain (
str
) – Emissions domaindate – Date as a string e.g. “2012” or “201206” associated with emissions as a string. Only needed if this can not be inferred from the time coords
high_time_resolution (
Optional
[bool
]) – If this is a high resolution fileperiod (
Union
[str
,tuple
,None
]) – Period of measurements, if not passed this is inferred from the time coordscontinuous (
bool
) – Whether time stamps have to be continuous.overwrite (
bool
) – Should this data overwrite currently stored data.store (
Optional
[str
]) – Name of store to write to
- Returns:
Dictionary of Datasource UUIDs data assigned to
- Return type:
dict
Footprints#
- openghg.standardise.standardise_footprint(filepath, site, domain, model, inlet=None, height=None, metmodel=None, species=None, network=None, period=None, chunks=None, continuous=True, retrieve_met=False, high_spatial_resolution=False, high_time_resolution=False, overwrite=False, store=None)[source]#
Reads footprint data files and returns the UUIDs of the Datasources the processed data has been assigned to
- Parameters:
filepath (
Union
[str
,Path
]) – Path of file to loadsite (
str
) – Site namedomain (
str
) – Domain of footprintsmodel (
str
) – Model used to create footprint (e.g. NAME or FLEXPART)inlet (
Optional
[str
]) – Height above ground level in metres. Format ‘NUMUNIT’ e.g. “10m”height (
Optional
[str
]) – Alias for inlet. One of height or inlet must be included.metmodel (
Optional
[str
]) – Underlying meteorlogical model used (e.g. UKV)species (
Optional
[str
]) – Species name. Only needed if footprint is for a specific species e.g. co2 (and not inert)network (
Optional
[str
]) – Network nameperiod (
Union
[str
,tuple
,None
]) – Period of measurements. Only needed if this can not be inferred from the time coordschunks (
Union
[int
,Dict
,Literal
['auto'
],None
]) – Chunk size to use when opening the NetCDF. Set to “auto” for automated chunk sizingcontinuous (
bool
) – Whether time stamps have to be continuous.retrieve_met (
bool
) – Whether to also download meterological data for this footprints areahigh_spatial_resolution (
bool
) – Indicate footprints include both a low and high spatial resolution.high_time_resolution (
bool
) – Indicate footprints are high time resolution (include H_back dimension) Note this will be set to True automatically for Carbon Dioxide data.overwrite (
bool
) – Overwrite any currently stored datastore (
Optional
[str
]) – Name of store to write to
- Returns:
Dictionary containing confirmation of standardisation process. None if file already processed.
- Return type:
dict / None
Helpers#
Some of the functions above require quite specific arguments as we must ensure all metadata attriuted to data is as correct as possible. These functions help you find the correct arguments in each case.
Behind the scences these functions use parsing functions that are written specifically for each data type. Please see the Developer API for these functions.