1.2.1. Methane satellite observations completeness assessment for greenhouse gas monitoring#
Production date: 03-06-2025
Produced by: CNR
🌍 Use case: Using satellite observations for quantifying global trends and uncertainties in greenhouse gas concentrations#
❓ Quality assessment questions#
How can the variability of spatial and temporal data coverage affect the quantification of the long-term atmospheric methane trends by satellite measurements (XCH\(_4\) Level 3 gridded product)?
What are the uncertainties related to the XCH\(_4\) satellite observations?
Methane (CH\(_4\)) is the second most important anthropogenic greenhouse gas after carbon dioxide (CO\(_2\)), representing about 19% of the total radiative forcing by long-lived greenhouse gases [1]. Atmospheric CH\(_4\) also adversely affects human health as a precursor of tropospheric ozone [2]. Monitoring the long-term CH\(_4\) variability is therefore crucial for monitoring the emission reductions [3]. In this assessment, atmospheric CH\(_4\) spatial seasonal means and trends are analysed using the XCH\(_4\) Level 3 gridded product (Obs4MIPs, version 4.5 [4]), by adopting an approach similar to [5]. Furthermore, we also evaluate the uncertainty associated with this dataset.
The Obs4MIPs XCH\(_4\) product is generated by spatial (5°x5°) and temporal (monthly) gridding of the corresponding EMMA Level 2 data product. For more details see [6]. By analysing an ensemble of different Level 2 datasets from various retrieval algorithms, the EMMA algorithm generates a dataset containing XCH\(_4\) from individual retrievals. In particular, for each month and 10°×10° grid box, the algorithm with the grid box mean closest to the median is selected. The list of the considered retrieval algorithms is available from [6]. The various retrieval algorithms are optimised for different instruments (SCIAMACHY, GOSAT, GOSAT-2) which measure backscattered solar radiation in the near-infrared for O\(_2\) as well as the absorption bands of CO\(_2\) or CH\(_4\).
Code is included for transparency but also learning purposes and gives users the chance to adapt the code used for the assesment as they wish. Users should always check product documentation and associated peer-reviewed papers for a more complete reporting of the issues discussed here (e.g., [7], [8]).
📢 Quality assessment statements#
These are the key outcomes of this assessment
The dataset “Methane data from 2002 to present derived from satellite observations” can be used to evaluate CH\(_4\) mean values, climatology and growth rate over the globe, hemispheres or large regions.
Caution should be exercised in certain regions (high latitudes, regions with frequent cloud cover, oceans) where data availability varies along the temporal coverage of the dataset: this must be carefully considered when evaluating global or hemispheric information.
For data in high latitude regions or in regions with frequent cloud cover, users should consult uncertainty and quality flags as appropriate for their applications. In addition, for the years 2003-2008 only values over land are available, which has to be taken into account for possible applications of this dataset.
As the uncertainties associated with the dataset are not constant over space and time, users are advised to consider the evolution of their values over the spatial regions and time periods of interest. The highest uncertainty values were found over high latitudes, the Himalayas and the tropical rainforest zone.
📋 Methodology#
Spatial seasonal means and trends are presented and assessed using the XCH\(_4\) v4.5 Level 3 gridded product (Obs4MIPs), which has been generated using the Level 2 EMMA products [4] as input.
To show how data coverage varies between years and seasons, we have calculated and plotted the average XCH\(_4\) values for the different seasons and years, over the period 2013-2022. This time period has been chosen for the purpose of comparing the calculated trends with the results in [1].
Spatial trends are calculated using a linear model (i.e., Theil-Sen slope estimator) over monthly anomalies (i.e., actual monthly values minus climatological monthly means). This should be treated with caution as the long-term trend of atmospheric CH\(_4\) is not strictly linear. The statistical significance of the trends is assessed using the Mann-Kendall test. Similar to [5], only land pixels are considered to avoid artefacts related to different data availability over oceans (the data product is land only for 2003-2008).
The global maps of the product uncertainties (“xch4_stderr”) for a specific subset of years (2015-2020) are shown, together with the time series of selected XCH\(_4\) variables (“xch4”, “xch4_stderr” and “xch4_nobs”). Please note that this analysis can be extended to one additional variable, not shown here (“xch4_stddev”), or can be customized by the user for specific regions of interest.
The analysis and results are organised in the following steps, which are detailed in the sections below:
1. Choose the data to use and set-up the code
Import all the relevant packages.
Choose the temporal and spatial coverage, land mask, and possible spatial regions for the analysis.
Cache needed functions.
2. Retrieve XCH_4 data (Obs4MIPs)
In this section, we define the data request to CDS.
3. Compute and plot the global variability of seasonal XCH_4
To show how data coverage varies between years and seasons, we have plotted spatial average XCH\(_4\) values for seasons and years. To avoid the production of too many maps, this analysis has been limited to the period 2013-2022. Seasons are defined as December-February (DJF), March-May (MAM), June-August (JJA) and September-November (SON).
4. Compute global and spatial trends
Trends (over 2013-2022) are calculated using a linear model (i.e., Theil-Sen slope estimator) over monthly anomalies (i.e., actual monthly values minus climatological monthly means). The statistical significance of the trends is assessed using the Mann-Kendall test.
5. Analysis concerning XCH_4 uncertainties
Annual global maps of the XCH\(_4\) uncertainties are presented, for a subset of data (2015-2020).
Global time series of XCH\(_4\), related uncertainties and data availability over the entire data period (2003-2022) are shown.
📈 Analysis and results#
1. Choose the data to use and set-up the code#
Import all relevant packages#
In this section, we import all the relevant packages needed for running the notebook.
Show code cell source
import math
import cartopy.crs as ccrs
import matplotlib.pyplot as plt
import xarray as xr
from c3s_eqc_automatic_quality_control import diagnostics, download, plot, utils
from xarrayMannKendall import Mann_Kendall_test
plt.style.use("seaborn-v0_8-notebook")
plt.rcParams["hatch.linewidth"] = 0.5
import os
os.environ["CDSAPI_RC"] = os.path.expanduser("~/putero_davide/.cdsapirc")
download.INVALIDATE_CACHE = True #Set True to invalidate the caching
Choose temporal and spatial coverage, land mask#
In this section, we define the parameters to be ingested by the code (that can be customized by the user), i.e.:
the temporal period of analysis;
the activation/deactivation of the land masking;
the regions selected for the analysis. Please note that, for this notebook, only the global maps and time series are reported.
The analyses presented in this notebook cover different time periods:
with the aim to compare our results with existing literature [1], the global variability and trend analyses are limited to 2013-2022;
the global plots of the XCH\(_4\) uncertainties refer to the years from 2015 to 2020;
the time series of XCH\(_4\), related uncertainties and data availability cover the entire period of data coverage.
Show code cell source
# Choose variable
variable = "xch4"
assert variable in [
f"{prefix}{suffix}"
for prefix in ("xch4", "xco2")
for suffix in ("", "_nobs", "_stderr", "_stddev")
]
# Choose a time period (to be used for the global variability and trend analysis)
year_start = 2013
year_stop = 2022
# Minimum value of land fraction used for masking
min_land_fraction = 0.5 # Use None to switch off
# Define regions for analysis
lon_slice = slice(-180, 180)
lat_slice = slice(-90, 90)
regions = {
"global": {"lon_slice": slice(-180, 180), "lat_slice": slice(-90, 90)}
}
Chache needed functions#
In this section, we cached a list of functions used in the analyses.
The function
get_da
(get_da_nomask
) is used to subset the data for the defined time period and spatial region by applying (not appliyng) the land mask (as a function of min_land_fraction).The function
convert_units
rescales XCH\(_4\) mole fraction to parts per billion (ppb).The
seasonal_weighted_mean
function extracts the regional means over the selected domains. It uses spatial weighting to account for the latitudinal dependence of the grid size in the lon/lat grids used for the reanalysis and for the forecast models. It is used by the functioncompute_seasonal_timeseries_nomask
to provide the seasonal XCH\(_4\) average value for each year (Fig. 1).The function
compute_anomaly_trends
is used to calculate the trend and the related statistical significance.The
compute_monthly_anomalies
function is used to derive the monthly XCH\(_4\) anomalies before calculating the trends.The
mask_scale_and_regionalise
function extracts the XCH\(_4\) data over the selected spatial region. It uses spatial weighting to account for the latitudinal dependence of the grid size in the lon/lat grids used for the reanalysis and for the forecast models. It uses theconvert_units
function for rescaling the values to ppb, and it applies the threshold (if any) on the minimum land fraction.
Show code cell source
transform_func_kwargs = {
"min_land_fraction": min_land_fraction,
"variable": variable,
"year_start": year_start,
"year_stop": year_stop,
"lon_slice": lon_slice,
"lat_slice": lat_slice,
}
transform_func_kwargs_nommask = {
"year_start": year_start,
"year_stop": year_stop,
}
def convert_units(da):
if da.name.endswith("_nobs"):
return da
with xr.set_options(keep_attrs=True):
if da.name.startswith("xch4") and da.attrs["units"] != "ppb":
da *= 1.0e9
da.attrs["units"] = "ppb"
elif da.name.startswith("xco2") and da.attrs["units"] != "ppm":
da *= 1.0e6
da.attrs["units"] = "ppm"
return da
def mask_scale_and_regionalise(ds, min_land_fraction, lon_slice, lat_slice):
if min_land_fraction is not None:
ds = ds.where(ds["land_fraction"] >= min_land_fraction)
for var, da in ds.data_vars.items():
if (fill_value := da.attrs.pop("fill_value", None)) is not None:
da = da.where(da != fill_value.astype(da.dtype))
ds[var] = convert_units(da)
if lon_slice or lat_slice:
ds = utils.regionalise(ds, lon_slice=lon_slice, lat_slice=lat_slice)
return ds
def get_da(
ds, min_land_fraction, variable, year_start, year_stop, lon_slice, lat_slice
):
ds = mask_scale_and_regionalise(
ds.sel(time=slice(str(year_start), str(year_stop))),
min_land_fraction,
lon_slice,
lat_slice,
)
da = ds[variable]
da = utils.regionalise(da, lon_slice=lon_slice, lat_slice=lat_slice)
if min_land_fraction is not None:
return da.where(ds["land_fraction"] >= min_land_fraction)
return da
def get_da_nomask(ds, year_start, year_stop):
mask = (ds["time"].dt.year >= year_start) & (ds["time"].dt.year <= year_stop)
ds = ds.where(mask.compute(), drop=True)
ds = mask_scale_and_regionalise(ds, None, None, None)
(varname,) = set(ds.data_vars) & {"xch4"}
da = ds[varname]
return da
def compute_seasonal_timeseries_nomask(ds, year_start, year_stop):
# Shift years (shift -1 to get D(year-1)J(year)F(year))
da = get_da_nomask(ds, year_start, year_stop)
da = da.assign_coords(year=ds["time"].dt.year.shift(time=-1).astype(int))
# Get rid of 1st JF and last D, so it become [MAM, JJA, SON, DJF, ..., SON]
da = da.isel(time=slice(2, -1))
da = da.groupby("year").map(diagnostics.seasonal_weighted_mean)
return da.to_dataset()
def compute_statistics(ds, **get_da_kwargs):
da = get_da(ds, **get_da_kwargs)
da = diagnostics.spatial_weighted_statistics(da)
return da.to_dataset()
def compute_monthly_anomalies(ds, **get_da_kwargs):
da = get_da(ds, **get_da_kwargs)
with xr.set_options(keep_attrs=True):
da = da.groupby("time.month") - da.groupby("time.month").mean()
return da
def compute_mann_kendall_trend(da, **mann_kendall_kwargs):
coords_name = {"time": "time", "x": "longitude", "y": "latitude"}
ds_trend = Mann_Kendall_test(
da, coords_name=coords_name, **mann_kendall_kwargs
).compute()
return ds_trend.rename({k: v for k, v in coords_name.items() if k != "time"})
def compute_seasonal_detrended_anomaly(da, **polyfit_kwargs):
da_trend = xr.polyval(
da["time"], da.polyfit("time", **polyfit_kwargs).polyfit_coefficients
)
da_detrended = da - da_trend
return da_detrended.groupby("time.year").map(diagnostics.seasonal_weighted_mean)
def compute_anomaly_trends(ds, **get_da_kwargs):
da_anomaly = compute_monthly_anomalies(ds, **get_da_kwargs)
# Mann-Kendall
ds_mann_kendall = compute_mann_kendall_trend(
da_anomaly, alpha=0.05, method="theilslopes"
).where(da_anomaly.notnull().any("time"))
ds_mann_kendall["trend"].attrs = {
"long_name": f"Trend of anomalies of {da_anomaly.attrs.get('long_name', da_anomaly.name)}",
"units": f"{da_anomaly.attrs['units']}/month",
}
ds_mann_kendall["std_error"].attrs = {
"long_name": f"Standard error of anomalies of {da_anomaly.attrs.get('long_name', da_anomaly.name)}",
"units": f"{da_anomaly.attrs['units']}/month",
}
# Detrended anomalies
da_detrended = compute_seasonal_detrended_anomaly(da_anomaly, deg=1)
da_detrended.attrs = {
"long_name": f"Detrended of anomalies of {da_anomaly.attrs.get('long_name', da_anomaly.name)}",
"units": f"{da_anomaly.attrs['units']}",
}
ds_mann_kendall["detrended_anomaly"] = da_detrended
return ds_mann_kendall
2. Retrieve XCH\(_4\) data (Obs4MIPs)#
In this section, we define the data request to CDS (data product Obs4MIPs, Level 3, version 4.5, XCH\(_4\)) and download the dataset.
Show code cell source
request = (
"satellite-carbon-dioxide" if variable.startswith("xco2") else "satellite-methane",
{
"processing_level": "level_3",
"variable": variable.split("_")[0],
"sensor_and_algorithm": "merged_obs4mips",
"version": "4_5",
"format": "zip",
},
)
Show code cell source
datasets = {}
for region, kwargs in regions.items():
print(f"{region=}")
ds = download.download_and_transform(
*request,
transform_func=mask_scale_and_regionalise,
transform_func_kwargs={"min_land_fraction": min_land_fraction} | kwargs,
)
for da in ds.data_vars.values():
if da.attrs.get("units") in ["1", 1]:
da.attrs.pop("units")
datasets[region] = ds
region='global'
0%| | 0/1 [00:00<?, ?it/s]/data/common/miniforge3/envs/wp5/lib/python3.11/site-packages/cacholote/extra_encoders.py:297: SerializationWarning: cannot serialize global coordinates {'time_bnds', 'lat_bnds', 'pre_bnds', 'lon_bnds'} because the global attribute 'coordinates' already exists. This may prevent faithful roundtrippingof xarray datasets
obj.to_netcdf(tmpfilename)
100%|██████████| 1/1 [00:00<00:00, 1.45it/s]
/data/common/miniforge3/envs/wp5/lib/python3.11/site-packages/scipy/stats/_mstats_basic.py:1256: RuntimeWarning: All `x` coordinates are identical.
return stats_theilslopes(y, x, alpha=alpha, method=method)
/data/common/miniforge3/envs/wp5/lib/python3.11/site-packages/numpy/_core/fromnumeric.py:3860: RuntimeWarning: Mean of empty slice.
return _methods._mean(a, axis=axis, dtype=dtype,
/data/common/miniforge3/envs/wp5/lib/python3.11/site-packages/numpy/_core/_methods.py:145: RuntimeWarning: invalid value encountered in scalar divide
ret = ret.dtype.type(ret / rcount)
3. Compute and plot the global variability of seasonal XCH\(_4\)#
To show how data coverage varies between years and seasons, in this section we plot the spatial average XCH\(_4\) values for seasons and years. To limit the number of plots produced, this analysis is limited to the period 2013-2022. Seasons are defined as: December-February (DJF), March-May (MAM), June-August (JJA), and September-November (SON). The seasonality of the availability of measurements is evident, with values available in the Southern Hemisphere (SH) mid and high latitudes from September to March, whereas values for the Northern Hemisphere (NH) mid and high latitudes are mostly available from April to August.
Show code cell source
# Analysis of global variability of seasonal XCH4
# To invalidate the cache you can pass the argument invalidate_cache=True
ds_seasonal = download.download_and_transform(
*request,
transform_func=compute_seasonal_timeseries_nomask,
transform_func_kwargs={"year_start": year_start - 1, "year_stop": year_stop},
)
ds_seasonal = ds_seasonal.sel(year=slice(year_start, year_stop))
da = ds_seasonal[variable]
da = da.where(da < 1.0e4)
_ = plot.projected_map(
da,
projection=ccrs.Robinson(),
col="season",
row="year",
robust=True,
cbar_kwargs={"orientation": "horizontal", "pad": 0.05},
)
0%| | 0/1 [00:00<?, ?it/s]/data/common/miniforge3/envs/wp5/lib/python3.11/site-packages/xarray/core/duck_array_ops.py:237: RuntimeWarning: invalid value encountered in cast
return data.astype(dtype, **kwargs)
100%|██████████| 1/1 [00:01<00:00, 1.74s/it]

/data/common/miniforge3/envs/wp5/lib/python3.11/site-packages/xarray/core/nputils.py:256: RankWarning: Polyfit may be poorly conditioned
warn_on_deficient_rank(rank, x.shape[1])
/data/common/miniforge3/envs/wp5/lib/python3.11/site-packages/xarray/core/nputils.py:256: RankWarning: Polyfit may be poorly conditioned
warn_on_deficient_rank(rank, x.shape[1])
/data/common/miniforge3/envs/wp5/lib/python3.11/site-packages/xarray/core/nputils.py:256: RankWarning: Polyfit may be poorly conditioned
warn_on_deficient_rank(rank, x.shape[1])
/data/common/miniforge3/envs/wp5/lib/python3.11/site-packages/xarray/core/nputils.py:256: RankWarning: Polyfit may be poorly conditioned
warn_on_deficient_rank(rank, x.shape[1])
/data/common/miniforge3/envs/wp5/lib/python3.11/site-packages/xarray/core/nputils.py:256: RankWarning: Polyfit may be poorly conditioned
warn_on_deficient_rank(rank, x.shape[1])
Global annual variability of seasonal XCH\(_4\), over 2013-2022. The different panels represent the individual seasons (column) and years (rows). Seasons are defined as: December-February (DJF), March-May (MAM), June-August (JJA), and September-November (SON).
4. Compute global and spatial trends#
In this section, XCH\(_4\) spatial trends are calculated and statistical significance is assessed for land pixels (“land_fraction > 0.5”). Global statistics for individual pixels are also reported as ppb/month. Please note that the low growth rates observed in the tropics are associated with high statistical errors in trend calculations. This is probably due to the frequent presence of clouds resulting in sparse data availability. Over Greenland, however, high retrieval uncertainty may be a factor due to the high surface albedo.
For 2013-2022, the mean global XCH\(_4\) growth rate is 9.71 \(\pm\) 2.21 ppb/yr (\(\pm\) standard error of the trend calculation). Taking into account the associated uncertainties, the different calculation methods and the different vertical representativeness of satellite observations with respect to near-surface measurements, this value is reasonably consistent with the mean absolute increase of 10.20 ppb/yr over the past 10 years, as reported by the WMO’s global in-situ observations [1].
Show code cell source
#Calculation of global and pixel trends
#To invalidate the cache you can pass the argument invalidate_cache=True
ds_trend = download.download_and_transform(
*request,
transform_func=compute_anomaly_trends,
transform_func_kwargs=transform_func_kwargs,
)
plot.projected_map(ds_trend["trend"], robust=True, projection=ccrs.Robinson())
plot.projected_map(
ds_trend["p"],
plot_func="contourf",
show_stats=False,
cmap="none",
add_colorbar=False,
levels=[0, 0.05, 1],
hatches=["", "/" * 5],
)
_ = plt.suptitle(
f" Trend {year_start}-{year_stop}"
)
100%|██████████| 1/1 [00:03<00:00, 3.63s/it]

Show code cell source
plot.projected_map(ds_trend["std_error"], robust=True, projection=ccrs.Robinson())
_ = plt.suptitle(
f" Standard error of trend {year_start}-{year_stop}"
)

Trends (upper map) and related standard error (bottom map) of XCH\(_4\) given in ppb/month calculated by using linear model (Theil-Sen) and Mann-Kendall test for statistical significance. Over each pixel, trends are calculated for monthly anomalies over land (“land_fraction > 0.5”). In the upper map, any shaded areas indicate pixels that did not pass the Mann-Kendall significance test. Global statistics for individual pixels are shown on the right of the plot.
5. Analysis concerning XCH\(_4\) uncertainties#
Global maps of XCH\(_4\) uncertainties#
In this section, we show the multi-year (for the 2015-2020 period) global maps of the XCH\(_4\) reported uncertainties, i.e., the “xch4_stderr” variable. This parameter is defined as the standard error of the average including single sounding noise and potential seasonal and regional biases [4]. The yearly averages are calculated by averaging the monthly values provided by the considered dataset. Throughout all the selected years some features emerge, with higher uncertainties mainly over the Himalayas, South Asia, high latitudes, and the tropical rainforest zone. This is mainly due to the sparseness in sampling because of frequent cloud cover, as well as large solar zenith angles in high latitudes, which are a challenge for accurate XCH\(_4\) retrievals [5]. It is worth noting that regions characterised by large uncertainties were also affected by spatial trends that deviated from the global average (see the previous figure). Users should therefore exercise caution when deriving long-term trends for regions affected by high uncertainty.
Show code cell source
time_range = slice("2015", "2020")
for region in ["global"]:
ds = datasets[region].sel(time=time_range)
for variable in ["xch4_stderr"]:
da = diagnostics.annual_weighted_mean(ds[variable])
facet = plot.projected_map(da, col="year", col_wrap=2, figsize=(10, 5), robust=True)
facet.fig.suptitle(f"{region =} {variable =}", y=1.01)

Multi-year global distribution of XCH\(_4\) standard error (from 2015 to 2020), derived from the XCH4_OBS4MIPS dataset (version 4.5). The title indicates the spatial region and the selected variable.
Time series analysis for the different variables#
In this section, we show the time series for three XCH\(_4\) variables (i.e., “xch4”, “xch4_stderr”, and “xch4_nobs”), considering the entire dataset. For each variable, each plot displays the monthly global spatial average, along with the corresponding monthly standard deviation (except for “xch4_nobs”). This analysis would provide the user with:
the global time series of XCH\(_4\) values (“xch4”), highlighting the overall positive trend;
the change over time of the uncertainty associated to the XCH\(_4\) values (“xch4_stderr”);
the time series of “xch4_nobs”, which denotes the number of individual XCH\(_4\) Level 2 observations used to compute the monthly Level 3 data.
The XCH\(_4\) values have been nearly constant until 2007, then a positive trend has been detected. This is likely due to a combination of increasing natural (e.g., wetlands) and anthropogenic (e.g., fossil-fuel related) emissions, and possible decreasing sinks, although this is currently under investigation (see [3] and references therein).
By looking at the uncertainties (“xch4_stderr”), the most evident feature is the sharp decrease after 2009, likely due to the introduction of algorithms based on GOSAT observations used to calculate the median XCH\(_4\) data [6]. A further increase in the uncertainties is observed from 2020 onwards, and this is probably linked to the introduction of GOSAT-2 measurements [6].
The temporal variability of “xch4_nobs” traces the changes over time of the input data availability, and the number of used algorithms to obtain the merged Level 2 data products, from which the Obs4MIPs product is derived (please note that the standard deviations have not been reported for “xch4_nobs” to increase the plot readability). The first time period (up to April 2009) is characterized by a significant number of observations by the SCIAMACHY WFMD product only. The following period (up to 2022) reflects the reduced number of soundings provided by SCIAMACHY with the contributions from GOSAT and GOSAT-2 (from January 2019). For details about the different Level 2 products used as input for the generation of the Level 3 XCH\(_4\) data, see [5] and [6].
Please note that this analysis can be customized by the user to limit it to specific regions, or to include one additional variable (not shown here): “xch4_stddev”, which represents the standard deviation of the XCH\(_4\) Level 2 observations within each grid box.
Show code cell source
for variable in ["xch4", "xch4_stderr", "xch4_nobs"]:
means = []
stds = []
for region, ds in datasets.items():
da = ds[variable]
means.append(diagnostics.spatial_weighted_mean(da).expand_dims(region=[region]))
stds.append(diagnostics.spatial_weighted_std(da).expand_dims(region=[region]))
da_mean = xr.concat(means, "region")
da_std = xr.concat(stds, "region")
facet = da_mean.plot(col="region", figsize=(7, 4))
for ax, sel_dict in zip(facet.axs.flatten(), facet.name_dicts.flatten()):
if variable == "xch4_nobs":
lower = 0
else:
lower = da_mean.sel(sel_dict) - da_std.sel(sel_dict)
ax.fill_between(
da["time"],
lower.where(lower > 0, 0),
da_mean.sel(sel_dict) + da_std.sel(sel_dict),
alpha=0.5,
)
# If variaable is xco2_nobs, use log scale for y-axis
if variable == "xch4_nobs":
ax.set_yscale("log")
#ax.set_ylabel("n observations")
# Imposta tick logaritmici automatici
from matplotlib.ticker import LogLocator, NullFormatter
ax.yaxis.set_major_locator(LogLocator(base=10))
ax.yaxis.set_minor_locator(LogLocator(base=10, subs="all"))
ax.yaxis.set_minor_formatter(NullFormatter())
ax.grid()
else:
ax.grid()
facet.fig.suptitle(f"{variable = }", y=1.01)
plt.show()



Global monthly time series of XCH\(_4\) (top panel) and their associated uncertainties (middle panel) for the entire dataset. The blue lines represent the monthly spatial average, while the shaded areas indicate \(\pm\)1 standard deviation. The bottom panel shows the time series of monthly spatial average of the number of individual XCH\(_4\) Level 2 observations used to compute Obs4MIPs (Level 3) data. The main titles indicate the variable shown, while the subtitles specify the corresponding region.
ℹ️ If you want to know more#
Key resources#
The CDS catalogue entries for the data used were:
Methane data from 2002 to present derived from satellite observations: https://cds.climate.copernicus.eu/datasets/satellite-methane?tab=overview
Code libraries used:
C3S EQC custom functions,
c3s_eqc_automatic_quality_control
, prepared by B-Open
Users interested in obtaining updated figures for methane growth rates and global trends are directed to official C3S sources for precise reporting: [7], [8].
Users interested in near-real time detection of hot-spot locations for methane emissions can consider to use the CAMS Methane Hotspot Explorer: https://atmosphere.copernicus.eu/ghg-services/cams-methane-hotspot-explorer?utm_source=press&utm_medium=referral&utm_campaign=CH4-app-2025
References#
[1] World Meteorological Organization (2023). WMO Greenhouse Gas Bulletin, No. 19, ISSN 2078-0796.
[2] West, J. J., Fiore, A. M., Horowitz, L. W., and Mauzerall, D. L. (2006). Global health benefits of mitigating ozone pollution with methane emission controls. Proceedings of the National Academy of Sciences USA, 103, 3988–3993.
[3] Saunois, M., Martinez, A., Poulter, B., Zhang, Z., Raymond, P., Regnier, P., Canadell, J. G., Jackson, R. B., Patra, P. K., Bousquet, P., Ciais, P., Dlugokencky, E. J., Lan, X., Allen, G. H., Bastviken, D., Beerling, D. J., Belikov, D. A., Blake, D. R., Castaldi, S., Crippa, M., Deemer, B. R., Dennison, F., Etiope, G., Gedney, N., Höglund-Isaksson, L., Holgerson, M. A., Hopcroft, P. O., Hugelius, G., Ito, A., Jain, A. K., Janardanan, R., Johnson, M. S., Kleinen, T., Krummel, P., Lauerwald, R., Li, T., Liu, X., McDonald, K. C., Melton, J. R., Mühle, J., Müller, J., Murguia-Flores, F., Niwa, Y., Noce, S., Pan, S., Parker, R. J., Peng, C., Ramonet, M., Riley, W. J., Rocher-Ros, G., Rosentreter, J. A., Sasakawa, M., Segers, A., Smith, S. J., Stanley, E. H., Thanwerdas, J., Tian, H., Tsuruta, A., Tubiello, F. N., Weber, T. S., van der Werf, G., Worthy, D. E., Xi, Y., Yoshida, Y., Zhang, W., Zheng, B., Zhu, Q., Zhu, Q., and Zhuang, Q. (2024). Global Methane Budget 2000–2020, Earth System Science Data Discussion, in review.
[4] Buchwitz, M. (2024). Product User Guide and Specification (PUGS) – Main document for Greenhouse Gas (GHG: CO\(_2\) & CH\(_4\)) data set CDR7 (01.2003-12.2022), C3S project 2021/C3S2_312a_Lot2_DLR/SC1, v7.3.
[5] Reuter, M., Buchwitz, M., Schneising, O., Noël, S., Bovensmann, H., Burrows, J. P., Boesch, H., Di Noia, A., Anand, J., Parker, R. J., Somkuti, P., Wu, L., Hasekamp, O. P., Aben, I., Kuze, A., Suto, H., Shiomi, K., Yoshida, Y., Morino, I., Crisp, D., O’Dell, C. W., Notholt, J., Petri, C., Warneke, T., Velazco, V. A., Deutscher, N. M., Griffith, D. W. T., Kivi, R., Pollard, D. F., Hase, F., Sussmann, R., Té, Y. V., Strong, K., Roche, S., Sha, M. K., De Mazière, M., Feist, D. G., Iraci, L. T., Roehl, C. M., Retscher, C., and Schepers, D. (2020). Ensemble-based satellite-derived carbon dioxide and methane column-averaged dry-air mole fraction data sets (2003–2018) for carbon and climate applications, Atmospheric Measurement Techniques, 13, 789–819.
[6] Reuter,M. and Buchwitz, M. (2024). Algorithm Theoretical Basis Document (ATBD) – ANNEX D for products XCO2_EMMA, XCH4_EMMA, XCO2_OBS4MIPS, XCH4_OBS4MIPS (v4.5, CDR7, 2003-2022), C3S project 2021/C3S2_312a_Lot2_DLR/SC1, v7.1b.
[7] Copernicus Climate Change Service (C3S) and World Meteorological Organization (WMO). (2025). European State of the Climate 2024. https://doi.org/10.24381/14j9-s541
[8] Copernicus Climate Change Service (C3S). (2025). Global Climate Highlights 2024.