Skip to main content
Ctrl+K
CryoCloud - Home
  • Welcome to CryoCloud: Empowering collaborative Earth science in the cloud

Details

  • About
  • Mission
  • Getting started
  • CryoCloud Best Practices
  • CryoCloud Code of Conduct
  • Citing CryoCloud

Cloud Training

  • Onboarding in the Cloud

How Tos

  • How To
    • Fundamentals
      • GitHub
      • JupyterHub
      • Git
      • Earthdata Login
      • Google Earth Engine Sign-Up
      • Software Carpentry Training
      • Python Installation and Environments
      • nbdime - version control for Jupyter notebooks
    • Datasets
      • Sentinel 2 cloud access
      • Instructions for setting up an AWS S3 bucket for your project
      • Using CryoCloud S3 Scratch Bucket
      • Earthaccess

Tutorials

  • Tutorials
    • Demo CryoCloud
    • ICESat-2 and Landsat cloud access and data integration
    • Streaming cloud-hosted ICESat-2 ATL15 (Gridded Antarctic/Arctic Land Ice Height) to calculate dh/dt trends
    • Using ICESat-2 ATL15 (Gridded Arctic Land Ice Height) to investigate ice-surface height anomalies
    • NASA Earthdata Cloud and data access using earthaccess and icepyx
      • Introduction to NASA Earthdata Cloud and ICESat-2
      • Using NASA Earthdata Search to Discover Cloud-Hosted Data
      • Part 1: Introduction to the earthaccess python library
      • Part 2: Using the icepyx python library to access ICESat-2 data
      • Part 2b: Using the icepyx python library to access ICESat-2 data (ATL06)
    • CryoCloud Tutorials: SlideRule
    • Parallelized Cloud Computing with Dask for Geoscientists
    • Analysis-ready, cloud-optimized data: writing zarr directories
    • Determining ice sheet change using Greenland Ice Mapping Project (GrIMP) tools

Contributing

  • Contributing
    • Workflow for contributing to our JupyterBook (or any GitHub project)

Reference

  • Publications, data products, and libraries supported by CryoCloud
  • Glossaries
  • ICESat-2 Resources
  • Open Science
  • Open-source Software
  • How to Get Help
  • Binder logo Binder
  • JupyterHub logo JupyterHub
  • Repository
  • Suggest edit
  • Open issue
  • .ipynb

Streaming cloud-hosted ICESat-2 ATL15 (Gridded Antarctic/Arctic Land Ice Height) to calculate dh/dt trends

Contents

  • Key learning outcomes
  • What is ICESat-2?
    • What is ATL14/15?
  • Streaming cloud-hosted data from NASA Earthdata Cloud
  • Define functions
  • Greenland Ice Sheet dh/dt
  • Antarctica Ice Sheet ATL15 pre-processing
    • Antarctic Ice Sheet dh/dt

Streaming cloud-hosted ICESat-2 ATL15 (Gridded Antarctic/Arctic Land Ice Height) to calculate dh/dt trends#


Authors

  • Wilson Sauthoff (https://wsauthoff.github.io)

  • Matthew Siegfried (https://mrsiegfried.github.io/)

Key learning outcomes#

  • Learn how to open, plot, and explore gridded raster data from the ICESat-2 ATL15 data product.

  • Use Xarray to import multi-dimensional data stored in “the cloud” and apply a function across a data cube.

  • Use gridded time series data to calculate per pixel height change (dh/dt) trends across the full duration of the ICESat-2 mission (2019 to latest release date).

# This tutorial requires a ~8 GB server instance

# Import libraries
import earthaccess
from IPython.display import Image, display
import matplotlib.colors as mpl_colors
import matplotlib.dates as mdates
import matplotlib.font_manager as fm
import matplotlib.gridspec as gridspec
import matplotlib.patches as patches
import matplotlib.pyplot as plt
import matplotlib.ticker as ticker
from mpl_toolkits.axes_grid1.anchored_artists import AnchoredSizeBar
from mpl_toolkits.axes_grid1 import make_axes_locatable
import numpy as np
import pandas as pd
import xarray as xr

What is ICESat-2?#

ICESat-2 (Ice, Cloud, and land Elevation Satellite 2), part of NASA’s Earth Observing System, is a satellite mission for measuring ice sheet elevation and sea ice thickness, as well as land topography, vegetation characteristics, and clouds. It does so using an altimeter or an altitude meter, which is an instrument used to measure the altitude of an object above a fixed level (the datum we talked about earlier). This is typically achieved by measuring the time it takes for a lidar or radar pulse, released by a satellite-based altimeter, to travel to a surface, reflect, and return to be recorded by an onboard instrument. ICESat-2 uses three pairs of laser pulsers and the detector to count the reflected photons.

ICESat-2 laser configuration (from Smith and others, 2019):

ICESat-2 laser configuration

What is ATL14/15?#

ATL15 is one of the various ICESat-2 data products. ATL15 provides various resolutions (1 km, 10 km, 20 km, and 40 km) of gridded (AKA raster) data of height change at 3-month intervals (Smith and others, 2022), allowing for estimation of dh/dt at each grid cell.

ATL14 is an accompnaying high-resolution (100 m) digital elevation model (DEM) that provides spatially continuous gridded data of ice sheet surface height at a specific datetime. ATL15 height changes are relative to the ATL14 DEM.

Learn more about the ICESat-2 ATL14/15 Gridded Antarctic and Arctic Land Ice Height Change data product dataset here.

Streaming cloud-hosted data from NASA Earthdata Cloud#

We will be working with cloud-hosted data files. This guide explains how to find and access Earthdata cloud-hosted data. Here is a complete list of earthdata cloud-hosted data products currently available from NSIDC.

Streaming cloud-hosted NASA Earthdata requires a free account. Get one here.

# Log into NASA Earthdata to search for datasets
earthaccess.login()
<earthaccess.auth.Auth at 0x7ff27abe7990>
# Find ICESat-2 ATL15 version four data granules in the northern hemisphere
results = earthaccess.search_data(
    doi='10.5067/ATLAS/ATL15.004',
    bounding_box=(180, 60, -180, 90),  # (lower_left_lon, lower_left_lat , upper_right_lon, upper_right_lat))
    cloud_hosted=True,
)
# Open data granules as s3 files to stream
files = earthaccess.open(results)
files
[<File-like object HTTPFileSystem, https://n5eil01u.ecs.nsidc.org/DP8/ATLAS/ATL15.004/2019.01.01/ATL15_SV_0321_40km_004_01.nc>,
 <File-like object HTTPFileSystem, https://n5eil01u.ecs.nsidc.org/DP8/ATLAS/ATL15.004/2019.01.01/ATL15_GL_0321_40km_004_01.nc>,
 <File-like object HTTPFileSystem, https://n5eil01u.ecs.nsidc.org/DP8/ATLAS/ATL15.004/2019.01.01/ATL15_CS_0321_01km_004_01.nc>,
 <File-like object HTTPFileSystem, https://n5eil01u.ecs.nsidc.org/DP8/ATLAS/ATL15.004/2019.01.01/ATL15_SV_0321_10km_004_01.nc>,
 <File-like object HTTPFileSystem, https://n5eil01u.ecs.nsidc.org/DP8/ATLAS/ATL15.004/2019.01.01/ATL15_GL_0321_20km_004_01.nc>,
 <File-like object HTTPFileSystem, https://n5eil01u.ecs.nsidc.org/DP8/ATLAS/ATL15.004/2019.01.01/ATL15_SV_0321_20km_004_01.nc>,
 <File-like object HTTPFileSystem, https://n5eil01u.ecs.nsidc.org/DP8/ATLAS/ATL15.004/2019.01.01/ATL15_RA_0321_20km_004_01.nc>,
 <File-like object HTTPFileSystem, https://n5eil01u.ecs.nsidc.org/DP8/ATLAS/ATL15.004/2019.01.01/ATL15_RA_0321_01km_004_01.nc>,
 <File-like object HTTPFileSystem, https://n5eil01u.ecs.nsidc.org/DP8/ATLAS/ATL15.004/2019.01.01/ATL15_IS_0321_40km_004_01.nc>,
 <File-like object HTTPFileSystem, https://n5eil01u.ecs.nsidc.org/DP8/ATLAS/ATL15.004/2019.01.01/ATL15_IS_0321_01km_004_01.nc>,
 <File-like object HTTPFileSystem, https://n5eil01u.ecs.nsidc.org/DP8/ATLAS/ATL15.004/2019.01.01/ATL15_CN_0321_20km_004_01.nc>,
 <File-like object HTTPFileSystem, https://n5eil01u.ecs.nsidc.org/DP8/ATLAS/ATL15.004/2019.01.01/ATL15_CN_0321_40km_004_01.nc>,
 <File-like object HTTPFileSystem, https://n5eil01u.ecs.nsidc.org/DP8/ATLAS/ATL15.004/2019.01.01/ATL15_CN_0321_10km_004_01.nc>,
 <File-like object HTTPFileSystem, https://n5eil01u.ecs.nsidc.org/DP8/ATLAS/ATL15.004/2019.01.01/ATL15_IS_0321_10km_004_01.nc>,
 <File-like object HTTPFileSystem, https://n5eil01u.ecs.nsidc.org/DP8/ATLAS/ATL15.004/2019.01.01/ATL15_CS_0321_10km_004_01.nc>,
 <File-like object HTTPFileSystem, https://n5eil01u.ecs.nsidc.org/DP8/ATLAS/ATL15.004/2019.01.01/ATL15_CS_0321_20km_004_01.nc>,
 <File-like object HTTPFileSystem, https://n5eil01u.ecs.nsidc.org/DP8/ATLAS/ATL15.004/2019.01.01/ATL15_SV_0321_01km_004_01.nc>,
 <File-like object HTTPFileSystem, https://n5eil01u.ecs.nsidc.org/DP8/ATLAS/ATL15.004/2019.01.01/ATL15_IS_0321_20km_004_01.nc>,
 <File-like object HTTPFileSystem, https://n5eil01u.ecs.nsidc.org/DP8/ATLAS/ATL15.004/2019.01.01/ATL15_RA_0321_40km_004_01.nc>,
 <File-like object HTTPFileSystem, https://n5eil01u.ecs.nsidc.org/DP8/ATLAS/ATL15.004/2019.01.01/ATL15_GL_0321_01km_004_01.nc>,
 <File-like object HTTPFileSystem, https://n5eil01u.ecs.nsidc.org/DP8/ATLAS/ATL15.004/2019.01.01/ATL15_CN_0321_01km_004_01.nc>,
 <File-like object HTTPFileSystem, https://n5eil01u.ecs.nsidc.org/DP8/ATLAS/ATL15.004/2019.01.01/ATL15_CS_0321_40km_004_01.nc>,
 <File-like object HTTPFileSystem, https://n5eil01u.ecs.nsidc.org/DP8/ATLAS/ATL15.004/2019.01.01/ATL15_RA_0321_10km_004_01.nc>,
 <File-like object HTTPFileSystem, https://n5eil01u.ecs.nsidc.org/DP8/ATLAS/ATL15.004/2019.01.01/ATL15_GL_0321_10km_004_01.nc>]
# After viewing files, index the files you wish to open
print(files[19])  # 1-km resolution Greenland ATL15
<File-like object HTTPFileSystem, https://n5eil01u.ecs.nsidc.org/DP8/ATLAS/ATL15.004/2019.01.01/ATL15_GL_0321_01km_004_01.nc>
# Open ATL15 Greenland data granule into an xarray dataset
ATL15_ds = xr.open_dataset(files[19], group='delta_h')

# View file structure and metadata of dataset
ATL15_ds
<xarray.Dataset> Size: 2GB
Dimensions:              (x: 1541, y: 2741, time: 21)
Coordinates:
  * x                    (x) float64 12kB -6.7e+05 -6.69e+05 ... 8.7e+05
  * y                    (y) float64 22kB -3.35e+06 -3.349e+06 ... -6.1e+05
  * time                 (time) datetime64[ns] 168B 2019-01-01T06:00:00 ... 2...
Data variables:
    Polar_Stereographic  int8 1B ...
    ice_area             (time, y, x) float32 355MB ...
    delta_h              (time, y, x) float32 355MB ...
    delta_h_sigma        (time, y, x) float32 355MB ...
    data_count           (time, y, x) float32 355MB ...
    misfit_rms           (time, y, x) float32 355MB ...
    misfit_scaled_rms    (time, y, x) float32 355MB ...
Attributes:
    description:  delta_h group includes variables describing height differen...
xarray.Dataset
    • x: 1541
    • y: 2741
    • time: 21
    • x
      (x)
      float64
      -6.7e+05 -6.69e+05 ... 8.7e+05
      units :
      meters
      dimensions :
      x
      datatype :
      float64
      least_significant_digit :
      None
      description :
      x coordinate of the 1-km cell centers, in projected coordinates
      long_name :
      polar stereographic x at 1km
      source :
      ATBD section 3.2
      grid_mapping :
      Polar_Stereographic
      standard_name :
      projection_x_coordinate
      array([-670000., -669000., -668000., ...,  868000.,  869000.,  870000.])
    • y
      (y)
      float64
      -3.35e+06 -3.349e+06 ... -6.1e+05
      units :
      meters
      dimensions :
      y
      datatype :
      float64
      least_significant_digit :
      None
      description :
      y coordinate of the 1-km cell centers, in projected coordinates
      long_name :
      polar stereographic y at 1km
      source :
      ATBD section 3.2
      grid_mapping :
      Polar_Stereographic
      standard_name :
      projection_y_coordinate
      array([-3350000., -3349000., -3348000., ...,  -612000.,  -611000.,  -610000.])
    • time
      (time)
      datetime64[ns]
      2019-01-01T06:00:00 ... 2024-01-...
      dimensions :
      time
      datatype :
      float64
      least_significant_digit :
      None
      description :
      Time for each node, in days since 2018-01-01:T00.00.00 UTC
      long_name :
      quarterly h(t) time
      source :
      ATBD section 4.2
      array(['2019-01-01T06:00:00.000000000', '2019-04-02T13:30:00.000000000',
             '2019-07-02T21:00:00.000000000', '2019-10-02T04:30:00.000000000',
             '2020-01-01T12:00:00.000000000', '2020-04-01T19:30:00.000000000',
             '2020-07-02T03:00:00.000000000', '2020-10-01T10:30:00.000000000',
             '2020-12-31T18:00:00.000000000', '2021-04-02T01:30:00.000000000',
             '2021-07-02T09:00:00.000000000', '2021-10-01T16:30:00.000000000',
             '2022-01-01T00:00:00.000000000', '2022-04-02T07:30:00.000000000',
             '2022-07-02T15:00:00.000000000', '2022-10-01T22:30:00.000000000',
             '2023-01-01T06:00:00.000000000', '2023-04-02T13:30:00.000000000',
             '2023-07-02T21:00:00.000000000', '2023-10-02T04:30:00.000000000',
             '2024-01-01T12:00:00.000000000'], dtype='datetime64[ns]')
    • Polar_Stereographic
      ()
      int8
      ...
      standard_name :
      Polar_Stereographic
      grid_mapping_name :
      polar_stereographic
      straight_vertical_longitude_from_pole :
      -45.0
      latitude_of_projection_origin :
      90.0
      standard_parallel :
      70.0
      scale_factor_at_projection_origin :
      1.0
      false_easting :
      0.0
      false_northing :
      0.0
      semi_major_axis :
      6378.137
      semi_minor_axis :
      6356.752
      inverse_flattening :
      298.257223563
      spatial_epsg :
      3413
      spatial_ref :
      PROJCS["WGS 84 / NSIDC Sea Ice Polar Stereographic North",GEOGCS["WGS 84",DATUM["WGS_1984",SPHEROID["WGS 84",6378137,298.257223563,AUTHORITY["EPSG","7030"]],AUTHORITY["EPSG","6326"]],PRIMEM["Greenwich",0,AUTHORITY["EPSG","8901"]],UNIT["degree",0.0174532925199433,AUTHORITY["EPSG","9122"]],AUTHORITY["EPSG","4326"]],PROJECTION["Polar_Stereographic"],PARAMETER["latitude_of_origin",70],PARAMETER["central_meridian",-45],PARAMETER["scale_factor",1],PARAMETER["false_easting",0],PARAMETER["false_northing",0],UNIT["metre",1,AUTHORITY["EPSG","9001"]],AXIS["X",EAST],AXIS["Y",NORTH],AUTHORITY["EPSG","3413"]]
      crs_wkt :
      PROJCS["WGS 84 / NSIDC Sea Ice Polar Stereographic North",GEOGCS["WGS 84",DATUM["WGS_1984",SPHEROID["WGS 84",6378137,298.257223563,AUTHORITY["EPSG","7030"]],AUTHORITY["EPSG","6326"]],PRIMEM["Greenwich",0,AUTHORITY["EPSG","8901"]],UNIT["degree",0.0174532925199433,AUTHORITY["EPSG","9122"]],AUTHORITY["EPSG","4326"]],PROJECTION["Polar_Stereographic"],PARAMETER["latitude_of_origin",70],PARAMETER["central_meridian",-45],PARAMETER["scale_factor",1],PARAMETER["false_easting",0],PARAMETER["false_northing",0],UNIT["metre",1,AUTHORITY["EPSG","9001"]],AXIS["X",EAST],AXIS["Y",NORTH],AUTHORITY["EPSG","3413"]]
      GeoTransform :
      [-670000. 1000. 0. -610000. 0. -1000.]
      [1 values with dtype=int8]
    • ice_area
      (time, y, x)
      float32
      ...
      least_significant_digit :
      4
      units :
      meters^2
      dimensions :
      time,y,x
      datatype :
      float32
      description :
      Ice-covered area of each 1x1 km grid cell, accounting for the area distortion in the polar-stereographic projections
      long_name :
      ice-covered area at 1 km
      source :
      ATBD section 3.4
      grid_mapping :
      Polar_Stereographic
      [88701501 values with dtype=float32]
    • delta_h
      (time, y, x)
      float32
      ...
      least_significant_digit :
      4
      units :
      meters
      dimensions :
      time,y,x
      datatype :
      float32
      description :
      Height change relative to the datum (Jan 1, 2020) surface
      long_name :
      height change at 1 km
      source :
      ATBD section 3.4
      grid_mapping :
      Polar_Stereographic
      [88701501 values with dtype=float32]
    • delta_h_sigma
      (time, y, x)
      float32
      ...
      least_significant_digit :
      4
      units :
      meters
      dimensions :
      time,y,x
      datatype :
      float32
      description :
      Estimated error in height change relative to the datum (Jan 1, 2020) surface
      long_name :
      height change uncertainty at 1 km
      source :
      ATBD section 3.4
      grid_mapping :
      Polar_Stereographic
      [88701501 values with dtype=float32]
    • data_count
      (time, y, x)
      float32
      ...
      least_significant_digit :
      4
      units :
      counts
      dimensions :
      time,y,x
      datatype :
      float32
      description :
      Weighted number of data contributing to each node in the 1-km height-change grid
      long_name :
      data count
      source :
      ATBD section 5.2.4.4
      grid_mapping :
      Polar_Stereographic
      [88701501 values with dtype=float32]
    • misfit_rms
      (time, y, x)
      float32
      ...
      least_significant_digit :
      4
      units :
      meters
      dimensions :
      time,y,x
      datatype :
      float32
      description :
      Misfit associated with each node in the 1-km height-change grid
      long_name :
      rms misfit
      source :
      ATBD section 5.2.4.4
      grid_mapping :
      Polar_Stereographic
      [88701501 values with dtype=float32]
    • misfit_scaled_rms
      (time, y, x)
      float32
      ...
      least_significant_digit :
      4
      units :
      counts
      dimensions :
      time,y,x
      datatype :
      float32
      description :
      Scaled misfit associated with each node in the 1-km height-change grid
      long_name :
      scaled rms misfit
      source :
      ATBD section 5.2.4.4
      grid_mapping :
      Polar_Stereographic
      [88701501 values with dtype=float32]
    • x
      PandasIndex
      PandasIndex(Index([-670000.0, -669000.0, -668000.0, -667000.0, -666000.0, -665000.0,
             -664000.0, -663000.0, -662000.0, -661000.0,
             ...
              861000.0,  862000.0,  863000.0,  864000.0,  865000.0,  866000.0,
              867000.0,  868000.0,  869000.0,  870000.0],
            dtype='float64', name='x', length=1541))
    • y
      PandasIndex
      PandasIndex(Index([-3350000.0, -3349000.0, -3348000.0, -3347000.0, -3346000.0, -3345000.0,
             -3344000.0, -3343000.0, -3342000.0, -3341000.0,
             ...
              -619000.0,  -618000.0,  -617000.0,  -616000.0,  -615000.0,  -614000.0,
              -613000.0,  -612000.0,  -611000.0,  -610000.0],
            dtype='float64', name='y', length=2741))
    • time
      PandasIndex
      PandasIndex(DatetimeIndex(['2019-01-01 06:00:00', '2019-04-02 13:30:00',
                     '2019-07-02 21:00:00', '2019-10-02 04:30:00',
                     '2020-01-01 12:00:00', '2020-04-01 19:30:00',
                     '2020-07-02 03:00:00', '2020-10-01 10:30:00',
                     '2020-12-31 18:00:00', '2021-04-02 01:30:00',
                     '2021-07-02 09:00:00', '2021-10-01 16:30:00',
                     '2022-01-01 00:00:00', '2022-04-02 07:30:00',
                     '2022-07-02 15:00:00', '2022-10-01 22:30:00',
                     '2023-01-01 06:00:00', '2023-04-02 13:30:00',
                     '2023-07-02 21:00:00', '2023-10-02 04:30:00',
                     '2024-01-01 12:00:00'],
                    dtype='datetime64[ns]', name='time', freq=None))
  • description :
    delta_h group includes variables describing height differences between the model surface at any time and the DEM surface at a resolution of 1 km.

Define functions#

def linear_trend(y, time):
    """
    Computes the linear trend (slope) of a time series using linear regression.
    
    Parameters:
    -----------
    y : array-like
        The dependent variable (e.g., data values) for which the trend is to be computed. 
        This is typically a 1D array or list of observations over time.
        
    time : array-like
        The independent variable (e.g., time points). This should be a 1D array of time values 
        corresponding to the `y` values. It is expected that `time` and `y` have the same length.
    
    Returns:
    --------
    slope : float
        The slope of the linear trend. If there is not enough data or all values are NaN, 
        the function returns `np.nan`.
    
    Description:
    ------------
    This function computes the slope of the linear regression line that fits the given data 
    points `(time, y)`. It ignores any `NaN` values in `y` and performs the regression only 
    on valid data points. If there are fewer than two valid data points, the function returns `np.nan`.
    
    Example:
    --------
    >>> y = [2.0, 3.1, 4.2, 5.4, np.nan, 6.8]
    >>> time = [0, 1, 2, 3, 4, 5]
    >>> linear_trend(y, time)
    1.22  # Slope of the linear trend
    
    """
    # Create a mask that is True for finite (non-NaN) values in y, and False for NaN values
    mask = np.isfinite(y)
    
    # If there are fewer than two valid data points, return NaN because we can't fit a line
    if np.sum(mask) < 2:  # Not enough valid data points to compute a trend
        return np.nan
    
    # Perform a linear fit (regression) on the valid data points (ignoring NaNs)
    # np.polyfit returns the slope and intercept, [0] extracts the slope
    return np.polyfit(time[mask], y[mask], 1)[0]  # Return the slope of the fitted line
def plot_timeseries_stat(timeseries_stat_da, cmap_extent_abs_value, clb_label, scale_bar_len):
    """
    Plots a timeseries statistic using a diverging colormap and displays a colorbar.
    
    Parameters:
    -----------
    timeseries_stat_da : xarray.DataArray
        The data array containing the timeseries statistic to be plotted.
        
    cmap_extent_abs_value : float
        This value is used to control the minimum/maximum value in the colormap to avoid extreme outliers
        and have a symmetric colorbar.
        
    clb_label : str
        The label for the colorbar. This should describe what the color scale represents.

    scale_bar_len: float
        Length of the displayed scale bar in km.
    
    Returns:
    --------
    None
        Displays a plot of the data with a colorbar.
    
    Description:
    ------------
    This function visualizes a 2D timeseries statistic using a diverging colormap 
    (Red-Blue). It computes the maximum value based on the provided percentile to 
    handle outliers and centers the colormap around zero for balance between 
    positive and negative values. It also adds a horizontal colorbar below the 
    plot, labels it, and hides axis tick marks and labels for a cleaner presentation.
    """
    # Create a figure and axis with a specific size for the plot
    fig, ax = plt.subplots(figsize=(10, 10))

    # Set the background of the plot area to grey
    ax.set_facecolor('grey')  # This changes the color inside the plot area

    # Customize the border around the plot
    for spine in ax.spines.values():
        spine.set_edgecolor('black')  # Set border color to black
        spine.set_linewidth(2)        # Set the width of the border to 2 (thicker)

    # Get the minimum and maximum values of the x and y dimensions
    # These values will be used to set the extent of the plot
    x_min = timeseries_stat_da.x.min()
    x_max = timeseries_stat_da.x.max()
    y_min = timeseries_stat_da.y.min()
    y_max = timeseries_stat_da.y.max()

    # Create a diverging normalization for the color scale
    # This centers the color scale at 0, which is useful for visualizing differences
    divnorm = mpl_colors.TwoSlopeNorm(vcenter=0.)

    # Create the normalization for the color scale, centered at 0
    norm = mpl_colors.TwoSlopeNorm(vmin=-cmap_extent_abs_value, vcenter=0, vmax=cmap_extent_abs_value)

    # Plot the data array using imshow
    # - extent sets the range of the x and y axes
    # - cmap sets the color scheme (RdBu is red to blue for diverging data)
    # - norm applies the normalization we defined above
    # - origin='lower' makes sure the plot is oriented correctly
    m = ax.imshow(timeseries_stat_da, 
                  extent=[x_min, x_max, y_min, y_max],
                  cmap='RdBu', norm=norm,
                  origin='lower')

    # Set the limits for the x and y axes using the min and max values we got earlier
    ax.set(xlim=(x_min, x_max), ylim=(y_min, y_max))

    # Remove the tick marks and labels on both the x and y axes to clean up the plot
    ax.tick_params(left=False, right=False, top=False, bottom=False)  # Hide tick marks
    ax.set_xticklabels([])  # Remove labels on the x-axis
    ax.set_yticklabels([])  # Remove labels on the y-axis

    # Add a colorbar below the plot to show the color scale for the data
    # - divider is used to add the colorbar below the plot (bottom) with a specific size and padding
    divider = make_axes_locatable(ax)
    cax = divider.append_axes('bottom', size='2.5%', pad=0.2)
    
    # Add the colorbar to the plot and extend it both ways to indicate out-of-range values
    # - extend='both' adds arrowheads to the colorbar to show that values extend beyond the range
    # - orientation='horizontal' places the colorbar below the plot
    cbar = fig.colorbar(m, cax=cax, extend='both', orientation='horizontal')
    cbar.set_label(clb_label)  # Add a label to the colorbar for clarity

    # Add a label to the colorbar
    cbar.set_label(clb_label, fontsize=16)  # Adjust the fontsize of the colorbar label
    
    # Adjust the fontsize of the colorbar tick labels
    cbar.ax.tick_params(labelsize=14)  # Adjust the fontsize of the colorbar ticks

    # Add a scale bar (e.g., for 100 km)
    scalebar = AnchoredSizeBar(ax.transData,
                               scale_bar_len*1e3,  # Length of the scale bar in km
                               '{} km'.format(scale_bar_len),  # Label for the scale bar
                               loc='lower right',  # Location on the plot
                               pad=0.5,
                               color='black',
                               frameon=False,
                               size_vertical=1,
                               fontproperties=fm.FontProperties(size=14))  # Font size for the label

    ax.add_artist(scalebar)  # Add the scale bar to the axis

    # Show the plot on the screen
    plt.show()
def fractional_year_rounded(date):
    """
    Converts a given date to a fractional year and rounds it to the nearest 0.25 increment.

    Parameters:
    -----------
    date : numpy.datetime64, pandas.Timestamp, or similar
        The input date to be converted to a fractional year. It can be provided as a 
        numpy datetime64 object or any object convertible to a pandas datetime.
    
    Returns:
    --------
    rounded_fractional_year : float
        The fractional year, rounded to the nearest quarter (0.25) year. The fractional year
        represents the year plus the fraction of the year that has passed based on the day of the year.
    
    Description:
    ------------
    This function calculates the fractional part of the year based on how many days have passed 
    in the year (accounting for leap years if applicable). It then rounds the result to the 
    nearest 0.25 increment, representing the year in quarters.
    
    Example:
    --------
    >>> fractional_year_rounded(np.datetime64('2022-03-15'))
    2022.25
    >>> fractional_year_rounded(pd.Timestamp('2022-11-20'))
    2022.75
    """
    # Convert to pandas datetime to easily extract year and day information
    date = pd.to_datetime(date.astype('datetime64[D]'))
    
    # Get the year and day of year
    year = date.year
    day_of_year = date.dayofyear
    
    # Determine if it's a leap year
    days_in_year = 366 if pd.Timestamp(year, 12, 31).is_leap_year else 365
    
    # Compute the fractional year
    fractional_year = year + (day_of_year / days_in_year)
    
    # Round to the nearest .25 increment
    rounded_fractional_year = np.round(fractional_year * 4) / 4
    
    return rounded_fractional_year

Greenland Ice Sheet dh/dt#

This code will take a few minutes to run.

# Extract the time values from the DataArray's time coordinate
time = ATL15_ds['delta_h']['time'].values

# Convert the time values from datetime64 to a numerical format (e.g., number of days since the first time point)
time_numeric = pd.to_datetime(time)
time_numeric = (time_numeric - time_numeric[0]) / pd.Timedelta(days=365.25)  # Time in years

# Apply the linear trend function over the time dimension
dhdt_trend = xr.apply_ufunc(
    linear_trend,                    # The function to apply
    ATL15_ds['delta_h'],             # The DataArray on which to apply the function
    input_core_dims=[['time']],      # The core dimension for the input (time axis of the DataArray)
    kwargs={'time': time_numeric},   # Pass 'time_numeric' as a keyword argument (numeric time values)
    vectorize=True,                  # Vectorize the function (apply it element-wise)
    dask="parallelized",             # If using Dask for larger datasets
    output_dtypes=[float]            # Output type (trend slope is a float)
)

# Now 'dhdt_trend' contains the dh/dt linear trend for each pixel
# Name dataarray to reflect this
dhdt_trend.name = 'annual_dhdt_trend'
# Get the start and end dates for ATL15 dataset
start_date = ATL15_ds.time[0].values
end_date = ATL15_ds.time[-1].values

# Compute fractional years for both dates
start_fractional_year = fractional_year_rounded(start_date)
end_fractional_year = fractional_year_rounded(end_date)

# Combine the two dates with a hyphen
date_range = f"{start_fractional_year}-{end_fractional_year}"

# Use function to plot dhdt_trend
plot_timeseries_stat(dhdt_trend, 1., 'dh/dt [m/yr]\n'+date_range, 500)
../../_images/cee78e85155a73f4601a359e4116e2b4ee399cd2ac1c597d573d417f2f657abc.png

Antarctica Ice Sheet ATL15 pre-processing#

ATL15 data covering Antarctica requires an additional step of preprocessing because the data are served separately in four quadrants (A1, A2, A3, and A4), which are the cartesian quadrantes in polar stereographic coordinates about the geographic south pole. So in order to make a continental-scale plot of ATL15, we first need to stitch these individual data files together.

(from ATL15 dataset user guide (Smith and others, 2024)): ICESat-2 ATL15 regions

# Find ICESat-2 ATL15 version four data granules in the southern hemisphere
results = earthaccess.search_data(
    doi='10.5067/ATLAS/ATL15.004',
    bounding_box=(180, -90, -180, -60),  # (lower_left_lon, lower_left_lat , upper_right_lon, upper_right_lat))
    cloud_hosted=True,
)

# Open data granules as s3 files to stream
files = earthaccess.open(results)
files
[<File-like object HTTPFileSystem, https://n5eil01u.ecs.nsidc.org/DP8/ATLAS/ATL15.004/2019.01.01/ATL15_A2_0321_01km_004_01.nc>,
 <File-like object HTTPFileSystem, https://n5eil01u.ecs.nsidc.org/DP8/ATLAS/ATL15.004/2019.01.01/ATL15_A3_0321_01km_004_01.nc>,
 <File-like object HTTPFileSystem, https://n5eil01u.ecs.nsidc.org/DP8/ATLAS/ATL15.004/2019.01.01/ATL15_A3_0321_20km_004_01.nc>,
 <File-like object HTTPFileSystem, https://n5eil01u.ecs.nsidc.org/DP8/ATLAS/ATL15.004/2019.01.01/ATL15_A4_0321_40km_004_01.nc>,
 <File-like object HTTPFileSystem, https://n5eil01u.ecs.nsidc.org/DP8/ATLAS/ATL15.004/2019.01.01/ATL15_A4_0321_20km_004_01.nc>,
 <File-like object HTTPFileSystem, https://n5eil01u.ecs.nsidc.org/DP8/ATLAS/ATL15.004/2019.01.01/ATL15_A3_0321_40km_004_01.nc>,
 <File-like object HTTPFileSystem, https://n5eil01u.ecs.nsidc.org/DP8/ATLAS/ATL15.004/2019.01.01/ATL15_A2_0321_40km_004_01.nc>,
 <File-like object HTTPFileSystem, https://n5eil01u.ecs.nsidc.org/DP8/ATLAS/ATL15.004/2019.01.01/ATL15_A3_0321_10km_004_01.nc>,
 <File-like object HTTPFileSystem, https://n5eil01u.ecs.nsidc.org/DP8/ATLAS/ATL15.004/2019.01.01/ATL15_A2_0321_20km_004_01.nc>,
 <File-like object HTTPFileSystem, https://n5eil01u.ecs.nsidc.org/DP8/ATLAS/ATL15.004/2019.01.01/ATL15_A1_0321_40km_004_01.nc>,
 <File-like object HTTPFileSystem, https://n5eil01u.ecs.nsidc.org/DP8/ATLAS/ATL15.004/2019.01.01/ATL15_A1_0321_10km_004_01.nc>,
 <File-like object HTTPFileSystem, https://n5eil01u.ecs.nsidc.org/DP8/ATLAS/ATL15.004/2019.01.01/ATL15_A1_0321_01km_004_01.nc>,
 <File-like object HTTPFileSystem, https://n5eil01u.ecs.nsidc.org/DP8/ATLAS/ATL15.004/2019.01.01/ATL15_A4_0321_10km_004_01.nc>,
 <File-like object HTTPFileSystem, https://n5eil01u.ecs.nsidc.org/DP8/ATLAS/ATL15.004/2019.01.01/ATL15_A4_0321_01km_004_01.nc>,
 <File-like object HTTPFileSystem, https://n5eil01u.ecs.nsidc.org/DP8/ATLAS/ATL15.004/2019.01.01/ATL15_A1_0321_20km_004_01.nc>,
 <File-like object HTTPFileSystem, https://n5eil01u.ecs.nsidc.org/DP8/ATLAS/ATL15.004/2019.01.01/ATL15_A2_0321_10km_004_01.nc>]
# We first index the data files we wish to open
print(files[11])
print(files[0])
print(files[1])
print(files[-3])
<File-like object HTTPFileSystem, https://n5eil01u.ecs.nsidc.org/DP8/ATLAS/ATL15.004/2019.01.01/ATL15_A1_0321_01km_004_01.nc>
<File-like object HTTPFileSystem, https://n5eil01u.ecs.nsidc.org/DP8/ATLAS/ATL15.004/2019.01.01/ATL15_A2_0321_01km_004_01.nc>
<File-like object HTTPFileSystem, https://n5eil01u.ecs.nsidc.org/DP8/ATLAS/ATL15.004/2019.01.01/ATL15_A3_0321_01km_004_01.nc>
<File-like object HTTPFileSystem, https://n5eil01u.ecs.nsidc.org/DP8/ATLAS/ATL15.004/2019.01.01/ATL15_A4_0321_01km_004_01.nc>
# Open each file
ATL15_A1 = xr.open_dataset(files[11], group='delta_h')
ATL15_A2 = xr.open_dataset(files[0], group='delta_h')
ATL15_A3 = xr.open_dataset(files[1], group='delta_h')
ATL15_A4 = xr.open_dataset(files[-3], group='delta_h')
# We will drop unneeded variables to conserve memory

# Specify the variables to keep
variables_to_keep = ['time', 'y', 'x', 'delta_h']

# List of xarray datasets
datasets = [ATL15_A1, ATL15_A2, ATL15_A3, ATL15_A4]

# Function to drop variables not in variables_to_keep from a dataset
def drop_unwanted_variables(dataset):
    variables_to_drop = [var for var in dataset.variables if var not in variables_to_keep]
    return dataset.drop_vars(variables_to_drop)

# Apply the function to each dataset
ATL15_A1, ATL15_A2, ATL15_A3, ATL15_A4 = [drop_unwanted_variables(ds) for ds in datasets]
# Use xarray concatenation to stitch two quadrants together 
# Use xarray index selecting to occlude the duplicated x=0 vector of data
ATL15_A12 = xr.concat([ATL15_A2.isel(x=slice(0,-1)), ATL15_A1], dim="x")
# Use xarray concatenation to stitch two quadrants togethers
# Use xarray index selecting to occlude the duplicated x=0 vector of data
ATL15_A34 = xr.concat([ATL15_A3.isel(x=slice(0,-1)), ATL15_A4], dim='x')
# Use xarray concatenation to stitch two-quadrant concatenated halves togethers
# Use xarray index selecting to occlude the duplicated y=0 vector of data
ATL15_ds = xr.concat([ATL15_A34.isel(y=slice(0,-1)), ATL15_A12], dim='y')
# Delete unneeded dataarrays and datasets to reduce memory consumption
del ATL15_A1, ATL15_A12, ATL15_A2, ATL15_A3, ATL15_A34, ATL15_A4

Antarctic Ice Sheet dh/dt#

The following code does continental-scale calculations of dh/dt and takes 20-30 minutes to run.

# Extract the time values from the DataArray's time coordinate
time = ATL15_ds['delta_h']['time'].values

# Convert the time values from datetime64 to a numerical format (e.g., number of days since the first time point)
time_numeric = pd.to_datetime(time)
time_numeric = (time_numeric - time_numeric[0]) / pd.Timedelta(days=365.25)  # Time in years

# Apply the linear trend function over the time dimension
dhdt_trend_2 = xr.apply_ufunc(
    linear_trend,                    # The function to apply
    ATL15_ds['delta_h'],             # The DataArray on which to apply the function
    input_core_dims=[['time']],      # The core dimension for the input (time axis of the DataArray)
    kwargs={'time': time_numeric},   # Pass 'time_numeric' as a keyword argument (numeric time values)
    vectorize=True,                  # Vectorize the function (apply it element-wise)
    dask="parallelized",             # If using Dask for larger datasets
    output_dtypes=[float]            # Output type (trend slope is a float)
)

# Now 'dhdt_trend' contains the dh/dt linear trend for each pixel
# Name dataarray to reflect this
dhdt_trend_2.name = 'annual_dhdt_trend'
# Get the start and end dates for ATL15 dataset
start_date = ATL15_ds.time[0].values
end_date = ATL15_ds.time[-1].values

# Compute fractional years for both dates
start_fractional_year = fractional_year_rounded(start_date)
end_fractional_year = fractional_year_rounded(end_date)

# Combine the two dates with a hyphen
date_range = f"{start_fractional_year} to {end_fractional_year}"

plot_timeseries_stat(dhdt_trend_2, 1., 'dh/dt [m/yr]\n'+date_range, 500)
../../_images/22c559e833d45a7776f133e314c6eaac044fd0ad092210d06177e3f8fab1ee1c.png

previous

ICESat-2 and Landsat cloud access and data integration

next

Using ICESat-2 ATL15 (Gridded Arctic Land Ice Height) to investigate ice-surface height anomalies

Contents
  • Key learning outcomes
  • What is ICESat-2?
    • What is ATL14/15?
  • Streaming cloud-hosted data from NASA Earthdata Cloud
  • Define functions
  • Greenland Ice Sheet dh/dt
  • Antarctica Ice Sheet ATL15 pre-processing
    • Antarctic Ice Sheet dh/dt

By CryoCloud Team and Community, Colorado School of Mines