# Scenes¶

The Scenes submodule provides a higher-level, object-based interface that makes many interactions with the Descartes Labs platform easier. The API provides a subset of the functionality in the Catalog and Raster clients. We recommend that new users start with Scenes. Use Catalog if you need to add or update raster products and Raster if you need advanced raster functionality.

The most important capabilities provided by the Scenes submodule include the Scene class representing an individual scene or image, the ability to search() for Scenes based on geospatial coordinates and other important properties, the SceneCollection API which allows the user to conveniently work with collections of individual scenes in aggregate, the GeoContext and related APIs which provide consistent geospatial parameters for use in rastering images, and a useful display() interface for rendering the resulting imagery in the client using matplotlib.

The Scenes API is available under the descarteslabs package as descarteslabs.scenes and the full API documentation is available at Scenes.

Note

For information about API Quotas and limits see our Quotas & Limits page.

## A Quick Tour¶

Define our Area of Interest:

>>> aoi_geometry = {
...     'type': 'Polygon',
...     'coordinates': [[
...         [-93.52300099792355, 41.241436141055345],
...         [-93.7138666, 40.703737],
...         [-94.37053769704536, 40.83098709945576],
...         [-94.2036617, 41.3717716],
...         [-93.52300099792355, 41.241436141055345]
...     ]]
... }
...


Search for Scenes within it:

>>> import descarteslabs as dl

>>> scenes, ctx = dl.scenes.search(
...     aoi_geometry,
...     products=["landsat:LC08:PRE:TOAR"],
...     start_datetime="2013-07-01",
...     end_datetime="2013-09-01",
...     limit=10
... )
...
>>> scenes
SceneCollection of 10 scenes
* Dates: Jul 07, 2013 to Aug 24, 2013
* Products: landsat:LC08:PRE:TOAR: 10
>>> ctx
AOI(geometry=<shapely.geom...x7f2a2e172438>,
resolution=15.0,
crs='EPSG:32615',
align_pixels=True,
bounds=(-94.37053769704536, 40.703737, -93.52300099792355, 41.3717716),
bounds_crs='EPSG:4326',
shape=None)


Use SceneCollection operations to order scenes and quickly inspect metadata:

>>> scenes = scenes.sorted("properties.date")
>>> scenes.each.properties.id
'landsat:LC08:PRE:TOAR:meta_LC80260312013188_v1'
'landsat:LC08:PRE:TOAR:meta_LC80260322013188_v1'
'landsat:LC08:PRE:TOAR:meta_LC80270312013195_v1'
'landsat:LC08:PRE:TOAR:meta_LC80260312013204_v1'
'landsat:LC08:PRE:TOAR:meta_LC80260322013204_v1'
'landsat:LC08:PRE:TOAR:meta_LC80270312013211_v1'
'landsat:LC08:PRE:TOAR:meta_LC80260312013220_v1'
'landsat:LC08:PRE:TOAR:meta_LC80260322013220_v1'
...
>>> scenes.each.properties.date.month
7
7
7
7
7
7
8
8
...


Load and display specific bands for a scene at a lower resolution:

>>> scene = scenes[-1]
>>> ctx_lowres = ctx.assign(resolution=60)
>>> arr = scene.ndarray("red green blue", ctx_lowres)
>>> dl.scenes.display(arr)


Iterate over the SceneCollection by months, retrieving stacked groups of scenes, and compute monthly median composites of NDVI within our area of interest:

>>> import numpy as np
>>> monthly_composites = {}
>>> for month, month_scenes in scenes.groupby("properties.date.month"):
...     stack = month_scenes.stack("red nir", ctx_lowres)
...     stack = stack.astype(float)  # avoid truncating to uint16
...     red, nir = stack[:, 0], stack[:, 1]
...     ndvi = (nir - red) / (nir + red)
...     ndvi_composite = np.ma.median(ndvi, axis=0)
...     monthly_composites[month] = ndvi_composite
...


And the mean NDVI value of each month’s composite is:

>>> {month: composite.mean()
...     for month, composite in monthly_composites.items()}
...
{7: 0.32396951619824726, 8: 0.3973330207454807}


View the computed NDVI composites:

>>> dl.scenes.display(*monthly_composites.values(),
...                   title=list(monthly_composites.keys()))
...


## Data Types¶

### Scene¶

The Scene class represents a single scene or image within the DL Catalog. A Scene object instance contains all the relevant metadata about the available imagery (unique identifier, geometry, acquisition date, available bands, resolution, etc.), and can be used to retrieve the imagery as an ndarray or download the image files.

Scene instances can be created in one of two ways, either by retrieving many scenes as a SceneCollection using the search() function or by using the from_id() class method to retrieve a single Scene given a unique identifier for the scene:

>>> import descarteslabs as dl

>>> scene, ctx = dl.scenes.Scene.from_id(
...     'landsat:LC08:PRE:TOAR:meta_LC80260322013188_v1'
...     )
...


And individual properties can be consulted:

>>> scene.properties.id
'landsat:LC08:PRE:TOAR:meta_LC80260322013188_v1'
>>> scene.properties.date
datetime.datetime(2013, 7, 7, 16, 56, 1, 614582)


Note that the GeoContext returned by from_id() is identical to that returned by the default_ctx() method:

>>> ctx == scene.default_ctx()
True


The coverage() method can be used to determine intersection with some target geometry:

>>> scene.coverage(
...     {'type': 'Polygon',
...     'coordinates': [[[-95,39],[-92,39],[-92,42],[-95,42],[-95,39]]]}
...     )
...
0.4192807721178378


The ndarray() method can be used to retrieve the raster data as an ndarray for further processing by the client:

>>> arr = scene.ndarray(bands="red green blue", ctx=ctx.assign(resolution=120))
>>> arr.shape
(3, 1975, 1942)


Similarly, the download() method can be used to retrieve the raster data into a file or file-like object in formats such as TIFF, PNG, JPEG:

>>> filepath = scene.download(bands="red green blue", ctx=ctx.assign(resolution=120))
>>> filepath
'landsat:LC08:PRE:TOAR:meta_LC80260322013188_v1-red-green-blue.tif'


Several of the methods accept a scaling and a data_type parameter. These parameters are treated consistently across the different methods, and merit some explanation and examples.

When band raster data is retrieved, it can be scaled and converted to a variety of data types as required by the user. When neither of these parameters are provided, the original band data is copied into the result without change, while the resulting data type is automatically selected based on the data types of the bands in order to hold all the data without loss of precision or range.

However, the user may specify several different alternative treatments of the band data. One of four automated scaling modes can be specified which direct the operation to rescale the pixel values in each band according to either the range of data in the image or ranges defined in the band properties and targeting an appropriate output data type.

The raw mode is equivalent to no scaling: the data is preserved as is, and the output data type is selected to hold all the band data without loss of precision or range.

>>> [scene.properties["bands"][b]["dtype"] for b in ["red","green","blue"]]
['UInt16', 'UInt16', 'UInt16']
>>>
>>> arr = scene.ndarray(
...     bands="red green blue",
...     ctx=ctx.assign(resolution=120),
...     scaling="raw"
... )
...
>>> arr.dtype
dtype('uint16')
>>> np.min(arr)
412
>>> np.max(arr)
10239


The auto mode automatically scales from the actual range of the band data to the standard display range of [0, 255]. This scaling is done independently for each band, thus this has the effect of “stretching” the dynamic range of the data in each band.

>>> arr = scene.ndarray(
...     bands="red green blue",
...     ctx=ctx.assign(resolution=120),
...     scaling="auto"
... )
...
>>> arr.dtype
dtype('uint8')
>>> np.min(arr)
13
>>> np.max(arr)
255


The display mode scales from the default_range value in the band properties to the standard display range of [0, 255]. Typically this leads to clipping or compression of large pixel values, having the effect of brightening the image.

>>> arr = scene.ndarray(
...     bands="red green blue",
...     ctx=ctx.assign(resolution=120),
...     scaling="display"
... )
...
>>> arr.dtype
dtype('uint8')
>>> np.min(arr)
26
>>> np.max(arr)
255


The physical mode scales from the data_range value in the band properties to the physical_range value in the band properties, returning the result as floating point data.

>>> arr = scene.ndarray(
...     bands="red green blue",
...     ctx=ctx.assign(resolution=120),
...     scaling="physical"
... )
...
>>> arr.dtype
dtype('float64')
>>> np.min(arr)
0.0412
>>> np.max(arr)
1.0239


The scaling parameter can also accept a list of scaling parameters, one for each band in the bands argument, including everything accepted by the Raster client API calls such as Raster.ndarray. Any of the elements may also be one of the automated mode keywords above, although in general one cannot mix different modes, with the exception of auto and display which can be intermixed. Additionally, when using the tuple notation it is possible to specify a percentage (as a string ending with a ‘%’), and the numeric bound will be computed automatically from the appropriate range from the band’s properties (e.g. data_range or physical_range). For example, a tuple of ("25%","75%") with a default_range of [0, 4000] will yield (1000, 3000).

>>> arr = scene.ndarray(
...     bands="red green blue",
...     ctx=ctx.assign(resolution=120),
...     scaling=[("25%", "75%"), ("25%", "75%"), ("25%", "75%")]
... )
...
>>> arr.dtype
dtype('uint8')
>>> np.min(arr)
0
>>> np.max(arr)
255


Finally, it is possible to pass a dictionary (or other Mapping type) for the scaling parameter. In this case, each band in the list of bands will be looked up in the mapping to find its corresponding scaling value. If the band does not appear in the mapping, and the type of the band is not “mask” or “class”, (band types which are rarely scaled), it will look for the key "default_" in the mapping and use any value it finds. If no value is found, then the scale parameter for the band will be set to None. The use of the mapping type is supported as a convenience; it is possible to define a set of standard scaling parameters by band name once, and then reuse this mapping across many calls to any of the Scene or SceneCollection methods which accept the scaling parameter with varying lists of band names.

>>> scaling = {
...     "derived:ndvi": (40000, 65535),
...     "nir": (0, 10000),
...     "default_": "display"
... }
...
>>> rgb = scene.ndarray(
...     bands="red green blue",
...     ctx=ctx.assign(resolution=120),
...     scaling=scaling
... )
...
>>> ndvi = scene.ndarray(
...     bands="derived:ndvi green nir",
...     ctx=ctx.assign(resolution=120),
...     scaling=scaling
... )
...


For the user familiar with the scales and data_type parameters used by the Raster class methods, there is a convenience method scaling_parameters() which will return the full scales and data_type values which the Scene class methods will generate. This can be useful for understanding in detail how scaling is being performed.

>>> scales, data_type = scene.scaling_parameters(
...     bands="red green blue",
...     scaling="display"
... )
...
>>> scales
[(0.0, 4000.0, 0, 255), (0.0, 4000.0, 0, 255), (0.0, 4000.0, 0, 255)]
>>> data_type
'Byte'


For a full description of the scaling and data_type parameters, please see the documentation of scaling_parameters().

### SceneCollection¶

SceneCollection objects are created using the search() function. They can then be manipulated to filter, iterate, or retrieve the individual scenes as well as to retrieve the raster data either through stacking or mosaicking.

Create a SceneCollection by searching:

>>> import descarteslabs as dl

>>> aoi_geometry = {
...     'type': 'Polygon',
...     'coordinates': [
...         [[-95.27841503861751, 42.76556057019057],
...          [-93.15675252485482, 42.36289849433184],
...          [-93.73350276458868, 40.73810018004927],
...          [-95.79766011799035, 41.13809376845988],
...          [-95.27841503861751, 42.76556057019057]]
...     ]
... }
...
>>> scenes, ctx = dl.scenes.search(
...     aoi_geometry,
...     products=["landsat:LC08:PRE:TOAR"],
...     limit=10
... )
...
>>> scenes
SceneCollection of 10 scenes
* Dates: Apr 18, 2013 to Sep 09, 2013
* Products: landsat:LC08:PRE:TOAR: 10
>>> ctx
AOI(geometry=<shapely.geom...x7f2a2dd01828>,
resolution=15.0,
crs='EPSG:32615',
align_pixels=True,
bounds=(-95.79766011799035, 40.73810018004927, -93.15675252485482, 42.76556057019057),
bounds_crs='EPSG:4326',
shape=None)


A SceneCollection functions as a random-access collection:

>>> scenes[0].properties.id
'landsat:LC08:PRE:TOAR:meta_LC80260312013108_v1'


Use each() and filter() to subselect Scenes you want:

>>> # which month is each scene from?
>>> scenes.each.properties.date.month.combine()
[4, 5, 5, 6, 6, 7, 7, 8, 8, 9]
>>> spring_scenes = scenes.filter(lambda s: s.properties.date.month <= 6)
>>> spring_scenes
SceneCollection of 5 scenes
* Dates: Apr 18, 2013 to Jun 21, 2013
* Products: landsat:LC08:PRE:TOAR: 5


Operate on related Scenes with groupby():

>>> for month, month_scenes in spring_scenes.groupby("properties.date.month"):
...     print("Month {}: {} scenes".format(month, len(month_scenes)))
...
Month 4: 1 scenes
Month 5: 2 scenes
Month 6: 2 scenes


Load data with stack() or mosaic(). Stacking yields an ndarray with an additional dimension corresponding to the different scenes in the collection, while mosaicking will render a single mosaic image.

>>> ctx_lowres = ctx.assign(resolution=120)
>>> stack = spring_scenes.stack("red green blue", ctx_lowres)
>>> stack.shape
(5, 3, 1845, 1862)
>>> mosaic = spring_scenes.mosaic("red green blue", ctx_lowres)
>>> mosaic.shape
(3, 1845, 1862)


Download georeferenced images with download() and download_mosaic():

>>> spring_scenes.download("red green blue", ctx_lowres, "rasters")
Traceback (most recent call last):
File "< chunk 60 named None >", line 1, in <module>
'mosaic-nir-red-alpha.tif'


All of the methods for retrieving raster data support the same scaling and data_type parameters as the Scene class. The behavior is equivalent, with one important difference: a SceneCollection can contain scenes from more than one product, and in order to render them with scaling the relevant properties of the bands from the multiple products must be compatible, or an error will be raised.

### GeoContext¶

GeoContext objects provide for the consistent specification of geospatial properties including coordinates, coordinate systems, and resolution across all parts of the ~descarteslabs.scenes.Scenes module. While it is possible to construct a GeoContext explicitly, they are normally obtained via the from_id() method or the search() function based either on the geometry of an area of interest specified by the user or the geospatial properties of the raw imagery.

The AOI and DLTile classes provide specializations of GeoContext for specific purposes.

import descarteslabs as dl

>>> scene, default_ctx = dl.scenes.Scene.from_id(
...     'landsat:LC08:PRE:TOAR:meta_LC80260322013188_v1'
...     )
...
>>> default_ctx
AOI(geometry=None,
resolution=15.0,
crs='EPSG:32615',
align_pixels=False,
bounds=(347692.5, 4345807.5, 580732.5, 4582807.5),
bounds_crs='EPSG:32615',
shape=None)


GeoContexts are immutable; instead, create copies with new values using assign(). (Assigning new values to DLTiles is only supported for the pad parameter) Let’s use a lower resolution to load images faster:

>>> lowres = default_ctx.assign(resolution=75)
>>> lowres_arr = scene.ndarray("red green blue", lowres)
>>>
>>> dl.scenes.display(lowres_arr, size=4,
...                   title="Default GeoContext, 75-meter resolution")
...


You can also create GeoContexts explicitly. We’ll make a new polygon half the size of the scene’s full extent, and then use Web Mercator.

>>> import shapely.affinity
>>>
>>> new_cutline = shapely.affinity.scale(scene.geometry, xfact=0.5, yfact=0.5)
>>> webmerc_cutline_aoi = dl.scenes.AOI(
...     geometry=new_cutline,
...     resolution=75,
...     crs="EPSG:3857"  # "EPSG:3857" is the code for the Web Mercator
... )                    # coordinate reference system, see http://epsg.io/3857
...
>>> webmerc_cutline_arr = scene.ndarray("red green blue", webmerc_cutline_aoi)
>>> dl.scenes.display(webmerc_cutline_arr, size=4,
...                   title="Same scene, with cutline and Web Mercator")
...


Let’s assign our new cutline to the default GeoContext to see the difference between the coordinate reference systems:

>>> with_cutline = lowres.assign(geometry=new_cutline)
>>> with_cutline_arr = scene.ndarray("red green blue", with_cutline)
>>> dl.scenes.display(with_cutline_arr, size=4,
...                   title="Original GeoContext with new cutline")
...


Why is there all that empty space around the sides? We assigned a new geometry, but we didn’t change the bounds. Bounds determine the x-y extent that’s rasterized; geometry just clips within that. You can pass bounds="update" to compute new bounds when assinging a new geometry.

>>> cutline_bounds = lowres.assign(geometry=new_cutline, bounds="update")
>>> cutline_bounds_arr = scene.ndarray("red green blue", cutline_bounds)
>>> dl.scenes.display(cutline_bounds_arr, size=4,
...                   title="Original GeoContext, new cutline and bounds")
...


Bounds can be expressed in any coordinate reference system, set in bounds_crs. They’re typically either in the native CRS of the Scene, or in WGS84 when clipping to a geometry. Note that when computing bounds from a geometry, bounds_crs is automatically set to EPSG:4326 (short for WGS84 lat-lon coordinates), since that’s the CRS in which the geometry is also defined.

You can also use DLTiles to split up regions along a grid:

>>> tiles = dl.scenes.DLTile.from_shape(
...     new_cutline,
...     resolution=75,
... )
...
>>> len(tiles)
38
>>> tile0_arr = scene.ndarray("red green blue", tiles[0])
>>> tile1_arr = scene.ndarray("red green blue", tiles[1])
>>> dl.scenes.display(tile0_arr, tile1_arr,
...                   title=[tiles[0].key, tiles[1].key], size=3)
...


## Functions¶

### Display¶

The display() function has been used throughout this guide to generate plots of imagery. There’s nothing Scene specific about display() but it is a very useful helper for working with numpy arrays containing imagery derived from scenes, whether those images contain spectral data or classification data. It uses the matplotlib package to generate the plots.

As seen previously, display() accepts one or more ndarrays (such as returned by ndarray() or stack() or mosaic(), yielding each in a separate figure within the plot. It can accept images with either one band, three bands, or three bands + the alpha band (as long as the three bands are not themselves masked). Colormaps may be applied when the input image contains a single band containing either spectral or classification data.

There are plenty of examples already in this guide demonstrating the use of display() with ordinary RGB spectral data. Here’s an example of working with some other kinds of data:

import descarteslabs as dl

>>> aoi_geometry = {
...     "type": "Polygon",
...     "coordinates": [
...         [[-121.80, 39.607],
...          [-121.38, 39.607],
...          [-121.38, 39.880],
...          [-121.80, 39.880],
...          [-121.80, 39.607]]
...     ]
... }
...
>>> scenes, ctx = dl.scenes.search(
...     aoi_geometry,
...     products = ["landsat:LC08:01:T1:TOAR"],
...     start_datetime = "2018-10-22",
...     end_datetime = "2018-12-19"
... )
...
>>> ctx = ctx.assign(resolution=60)
>>> pre_scenes = scenes.filter(lambda s: s.properties.date.month == 10)
>>> post_scenes = scenes.filter(lambda s: s.properties.date.month == 12)
>>>
>>> pre_mosaic = pre_scenes.mosaic("nir swir2 swir2", ctx)
>>> post_mosaic = post_scenes.mosaic("nir swir2 swir2", ctx)


Let’s have a look at some false color IR data:

>>> dl.scenes.display(pre_mosaic, post_mosaic, size=4,
...                   title=['Pre-fire nir and swir',
...                          'Post-fire nir and swir'])
...


Calculate a Normalized Burn Ratio, and display it greyscale.

>>> pre_nir = pre_mosaic[0,:,:].astype('float')
>>> pre_swir = pre_mosaic[1,:,:].astype('float')
>>> pre_nbr = (pre_nir - pre_swir) / (pre_nir + pre_swir)
>>> post_nir = post_mosaic[0,:,:].astype('float')
>>> post_swir = post_mosaic[1,:,:].astype('float')
>>> post_nbr = (post_nir - post_swir) / (post_nir + post_swir)
>>> dl.scenes.display(pre_nbr, post_nbr, size=4,
...                   title=['Pre-fire Normalized Burn Ratio',
...                          'Post-fire Normalized Burn Ratio'])
...


Calculate and display the change in Normalized Burn Ration (“d-NBR”) using a more interesting colormap. By clipping the computed d-NBR at the unburned (0.1) and severly burned (0.66) levels and then inverting the colormap is best exploited:

>>> dnbr = pre_nbr - post_nbr
>>> dnbr[dnbr < 0.1] = 0.1
>>> dnbr[dnbr > 0.66] = 0.66
>>> dl.scenes.display(-dnbr, size=4, colormap='RdYlGn',
...                   robust=False, title='d-NBR')
...