Tasks

The Tasks API provides scalable compute capabilities to parallelize your computations. It works by packaging your Python code and executing the code on nodes hosted by Descartes Labs in our cloud infrastructure. These nodes are able to access imagery at extremely high rates of throughput which, paired with horizontal scaling, allow you to execute computations over nearly any spatio-temporal scale.

You can view your task groups through the monitor UI.

All features described here require a recent version of the Descartes Labs Python client. See these instructions for installing the latest client.

Basic Example

This basic example shows how to create a new task group and submit a single task to the group.

Note

All the following examples use Python 3.6. You may need to adapt these to your Python version by changing the image argument to match your Python version. See Choosing Your Environment for the available images.

from descarteslabs.client.services.tasks import Tasks

def hello(i):
    import geopandas

    print(geopandas)
    return "hello {}".format(i)

client = Tasks()

print("creating function")
async_func = client.create_function(
    hello,
    name='my-task-hello',
    image="us.gcr.io/dl-ci-cd/images/tasks/public/py3.6/default:v2018.11.27",
    requirements=[
        'geopandas==0.4.0',
    ],
)

# submit a task to the task group
print("submitting a task")
task = async_func(5)

# print the task result and logs
print("waiting for the task to complete")
print(task.result)
print(task.log)

We define a function in called hello which prints out information about the geopandas package, and returns the string hello <argument>.

Then we generate a new task group using the create_function method which specifies the entrypoint function hello, gives the task group a name, and specifies a Docker image that defines the environment in which the code will be executed.

Finally, we call the async function to submit a single task. This submits the task to the task group created by the create_function call, and the reference to the task is stored in the task variable. This also triggers instances to spin up on the backend to execute the task. Instance management is handled in the background. Instances are created or destroyed as needed to match the compute resources required by the job.

A few important features of the Tasks API are highlighted by this example:

  • You can pass any JSON-serializable argument to a task, e.g. arguments with type str, dict, list, None, or any numeric data type.
  • Your can import non-standard Python packages to be used in your entrypoint if the packages are specified as requirements or already present in the image you’ve selected.
  • You can access any logging or debugging information, including print statements executed inside your function, through the logs stored in task.log. Logs and details for individual tasks are also available through the monitor UI.

Advanced Tasks Usage

Advanced features of Tasks allow you to

  • organize your code using standard Python package and module conventions instead of writing all of your code inside a single function
  • add Python dependencies and specify particular version requirements
  • include data files that your group requires to run

We recommend that you use these features to improve the readability of your code and better control the environment your code executes on.

Python Package Example

This example shows all the features you can use when using Python packages to organize your code. This and the following examples require some example code. Download the example code.

See scripts/complete_example.py.

from descarteslabs.client.services.tasks import Tasks

client = Tasks()

print("creating function")
async_func = client.create_function(
    "task_examples.complete.simplify",
    name='my-task-hello',
    image="us.gcr.io/dl-ci-cd/images/tasks/public/py3.6/default:v2019.02.06",
    include_modules=[
        'task_examples'
    ],
    requirements=[
        "descarteslabs[complete]>=0.17.1",
        'geopandas==0.4.0',
    ],
    include_data=[
        'task_examples/data/*.json'
    ]
)

# submit a task to the task group
print("submitting a task")
task = async_func(5)

# print the task result and logs
print("waiting for the task to complete")
print(task.result)
print(task.log)

Instead of defining our entrypoint function in the deployment script, we’ve organized our code using common Python conventions. We’ve created a task_examples.complete module which contains the simplify function. Additionally, we tell the group to include this package, some additional data, and specific Python requirements for it to run successfully.

Including local packages (include_modules). Your function can make use of any local modules and packages. Specify them by the name you would use to import them. In this example the assumption is there is a local directory task_examples with a complete.py file that defines a simplify function. All submodules of the task_examples package will be included.

Making Python dependencies available to your code (requirements). Your function can make use of any external Python dependencies that you specify as requirements. In this example, we specify a descarteslabs client version and a geopandas version. As long as you pick an image with your desired Python version (Python 3.6 in this case), you can upgrade or downgrade any of your other package dependencies as needed.

Including data files (include_data). You can include local data files that your function and included code can read. Wildcard patterns such as the * (asterisk) - meaning any string - are supported. Your code must use the pkg_resources API to read data files (see below).

Code Organization

We suggest that you use customary ways of organizing the code for a Python project. A common way to organize your source repository looks like this:

myproject/
├── my_package/
|   ├── data/
|   |   └── my_data.txt
|   ├── __init__.py
|   ├── models.py
|   └── utils.py
├── scripts/
|   └── deploy_task_group.py
└── requirements.txt
  • The project’s Python code is all contained within a package called my_package.
  • Data is co-located with code within my_package so it can be referenced relative to the source code.
  • A requirements file at the top level lists all the dependencies for the the source code. The same requirements file can be given when creating a task group.
  • A deploy_task_group.py script creates a new task group and kicks off tasks. It contains an entrypoint function (see below) which imports code from my_package to use.

This example follows some general guidelines. But you are not restricted to a single package and you can organize your code in any way you want, as long as you can put it together as a list of module names importable in your current local Python environment.

Entrypoint Function

You can specify an entrypoint function two ways. As a referenced function:

def f(x):
    from my_package import my_entrypoint

    return my_entrypoint(x)

async_func = tasks.create_function(
    f,
    name='hello-world',
    image="us.gcr.io/dl-ci-cd/images/tasks/public/py2/default:v2019.02.06",
    include_modules=[
        'my_package',
    ],
)

Alternatively, you can use a fully-qualified function name:

async_func = tasks.create_function(
    'my_package.my_entrypoint',
    name='hello-world',
    image="us.gcr.io/dl-ci-cd/images/tasks/public/py2/default:v2019.02.06",
    include_modules=[
        'my_package',
    ],
)

Some restrictions apply to one or both methods of passing an entrypoint function:

  • *function references only* The function needs to be completely self-contained. Globals (variables defined in the top-level module namespace) cannot be referenced. Define any variables and constants within the function’s local scope. All modules it uses need to be imported within the function.
  • *fully-qualified function name* Any modules referenced in your packages and submodules need to be locally importable.
  • You can only return JSON-serializable values from the function. If a function returns a value that cannot be JSON-serialized, your tasks will fail.
  • You can only pass JSON-serializable arguments to the function, e.g. arguments with type str, dict, list, None, or any numeric data type.

Python Dependencies

You can specify your Python dependencies in two ways. You can give a list of dependencies:

async_func = tasks.create_function(
    requirements=[
        "descarteslabs[complete]==0.17.1",
        "scikit-image==0.13.1".
        "scipy>=1.0.0",
    ],
    ...
)

If you already have your dependencies in a standard requirements file you can give a path (absolute or relative to the current working directory) to that:

async_func = tasks.create_function(
    requirements="path/to/requirements.txt",
    ...
)

The dependency specification and requirements file use the same format you are used to from standard Python packaging tools such as pip. For exhaustive details on this see PEP 508 for dependency specification and the pip documentation on requirements files.

If you specify a different version for a requirement that already exists on the image, your specified version will take precedence over the existing version, allowing you to upgrade or downgrade dependencies as required.

Build Failures

If you give Python dependencies for your task, they are essentially installed with pip from PyPI in your image before a task is run. There is a chance that this dependency build fails. Here are a few reasons why it might fail:

  • You have a typo in your list of requirements and the package doesn’t exist
  • A package version you request is not compatible with the environment (e.g. incompatible Python version)
  • A package needs system libraries or tools to build that are not present in the environment
  • The package fails to download from PyPI because of a transient network problem
  • Data or code files you included are too large (they shouldn’t exceed a total of 10MB)

If a problem occurs during the build, the task group will be in a “build failed” state and not accept tasks anymore. In the monitor UI you can find a URL to the build log that will help you diagnose the problem before attempting to create a group again.

Custom Environments

To reiterate, Python dependencies are installed in the environment of the image you give at group creation time. In other words, the build environment and the runtime environment are the same. If you have unusual build-time requirements such as specific system libraries, you may create a custom image that fulfills them. Contact support for help with creating a custom image.

Data Files

You can specify data files to be included as a list of patterns:

async_func = tasks.create_function(
    include_data=[
        'my_package/data/*.txt',
        'my_package/data/image??.png',
        'my_package/data/document.rst',
    ],
    ...
)

This supports Unix-style pattern expansion as per the glob module in the Python standard library.

In your code you must read data files using the standard pkg_resources API - not by looking for and opening files directly:

import pkg_resources
import my_package

# Read a file as a string
text = pkg_resources.resource_string(my_package.__name__, "data/data.txt")

# Open a file as a file-like object
file_like = pkg_resources.resource_stream(my_package.__name__, "data/data.txt")

We reference data files relative to the package they are contained in. For example, the original inclusion path for the file referenced here would have been my_package/data/data.txt - in the package my_package. Colocate your data with your code in a package as much as possible.

The pkg_resources API is part of setuptools, read more details about it in its documentation.

More Examples

Multiple Tasks Example

This example illustrates the more typical use case of submitting multiple tasks to a new group.

See scripts/multiple_tasks.py

from descarteslabs.client.services.tasks import Tasks, as_completed

# create the task group
client = Tasks()

print("creating function")
async_func = client.create_function(
    "task_examples.basic.generate_random_image",
    name='my-task-random-image',
    image="us.gcr.io/dl-ci-cd/images/tasks/public/py3.6/default:v2019.02.06",
    include_modules=[
        'task_examples'
    ]
)

# submit 20 tasks to the task group
print("submitting tasks")
tasks = async_func.map(range(20))

# print the shape of the image array returned by each task
print("starting to wait for task completions")
for task in as_completed(tasks):
    if task.is_success:
        print(task.result.shape)
    else:
        print(task.exception)
        print(task.log)

Here, we reference the "task_examples.basic.generate_random_image" function which generates a random image using numpy with the same number of bands as the value passed to the num_bands parameter.

This example highlights a few additional features of the Tasks API:

  • To submit tasks to the task group, we are using the map method to submit a task for each of the elements in the list. This is typically the most efficient way to submit tasks to a task group, particularly if the number of tasks is large. You are also able to submit tasks one at a time, e.g. within in a for-loop.
  • We use the as_completed method to retrieve the task results for each task as it is completed. Within this loop, we also catch exceptions and print the logs of any failed task.

It’s important to note that the return value from the entrypoint function is converted to a list because return values must be JSON-serializable.

GPU-enabled Task Example

In this example, we use GPU-enabled tasks to do TensorFlow matrix multiplication on a GPU. We do this by defining a TensorFlow task that explicitly places operations on the GPU, and we request a GPU when creating the task group. We use a GPU-enabled container image for this task.

See scripts/gpu_example.py

from descarteslabs.client.services.tasks import Tasks


client = Tasks()

# We create our task function, which creates a new task group in the process.
# Note that we use a public GPU-enabled image that Descartes provides: py3.6-gpu.
print("creating function")
async_function = client.create_function(
    "task_examples.gpu.gpu_tf_ex",
    image='us.gcr.io/dl-ci-cd/images/tasks/public/py3.6-gpu/default:v2019.02.06',
    name='gpu_tf_ex',
    # You can request GPUs exactly as you would request CPUs.
    gpus=1,
    include_modules=[
        'task_examples'
    ]
)

# We launch a task, wait for it to complete, and print the result.  This task will only
# succeed if TensorFlow was able to execute the matrix multiplication on the GPU.
print("submitting a task")
task = async_function(3)

print("waiting for the task to complete")
async_function.wait_for_completion(show_progress=True)
print(task.result)

When creating a GPU-enabled task group, add the gpus=1 keyword argument to your client.create_function, client.create_or_get_function, or client.new_group function call. The only other thing you’ll need to run GPU-enabled tasks is a container image that contains the NVIDIA CUDA library and GPU-supporting libraries for whatever computation you wish to do (e.g. the tensorflow-gpu library for TensorFlow with GPU support.) We provide several public GPU-oriented container images for Python 2.7, 3.5, and 3.6; see the GPU-enabled images in the Choosing Your Environment section.

This task will only succeed (and return a result) if the TensorFlow matrix multiplication is able to be placed on a GPU. We only do this for demonstration purposes, however. If you’re using a machine learning framework with a backend that does automatic device placement for operators (e.g. Keras with a TensorFlow backend), you won’t have to explicitly place operations on the GPU; the backend will automatically do that for you. If you’ve already defined a task function that does Keras model training or inference, you shouldn’t have to changed that code to take advantage of GPU training/inference within our tasks service.

In-Memory file storage

You can use the directory located at /cache for file storage while your task is running. This directory is a memory-backed filesystem and is a good place to write temporary files. Keep in mind that any data written to /cache counts against the memory limit of your task. Use /cache if you can tolerate the memory usage and you have a lot of IO or you need high-performance IO.

See scripts/cache_example.py

from descarteslabs.client.services.tasks import Tasks


def hello(i):
    from task_examples.cache import hello

    # specify a file location in the cache to write files to
    return hello(i, "/cache/geometry.wkt")

client = Tasks()

print("creating function")
async_func = client.create_function(
    hello,
    name='my-task-hello',
    image="us.gcr.io/dl-ci-cd/images/tasks/public/py3.6/default:v2019.02.06",
    include_modules=[
        "tasks_examples"
    ]
)

# submit a task to the task group
print("submitting a task")
task = async_func(5)

# print the task result and logs
print("waiting for the task to complete")
print(task.result)
print(task.log)

Troubleshooting Tasks

Understanding Tasks Concurrency

Task concurrency can be an indicator of the health of your task group, but can it can be tricky to interpret what the concurrency of your group means. Here are some guidelines to help you understand your task concurrency.

If your task group has been running for a while, but you have 0 concurrency, check that you have submitted tasks and there are still tasks pending to be processed.

If you know you have have tasks pending, but the concurrency is 0 or lower than your minimum_concurrency it’s likely that your task group is waiting for resources to become available. If the Tasks service is under heavy load, or you’ve asked for a large resource allocation for your tasks, it can take some time for resources to become available for scheduling your workers.

If you’ve set a minimum_concurrency, a best effort is made to schedule the minimum number of workers requested as soon as your task group starts up. If resources to schedule your minimum_concurrency are not available, we will continually attempt to meet at least the minumum. Once scheduled, your group should not drop below the minimum_concurrency until all the remaining tasks are completed. Certain events may cause your group to reschedule workers, which causes the process to start over. These cases are rare, but may account for missing workers.

Choosing Your Environment

The execution environment for your function in the cloud is defined by the docker image you pick when creating the function. The below images are available covering typical use cases. If none of the provided images suits your needs, contact support@descarteslabs.com about customizing an image.

Match your local Python version to the image you choose. Your function will be rejected or might not run successfully if there is a mismatch between your local Python version and the Python version in the image. A differing bug release version (the “x” in Python version “2.7.x”) is fine.

If you need GPU support, choose one of the GPU-enabled images. All GPU-supporting images end with -gpu.

Current Images

Python 2.7.12, Ubuntu 16.04
Image: us.gcr.io/dl-ci-cd/images/tasks/public/py2/default:v2019.02.12
Date: 02/12/2019
Python highlights: GDAL, numpy, pandas, scikit-image, scikit-learn, scipy, Tensorflow, PyTorch
Other libraries and tools: GEOS 3.5.1, proj 4.9.2, FFTW 3.3.4
absl-py==0.7.0
affine==2.2.2
astor==0.7.1
astropy==2.0.5
atomicwrites==1.3.0
attrs==18.2.0
backports.functools-lru-cache==1.5
backports.weakref==1.0.post1
bleach==1.5.0
blosc==1.5.1
cachetools==2.0.1
certifi==2018.11.29
chardet==3.0.4
Click==7.0
click-plugins==1.0.4
cligj==0.5.0
cloudpickle==0.4.0
cryptography==1.2.3
cycler==0.10.0
decorator==4.3.2
descartes==1.1.0
descarteslabs==0.17.1
enum34==1.1.2
Fiona==1.7.11.post1
funcsigs==1.0.2
futures==3.2.0
gast==0.2.2
GDAL==2.2.2
geojson==2.4.1
geopandas==0.3.0
grpcio==1.18.0
h5py==2.7.1
html5lib==0.9999999
idna==2.7
ipaddress==1.0.16
Keras==2.1.5
kiwisolver==1.0.1
Markdown==3.0.1
matplotlib==2.2.3
mercantile==1.0.4
mock==2.0.0
more-itertools==5.0.0
munch==2.3.2
networkx==2.1
numpy==1.11.0
pandas==0.22.0
pathlib2==2.3.3
pbr==5.1.2
Pillow==5.1.0
pluggy==0.8.1
protobuf==3.6.1
psutil==5.4.5
py==1.7.0
pyasn1==0.1.9
pyOpenSSL==0.15.1
pyparsing==2.3.1
pyproj==1.9.5.1
pytest==4.2.0
python-dateutil==2.8.0
pytz==2018.9
PyWavelets==1.0.1
PyYAML==3.13
rasterio==0.36.0
requests==2.20.1
scandir==1.9.0
scikit-image==0.13.1
scikit-learn==0.19.1
scipy==1.0.1
Shapely==1.6.4.post1
six==1.10.0
snuggs==1.4.2
subprocess32==3.5.3
tensorboard==1.7.0
tensorflow==1.7.0
termcolor==1.1.0
torch==0.4.0
torchvision==0.2.1
urllib3==1.24.1
Werkzeug==0.14.1
xarray==0.10.3
Python 2.7.12, Ubuntu 16.04, NVIDIA/CUDA GPU-enabled
Image: us.gcr.io/dl-ci-cd/images/tasks/public/py2-gpu/default:v2019.02.12
Date: 02/12/2019
Python highlights: GDAL, numpy, pandas, scikit-image, scikit-learn, scipy, Tensorflow, PyTorch
Other libraries and tools: cuda 9.0, GEOS 3.5.1, proj 4.9.2, FFTW 3.3.4
absl-py==0.7.0
affine==2.2.2
astor==0.7.1
astropy==2.0.5
atomicwrites==1.3.0
attrs==18.2.0
backports.functools-lru-cache==1.5
backports.weakref==1.0.post1
bleach==1.5.0
blosc==1.5.1
cachetools==2.0.1
certifi==2018.11.29
chardet==3.0.4
Click==7.0
click-plugins==1.0.4
cligj==0.5.0
cloudpickle==0.4.0
cryptography==1.2.3
cycler==0.10.0
decorator==4.3.2
descartes==1.1.0
descarteslabs==0.17.1
enum34==1.1.2
Fiona==1.7.11.post1
funcsigs==1.0.2
futures==3.2.0
gast==0.2.2
GDAL==2.2.2
geojson==2.4.1
geopandas==0.3.0
grpcio==1.18.0
h5py==2.7.1
html5lib==0.9999999
idna==2.7
ipaddress==1.0.16
Keras==2.1.5
kiwisolver==1.0.1
Markdown==3.0.1
matplotlib==2.2.3
mercantile==1.0.4
mock==2.0.0
more-itertools==5.0.0
munch==2.3.2
networkx==2.1
numpy==1.11.0
pandas==0.22.0
pathlib2==2.3.3
pbr==5.1.2
Pillow==5.1.0
pluggy==0.8.1
protobuf==3.6.1
psutil==5.4.5
py==1.7.0
pyasn1==0.1.9
pyOpenSSL==0.15.1
pyparsing==2.3.1
pyproj==1.9.5.1
pytest==4.2.0
python-dateutil==2.8.0
pytz==2018.9
PyWavelets==1.0.1
PyYAML==3.13
rasterio==0.36.0
requests==2.20.1
scandir==1.9.0
scikit-image==0.13.1
scikit-learn==0.19.1
scipy==1.0.1
Shapely==1.6.4.post1
six==1.10.0
snuggs==1.4.2
subprocess32==3.5.3
tensorboard==1.7.0
tensorflow-gpu==1.7.0
termcolor==1.1.0
torch==0.4.0
torchvision==0.2.1
urllib3==1.24.1
Werkzeug==0.14.1
xarray==0.10.3
Python 3.4.9, Ubuntu 16.04
Image: us.gcr.io/dl-ci-cd/images/tasks/public/py3.4/default:v2019.02.12
Date: 02/12/2019
Python highlights: GDAL, numpy, pandas, scikit-image, scikit-learn, scipy, Tensorflow
Other libraries and tools: GEOS 3.5.1, proj 4.9.2, FFTW 3.3.4
absl-py==0.7.0
affine==2.2.2
astor==0.7.1
astropy==2.0.5
atomicwrites==1.3.0
attrs==18.2.0
bleach==1.5.0
blosc==1.5.1
cachetools==2.0.1
certifi==2018.11.29
chardet==3.0.4
Click==7.0
click-plugins==1.0.4
cligj==0.5.0
cloudpickle==0.4.0
cycler==0.10.0
decorator==4.3.2
descartes==1.1.0
descarteslabs==0.17.1
Fiona==1.7.11.post1
gast==0.2.2
GDAL==2.2.2
geojson==2.4.1
geopandas==0.3.0
grpcio==1.18.0
h5py==2.7.1
html5lib==0.9999999
idna==2.7
Keras==2.1.5
kiwisolver==1.0.1
Markdown==3.0.1
matplotlib==2.2.3
mercantile==1.0.4
more-itertools==5.0.0
munch==2.3.2
networkx==2.1
numpy==1.11.0
pandas==0.22.0
pathlib2==2.3.3
Pillow==5.1.0
pluggy==0.8.1
protobuf==3.6.1
psutil==5.4.5
py==1.7.0
pycurl==7.43.0
pygobject==3.20.0
pyparsing==2.3.1
pyproj==1.9.5.1
pytest==4.2.0
python-apt==1.1.0b1+ubuntu0.16.4.2
python-dateutil==2.8.0
pytz==2018.9
PyWavelets==1.0.1
PyYAML==3.13
rasterio==0.36.0
requests==2.20.1
scandir==1.9.0
scikit-image==0.13.1
scikit-learn==0.19.1
scipy==1.0.1
Shapely==1.6.4.post1
six==1.12.0
snuggs==1.4.2
tensorboard==1.7.0
tensorflow==1.7.0
termcolor==1.1.0
urllib3==1.24.1
Werkzeug==0.14.1
xarray==0.10.3
Python 3.5.2, Ubuntu 16.04
Image: us.gcr.io/dl-ci-cd/images/tasks/public/py3.5/default:v2019.02.12
Date: 02/12/2019
Python highlights: GDAL, numpy, pandas, scikit-image, scikit-learn, scipy, Tensorflow, PyTorch
Other libraries and tools: GEOS 3.5.1, proj 4.9.2, FFTW 3.3.4
absl-py==0.7.0
affine==2.2.2
astor==0.7.1
astropy==2.0.5
atomicwrites==1.3.0
attrs==18.2.0
bleach==1.5.0
blosc==1.5.1
cachetools==2.0.1
certifi==2018.11.29
chardet==3.0.4
Click==7.0
click-plugins==1.0.4
cligj==0.5.0
cloudpickle==0.4.0
cycler==0.10.0
decorator==4.3.2
descartes==1.1.0
descarteslabs==0.17.1
Fiona==1.7.11.post1
gast==0.2.2
GDAL==2.2.2
geojson==2.4.1
geopandas==0.3.0
grpcio==1.18.0
h5py==2.7.1
html5lib==0.9999999
idna==2.7
Keras==2.1.5
kiwisolver==1.0.1
Markdown==3.0.1
matplotlib==3.0.2
mercantile==1.0.4
more-itertools==5.0.0
munch==2.3.2
networkx==2.1
numpy==1.11.0
pandas==0.22.0
pathlib2==2.3.3
Pillow==5.1.0
pluggy==0.8.1
protobuf==3.6.1
psutil==5.4.5
py==1.7.0
pycurl==7.43.0
pygobject==3.20.0
pyparsing==2.3.1
pyproj==1.9.5.1
pytest==4.2.0
python-apt==1.1.0b1+ubuntu0.16.4.2
python-dateutil==2.8.0
pytz==2018.9
PyWavelets==1.0.1
PyYAML==3.13
rasterio==0.36.0
requests==2.20.1
scikit-image==0.13.1
scikit-learn==0.19.1
scipy==1.0.1
Shapely==1.6.4.post1
six==1.12.0
snuggs==1.4.2
tensorboard==1.7.0
tensorflow==1.7.0
termcolor==1.1.0
torch==0.4.0
torchvision==0.2.1
urllib3==1.24.1
Werkzeug==0.14.1
xarray==0.10.3
Python 3.5.2, Ubuntu 16.04, NVIDIA/CUDA GPU-enabled
Image: us.gcr.io/dl-ci-cd/images/tasks/public/py3.5-gpu/default:v2019.02.12
Date: 02/12/2019
Python highlights: GDAL, numpy, pandas, scikit-image, scikit-learn, scipy, Tensorflow, PyTorch
Other libraries and tools: cuda 9.0, GEOS 3.5.1, proj 4.9.2, FFTW 3.3.4
absl-py==0.7.0
affine==2.2.2
astor==0.7.1
astropy==2.0.5
atomicwrites==1.3.0
attrs==18.2.0
blosc==1.5.1
cachetools==2.0.1
certifi==2018.11.29
chardet==3.0.4
Click==7.0
click-plugins==1.0.4
cligj==0.5.0
cloudpickle==0.4.0
cycler==0.10.0
decorator==4.3.2
descartes==1.1.0
descarteslabs==0.17.1
Fiona==1.7.11.post1
gast==0.2.2
GDAL==2.2.2
geojson==2.4.1
geopandas==0.3.0
grpcio==1.18.0
h5py==2.7.1
idna==2.7
Keras==2.1.5
Keras-Applications==1.0.7
Keras-Preprocessing==1.0.9
kiwisolver==1.0.1
Markdown==3.0.1
matplotlib==3.0.2
mercantile==1.0.4
more-itertools==5.0.0
munch==2.3.2
networkx==2.1
numpy==1.11.0
pandas==0.22.0
pathlib2==2.3.3
Pillow==5.1.0
pluggy==0.8.1
protobuf==3.6.1
psutil==5.4.5
py==1.7.0
pycurl==7.43.0
pygobject==3.20.0
pyparsing==2.3.1
pyproj==1.9.5.1
pytest==4.2.0
python-apt==1.1.0b1+ubuntu0.16.4.2
python-dateutil==2.8.0
pytz==2018.9
PyWavelets==1.0.1
PyYAML==3.13
rasterio==0.36.0
requests==2.20.1
scikit-image==0.13.1
scikit-learn==0.19.1
scipy==1.0.1
Shapely==1.6.4.post1
six==1.12.0
snuggs==1.4.2
tensorboard==1.12.2
tensorflow-gpu==1.12.0
termcolor==1.1.0
torch==0.4.0
torchvision==0.2.1
urllib3==1.24.1
Werkzeug==0.14.1
xarray==0.10.3
Python 3.6.8, Ubuntu 16.04
Image: us.gcr.io/dl-ci-cd/images/tasks/public/py3.6/default:v2019.02.12
Date: 02/12/2019
Python highlights: GDAL, numpy, pandas, scikit-image, scikit-learn, scipy, Tensorflow, PyTorch
Other libraries and tools: GEOS 3.5.1, proj 4.9.2, FFTW 3.3.4
absl-py==0.7.0
affine==2.2.2
astor==0.7.1
astropy==2.0.5
atomicwrites==1.3.0
attrs==18.2.0
bleach==1.5.0
blosc==1.5.1
cachetools==2.0.1
certifi==2018.11.29
chardet==3.0.4
Click==7.0
click-plugins==1.0.4
cligj==0.5.0
cloudpickle==0.4.0
cycler==0.10.0
decorator==4.3.2
descartes==1.1.0
descarteslabs==0.17.1
Fiona==1.7.11.post1
gast==0.2.2
GDAL==2.2.2
geojson==2.4.1
geopandas==0.3.0
grpcio==1.18.0
h5py==2.7.1
html5lib==0.9999999
idna==2.7
Keras==2.1.5
kiwisolver==1.0.1
Markdown==3.0.1
matplotlib==3.0.2
mercantile==1.0.4
more-itertools==5.0.0
munch==2.3.2
networkx==2.1
numpy==1.11.0
pandas==0.22.0
Pillow==5.1.0
pluggy==0.8.1
protobuf==3.6.1
psutil==5.4.5
py==1.7.0
pycurl==7.43.0
pygobject==3.20.0
pyparsing==2.3.1
pyproj==1.9.5.1
pytest==4.2.0
python-apt==1.1.0b1+ubuntu0.16.4.2
python-dateutil==2.8.0
pytz==2018.9
PyWavelets==1.0.1
PyYAML==3.13
rasterio==0.36.0
requests==2.20.1
scikit-image==0.13.1
scikit-learn==0.19.1
scipy==1.0.1
Shapely==1.6.4.post1
six==1.12.0
snuggs==1.4.2
tensorboard==1.7.0
tensorflow==1.7.0
termcolor==1.1.0
torch==0.4.0
torchvision==0.2.1
urllib3==1.24.1
Werkzeug==0.14.1
xarray==0.10.3
Python 3.6.8, Ubuntu 16.04, NVIDIA/CUDA GPU-enabled
Image: us.gcr.io/dl-ci-cd/images/tasks/public/py3.6-gpu/default:v2019.02.12
Date: 02/12/2019
Python highlights: GDAL, numpy, pandas, scikit-image, scikit-learn, scipy, Tensorflow, PyTorch
Other libraries and tools: cuda 9.0, GEOS 3.5.1, proj 4.9.2, FFTW 3.3.4
absl-py==0.7.0
affine==2.2.2
astor==0.7.1
astropy==2.0.5
atomicwrites==1.3.0
attrs==18.2.0
blosc==1.5.1
cachetools==2.0.1
certifi==2018.11.29
chardet==3.0.4
Click==7.0
click-plugins==1.0.4
cligj==0.5.0
cloudpickle==0.4.0
cycler==0.10.0
decorator==4.3.2
descartes==1.1.0
descarteslabs==0.17.1
Fiona==1.7.11.post1
gast==0.2.2
GDAL==2.2.2
geojson==2.4.1
geopandas==0.3.0
grpcio==1.18.0
h5py==2.7.1
idna==2.7
Keras==2.1.5
Keras-Applications==1.0.7
Keras-Preprocessing==1.0.9
kiwisolver==1.0.1
Markdown==3.0.1
matplotlib==3.0.2
mercantile==1.0.4
more-itertools==5.0.0
munch==2.3.2
networkx==2.1
numpy==1.11.0
pandas==0.22.0
Pillow==5.1.0
pluggy==0.8.1
protobuf==3.6.1
psutil==5.4.5
py==1.7.0
pycurl==7.43.0
pygobject==3.20.0
pyparsing==2.3.1
pyproj==1.9.5.1
pytest==4.2.0
python-apt==1.1.0b1+ubuntu0.16.4.2
python-dateutil==2.8.0
pytz==2018.9
PyWavelets==1.0.1
PyYAML==3.13
rasterio==0.36.0
requests==2.20.1
scikit-image==0.13.1
scikit-learn==0.19.1
scipy==1.0.1
Shapely==1.6.4.post1
six==1.12.0
snuggs==1.4.2
tensorboard==1.12.2
tensorflow-gpu==1.12.0
termcolor==1.1.0
torch==0.4.0
torchvision==0.2.1
urllib3==1.24.1
Werkzeug==0.14.1
xarray==0.10.3
Python 3.7.1, Ubuntu 18.04
Image: us.gcr.io/dl-ci-cd/images/tasks/public/py3.7/default:v2019.02.12
Date: 02/12/2019
Python highlights: GDAL, numpy, pandas, scikit-image, scikit-learn, scipy
Other libraries and tools: GEOS 3.6.2, proj 4.9.3, FFTW 3.3.7
affine==2.2.2
astropy==3.0.5
attrs==18.2.0
blosc==1.5.1
cachetools==2.0.1
certifi==2018.11.29
chardet==3.0.4
Click==7.0
click-plugins==1.0.4
cligj==0.5.0
cloudpickle==0.4.0
cycler==0.10.0
Cython==0.29
dask==1.1.1
decorator==4.3.2
descartes==1.1.0
descarteslabs==0.17.1
Fiona==1.7.11.post1
GDAL==2.2.2
geojson==2.4.1
geopandas==0.3.0
h5py==2.7.1
idna==2.7
kiwisolver==1.0.1
matplotlib==3.0.2
mercantile==1.0.4
munch==2.3.2
networkx==2.1
numpy==1.15.4
pandas==0.23.4
Pillow==5.1.0
psutil==5.4.5
pygobject==3.26.1
pyparsing==2.3.1
pyproj==1.9.5.1
python-apt==1.6.3+ubuntu1
python-dateutil==2.8.0
pytz==2018.9
PyWavelets==1.0.1
rasterio==1.0.9
requests==2.20.1
scikit-image==0.14.1
scikit-learn==0.20.0
scipy==1.0.1
Shapely==1.6.4.post1
six==1.12.0
snuggs==1.4.2
toolz==0.9.0
torch==0.4.1.post2
torchvision==0.2.1
urllib3==1.24.1
xarray==0.10.3