General questions
I have a problem, please help!
Please post a question following this guide, providing as much information as possible (see the Stack Overflow guide to asking good questions). This usually means:
sending a link to a script in the Code Editor (click on the "Get Link" button to obtain it)
sharing any assets needed to run the script
for failed batch tasks, report the ID of the failed task. Task IDs look like this:
4C25GIJBMB52PXTEJTF4JLGL
. They can be found in the Task tab of the Code Editor. Learn more.
Where can I read about the Earth Engine architecture?
Please see this paper: Gorelick et al. 2017.
Are there any recommended Earth Engine tutorials not created by Google?
See the EDU and Training Resources pages.
Are there any recommended remote sensing tutorials?
See this free EE course by Ujaval Gandhi, which includes a video with an introduction to remote sensing.
Earth Engine programming
What are some common coding errors?
See the debugging guide.
Why can't I use simple math like ee.Image("image") * 2
?
In EE you should not mix server-side and client-side objects or operations. All operations on EE objects are performed server-side. Any client-side computations will not do what you intend them to do. Please see this page for more details.
How can I use for
loops or if/else
statements?
Earth Engine programming is done using a functional language, so
loops and conditional operations should be expressed using equivalent
concepts like map
or filter
. Please see
this page for more details.
How do I show text labels in my image or video?
There is no built-in support for text labels, but you can:
- Use a third-party JS package. See example
- Use a third-party Python package geemap.
- Use QGIS bringing in EE images via the EE QGIS plugin
Can I use some standard color palettes?
Use a third-party JS package ee-palettes.
How do I create my own website that uses Earth Engine?
Use Earth Engine Apps for simple applications. In more complex cases, you can build EE-powered App Engine apps.
How do Map IDs work?
Map IDs (called mapid
throughout the API) are keys which allow clients to
fetch map tiles. Each ID is a hash created by providing an image expression to
the getMapId
endpoint. The resulting IDs are keys which point to both the
image expression and the user credentials that will be used to generate
tiles in a later stage.
Requesting map tiles involves specifying the location of the tile (x
, y
,
zoom
) as well as the mapid
(the key to the image and credentials). The same
ID can be reused to load many map tiles. There are no limits on the reuse of
mapid
keys, but they expire after a few hours. We don't currently publish
guarantees about how long they persist, but any code you write should be
resilient to the ID expiring.
Creating these IDs involves storing a small amount of data and validating
credentials, so it's best to reuse them for as long as possible. There's no API
quota specifically associated with the getMapId
endpoint, but any workflow
which involves creating mapid
objects at a rate anywhere near the rate of
fetching tiles is probably doing something wrong. Earth Engine has no API
endpoint to remove, list or manage these IDs, since these are transient
resources.
Why does ee.Algorithms.If()
print both the true and false case?
function a() { print("true"); }
function b() { print("false"); }
// Prints 'true' and 'false'.
ee.Algorithms.If(true, a(), b());
The If()
algorithm works just like every other algorithm on Earth Engine in
that all its arguments have to be evaluated before the algorithm itself can run.
The algorithm gets both the trueCase
and falseCase
results and then picks
and returns one based on the condition
argument, but both paths had to be
executed in order for those values to be passed into the algorithm in the first
place.
I get the error "Request payload size exceeds the limit"
You are trying to send to Earth Engine a very large request. This typically happens when the code uses a lot of client-side 'for' loops or constructs a FeatureCollection out of many geometry objects. In the latter case, instead of creating such geometries in your script, generate a CSV file containing them and upload it into a table asset.
What the difference between ee.Image.clip()
and ee.Filter.bounds()
?
See this GIS Stack Exchange thread.
The ee.Image.clip()
function masks out
pixels that do not intersect a given ee.Geometry
or ee.Feature
, making them
transparent in visualizations and excluded in computations. You can
conceptualize it as clipping off pixels from an image.
The ee.Filter.bounds()
function filters ee.Image
objects out of an ee.ImageCollection
based on
image intersection with an ee.Geometry
or ee.Feature
. It is used to limit
the scope of an analysis to only the images that intersect a given region,
which helps to optimize expressions.
How to convert image pixels to feature collections, with one feature per pixel?
Use the ee.Image.sample()
function. See usage examples on the function's API
reference page.
What is the difference between ee.ImageCollection.merge()
and ee.ImageCollection.combine()
?
The ee.ImageCollection.merge()
function merges all of the images from two collections into one collection,
regardless of whether their respective images have coincident bands, metadata,
CRS, or scale. It is the union of two collections. The combine()
method
combines the bands of matching images from two collections into a single
collection. Matching images share the same ID (system:index
property). It is
an inner join of two collections based on image ID, where bands from matching
images are combined. For matching images, bands from the secondary image are
appended to the primary image (overwrite is optional). If there are no matching
images, an empty collection is returned.
How to filter image collections on multiple date intervals?
See this GIS Stack Exchange thread.
Either merge()
multiple collections together or use ee.Filter.or()
.
How to create a bounding box around a given point?
// Buffer the point by a desired radius and then get the bounding box.
var LNG = -117.298;
var LAT = 45.162;
var point = ee.Geometry.Point([LNG, LAT]);
var buffer = point.buffer(30000); // half of box width as buffer input
var box = buffer.bounds(); // draw a bounding box around the buffered point
Map.centerObject(box);
Map.addLayer(point);
Map.addLayer(box);
// Map the buffer and bounds procedure over a point feature collection.
var pointCol = ee.FeatureCollection([
ee.Feature(ee.Geometry.Point([LNG + 1, LAT])),
ee.Feature(ee.Geometry.Point([LNG - 1, LAT]))
]);
var boxCol = pointCol.map(function(feature) {
var box = feature.buffer(30000).bounds();
return feature.setGeometry(box.geometry());
});
Map.addLayer(boxCol);
Data Catalog
Can you add dataset X?
File a dataset request bug following this guide.
You can also upload data into your Earth Engine home folder. See Importing Raster Data and Importing Table Data.
An existing dataset has a new version
File a dataset bug following this guide and indicate that you are requesting a dataset update.
An existing dataset is not updated or is missing assets
Before reporting a problem, please verify, if possible, that the desired assets actually exist on the dataset provider's site. See this page for more details.
If you are looking for an asset by filtering an ImageCollection
, make sure
your filters are not too restrictive.
In particular, note that:
Sentinel-2 SR (Level 2 data) were not produced by ESA for early Level 1 scenes.
Landsat does not have worldwide coverage before 2000.
An existing dataset has wrong values
Please post on the developers forum. Include a script that zooms into the native resolution of the asset and makes it easy to see what values are wrong. Please explain where exactly you observed the alternative value.
Where else can I ask about datasets?
For questions about NASA datasets, see this forum.
For questions about the toolboxes operating on Copernicus datasets, see the S1, S2, and S3 forums.
How large is the EE catalog?
As of October 2023, the catalog contains over 1000 datasets. Its size on disk is over 90 petabytes (after taking into account lossless compression).
How often are data in EE updated?
Normally, all ongoing datasets are updated at least daily (though not all such datasets have new data every day). Some datasets are updated several times a day. However, there is no policy to guarantee the presence of the most recent assets in the catalog.
How do I view the contents of the EE catalog programmatically?
The list of dataset is exported in the STAC format to
a Google Cloud Storage bucket gs://earthengine-stac
.
The entry file is
catalog.json.
Can I use Google Maps data or imagery for analysis?
Google does not license or sell basemap data for analysis.
How can I find the date an asset was ingested?
The 'system:version'
asset property is the ingestion timestamp, formatted as
microseconds since Unix epoch. Here is an example that converts the ingestion
timestamp of a Landsat image to a human readable format.
var image = ee.Image('LANDSAT/LC08/C02/T1_L2/LC08_044034_20210508');
print('Ingest date', ee.Date(image.getNumber('system:version').divide(1000)));
Does Earth Engine Catalog have JSON-LD metadata?
Yes, JSON-LD metadata are embedded in the catalog HTML pages. For example, the Sentinel-2 page contains the following block:
<script type="application/ld+json">
{
"@context": "https://schema.org",
"@type": "BreadcrumbList",
"itemListElement": [{
"@type": "ListItem",
"position": 1,
"name": "Earth Engine Data Catalog",
"item": "https://developers.google.com/earth-engine/datasets"
},{
"@type": "ListItem",
"position": 2,
"name": "Harmonized Sentinel-2 MSI: MultiSpectral Instrument, Level-1C",
"item": "https://developers.google.com/earth-engine/datasets/catalog/COPERNICUS_S2_HARMONIZED"
}]
}
</script>
Landsat
How is the simpleComposite
algorithm implemented?
The server-side implementation is equivalent to this JavaScript code.
How can I create cloud-free composites from Landsat surface reflectance data?
Landsat Level 2 (surface reflectance) data have several quality bands that can be used to mask out clouds and other unwanted image artifacts. An example of using these bands to process Landsat 8 SR images and create a median cloud-free composite is provided in this GIS Stack Exchange post. The same procedure is used to build a cloud-free composite for use in supervised classification examples in the Developer's Guide.
Is cross-sensor Landsat surface reflectance harmonization needed?
Roy et al., 2016 included an analysis of reflectance differences between Landsat 7-8 TOA and surface reflectance. They published the OLS and RMA coefficients so readers could transform the reflectance values of one sensor's data to another. The final line of the paper states: "Although sensor differences are quite small they may have significant impact depending on the Landsat data application." However, this analysis was based on pre-collection data.
The improvements made during Collection 1 and Collection 2 reprocessing may influence the relationship between sensors, but as far as we know, there have been no analyses similar to Roy et al. (2016) for Collection 1 or Collection 2 data. Despite no formal analysis, there seems to be a general consensus among influential Landsat users that no correction is needed for Collection 2, Level 2 (surface reflectance) data. For example, in a reply to a question regarding the need for Collection 2, Level 2 harmonization, Mike Wulder of the Landsat Science Team noted that depending on the nature of the application of interest (including land cover mapping and change detection), the Collection 2 surface reflectance products are highly suitable and reliable, without need for cross-sensor adjustment.
How can I mask clouds and cloud shadows in MSS imagery?
The third-party msslib module for the JavaScript Code Editor includes an implementation of the MSScvm algorithm, as well as other helpful functions for exploring and preparing MSS data.
Data Management
Who owns the data that I upload?
According to the Earth Engine Terms of Service, customers own the data that they upload to Earth Engine.
I can't upload data!
Please check the upload task status in the Tasks pane in the upper right-hand corner of the Code Editor. You can also view the dedicated task page.
If there is no task, you have probably tried uploading your file through the Code Editor, but due to a networking problem the file never finished uploading, so the task was never created. Please try using a different browser or a different computer.
If there is a failed task, please examine the error it shows. If there is no
specific error message, first verify that your file is not corrupt by running
gdalinfo
for raster
files or ogr2ogr
for vector files.
These commands will try to read all the data from the source files and
display errors if the files are corrupt.
Example gdalinfo call:
gdalinfo -mm -stats -checksum file.tif
Example ogr2ogr call that will convert in.shp
to out.csv
:
ogr2ogr -lco GEOMETRY=AS_WKT -f CSV out.csv in.shp
If the file looks valid, please post the failed task id as text (not as a
screenshot) on the
developers mailing list.
Task ids have this format: 4C25GIJBMB52PXTEJTF4JLGL
. Please also make your
source file publicly readable if possible. If it's a private file, please share it
with just earthengine@google.com
if you would like the Earth Engine team to
examine it. If it's not possible to share the source file, please at least
provide the output of gdalinfo -mm -stats -checksum
.
If Earth Engine does not support a certain projection, you will need to
reproject the data before uploading using, for example,
gdalwarp
.
How do I upload a file in NetCDF or another unsupported raster format?
Currently only GeoTIFFs can be uploaded to Earth Engine. Other GDAL-compatible
formats can be converted to GeoTIFFs using
gdal_translate
. Example:
gdal_translate -co COMPRESS=DEFLATE file.nc file.tif
Note that some NetCDF or HDF files consist of multiple subdatasets that can be
discovered with gdalinfo
. The
gdal_translate
command in this case will look like this (note that file path
between the double quotes):
gdal_translate HDF4_EOS:EOS_GRID:"/tmp/MCD12Q1.A2001001.h00v08.005.2011055224312.hdf":MOD12Q1:Land_Cover_Type_1 file.tif
NetCDF files sometimes do not carry a projection that GDAL recognizes. In this
case you would need to set the projection and spatial extent in the
gdal_translate
command line. Example:
gdal_translate -a_srs EPSG:4326 -a_ullr -180 90 180 -90 file.nc file.tid
My raster ingestion has been running for days and has not finished.
Using gdalinfo
, check if your file has the following GDAL option set:
INTERLEAVE=PIXEL
. For files with this option and many bands, ingestion might
never finish because the layout of such files makes reads very slow.
Try converting such files to the band-interleaved layout before uploading:
gdal_translate -co "INTERLEAVE=BAND" src.tif dst.tif
My uploaded rasters do not match the basemap.
If the data are slightly offset from the basemap, the projection probably has
an incorrect datum (assumption about the shape of the Earth).
This happens most often with
the sinusoidal projection that cannot be fully encoded in GDAL metadata. When
you know what the target projection should be (e.g., SR-ORG:6974
for files using
MODIS sinusoidal projection), set the --crs
flag during
command-line upload
or the crs
field of the
upload manifest.
If the data appear grossly distorted and/or in the completely wrong place, the projection or the affine transform is likely wrong.
My raster only shows up over the Eastern hemisphere.
You probably uploaded a global raster that spans the longitude range [0, 360]. However, Earth Engine requires rasters to be in the range [-180, 180]. Please swap the left and right halves of the raster before ingestion. See these GIS Stack Exchange suggestions.
Why does my ingested classification image look speckled?
You probably used the default MEAN
pyramiding policy.
For classification images, the MODE
pyramiding policy should be used. for
QA/bitmask images, the SAMPLE
pyramiding policy should be used.
I get the following error: No data value -128.0 cannot be applied to band #0 of type Short<0, 255>.
GDAL cannot treat single-byte bands as containing signed integers, so it reads such bands as unsigned integers. This would conflict with negative no data values.
If your values are actually signed integers, use
manifest upload
and add this to the tileset section containing your file: data_type: "INT8"
If your values are unsigned integers, your file has a bad nodata value. You can
override it on upload with the correct data value (or a value that never occurs,
if one exists). You can also use gdal_translate -a_nodata
to change the nodata
value or gdal_edit.py -unsetnodata
to remove it.
How do I upload a file in the GeoJSON or another unsupported vector format?
Use ogr2ogr
to translate
OGR-compatible formats into CSV or SHP. Example:
or2ogr -f "ESRI Shapefile" file.shp file.kml
Note that in CSV uploads, the geometry column can contain GeoJSON.
I want to upload data using Python, or upload many files at once.
Use command-line upload. Such uploads require source files to be first placed into a GCS (Google Cloud Storage) bucket. GCS usage does not cost anything if you stay within the free tier limits - see the pricing page.
I want to upload a large raster mosaic split into many tiles.
If the files all have the same projection and pixel size, just upload them together into the same asset — they will be mosaiced automatically.
If the files have different projections or pixel sizes, they cannot be mosaiced
into a single raster asset. Instead, upload each tile as a separate asset into
the same ImageCollection
that can be mosaiced using
ImageCollection.mosaic()
I am trying to upload a mosaic and get errors about mismatched tiles.
Tiles for Earth Engine raster mosaics must have the same projection and pixel size. In addition, the tiles must align exactly on pixel boundaries.
I am trying to upload a file from a GCS bucket, but Earth Engine cannot find it.
You may have used different Google accounts for the GCS upload and for connecting to Earth Engine. Make sure the GCS file is readable by the account you use to connect to Earth Engine. If browser multilogin makes this confusing, connect to Earth Engine in an anonymous/incognito browser window.
I want to export many assets at once.
We are working on a single-command export of ImageCollections
. For now, please
export each image separately.
I want to move or delete a Folder
or an ImageCollection
with a single command.
Currently you have to first move or delete each asset, then move or delete the parent folder or collection. If there are a lot of child assets, write a shell or Python loop to iterate over them.
I want to directly access Earth Engine data from outside of EE.
QGIS has an Earth Engine plugin.
GDAL has an Earth Engine driver.
Other systems can use the EE REST API.
I would like to update a part of my Earth Engine asset without completely reingesting it.
Currently it's not possible to update raster or vector data uploaded into EE. Only asset metadata properties can be updated.
I am losing access to my account. What do I do with my assets?
If the policies of the original account allow data transfer, please share your assets with another account, then copy the assets to be owned by the new account. Use the command-line copy utility if there are many assets to move.
If an account is deleted, the assets owned by it are no longer accessible.
My exported image is in the wrong place.
For some projections, such as sinusoidal or conic, GeoTIFF files sometimes cannot store all the necessary projection parameters. This results in an exported file that appears in the wrong place when viewed in desktop GIS tools or reingested into EE.
To fix this, specify an export crs
parameter that's known to work well
with GeoTIFF files—for example, use the EPSG code for the UTM zone
containing your area of interest.
What Cloud Storage bucket location should I use to store COG assets?
The answer depends on what you are trying to optimize for. If you are optimizing for low latency computation access, the best GCS bucket locations to store COG assets are US-CENTRAL*. See the Bucket locations page for information on other considerations.
Exported feature collection assets do not preserve properties I set.
No Export.table.*
functions preserve table-level properties in the output.
For many output formats (e.g. CSV, GeoJSON), there is no support for such
metadata. The Export.table.toAsset
function could support table-level
properties, but does not at this time.
Tables exported to Drive as CSV format are converted to XSLX format.
Depending on your Google Drive settings, CSV tables that you export from Earth Engine can be converted to XSLX files with unintended effects, such as data type conversions. Follow these steps to modify the behavior for subsequent exports.
- In Google Drive on the web, click the Settings cog at the top right.
- Click Settings.
- Scroll down and uncheck "Convert uploaded files to Google Docs editor format".
Code Editor
I cannot log in to the Code Editor because it prompts me to use the wrong account.
Please log out, select the account that is registered to use Earth Engine in the "Choose an account" page, and then re-select the same account in the second "Choose an account to continue to Earth Engine Code Editor" page (exact wording may be different).
I want to screenshot a global map, but don't like the Web Mercator projection.
The map projection used in the Code Editor is
Web Mercator
('EPSG:3857'). It inflates the size of objects away from the equator, making
high latitude regions appear much larger than they actually are,
as compared to regions near the equator. You cannot change the projection of the
Code Editor's map canvas, but you can "paint" an image in the projection of your
choice to the Web Mercator canvas using the ee.Image.changeProj
method.
See the method's API reference page
for an example of displaying a global DEM to the Code Editor map in the
Robinson projection.
Note that this technique should only be used for visualization purposes, as the
Code Editor's inspector and drawing tools are still operating in Web Mercator.
My script does not save (Script error: Unknown commit
).
If you receive a Script error: Unknown commit
message when saving a script,
it likely means that the repository that you are saving to is out of sync.
The cause of this state is variable and difficult to identify. To resolve the
issue, try refreshing the script list using the button in the upper right corner
of the Script tab. If that does not work, try creating a new repository from
the New button in the Script tab and saving your script there
(you may need to move scripts into the new repo from the out-of-sync repo).