Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
SlideShare a Scribd company logo
Earth Engine and
Google Cloud Platform
Slides Courtesy: Matt Hancher, Engineering Manager, Google Earth Engine
Runcy Oommen
https://runcy.me
A whirlwind tour of Earth Engine
Use these slides as a quick-reference later:
GCP free tier; basic command line knowledge
Objectives
Agenda
Introduction to Google Cloud Platform
Getting Started with Cloud Platform
Command Line Tools: gcloud, gsutil, and earthengine
Exporting Data, Maps, and Map Tiles
Compute Engine and Container Engine
Earth Engine on Google Cloud Platform (GCP)
Earth Engine on Google Cloud Platform (GCP)
666
For the past 19 years, Google has
been building out the world’s fastest,
most powerful, highest quality cloud
infrastructure on the planet.
6
“Google is living a few years
in the future and sending
the rest of us messages”
Doug Cutting - Hadoop Co-Creator
Over 15 Years of Tackling Big Data Problems
Google
Papers
20082002 2004 2006 2010 2012 2014 2015
GFS Flume Java
Open
Source
2005
Google
Cloud
Products BigQuery Pub/Sub Dataflow Bigtable
BigTable Dremel Spanner
ML
2016
Millwheel TensorflowDataflowMapReduce
Use the best of Google’s innovation to
solve the problems that matter most to you.
SJC (JP, HK, SG) 2013
Edge points of presence (>100)
Leased and owned fiber
#
#
Future region and number of zones
Current region and number of zones
Trailing 3 Year CAPEX Investment
$29.4 Billion
2
3
3
3
3 3
3
3
3
4
3
3
Frankfurt
Singapore
S Carolina
N Virginia
Belgium
London
Taiwan
Mumbai
Sydney
Oregon Iowa
São Paulo
Finland
Tokyo
Montreal
California
Netherlands
3
3
33
3
2
3
GCP Regions
Solid Foundation Serverless Data Platform Let Developers Just Code
The Journey to a Web-Scale Cloud
Physical/Colo Serverless/No-ops
Storage Processing Memory Network
Virtualized
Storage Processing Memory Network
Phase 1 Phase 2 Phase 3
Serverless
Data Platform
Just send
eventsPub/Sub
Just run
queriesBigQuery
Just write
pipelinesDataflow
1 Billion Users
Agenda
Introduction to Google Cloud Platform
Getting Started with Cloud Platform
Command Line Tools: gcloud, gsutil, and earthengine
Exporting Data, Maps, and Map Tiles
Compute Engine and Container Engine
All Google Cloud Platform resources live within a project.
Projects manage settings, permissions, billing info, etc.
Cloud Projects
Cloud Console
Log in at console.cloud.google.com
Cloud Shell
Learn more at cloud.google.com/shell
Durable and highly available object storage (i.e. file storage) in the cloud,
as well as static content serving on the web.
Several storage types all use the same APIs and access methods.
Cloud Storage
Files in Google Cloud Storage are called objects.
You store your objects in one or more buckets.
Buckets live in a single global namespace.
Cloud Storage URL: gs://my-bucket/path/to/my-object
Objects and Buckets
Objects can be either public or private.
You control object permissions using Access Control Lists (ACLs).
ACLs grant READER, WRITER, and OWNER permissions to one or more grantees.
You can set the default ACL for newly-created objects in a bucket.
Cloud Storage Permissions
A simple user interface to:
• Create and manage buckets
• Upload and manage objects
Cloud Storage Console
Find it at console.cloud.google.com/storage
Serving Static Content
Public objects are served directly over HTTPS:
https://storage.googleapis.com/my-bucket/path/to/my-object
Private objects can be accessed from a browser by logged-in users too,
but it is slower and involves a URL redirection:
https://console.cloud.google.com/m/cloudstorage/b/my-bucket/o/path/to/my-object
Agenda
Introduction to Google Cloud Platform
Getting Started with Cloud Platform
Command Line Tools: gcloud, gsutil, and earthengine
Exporting Data, Maps, and Map Tiles
Compute Engine and Container Engine
The gsutil and gcloud Command Line Tools
gsutil
• Copy data into and out of Cloud Storage.
• Manage your Cloud Storage buckets, ACLs, etc.
gcloud
• Create and manage virtual machines in Compute Engine.
• Create and manage clusters in Dataproc and Container Engine.
• Create and manage Cloud SQL databases.
• ...and much more.
Both come with the Google Cloud SDK: cloud.google.com/sdk
The earthengine Command Line Tool
earthengine
• Copy, move, and remove assets.
• Upload images and tables from Cloud Storage.
• View and modify asset ACLs.
• Create folders and image collections.
• Manage long-running batch tasks.
Comes with the Earth Engine Python SDK:
developers.google.com/earth-engine/python_install
Manage Assets and Files
List assets and files with ls:
earthengine ls users/username/folder
gsutil ls gs://my-bucket/folder
Copy and move assets and files with cp and mv:
earthengine cp users/username/source users/username/destination
gsutil mv gs://my-bucket/source gs://my-bucket/destination
Remove assets and files with rm:
earthengine rm users/username/asset_id
gsutil rm gs://my-bucket/filename
Create Buckets, Folders, and Collections
Create a Cloud Storage Bucket:
gsutil mb gs://my-new-bucket
Create an Earth Engine folder:
earthengine create folder users/username/my-new-folder
Create an Earth Engine image collection:
earthengine create collection users/username/my-new-folder
Upload images from Cloud Storage to Earth Engine
Simple image upload:
earthengine upload image --asset_id my_asset gs://my-bucket/my_file.tif
Control the pyramid of reduced-resolution data:
--pyramiding_policy sample
(Options are mean, sample, mode, min, and max. The default is mean.)
Control the image’s mask:
--nodata_value=255
--last_band_alpha
Upload Tables from Cloud Storage to Earth Engine
A simple table upload:
earthengine upload table --asset_id my_asset gs://my-bucket/my_file.shp
Shapefiles consist of multiple files. Specify the URL to the main .shp file.
Earth Engine will automatically use sidecar files that have the same base filename but
different extensions.
Manage Image Metadata in Earth Engine
Set a metadata property on an image asset:
earthengine asset set -p name=value users/username/asset_id
Set the special start time property on an image asset:
earthengine asset set --time_start 1978-10-15T12:34:56 users/username/asset_id
(You can use the same flags to set properties when uploading an image!)
Dump information about an asset:
earthengine asset info users/username/asset_id
Manage Access Permissions
Access Control Lists (ACLs) are how you manage access permissions for private
data.
Get an asset’s or object’s ACL with “acl get”:
earthengine acl get users/username/asset_id
gsutil acl get gs://my-bucket/path/to/my/file
Set a “public” (world-readable) or “private” ACL with “acl set”:
earthengine acl set public users/username/asset_id
gsutil acl set private gs://my-bucket/path/to/my/file
Manage Access Permissions (Part 2)
Copy an ACL from one asset to others with “acl get” and “acl set”:
gsutil acl get gs://my-bucket/source > my_acl
gsutil acl set my_acl gs://my-bucket/destination/*
Change an individual user’s access with “acl ch”:
gsutil acl ch -u user@domain.com:R gs://my-bucket/source
Use :W to grant write access, or -d to delete the user’s permissions.
Use the special AllUsers user to control whether all users can see your object.
(These all work the same way in earthengine, too.)
Agenda
Introduction to Google Cloud Platform
Getting Started with Cloud Platform
Command Line Tools: gcloud, gsutil, and earthengine
Exporting Data, Maps, and Map Tiles
Compute Engine and Container Engine
Exporting Images to Cloud Storage
You can export images directly to Cloud Storage.
// Export an image to Cloud Storage.
Export.image.toCloudStorage({
image: image,
description: 'MyImageExport',
bucket: 'my-bucket',
fileNamePrefix: 'my_filename',
scale: 30,
region: geometry,
});
This will produce a file named gs://my-bucket/my_filename.tif.
If the image is too large it will be automatically split across multiple files.
Exporting Images to Cloud Storage
Or do it in Python.
from ee.batch import Export
# Export an image to Cloud Storage.
task = Export.image.toCloudStorage(
image=image,
description='MyImageExport',
bucket='my-bucket',
fileNamePrefix='my_filename',
scale=30,
region=geometry,
)
task.start()
Exporting Tables to Cloud Storage
Export tables (i.e. FeatureCollections) directly to Cloud Storage.
# Export a table to Cloud Storage.
task = Export.table.toCloudStorage(
collection=features,
description='MyTableExport',
bucket='my-bucket',
fileNamePrefix='my_filename',
)
This will produce a file named gs://my-bucket/my_filename.csv.
In addition to CSV, you can also export Shapefiles, GeoJSON, KML, or KMZ.
Exporting Videos to Cloud Storage
Export videos directly to Cloud Storage.
# Export a video to Cloud Storage.
task = Export.video.toCloudStorage(
collection=images,
description='MyVideoExport',
bucket='my-bucket',
dimensions=720,
framesPerSecond=12,
region=geometry,
)
This will produce a file named gs://my-bucket/myVideoExport.mp4.
Exporting Maps and Map Tiles
Export map tiles directly to Cloud Storage.
# Export a map to Cloud Storage.
task = Export.map.toCloudStorage(
image=image,
description='MyMapExport',
bucket='my-bucket',
path='my_folder',
region=geometry,
maxZoom=5,
})
This will produce a folder named gs://my-bucket/my_folder/ containing your tiles.
Exporting Maps and Map Tiles
The map tile path is: folder/Z/X/Y
Z: The zoom level. Level 0 is global, and each higher level is twice the
resolution.
X, Y: The x and y positions of the tile within the zoom level. 0/0 is the upper
left.
The Map tiles are in the Google Maps Mercator projection, which is used by most web
mapping applications.
If you specifically request PNG or JPG tiles then they will have a .png or .jpg extension.
By default they are a mix of PNG and JPG (a.k.a. “AUTO”) and have no file extension.
Exporting Maps and Map Tiles
Browse the output in the Cloud Storage Browser: console.cloud.google.com/storage/browser
index.html: A Simple HTML Map Viewer
Quickly view your map tiles.
Share a link.
Embed in an <iframe>.
index.html: A Simple HTML Map Viewer
Or, use the code as a starting point
for a custom app.
If you expect much traffic, be sure to
sign up for a Maps API key.
earth.html: View in Google Earth!
View and share your imagery in the
new Google Earth on the web, iOS,
and Android!
Agenda
Introduction to Google Cloud Platform
Getting Started with Cloud Platform
Command Line Tools: gcloud, gsutil, and earthengine
Exporting Data, Maps, and Map Tiles
Compute Engine and Container Engine
Google Compute Engine
• Virtual machines in Google's advanced data centers.
• Scale up from single instances to whatever you need, instantly.
• Custom machine types let you pay for only what you need.
• Long-running workloads are automatically discounted.
• Our efficient infrastructure is powered entirely by renewable energy.
Compute Engine with Earth Engine
Two common reasons to use Compute Engine and Earth Engine together:
Run third-party binaries or legacy tools, or run computations that can't be expressed
in the Earth Engine API, using data from the Earth Engine data catalog.
Run applications built with EE Python API, such as custom-built web applications.
(But also consider App Engine for this use case; it's often simpler.)
Data never has to leave the cloud. Use Cloud Storage as a staging area.
Compute Engine Console
Find it at console.cloud.google.com/compute
Getting Started with Compute Engine
Compute Engine Quick Start:
cloud.google.com/compute/docs/quickstart-linux
Install the Earth Engine SDK:
sudo apt-get update
sudo apt-get install libffi-dev libssl-dev python-dev python-pip
sudo pip install cryptography google-api-python-client earthengine-api
Using your Ordinary Google Account
Create your virtual machine instance:
gcloud compute instances create my-instance --machine-type f1-micro --zone us-central1-a
Log into your instance via ssh:
gcloud compute ssh my-instance
Now authenticate your instance to EE:
earthengine authenticate
It will print a URL to log in via your browser. Copy and paste the code back into the shell.
Using your Ordinary Google Account
Now you can easily authenticate to Earth Engine from Python:
import ee
ee.Initialize()
That's it! Once you've logged in and authenticated, your credentials are stored locally
on the VM and are used by default.
Using your Compute Engine Service Account
Configure the appropriate scopes when you create your Compute Engine instance:
GCP_SCOPE=https://www.googleapis.com/auth/cloud-platform
EE_SCOPE=https://www.googleapis.com/auth/earthengine
gcloud compute instances create my-instance 
--machine-type f1-micro --scopes ${GCP_SCOPE},${EE_SCOPE}
Note: Today you can only create Compute Engine instances with custom scopes using
the gcloud command line tool, not via the Compute Engine web UI.
Using your Compute Engine Service Account
Now you can easily authenticate to Earth Engine from Python:
import ee
from oauth2client.client import GoogleCredentials
ee.Initialize(GoogleCredentials.get_application_default())
That's it! In a properly-configured VM you never have to worry about managing service
account credentials.
Authorizing your Compute Engine Service Account
Email earthengine@google.com to authorize your service account for EE.
(You only need to do this once: all your Compute Engine instances can share the same Service
Account. If you prefer, you can configure others to isolate apps from each other.)
To find your Service Account id:
gcloud compute instances describe my-instance
...
serviceAccounts:
- email: 622754926664-compute@developer.gserviceaccount.com
...
Remember to share any private assets you need with that account!
Compute Engine Pricing
You can choose standard machine sizes,
or you can configure a custom machine size.
Automatic discounts for sustained use.
Preemptible VMs are discounted to around 21% of the base rate!
Typical US prices for a few machine types:
Free Usage Tier:
One f1-micro VM instance
Type CPUs Memory Typical Price / Hour Sustained Price / Month
f1-micro 1 (shared) 0.60GB $0.007 $3.88
n1-standard-1 1 3.75GB $0.0475 $24.27
n1-standard-16 16 60GB $0.76 $388.36
Google Container Engine
A powerful automated cluster manager for running
clusters on Compute Engine.
Lets you set up a cluster in minutes, based on
requirements you define (such as CPU and memory).
Built on Docker and the open-source Kubernetes system.
cloud.google.com/container-engine/docs
console.cloud.google.com/kubernetes
Building what’s next 57
Containers at Google
We launch over
2 Billion
containers per week.
Earth Engine as a GCP Service
The Earth Engine Cloud API is an HTTP/REST
interface to Earth Engine.
Available for early access (pre-Alpha) now:
● Access data in the Earth Engine catalog
● GDAL interface (in partnership with Planet)
● Ingest data and manage batch tasks
Coming soon:
● On-the-fly computation and queries
● Batch computation and exports
Coming later:
● Cloud IAM integration, etc.
Example: Get Info About an Asset
GET /v1/assets/CIESIN/GPWv4/population-density/2000
Returns:
{
"type": "IMAGE",
"path": "CIESIN/GPWv4/population-density/2000",
"updateTime": "2016-12-16T19:51:16.107Z",
"time": "2000-01-01T00:00:00Z",
"bands": [
{
"name": "population-density",
"dataType": {
"precision": "FLOAT32"
},
...
"sizeBytes": "198246654"
}
(Note: Output is truncated to fit on one slide!)
Example: Fetch Pixels from an Asset
POST /v1/assets:getPixels
{
"path": "LANDSAT/LC8/LC80440342017037LGN00",
"bandIds": ["B5", "B4", "B3"],
"visualizationParams": {
"ranges": [{"min": 0, "max": 25000}]
},
"encoding": "JPEG",
"pixelGrid": {
"affine_transform": {
"scaleX": 30,
"scaleY": -30,
"translateX": 580635,
"translateY": 4147365
},
"dimensions": {
"width": 256,
"height": 256
}
}
}
Returns:
Thank you!
Madacascar Lemur

More Related Content

Earth Engine on Google Cloud Platform (GCP)

  • 1. Earth Engine and Google Cloud Platform Slides Courtesy: Matt Hancher, Engineering Manager, Google Earth Engine Runcy Oommen https://runcy.me
  • 2. A whirlwind tour of Earth Engine Use these slides as a quick-reference later: GCP free tier; basic command line knowledge Objectives
  • 3. Agenda Introduction to Google Cloud Platform Getting Started with Cloud Platform Command Line Tools: gcloud, gsutil, and earthengine Exporting Data, Maps, and Map Tiles Compute Engine and Container Engine
  • 6. 666 For the past 19 years, Google has been building out the world’s fastest, most powerful, highest quality cloud infrastructure on the planet. 6
  • 7. “Google is living a few years in the future and sending the rest of us messages” Doug Cutting - Hadoop Co-Creator
  • 8. Over 15 Years of Tackling Big Data Problems Google Papers 20082002 2004 2006 2010 2012 2014 2015 GFS Flume Java Open Source 2005 Google Cloud Products BigQuery Pub/Sub Dataflow Bigtable BigTable Dremel Spanner ML 2016 Millwheel TensorflowDataflowMapReduce
  • 9. Use the best of Google’s innovation to solve the problems that matter most to you.
  • 10. SJC (JP, HK, SG) 2013 Edge points of presence (>100) Leased and owned fiber # # Future region and number of zones Current region and number of zones Trailing 3 Year CAPEX Investment $29.4 Billion 2 3 3 3 3 3 3 3 3 4 3 3 Frankfurt Singapore S Carolina N Virginia Belgium London Taiwan Mumbai Sydney Oregon Iowa São Paulo Finland Tokyo Montreal California Netherlands 3 3 33 3 2 3 GCP Regions
  • 11. Solid Foundation Serverless Data Platform Let Developers Just Code
  • 12. The Journey to a Web-Scale Cloud Physical/Colo Serverless/No-ops Storage Processing Memory Network Virtualized Storage Processing Memory Network Phase 1 Phase 2 Phase 3
  • 13. Serverless Data Platform Just send eventsPub/Sub Just run queriesBigQuery Just write pipelinesDataflow
  • 15. Agenda Introduction to Google Cloud Platform Getting Started with Cloud Platform Command Line Tools: gcloud, gsutil, and earthengine Exporting Data, Maps, and Map Tiles Compute Engine and Container Engine
  • 16. All Google Cloud Platform resources live within a project. Projects manage settings, permissions, billing info, etc. Cloud Projects
  • 17. Cloud Console Log in at console.cloud.google.com
  • 18. Cloud Shell Learn more at cloud.google.com/shell
  • 19. Durable and highly available object storage (i.e. file storage) in the cloud, as well as static content serving on the web. Several storage types all use the same APIs and access methods. Cloud Storage
  • 20. Files in Google Cloud Storage are called objects. You store your objects in one or more buckets. Buckets live in a single global namespace. Cloud Storage URL: gs://my-bucket/path/to/my-object Objects and Buckets
  • 21. Objects can be either public or private. You control object permissions using Access Control Lists (ACLs). ACLs grant READER, WRITER, and OWNER permissions to one or more grantees. You can set the default ACL for newly-created objects in a bucket. Cloud Storage Permissions
  • 22. A simple user interface to: • Create and manage buckets • Upload and manage objects Cloud Storage Console Find it at console.cloud.google.com/storage
  • 23. Serving Static Content Public objects are served directly over HTTPS: https://storage.googleapis.com/my-bucket/path/to/my-object Private objects can be accessed from a browser by logged-in users too, but it is slower and involves a URL redirection: https://console.cloud.google.com/m/cloudstorage/b/my-bucket/o/path/to/my-object
  • 24. Agenda Introduction to Google Cloud Platform Getting Started with Cloud Platform Command Line Tools: gcloud, gsutil, and earthengine Exporting Data, Maps, and Map Tiles Compute Engine and Container Engine
  • 25. The gsutil and gcloud Command Line Tools gsutil • Copy data into and out of Cloud Storage. • Manage your Cloud Storage buckets, ACLs, etc. gcloud • Create and manage virtual machines in Compute Engine. • Create and manage clusters in Dataproc and Container Engine. • Create and manage Cloud SQL databases. • ...and much more. Both come with the Google Cloud SDK: cloud.google.com/sdk
  • 26. The earthengine Command Line Tool earthengine • Copy, move, and remove assets. • Upload images and tables from Cloud Storage. • View and modify asset ACLs. • Create folders and image collections. • Manage long-running batch tasks. Comes with the Earth Engine Python SDK: developers.google.com/earth-engine/python_install
  • 27. Manage Assets and Files List assets and files with ls: earthengine ls users/username/folder gsutil ls gs://my-bucket/folder Copy and move assets and files with cp and mv: earthengine cp users/username/source users/username/destination gsutil mv gs://my-bucket/source gs://my-bucket/destination Remove assets and files with rm: earthengine rm users/username/asset_id gsutil rm gs://my-bucket/filename
  • 28. Create Buckets, Folders, and Collections Create a Cloud Storage Bucket: gsutil mb gs://my-new-bucket Create an Earth Engine folder: earthengine create folder users/username/my-new-folder Create an Earth Engine image collection: earthengine create collection users/username/my-new-folder
  • 29. Upload images from Cloud Storage to Earth Engine Simple image upload: earthengine upload image --asset_id my_asset gs://my-bucket/my_file.tif Control the pyramid of reduced-resolution data: --pyramiding_policy sample (Options are mean, sample, mode, min, and max. The default is mean.) Control the image’s mask: --nodata_value=255 --last_band_alpha
  • 30. Upload Tables from Cloud Storage to Earth Engine A simple table upload: earthengine upload table --asset_id my_asset gs://my-bucket/my_file.shp Shapefiles consist of multiple files. Specify the URL to the main .shp file. Earth Engine will automatically use sidecar files that have the same base filename but different extensions.
  • 31. Manage Image Metadata in Earth Engine Set a metadata property on an image asset: earthengine asset set -p name=value users/username/asset_id Set the special start time property on an image asset: earthengine asset set --time_start 1978-10-15T12:34:56 users/username/asset_id (You can use the same flags to set properties when uploading an image!) Dump information about an asset: earthengine asset info users/username/asset_id
  • 32. Manage Access Permissions Access Control Lists (ACLs) are how you manage access permissions for private data. Get an asset’s or object’s ACL with “acl get”: earthengine acl get users/username/asset_id gsutil acl get gs://my-bucket/path/to/my/file Set a “public” (world-readable) or “private” ACL with “acl set”: earthengine acl set public users/username/asset_id gsutil acl set private gs://my-bucket/path/to/my/file
  • 33. Manage Access Permissions (Part 2) Copy an ACL from one asset to others with “acl get” and “acl set”: gsutil acl get gs://my-bucket/source > my_acl gsutil acl set my_acl gs://my-bucket/destination/* Change an individual user’s access with “acl ch”: gsutil acl ch -u user@domain.com:R gs://my-bucket/source Use :W to grant write access, or -d to delete the user’s permissions. Use the special AllUsers user to control whether all users can see your object. (These all work the same way in earthengine, too.)
  • 34. Agenda Introduction to Google Cloud Platform Getting Started with Cloud Platform Command Line Tools: gcloud, gsutil, and earthengine Exporting Data, Maps, and Map Tiles Compute Engine and Container Engine
  • 35. Exporting Images to Cloud Storage You can export images directly to Cloud Storage. // Export an image to Cloud Storage. Export.image.toCloudStorage({ image: image, description: 'MyImageExport', bucket: 'my-bucket', fileNamePrefix: 'my_filename', scale: 30, region: geometry, }); This will produce a file named gs://my-bucket/my_filename.tif. If the image is too large it will be automatically split across multiple files.
  • 36. Exporting Images to Cloud Storage Or do it in Python. from ee.batch import Export # Export an image to Cloud Storage. task = Export.image.toCloudStorage( image=image, description='MyImageExport', bucket='my-bucket', fileNamePrefix='my_filename', scale=30, region=geometry, ) task.start()
  • 37. Exporting Tables to Cloud Storage Export tables (i.e. FeatureCollections) directly to Cloud Storage. # Export a table to Cloud Storage. task = Export.table.toCloudStorage( collection=features, description='MyTableExport', bucket='my-bucket', fileNamePrefix='my_filename', ) This will produce a file named gs://my-bucket/my_filename.csv. In addition to CSV, you can also export Shapefiles, GeoJSON, KML, or KMZ.
  • 38. Exporting Videos to Cloud Storage Export videos directly to Cloud Storage. # Export a video to Cloud Storage. task = Export.video.toCloudStorage( collection=images, description='MyVideoExport', bucket='my-bucket', dimensions=720, framesPerSecond=12, region=geometry, ) This will produce a file named gs://my-bucket/myVideoExport.mp4.
  • 39. Exporting Maps and Map Tiles Export map tiles directly to Cloud Storage. # Export a map to Cloud Storage. task = Export.map.toCloudStorage( image=image, description='MyMapExport', bucket='my-bucket', path='my_folder', region=geometry, maxZoom=5, }) This will produce a folder named gs://my-bucket/my_folder/ containing your tiles.
  • 40. Exporting Maps and Map Tiles The map tile path is: folder/Z/X/Y Z: The zoom level. Level 0 is global, and each higher level is twice the resolution. X, Y: The x and y positions of the tile within the zoom level. 0/0 is the upper left. The Map tiles are in the Google Maps Mercator projection, which is used by most web mapping applications. If you specifically request PNG or JPG tiles then they will have a .png or .jpg extension. By default they are a mix of PNG and JPG (a.k.a. “AUTO”) and have no file extension.
  • 41. Exporting Maps and Map Tiles Browse the output in the Cloud Storage Browser: console.cloud.google.com/storage/browser
  • 42. index.html: A Simple HTML Map Viewer Quickly view your map tiles. Share a link. Embed in an <iframe>.
  • 43. index.html: A Simple HTML Map Viewer Or, use the code as a starting point for a custom app. If you expect much traffic, be sure to sign up for a Maps API key.
  • 44. earth.html: View in Google Earth! View and share your imagery in the new Google Earth on the web, iOS, and Android!
  • 45. Agenda Introduction to Google Cloud Platform Getting Started with Cloud Platform Command Line Tools: gcloud, gsutil, and earthengine Exporting Data, Maps, and Map Tiles Compute Engine and Container Engine
  • 46. Google Compute Engine • Virtual machines in Google's advanced data centers. • Scale up from single instances to whatever you need, instantly. • Custom machine types let you pay for only what you need. • Long-running workloads are automatically discounted. • Our efficient infrastructure is powered entirely by renewable energy.
  • 47. Compute Engine with Earth Engine Two common reasons to use Compute Engine and Earth Engine together: Run third-party binaries or legacy tools, or run computations that can't be expressed in the Earth Engine API, using data from the Earth Engine data catalog. Run applications built with EE Python API, such as custom-built web applications. (But also consider App Engine for this use case; it's often simpler.) Data never has to leave the cloud. Use Cloud Storage as a staging area.
  • 48. Compute Engine Console Find it at console.cloud.google.com/compute
  • 49. Getting Started with Compute Engine Compute Engine Quick Start: cloud.google.com/compute/docs/quickstart-linux Install the Earth Engine SDK: sudo apt-get update sudo apt-get install libffi-dev libssl-dev python-dev python-pip sudo pip install cryptography google-api-python-client earthengine-api
  • 50. Using your Ordinary Google Account Create your virtual machine instance: gcloud compute instances create my-instance --machine-type f1-micro --zone us-central1-a Log into your instance via ssh: gcloud compute ssh my-instance Now authenticate your instance to EE: earthengine authenticate It will print a URL to log in via your browser. Copy and paste the code back into the shell.
  • 51. Using your Ordinary Google Account Now you can easily authenticate to Earth Engine from Python: import ee ee.Initialize() That's it! Once you've logged in and authenticated, your credentials are stored locally on the VM and are used by default.
  • 52. Using your Compute Engine Service Account Configure the appropriate scopes when you create your Compute Engine instance: GCP_SCOPE=https://www.googleapis.com/auth/cloud-platform EE_SCOPE=https://www.googleapis.com/auth/earthengine gcloud compute instances create my-instance --machine-type f1-micro --scopes ${GCP_SCOPE},${EE_SCOPE} Note: Today you can only create Compute Engine instances with custom scopes using the gcloud command line tool, not via the Compute Engine web UI.
  • 53. Using your Compute Engine Service Account Now you can easily authenticate to Earth Engine from Python: import ee from oauth2client.client import GoogleCredentials ee.Initialize(GoogleCredentials.get_application_default()) That's it! In a properly-configured VM you never have to worry about managing service account credentials.
  • 54. Authorizing your Compute Engine Service Account Email earthengine@google.com to authorize your service account for EE. (You only need to do this once: all your Compute Engine instances can share the same Service Account. If you prefer, you can configure others to isolate apps from each other.) To find your Service Account id: gcloud compute instances describe my-instance ... serviceAccounts: - email: 622754926664-compute@developer.gserviceaccount.com ... Remember to share any private assets you need with that account!
  • 55. Compute Engine Pricing You can choose standard machine sizes, or you can configure a custom machine size. Automatic discounts for sustained use. Preemptible VMs are discounted to around 21% of the base rate! Typical US prices for a few machine types: Free Usage Tier: One f1-micro VM instance Type CPUs Memory Typical Price / Hour Sustained Price / Month f1-micro 1 (shared) 0.60GB $0.007 $3.88 n1-standard-1 1 3.75GB $0.0475 $24.27 n1-standard-16 16 60GB $0.76 $388.36
  • 56. Google Container Engine A powerful automated cluster manager for running clusters on Compute Engine. Lets you set up a cluster in minutes, based on requirements you define (such as CPU and memory). Built on Docker and the open-source Kubernetes system. cloud.google.com/container-engine/docs console.cloud.google.com/kubernetes
  • 57. Building what’s next 57 Containers at Google We launch over 2 Billion containers per week.
  • 58. Earth Engine as a GCP Service The Earth Engine Cloud API is an HTTP/REST interface to Earth Engine. Available for early access (pre-Alpha) now: ● Access data in the Earth Engine catalog ● GDAL interface (in partnership with Planet) ● Ingest data and manage batch tasks Coming soon: ● On-the-fly computation and queries ● Batch computation and exports Coming later: ● Cloud IAM integration, etc.
  • 59. Example: Get Info About an Asset GET /v1/assets/CIESIN/GPWv4/population-density/2000 Returns: { "type": "IMAGE", "path": "CIESIN/GPWv4/population-density/2000", "updateTime": "2016-12-16T19:51:16.107Z", "time": "2000-01-01T00:00:00Z", "bands": [ { "name": "population-density", "dataType": { "precision": "FLOAT32" }, ... "sizeBytes": "198246654" } (Note: Output is truncated to fit on one slide!)
  • 60. Example: Fetch Pixels from an Asset POST /v1/assets:getPixels { "path": "LANDSAT/LC8/LC80440342017037LGN00", "bandIds": ["B5", "B4", "B3"], "visualizationParams": { "ranges": [{"min": 0, "max": 25000}] }, "encoding": "JPEG", "pixelGrid": { "affine_transform": { "scaleX": 30, "scaleY": -30, "translateX": 580635, "translateY": 4147365 }, "dimensions": { "width": 256, "height": 256 } } } Returns: