Fusion Tables to Google Earth Engine (Interface 1 to Interface 2)

Google Fusion Table Migration with & within Google Earth Engine

Google is getting rid of Google Fusion tables at the end of 2019, and while most of usare blissfully okay never using a fusion table a few of those who use Google Fusion table to feed into maps API or enjoy its integration with Google Earth Engine might need a better way of migrating them. Now in June 2017 Google Earth Engine introduced the concept of Google Earth Engine tables, and a method to upload shapefiles instead of first converting things into fusion tables and then bringing them in, so it was possible for you to simply have shapefiles and ingest them as tables. However, there were those who were happy to use both shapefiles and fusion tables and with Fusion tables going away will need to make sure they have

  1. Migrated their fusion tables from their google drive to Google Earth Engine as tables.
  2. You don’t always own or keep all fusion tables in your google drive but rather in your scripts, so there must be a way to find these and bring them to your Earth Engine(EE) account as well.
  3. Last but not the least for those who are simply interested in converting these into shapefiles and exporting them out, this tutorial will let you convert all fusion tables into EE tables and then you can batch export them as shapefiles.

To do this I have a Python Command Line Interface and I am going to go through each one of them since you need to get some credential files and then just use the tools

Exporting all fusion table from your Google Drive to EE

Now this step requires you to actually generate credentials and I am going to create this step by step so it is easy to follow

  • Go to the Google Drive Rest API v3 Guides Page
  • Choose the Python options from the left-hand side on Quickstarts and click on Enable Drive API, if you are not logged in it will ask you to login or re-login as needed.
  • This will then generate client ids and secret and you can download the file by click Download Client Configuration and this saves a credentials.json file which is important, so hold on to this.
Download Client Configuration as credentials.json file
  • Head over to the GitHub page or install the tool needed
    pip install ft2gee .
  • Run the following code
ft2gee drive2tab --gee “users/johndone/vectors” --credentials “C:\users\johndoe\credentials.json”
  • Once you do this for the first time, your browser should point you to your account and will authenticate your credentials file to create a token.json file. You only have to do this once.
Credentials fetch tokens via OAuth for Google Drive Access
  • At this point sit back and relax, the script finds all fusion table files in your google drive, matches them for unique ids and names and then exports them into a Google Earth Engine Asset folder. The script also removes special and invalid characters to avoid the export to fail.
The tool also creates folders as needed and skips over existing files

Things to note, the export takes some time because effectively each and every fusion table is slowly exported into a table within Google Earth Engine. But it batch creates the task and you can always check back later.

Exporting all fusion table from your GEE Scripts to EE Tables

This has a different aim altogether, it relies on the idea that you have downloaded a copy of your Scripts from your GEE GitHub. You can download your scripts from here. You can read about the tutorials here. You can git clone your entire earth engine scripts library including those to which you have access. Once you have downloaded your scripts you can point to a particular folder and let the tool find all references to any Fusion tables and export them. Steps involved once you have your scripts downloaded

  • Setup is as simple as
ft2gee gee2tab --local "C:\johndoe\scripts" --gee "users/johndoe/vec"
  • The script automatically creates a folder as needed and skips existing tables in the folder for export. Sine multiple scripts can have repeated names, this script does have a limitation if a single name refers to multiple fusion tables but limits multiple exports by using sets instead of lists to avoid duplicate export and save on your task queue.
Export fusion table from scripts to Google Earth Engine Tables

Script Check your GEE Scripts to EE Tables

This is an experimental tool, that parses through a script finds fusion table references and replaces them with the probable path of a table, depending on whether you exported your fusion tables to that location.

ft2gee scriptcheck --local "C:\johndoe\scripts\CloudShadow" --gee "users/johndoe/vec"

Currently, it also tags the script with an ‘_FT’ and inserts a time header to make sure you know this script has been modified. You can then reupload the script after you check them in Google Earth Engine.

Tool to replace the instance of Fusion tables with Table reference (experimental)

There are a few more interesting implications here, for those who would like to simply use EE tables to help them export their fusion tables out, could modify the script to export the Fusion tables as Shapefiles into google drive or GCS buckets. That being said there are limitations on the size of each file created so set up a good experiment. For me, these set of tools allowed for exporting and resolving the issue of losing valuable data on Fusion Tables. Once the experimental scriptcheck toolis completed, the script will be able to check if suggested table paths are actually active and will be able to bath this for entire folders. I hope you find this tool and this tutorial useful and I look forward to interesting and alternative approaches that other Earth Engine users might be using to achieve the same results.

Star the GitHub project if you find it useful and feel free to If you find this guide useful please click the clap 👏 button a couple of times below to show your support (You can click more than once :D ) ! and if you have used it to recreate something or to create better workflow, let me know.

--

--

--

Remote sensing applications, large scale data processing and management, API applications along with network analysis and geostatistical methods

Love podcasts or audiobooks? Learn on the go with our new app.

Recommended from Medium

Goodbye Lerna (sorta), hello Bazel

How to Create a Solana NFT with Python

How to migrate an online store to Shopify? | Web-systems Solutions

How I Got Into Web 3: ETHWMN Fellowship

BioConnect’s Starting A Vlog!

XY/COIN/XYO Q2–2020 Update

Analyze your AWS Lambda Logs with Cloudwatch Logs Insights

Use of BigDecimal instead of double in Java for Financial systems

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store
Samapriya Roy

Samapriya Roy

Remote sensing applications, large scale data processing and management, API applications along with network analysis and geostatistical methods

More from Medium

Post Winter Storm Elpis Satellite Image of Israel

Generating 2D geo channels: stohastic object-based modeling

How I built a wind map with MTS

Near Space Labs Brings Highest Resolution Geospatial Imagery Down to Lowest Cost — New…