Raspberry Pi Single Board Computers sold over 12.5 million units by 2017

Baking API Clients in a Raspberry Pi: Planet and Earth Engine in a Box

Samapriya Roy
6 min readNov 22, 2017

Over the last couple of years both Amazon as well as Google have replicated and made Satellite Images part of Globally accessible dataset catalogs. Earth on AWS and Google’s Public Release datasets are massive projects that allows users to batch import and process imagery and datasets using existing data endpoints. Along with cloud native geospatial platforms such as Google Earth Engine and others like Vane and Raster Foundary the idea became standard that it was time to move the algorithms to the datasets rather than moving the datasets out of the buckets. Chris Holmes from Planet Labs who helped develop and contribute to the Cloud Optimized GeoTIFF format discusses the basic architecture of these technologies further in his medium posts Part I,Part II and Part III .

To truly understand what might be possible using an approach that rids the client side from any data storage and analysis need is to understand not just data democratization but also analysis democratization. This meant that you might be able to create inexpensive algorithms and send them as requests to existing analysis platforms. This proof of concept was an experiment in putting bare bone remote sensing tools and python command line interfaces to understand implementation on a $35 computer.

This allows a user with even the bare minima to approach to remote sensing while still maintaining functionality.

Neofetch System Readout for Raspberry Pi where I am running all API based tools

Perhaps one of the most interesting thing over the last couple of years was the idea that since these data sets are near “streamable” in applications of interest, it almost seems to free the client side from any heavy weight lifting. This experience allows them to build infrastructure or software insights around existing cloud services with bare minimum required on client side.

Planet API Client on Raspberry Pi

To test this idea of running light weight API(s) on a barebone infrastructure, Planet and EarthEngine API(s) along with custom built API(s) Planet-Earth Engine Pipeline and Clip and Ship Client were installed on a Raspberry Pi.

Earth Engine API on Raspberry Pi

This is also unique because standard libraries built for specific hardware are not implemented the same way in arm hardware architecture which is common among Single Board Computers(SBC(s)).

The Setup

This will allow you to setup a Raspberry Pi Model B+ and above to be able to run all tools and API clients of interest. This assumes you have a raspberry pi with you and my advice would be to buy the bundles which consist of the charging cable, additional micro-sd card among a few other components such as a case should you want one.

  • I am using a Raspberry Pi 2 but you can buy a raspberry pi 3 which is improved and comes with built in WiFi and Bluetooth capability. This can be installed on other systems such as udoo and rock 64 , but raspberry pi has one of the largest community support platforms.
  • The next step is to get an OS and the Ubuntu Mate Raspberry Pi Image serves well for this function. Ubuntu is one of the most common Linux distros and Ubuntu mate is a great flavor to try to install on a raspberry pi. This already comes with some of the basic libraries used by python that you would need for this to work. Installation instructions can be found here. Once the setup has been completed it can be accessed using an open ssh to remote login or by just connecting to the hdmi port.
Ubuntu Mate 16.04 Welcome Screen on Raspberry Pi
  • To install planet and earth engine API along with the custom addons and tools you can follow earlier posts on getting started or simply the set of instructions here and included below.[GEE Addon, Clip & Ship CLI, Planet Earth Engine CLI]. Please note this might require you to install additional libraries as needed by the system but the requirements file should take care of most of them.
Running core and custom API clients on raspberry pi

Most of the native clients do not require a sudo operation couple of the built tools and commands needs a super user status to store application logs and errors .

Detailed Setup Instructions to follow along within Raspberry Pi(This was tested with Ubuntu 16.04)

Applications

While the Raspberry Pi experiment is just a proof of concept at running individual boxes with extremely low overhead and rapid deployment, it also gets at some interesting issues.

  • Portability becomes critical- Having bootable OS is not new but having a support architecture that can run both headless as well as render an output to a screen make the form factor of such SBC(s) really attractive.
  • Coupled with an external hard drive which can be attached to such devices such as WD Labs Pi Drive you can get 375 GB of storage drive for less than a $1 per GB of storage. This is currently my setup and why this becomes interesting is if you want to run a long ending task which requires a staging space or scratch space you can implement this easily and cheap.
  • In areas with limitations on power this can be run using portable power banks and have an extremely low power footprint. This means even offline this device can run analysis and secondary operations with bare minima requirements.
  • Integration with additional light weight API(s) such as slack notification and twilio based text integration allows the device to communicate directly using triggers or calls.

Currently my raspberry pi checks for new PlanetScope and RapidEye images in my area of interest and sends me an alert on Slack everyday. It is also capable of downloading images and pre-processing them further acting as a staging buffer.

The purpose of this experiment was to run a portable, deployable and low cost machine that is capable of making calls to both the Planet API and the Google Earth Engine API and to be able to perform tasks such as acting as a resource buffer or staging area for imagery, preprocessing and clean up on images as needed and to interact between multiple API endpoints. This was inspired by the series of articles by Chris Holmes from Planet on Cloud Native Geospatial technologies.

The normative trend probably with building portable technologies is finding a middle ground and more flexibility such that a user gets more control than just being in an app based environment with no back-end interaction while also maintaining a fairly non complex way for them to interact with engine that runs these large and complex data and process chains. In Geospatial world as remote sensing tools and algorithms get ported and translated into open source cloud native methods, calling a compute operation will become simpler and cheaper on both open buckets as well as commercially closed buckets.

With virtual instances replacing portable hardware such as these SBC(s), these experiments and setups still allow for integration of sensor environments into existing cloud native ingestion, processing and analysis platforms. The next step is look at what is possible from such projects and workflows within the community that deals with software and hardware bridges for the next generation of remote sensing and geospatial analysis. While building tools and technologies that work towards needs of all possible architecture and computing resources available at large making both data and analysis cloud native.

Note: Most of the hardware resources mentioned in this post are simply what I have used for my personal setup and are not necessarily the only options available but serve as starting point for research while building your own setup.

--

--

Samapriya Roy
Samapriya Roy

Written by Samapriya Roy

Remote sensing applications, large scale data processing and management, API applications along with network analysis and geostatistical methods

No responses yet