Imagine some of the harshest operational conditions in the world, with swings in temperature and pressure. A world that is profound and yet vastly less explored than most missions undertaken by humanity. and yet something we rely on every single day. With the need for more ocean data and continuous monitoring, the Argo floats and Argo program deployed the first Argo float in the year 2000. Since then these drifting sentinels of ocean data, these record keepers have numbered in the thousands and with every cycle have created a story that measures the oceans in millions of space and time points.
The Argo Network
The Argo network is a global array of autonomous floats or profilers, deployed across the world’s oceans, and with some 4000 total floats they measure temperature, & salinity and relay that via satellite transmission links to data centers. The International Argo program is further integrated into the Copernicus program and the Global Earth Observation System of Systems (GEOSS).
Each instrument (float) spends almost all its life below the surface. The name Argo was chosen because the array of floats works in partnership with the Jason earth-observing satellites that measure the shape of the ocean surface. (In Greek mythology Jason sailed on his ship the Argo in search of the golden fleece). To learn more about Argo, how it works, its data and technology, and its scientific and environmental impact, click here. The best part all of this data is open source and available for use.
Argovis and accessing data
Argo floats networks have received tremendous support from uses and across different groups who have created such wonderful libraries like argopy and ArgoFloats for R to explore the datasets from these thousands of floats and millions of profiles. Argovis was created and group at the University of Boulder, Colorado and was designed to “Access Argo profiles via API, or visualize temperature, salinity, and BGC data by location”. You can check out the interface at https://argovis.colorado.edu/
Before we go any further, you will quite often hear those two terms so easy to get that cleared, each float is known as a platform (a platform is usually anything that carries a sensor or an array of sensors). Each time a float submerges and resurfaces it measure a profile across time and across a depth profile in the ocean, these are referred to as profiles or platform profiles.
The Argofloats python tool was generally influenced by making these datasets easily searchable and exportable to be used alongside other open ocean datasets. I created this from the Argovis API and is a work in development to support the efforts to make these more searchable and quickly usable as a command-line tool. You can find the latest docs here and the GitHub page here.
Getting stated with argofloats python package: Installation
This assumes that you have native python & pip installed in your system, you can test this by going to the terminal (or windows command prompt) and trying
python and then
To install argofloats: Simple CLI for ArgoVis and Argofloats you can install using two methods.
pip install argofloats
or you can also try
git clone https://github.com/samapriya/argofloats.git
python setup.py install
For Linux use sudo or try
pip install argofloats --user.
Quick overview tool
The overview tool fetches the latest information about the platform and profile count, last updated as well as the distribution of datasets across the different data centers or DACs. There are no arguments for this tool and it simply serves up existing database information.
Platform and Profile Metadata
The platform and profile metadata tools are designed around providing the user with specific information about a given platform or a profile associated with a platform. For platforms, the tool fetches metadata about a platform as pretty prints the information as a JSON object. This does require a Platform ID also called WMO number for the Argo float.
Each platform consists of profiles where each profile is attached to a platform and is a single cycle of data collection. So platforms can have multiple profiles and are generally represented as PlatformID_ProfileNo. This is the argument used by the tool and it pulls the metadata for that specific profile for that platform/Argo float.
Argovis Platform and Profile exploration
Argovis makes it really easy for a user to follow a platform and export both the platform level and/or profile level data from the interface. The user can either click on a single platform or draw an area to choose multiple profiles. The user can also change the date ranges to expand the search
The argofloats tool was built on the same principle of being able to select using a single platform or a profile id or even AOI. It is also possible to do an all-time export of all profiles associated with a single platform. This also includes date selection similar to argovis as shown below
Argofloats tool Platform Profiles and Profiles export
The Platform profile tool: Each platform consists of profiles where each profile is attached to a platform and is a single cycle of data collection. So platforms can have multiple profiles and are generally represented as PlatformID_ProfileNo. This tool fetches all platforms linked to a specific platform and exports it as a single CSV including platform profile id and measurements. If no path is provided the profile CSV is saved onto the home folder.
The profiles export tool is more general-purpose and allows you to search the Argo floats database using either lat long and a buffer area, or a geometry.geojson file or a given profile id. This tool is also capable of running long-time searches overcoming the 3-month limit constrained by the argovis API. All use cases are shown below. The outputs are written as CSV file with a prefix argoprofile_
Using the point geometry you can supply a lat lon and a buffer to create a search and export results
The area search allows you to parse using a geojson file instead of a point and buffer setup
Open Oceans project and Google Earth Engine (GEE) compatability
This tool is part of the open oceans project I started a few weeks ago to help curate open oceans tools and datasets. As I was building the data sandbox in Google Earth Engine, I decided to make all exports from this tool GEE compatible, so you can import these CSVs directly into Earth Engine for analysis as well as locate and find other datasets that might be collocated in time and space.
I hope to continue developing this tool with support from and suggestions from the user community. As I said in my earlier blog on aqualink & pyaqua, building a better understanding of oceans is critical, it is tied to measuring challenging amounts of data from harsh environments.
Access isn’t the same as accessibility
There is always more than one way for users to access, read and then follow up on the data. Asking the right questions, tinkering on solutions, and building as a community are what allow growth in different domains. You can also follow me on Twitter to get more updates as I build more for the open oceans.