Descartes Labs now have a virtual supercomputer on demand – That’s GCP for you!

Cloud Computing

Descartes Labs headquartered in Santa Fe, New Mexico has created a cloud-based supercomputing platform to apply machine intelligence to enormous datasets. The mission of the company is to understand the planet better, both for the profit and for the good of humanity. The company now uses advanced data analytics and satellite imagery processing on Google Cloud Platform to model complex systems on the planet to enable organizations to understand and “see” the world in a whole new way.

What Descartes Labs puts on the table for the world?

Forecasting - Agriculture

 

The challenging climate changes and growing population needs accurate agricultural forecasting. Descartes Labs is helping the world identify early signs of famine and address the food security crises. With the use of machine learning the company can analyze years of scientifically attuned satellite imagery to predict the crop yield and health. Descartes can provide programmatic and instant access to satellite images of any geographic location.

he company also gives its customers information about the global food supply chain by using deep learning, large-scale, remote-sensing, and high-performance computing. This information can be used by the academic researchers, Governments, and food producers to ensure that crop harvests are sufficient and the critical food chain links remain healthy.
Besides the crop health, the forecasting platform also gives insights into natural resources, human health, growth of cities, the state of available drinking water, and the spread of forest fires across the world.

How GCP helped Descartes Labs in their journey?

NASA

 

This broad scope of work involves massive and growing datasets. This was when Descartes turned to Google Cloud Platform. The Google Earth Engine hosts satellite imagery since 1973 of all NASA Landsat natively on Cloud Storage, Descartes utilized this marvel of Google. However, there was a need for a practical way to process satellite imagery of U.S. corn production data that was more than one petabyte without setting up a vast physical infrastructure.

Being a startup company, it could not afford to wait for months to prove its capability in the agricultural market. The company for added performance used Compute Engine and leveraged high-end Intel Xenon Scalable processors. They are now able to scale compute, storage and, networking to process the entire image archive of Landsat in just over 15 hours. With enabled historical back-testing, they can now predict corn-yields more accurately and at a faster rate than the government organizations.

Supercomputing with GCP

Supercomputing

By using the GCP, the company now can scale its proprietary ML tools on demand to process even the biggest datasets. The company from some of these satellites has developed the first-ever global composite views, which shows different frequency brands to monitor the Earth’s surface and the changes in vegetation.

To keep the costs low, Descartes Labs is using pre-emptive virtual machines, Compute Engines that are affordable since they are short-lived. They use thousands of CPU’s to keep the performance high while ingesting the high-bandwidth links and imagery to the Cloud Storage.

Descartes also uses Google Kubernetes Engine (GKE) that helps them get code into production quickly and provide better Api’s to its customers. The company’s solution on GCP is powered by Intel Xenon Scalable series that also includes Advanced Vector Extensions 512 support. This results in the effective processing of double the amount of data in one thus improving the compression speeds.

Building with BigQuery

The company captures information about each satellite image as vector data and uses Cloud Pub/Sub in combination with a microservice hosted on GKE to carry on that information to BigQuery for geospatial analysis. This has helped them to simplify a few of its GIS processes while taking advantage of server-less scalability. The company analyzes logs from its APIs and applications it provides to its customers by using BigQuery
One of the major advantages the company finds in using GCP is that they get access to the same tools that Google uses to empower its services that have helped them improve developer efficiency.

The way forward

With the cost of satellites coming down, the amount of observation of earth’s image data is increasing. Descartes Labs by using Intel and GCP has enabled an understanding of the world at a level, scale, and cost that was impossible a few years ago. With growing datasets, the company is exploring new ways by using TPUs from Google to make model training more effective. Besides that, the company is also looking forward to exploring artificial intelligence-driven use cases on GCP.
Within a few years, the company is expecting to have more than 30 petabytes on GCP. Descartes Labs believes that their collaboration with GCP has been invaluable and has helped them grow.

 

Become a Cloud Computing maverick – AWS IAM: The Cloud Engineer’s Secure Cloud Handbook Bundle

 

Leave a Reply

Your email address will not be published. Required fields are marked *