Humans have always taken inspiration from the natural world to solve complex problems. Some call it biomimicry, some call it being a copycat. Whatever your preferred terminology, the world of remote sensing is no different and, in fact, has more similarities to nature than you’d expect. To get us started with the series, let’s take inspiration from the animal kingdom. We all know that animals have different ways of experiencing their environment, but how is it similar to remote sensing?1
While we, as naturally flightless beings, are typically accustomed to viewing things from the ground, remote sensing gives us a bird’s-eye view of the world. Why bird’s eye view? Well, some of our first iterations of remote sensing actually involved attaching cameras to pigeons.2 The very first instances of what we consider remote sensing today happened back in 1850 when photos were first taken from a hot air balloon.3 People quickly explored how else to capture these types of aerial imagery including outfitting carrier pigeons with cameras. This was first done in the early 1900s and was largely motivated by World War I.4 Soon enough, though, airplanes with cameras quickly became a bit more reliable than the pigeons.
Though remote sensing technology has come a long way since the 1900s, the original, core idea remains just as relevant today: capture images from a higher vantage point – be it a bird’s eye view or a satellite in space – to see what the land below looks like to our eyes. The result when this is done? Truecolor imagery. Or rather, satellite imagery that displays visible light most closely to what human eyes (typically) perceive on their own.
Many animals can see wavelengths that humans can’t. This includes the fire chaser beetle (Melanophila acuminata) that can sense infrared from tens of miles away.5 These beetles rely on recently burnt trees as critical food for their young. To help find this food, these beetles are equipped with the ability to sense infrared which serves as a proxy for heat and guides them towards recently burned areas.
While humans can’t see infrared with our own eyes, we can use remote sensing to help us. Color infrared images, also known as false-color imagery, take the infrared band captured by remote sensors and display them using visible light bands. These types of images can show us phenomena that aren’t visible in truecolor imagery such as an active fire or the difference between deciduous and coniferous trees.
Stay tuned for more on color infrared imagery later in this series.
While most remote sensing satellites use the sun’s light to image the earth, known as passive remote sensors, some emit their own radiation and measure the radiation returned back.6 These sensors are known as active remote sensors and we can think of them operating similarly to bats using echolocation.
To echolocate, bats emit sound waves and, based on the information the ‘echo’ returns back, glean information about their surroundings, from the distance between things to the size of objects around them.7 8
Two common types of active remote sensing include Synthetic Aperture Radar (SAR) and Light Detection and Ranging (LiDAR).9 LiDAR closely mimics echolocation. Lidar sensors will emit light via a laser and, based on the time it takes for the light to return, the sensor can determine the elevation below. This data is often used to create elevation models of the earth’s surface, which is critical for disaster response operations and flood modeling among other applications.10
And finally, Chameleons! Chameleons can control their eyes independently of one another enabling them to see forwards and backwards at the same time. By comparing two images side-by-side, we can do something similar by seeing forwards and backwards in time.
Remote sensing has a long history, and today we see it applied to conservation, agriculture, urban and regional planning, and many other areas.
Although our own senses can’t cover the entire spectrum of seeing, hearing, or sensing, we can be intentional about the technology we use to give us the tools to monitor and steward the landscapes and environments we care for.
Now that we’re familiar with some of the (natural) history of remote sensing, we turn to our next piece in this series where we’ll learn about the electromagnetic spectrum, which builds our foundation for diving deeper into false-color imagery and remote sensing indices.
This piece was written by Lens team member Katie Tyler with credit to Lens software engineer T Zhang, who originally led an internal lunch and learn for the Upstream Tech team on this concept back in early January of this year.