This article is part of the Science in Sci-fi, Fact in Fantasy blog series. Each week, we tackle one of the scientific or technological concepts pervasive in sci-fi (space travel, genetic engineering, artificial intelligence, etc.) with input from an expert.
Please join the mailing list to be notified every time new content is posted.
About the Expert
Judy L. Mohr is an engineer by background, but a writer at heart. Her PhD specialized in astronomical instrumentation, where she used stellar light to measure the horizontal air movement above the McLellan 1-m Telescope, Tekapo, New Zealand. Her post-doctoral research was with the MARS research team, who are developing the world’s first color-CT scanner. Needless to say, optics, imaging and light are her thing.
It should be no surprise that Judy’s fictional writing has a science fiction and fantasy slant. You can follow Judy on Twitter, or visit her at www.judylmohr.com.
Imaging Over Long Distances
You can’t hide from spy satellites.
The satellite whizzes overheard, being realigned by the technician in some bunker in a secret location. After moments of clicking at the keyboard, a series of images flicker across the screen. Details of the landscape come into focus, but that detail is not enough. The technician taps the keyboard, clicks the mouse and the cameras on the satellite overhead zoom in. They’ve found him. They can see exactly what he’s wearing and the backpack he has slung across his shoulder. Oh no… The hero is now in danger. RUN, JASON BOURNE! RUN!
While Hollywood would in reality take those zooming-in shots using a hover drone, believe it or not, the concept that the movie makers are trying to portray is very real. As much as you might try and hide, you can’t; the spy satellite will see you.
Imaging over long distances has been an area of research for many long years. Granted large portions of the research have been initiated by military and spy-related activities, but other sectors also benefit from the technology: astronomy (which is how I came to be in this field myself), telecommunications, civil aviation, and users of Google Earth just to name a few. You read that correctly: Google Earth has taken advantage of satellite imagery for years.
Google Earth Satellite Images
Roughly eight years ago, Google Earth undertook a massive campaign to image my home city, Christchurch, New Zealand. The initial images blew everyone away, using a combination of satellite imagery, aerial photography and geographic information systems. Since then, Christchurch has suffered massive earthquakes, leveling whole areas of the city, including the CBD, making all of those initial Google Earth images redundant. New images have been taken, but most of the residential areas are now cheap, minimal-rendering, satellite images. Even so, the detail in the images is impressive.
Take the image of my own house.
You can clearly see my big red roof, just making out my chimney flu. There’s all the trees on my property, the neighbor’s picnic table on her patio, the playground that borders my back fence and, surprisingly, you can see my green, red and yellow wheelie bins for rubbish. (I should also point out that the image shown was taken at least two years ago. That playground in the image no longer exists.) If that’s the level of detail that a cheap, dirty-and-nasty satellite image can provide, just imagine what information can be gleamed if you actually took the time to focus and acquire a series of images, using a much better imaging system than what Google used to take those two-year-old images of my house.
The advancements on digital imaging are progressing every day. We all know this. The latest smartphones now use 16 MPixel cameras, and the digital zoom…
Can you see now why those spy films are reasonably accurate with technology? No?
Imaging Over Long Distances from Ground
Okay, if that argument hasn’t convinced you, what about imaging over long distances, but across the ground, using telescopic lenses. Some people say that it’s a simple task, but it’s not.
Imagine that you’re driving down the road on a sunny day. If you look into the distance, you will get a shimmering in the road, heat rising from the tarmac only say 50 meters in front of you (approximately 160 feet). The further into the distance you look, the greater the scintillation. This is more commonly known as the mirage effect.
But those spy films are taking images from miles away. Over those distances, the scintillation would make a standard image so fuzzy that detail would be hard to distinguish. Or would it?
In 2002, researchers based out of ADFA in Canberra, Australia presented to the world single-frame images obtained of a house located 10 kilometers away from their imaging system (over 6 miles).[1] If there had been a person standing at the window at the time the photos were taken, you would have seen them in those photos (see below image taken from that paper).

Single frame from a telescopic sequence obtained over a horizontal distance of 10 km (over 6 miles). (Source: Reference [1].)
Given the public-domain research available on this field, most of which is years old, it’s no stretch of the imagination to believe that a satellite could take a photo of you and determine whether you were holding a tennis ball or a basketball, and what hand you were holding it in. However, that spy is going to need ground-based surveillance footage if they want to see you peel that orange.
Meanwhile, poor Jason Bourne is on the run again. Hopefully, he can find some peace soon.
References:
[1] Jahromi, et al. (2002) “Image Restoration of Images Obtained by Near-Horizontal Imaging through the Atmosphere.” DICTA2002: Digital Image Computing Techniques and Applications, 21–22 January 2002, Melbourne, Australia.
Follow me and you'll never miss a post:












Please share this article:












[…] around our planet that help citizens (and missiles) find their destinations. Judy also tackled Imaging Over Long Distances, which offers some useful guidance for capturing and processing images from extreme distance. That […]