Phone cameras can take in more light than the human eye − that’s why low-light events like the northern lights often look better through your phone camera

A May 2024 solar storm made the northern lights visible across parts of the northern U.S. AP Photo/Lindsey Wasson

Douglas Goodwin, Scripps College

Smartphone cameras have significantly improved in recent years. Computational photography and AI allow these devices to capture stunning images that can surpass what we see with the naked eye. Photos of the northern lights, or aurora borealis, provide one particularly striking example.

If you saw the northern lights during the geomagnetic storms in May 2024, you might have noticed that your smartphone made the photos look even more vivid than reality.

Auroras, known as the northern lights (aurora borealis) or southern lights (aurora australis) occur when the solar wind disturbs Earth’s magnetic field. They appear as streaks of color across the sky.

Two images of the northern lights, the left labeled 'eye' and the right labeled 'camera.' The 'eye' image is darker with the colors more muted.
The left side shows the aurora as seen with the naked eye. The right side reveals how a smartphone camera can capture brighter and more colorful lights.Douglas Goodwin

What makes photos of these events even more striking than they appear to the eye? As a professor of computational photography, I’ve seen how the latest smartphone features overcome the limitations of human vision.

Your eyes in the dark

Human eyes are remarkable. They allow you to see footprints in a sun-soaked desert and pilot vehicles at high speeds. However, your eyes perform less impressively in low light.

Human eyes contain two types of cells that respond to light – rods and cones. Rods are numerous and much more sensitive to light. Cones handle color but need more light to function. As a result, at night our vision relies heavily on rods and misses color.

A diagram of a human eye, with a zoomed panel showing rod and cone receptors. The rods are cylindrical, while the cones are conical.
Rods and cones in your eyes are photoreceptors that process black and white as well as color. Blume, C., Garbazza, C. & Spitschan, M., CC BY-SA

The result is like wearing dark sunglasses to watch a movie. At night, colors appear washed out and muted. Similarly, under a starry sky, the vibrant hues of the aurora are present but often too dim for your eyes to see clearly.

In low light, your brain prioritizes motion detection and shape recognition to help you navigate. This trade-off means the ethereal colors of the aurora are often invisible to the naked eye. Technology is the only way to increase their brightness.

Taking the perfect picture

Smartphones have revolutionized how people capture the world. These compact devices use multiple cameras and advanced sensors to gather more light than the human eye can, even in low-light conditions. They achieve this through longer exposure times – how long the camera takes in light – larger apertures and increasing the ISO, the amount of light your camera lets in.

But smartphones do more than adjust these settings. They also leverage computational photography to enhance your images using digital techniques and algorithms. Image stabilization reduces the camera’s shakiness, and exposure settings optimize the amount of light the camera captures.

Multi-image processing creates the perfect photo by stacking multiple images together. A setting called night mode can balance colors in low light, while LiDAR capabilities in some phones keep your images in precise focus.

A diagram showing a stack of grainy images flattened down to one clear image.
Image stacking involves aligning and combining several noisy photos to enhance the final image’s quality. Averaging these images together suppresses random sensor noise. This results in a clearer and more detailed picture than any of the photos alone. Douglas Goodwin

LiDAR stands for light detection and ranging, and phones with this setting emit laser pulses to calculate the distances to objects in the scene quickly in any kind of light. LiDAR generates a depth map of the environment to improve focus and make objects in your photos stand out.

Two images, the left labeled 'optical' and the right labeled 'depth' of a person dancing. The 'optical' image shows how the person would look normally in the photo, while the 'depth' image shows their silhouette in white against a black background.
Smartphone cameras don’t just capture flat images – they collect depth information too. The left side shows a regular photo, while the right side illustrates the depth map, with lighter pixels closer to the camera and darker ones farther away. Normally hidden, this depth data enables smartphones to apply effects such as artificial background blur to mimic the look of the northern lights against a night sky. Douglas Goodwin

Artificial intelligence tools in your smartphone camera can further enhance your photos by optimizing the settings, applying bursts of light and using super-resolution techniques to get really fine detail. They can even identify faces in your photos.

AI processing in your smartphone’s camera

While there’s plenty you can do with a smartphone camera, regular cameras do have larger sensors and superior optics, providing more control over the images you take. Camera manufacturers like Nikon, Sony and Canon typically avoid tampering with the image, instead letting the photographer take creative control.

These cameras offer photographers the flexibility of shooting in raw format, which allows you to keep more of each image’s data for editing and often produces higher-quality results.

Unlike dedicated cameras, modern smartphone cameras use AI while and after you snap a picture to enhance your photos’ quality. While you’re taking a photo, AI tools will analyze the scene you’re pointing the camera at and adjust settings such as exposure, white balance and ISO, while recognizing the subject you’re shooting and stabilizing the image. These make sure you get a great photo when you hit the button.

You can often find features that use AI such as high dynamic range, night mode and portrait mode, enabled by default or accessible within your camera settings.

AI algorithms further enhance your photos by refining details, reducing blur and applying effects such as color correction after you take the photo.

All these features help your camera take photos in low-light conditions and contributed to the stunning aurora photos you may have captured with your phone camera.

While the human eye struggles to fully appreciate the northern lights’ otherworldly hues at night, modern smartphone cameras overcome this limitation. By leveraging AI and computational photography techniques, your devices allow you to see the bold colors of solar storms in the atmosphere, boosting color and capturing otherwise invisible details that even the keenest eye will miss.The Conversation

Douglas Goodwin, Visiting Assistant Professor in Media Studies, Scripps College

This article is republished from The Conversation under a Creative Commons license. Read the original article.



Incredible Drone Video of an Insanely Powerful Multi-Vortex Tornado Destroying Wind Turbines

Check out this drone video from meteorologist Reed Timmer showing a multi-vortex tornado destroying several wind turbines. The video left me without words! Nature can be so powerful!

From Reed Timmer:

The most incredible #tornado footage ever captured by drone near Greenfield, Iowa with up-close helical suction vortex action. This tornado was very strong and damaged many windmills along its path and also struck the community of Greenfield. Team Dominator intercepted this tornado.

[Reed Timmer]

Today’s Hot Deals: NERF Blasters, SAMSUNG Galaxy Tab S6 Lite Tablet, Xbox Game Pass ULTIMATE (Xbox and Windows), and MORE!

Xbox Game Pass Deal

For today’s edition of “Deal of the Day,” here are some of the best deals we stumbled on while browsing the web this morning! Please note that Geeks are Sexy might get a small commission from qualifying purchases done through our posts. As an Amazon Associate, I earn from qualifying purchases.

Soundcore Anker Life Q20 Hybrid Active Noise Cancelling Headphones$59.99 $39.90

SAMSUNG Galaxy Tab S6 Lite (2024) 10.4″ 64GB WiFi Android Tablet w/ S Pen Included, Gaming Ready, Long Battery Life, Slim Metal Design, DeX, AKG Dual Speakers$329.99 $249.99

Up to 64% Off on NERF Blasters

Xbox Game Pass ULTIMATE: 3-Month Membership – STACKABLE & GLOBAL! – (Windows/PC, Xbox Series X/S, Xbox One)$50.97 $34.97

Echo Show 8 (3rd Gen, 2023 release) Smart Display With Spatial Audio, Smart Home Hub, and Alexa$149.99 $94.99

Save Up to 44% Off on Samsung Computer Monitors

Save Up to 44% on Ring Doorbells, Cameras, and Alarms

Microsoft Office Professional 2021 for Windows: Lifetime License$219.99 $49.97

Up to 42% off on Greenworks Battery Powered Outdoor Tools and More (Lawn Mowers, Trimmers, Edgers, Blowers, etc.)

Nespresso Vertuo Coffee and Espresso Maker,1597 ml, by De’Longhi, Graphite Metal$219.00 $153.30

Eddie Murphy is Back as Axel Foley in Trailer for New Beverly Hills Cop Sequel

Axel Foley is back! Eddie Murphy returns as the iconic detective in the trailer for the new Beverly Hills Cop sequel, coming to Netflix on July 3rd. Joined by fresh faces and familiar allies, Foley dives into a high-stakes conspiracy to protect his daughter and uphold justice. Get ready for action-packed nostalgia!

Dyson spheres: astronomers report potential candidates for alien megastructures – here’s what to make of it

Dyson Sphere
A Dyson sphere. Image generated via Stable Diffusion.

Simon Goodwin, University of Sheffield

There are three ways to look for evidence of alien technological civilisations. One is to look out for deliberate attempts by them to communicate their existence, for example, through radio broadcasts. Another is to look for evidence of them visiting the Solar System. And a third option is to look for signs of large-scale engineering projects in space.

A team of astronomers have taken the third approach by searching through recent astronomical survey data to identify seven candidates for alien megastructures, known as Dyson spheres, “deserving of further analysis”.

This is a detailed study looking for “oddballs” among stars – objects that might be alien megastructures. However, the authors are careful not to make any overblown claims. The seven objects, all located within 1,000 light-years of Earth, are “M-dwarfs” — a class of stars that are smaller and less bright than the Sun.

Dyson spheres were first proposed by the physicist Freeman Dyson in 1960 as a way for an advanced civilisation to harness a star’s power. Consisting of floating power collectors, factories and habitats, they’d take up more and more space until they eventually surrounded almost the entire star like a sphere.

What Dyson realised is that these megastructures would have an observable signature. Dyson’s signature (which the team searched for in the recent study) is a significant excess of infrared radiation. That’s because megastructures would absorb visible light given off by the star, but they wouldn’t be able to harness it all. Instead, they’d have to “dump” excess energy as infrared light with a much longer wavelength.

Unfortunately, such light can also be a signature of a lot of other things, such as a disc of gas and dust, or discs of comets and other debris. But the seven promising candidates aren’t obliviously due to a disc, as they weren’t good fits to disc models.

It is worth noting there is another signature of Dyson sphere: that visible light from the star dips as the megastructure passes in front of it. Such a signature has been found before. There was a lot of excitement about Tabby’s star, or Kic 8462852, which showed many really unusual dips in its light that could be due to an alien megastructure.

Image of Tabby's Star in infrared and ultraviolet.
Tabby’s Star in infrared (left) and ultraviolet (right) wikipedia

It almost certainly isn’t an alien megastructure. A variety of natural explanations have been proposed, such as clouds of comets passing through a dust cloud. But it is an odd observation. An obvious follow up on the seven candidates would be to look for this signature as well.

The case against Dyson spheres

Dyson spheres may well not even exist, however. I think they are unlikely to be there. That’s not to say they couldn’t exist, rather that any civilisation capable of building them would probably not need to (unless it was some mega art project).

Dyson’s reasoning for considering such megastructures assumed that advanced civilisations would have vast power requirements. Around the same time, astronomer Nikolai Kardashev proposed a scale on which to rate the advancement of civilisations, which was based almost entirely on their power consumption.

In the 1960s, this sort of made sense. Looking back over history, humanity had just kept exponentially increasing its power use as technology advanced and the number of people increased, so they just extrapolated this ever-expanding need into the future.

However, our global energy use has started to grow much more slowly over the past 50 years, and especially over the last decade. What’s more, Dyson and Kardashev never specified what these vast levels of power would be used for, they just (fairly reasonably) assumed they’d be needed to do whatever it is that advanced alien civilisations do.

But, as we now look ahead to future technologies we see efficiency, miniaturisation and nanotechnologies promise vastly lower power use (the performance per watt of pretty much all technologies is constantly improving).

A quick calculation reveals that, if we wanted to collect 10% of the Sun’s energy at the distance the Earth is from the Sun, we’d need a surface area equal to 1 billion Earths. And if we had a super-advanced technology that could make the megastructure only 10km thick, that’d mean we’d need about a million Earths worth of material to build them from.

A significant problem is that our Solar System only contains about 100 Earths worth of solid material, so our advanced alien civilisation would need to dismantle all the planets in 10,000 planetary systems and transport it to the star to build their Dyson sphere. To do it with the material available in a single system, each part of the megastructure could only be one metre thick.

This is assuming they use all the elements available in a planetary system. If they needed, say, lots of carbon to make their structures, then we’re looking at dismantling millions of planetary systems to get hold of it. Now, I’m not saying a super-advanced alien civilisation couldn’t do this, but it is one hell of a job.

I’d also strongly suspect that by the time a civilisation got to the point of having the ability to build a Dyson sphere, they’d have a better way of getting the power than using a star, if they really needed it (I have no idea how, but they are a super-advanced civilisation).

Maybe I’m wrong, but it can’t hurt to look.The Conversation

Simon Goodwin, Professor of Theoretical Astrophysics, University of Sheffield

This article is republished from The Conversation under a Creative Commons license. Read the original article.