Categories: Web and IT News

The Most Expensive iPhone Photos Ever Taken: Inside Artemis II’s Orbital Camera Test

="">

Four astronauts aboard the Orion spacecraft just took iPhone photos from space. Not as a novelty. As a NASA experiment.

The Artemis II crew — Commander Reid Wiseman, Pilot Victor Glover, Mission Specialist Christina Koch, and Canadian Space Agency astronaut Jeremy Hansen — captured images of Earth during their 10-day lunar flyby mission using standard Apple iPhones. NASA released a selection of those photographs on April 5, and they are, by any reasonable standard, stunning. Sweeping views of Earth’s curvature. The thin blue line of atmosphere. Cloud formations sprawling across oceans. All shot on hardware you can buy at an Apple Store.

But the real story isn’t the beauty of the images. It’s why NASA put iPhones on a spacecraft in the first place.

A Consumer Device 230,000 Miles From the Nearest Genius Bar

According to AppleInsider, NASA included the iPhones as part of a technology demonstration to evaluate how consumer-grade camera hardware performs in the space environment. The agency has long relied on specialized imaging equipment — Hasselblad cameras during Apollo, Nikon DSLRs aboard the International Space Station — but the computational photography capabilities built into modern smartphones represent something different. Real-time HDR processing. Sensor fusion. Neural Engine-driven noise reduction. Features that professional cameras simply don’t offer in the same integrated package.

The Artemis II mission, which launched in early April 2025, marks the first crewed flight beyond low Earth orbit since Apollo 17 in December 1972. The crew isn’t landing on the Moon this time. Instead, they’re performing a lunar flyby — swinging around the far side of the Moon and returning to Earth — to validate Orion’s life support systems, navigation, and communication capabilities before the Artemis III landing mission. The iPhone photography test was a secondary objective, but it’s drawn outsized public attention.

And for good reason. The images NASA shared show Earth as seen from distances no human has experienced in over five decades. The photographs have a clarity and color fidelity that immediately distinguishes them from Apollo-era Hasselblad shots, though comparisons to those iconic images are inevitable.

NASA hasn’t disclosed which iPhone model the crew used. AppleInsider speculates it’s likely an iPhone 16 Pro or iPhone 16 Pro Max, given the 48-megapixel main sensor and the 5x optical zoom on those devices. The agency also hasn’t detailed what, if any, modifications were made to the phones for spaceflight — radiation shielding, thermal management, or software adjustments. These details matter. Consumer electronics aren’t designed for the radiation environment beyond Earth’s magnetosphere, and the Artemis II trajectory takes the crew well outside that protective bubble.

The space radiation question is nontrivial. Beyond low Earth orbit, devices are exposed to galactic cosmic rays and solar energetic particles that can cause single-event upsets in semiconductor chips — bit flips, latch-ups, or permanent damage to CMOS image sensors. The fact that the iPhones apparently functioned well enough to produce high-quality images suggests either that the exposure duration was short enough to avoid significant degradation, or that Apple’s chip architecture proved more resilient than expected. Possibly both.

Why This Matters Beyond the Photo Op

NASA’s interest in commercial off-the-shelf technology for human spaceflight has accelerated dramatically over the past decade. The agency’s partnerships with SpaceX, Blue Origin, and other private contractors reflect a broader strategic shift toward buying capability rather than building it from scratch. Testing an iPhone in deep space fits this pattern. If consumer imaging hardware can perform adequately on lunar missions, the cost savings over custom camera systems would be substantial.

There’s also a communication dimension. NASA has struggled for years to maintain public enthusiasm for its programs. The Apollo missions captivated the world partly because of their photography — Earthrise, the Blue Marble, bootprints in lunar regolith. Those images were taken on medium-format film cameras that cost, in today’s dollars, a fraction of the mission budget. But they became the mission’s most enduring legacy. NASA clearly understands that the Artemis program needs its own iconic imagery, and putting cameras in the hands of astronauts that produce immediately shareable, high-resolution digital photos is a calculated move.

The timing of NASA’s image release also coincided with broader coverage of the Artemis II mission’s progress. The crew successfully completed their lunar flyby trajectory and, as of early April, was on the return leg to Earth. Coverage from NASA’s official Artemis II mission page confirmed that all primary mission objectives were being met, with Orion’s heat shield, life support, and navigation systems performing within expected parameters.

Social media reaction was immediate and intense. On X, the images drew millions of impressions within hours of posting. Some users compared them favorably to photographs taken by professional equipment on the ISS. Others pointed out the surreal quality of the moment — a device designed for selfies and food photography now capturing views of Earth from cislunar space.

Not everyone was impressed in the same way. Some space photography enthusiasts noted that without RAW file access, the iPhone’s computational processing pipeline makes editorial choices about color, contrast, and sharpness that may not accurately represent what the human eye sees from orbit. A fair critique. Apple’s image processing is optimized for consumer appeal, not scientific accuracy. NASA’s professional cameras remain the instruments of record for mission documentation.

But that distinction may matter less than purists think. The Apollo program’s most famous photograph — the Blue Marble, shot on Hasselblad’s 70mm film — was itself heavily reproduced and color-corrected for public consumption. Every space photograph involves processing decisions. The iPhone just makes those decisions automatically, in milliseconds, using machine learning models trained on millions of terrestrial images. Whether those models produce accurate results when pointed at the Earth from 230,000 miles away is exactly the kind of question NASA’s technology demonstration is designed to answer.

The Canadian Space Agency’s involvement through astronaut Jeremy Hansen adds another layer. Canada’s contribution to Artemis — the Canadarm3 robotic system for the planned Lunar Gateway station — has been well documented, but Hansen’s presence on the crew makes Canada only the second nation to send an astronaut beyond low Earth orbit. His iPhone photographs carry national significance for Canada’s space program, and the CSA has been actively sharing them through its own channels.

What Comes Next for Cameras in Space

The data NASA collects from this iPhone experiment will inform imaging decisions for future Artemis missions. Artemis III, planned to land astronauts near the Moon’s south pole, will require extensive surface photography for both scientific and public affairs purposes. If consumer smartphone cameras prove viable in that environment — the lunar surface presents even harsher thermal and radiation conditions than cislunar space — NASA could integrate them into EVA suit configurations alongside traditional camera systems.

Apple has made no public comment on NASA’s use of its hardware. The company has historically been quiet about space applications of its products, even though iPhones and iPads have been used aboard the ISS for years, primarily as interfaces for science experiments rather than as cameras. This deeper-space deployment is a different matter entirely, and the promotional value — if Apple chose to capitalize on it — would be considerable.

For now, the photographs speak for themselves. Earth, impossibly blue against the void. Cloud systems tracing weather patterns across continents. The geometry of coastlines rendered in startling detail. All captured on a device that fits in a flight suit pocket.

The Artemis II crew is expected to splashdown in the Pacific Ocean in mid-April. When they do, their iPhones will likely undergo detailed post-flight analysis — sensor degradation measurements, radiation dose assessments, storage integrity checks. The glamorous part is the photos. The important part is the data.

Sometimes they’re the same thing.

The Most Expensive iPhone Photos Ever Taken: Inside Artemis II’s Orbital Camera Test first appeared on Web and IT News.

awnewsor

Recent Posts

The Quiet Death of the Dumb Terminal: Why Claude’s New Computer Use Is the Real AI Interface War

Anthropic just made its AI agent permanently resident on your desktop. Not as a chatbot…

13 hours ago

The Billionaire Who Says Your Kids Should Learn to Code Like They Learn to Read — And Why Wall Street Should Listen

Jack Clark thinks coding is the new literacy. Not in the vague, aspirational way that…

13 hours ago

Your AI Chatbot Is Flattering You — And It’s Making Its Answers Worse

Ask a chatbot a question and you’ll get an answer. But the answer you get…

13 hours ago

Google Photos Finally Fixes Its Most Annoying Editing Flaw — And It’s About Time

For years, cropping a photo in Google Photos has been an exercise in quiet frustration.…

13 hours ago

The Squeeze Is On: How U.S. Sanctions, OPEC Politics, and a Shadow War Are Reshaping Global Oil Markets

OPEC’s crude oil production dropped sharply in May, and the reasons stretch far beyond the…

13 hours ago

Google’s Gemini Is About to Know You Better Than You Know Yourself — And That’s the Whole Point

Google is making its biggest bet yet on the idea that artificial intelligence should be…

13 hours ago

This website uses cookies.