• While Stormtrack has discontinued its hosting of SpotterNetwork support on the forums, keep in mind that support for SpotterNetwork issues is available by emailing [email protected].

No-iFrills iPhone Lightning Imaging

gdlewen

EF4
Joined
May 5, 2019
Messages
393
Location
Owasso, OK
TL,DR: To reduce artifacts in iPhone lightning imaging due to overexposure, the iPhone SLO-MO video option is used at a nominal rate of 240 fps. The individual video frames are then merged in post-processing to create the effect of a single long exposure that spans the entire flash. Due to limited opportunities, this has not been fully tested with close, daytime cloud-to-ground lightning.

INTRODUCTION
Since @Laney Cole posted a question (Can someone help tell me this type of lightning), I have been playing around with using my iPhone 15 for lightning photography. Just a vanilla iPhone 15. And with no further investment in image acquisition or processing software: all freeware and no third-party apps. Budgetary constraints are the primary driving force, but there's also the challenge of what's called "desert island physics": making-do with only "coconuts, palm fronds, and sand."


AVOIDING OVEREXPOSED IMAGES.
To avoid problems with overexposed frames and rolling shutter, I use the iPhone SLO-MO video mode. This permits video recording at up to 250 fps. The hope is to reduce the image collection time from, say, 16ms to 4ms and thereby count fewer photons per frame. (NOTE: this is largely successful, although the brightest parts of the flash can still present a very few rolling shutter/overexposed frames.) Using SLO-MO also has the advantage that you are more likely to get the entire flash including initial branching, rather than only the main channel sans branches.


AUTO FOCUS
From what I can determine, the native iPhone camera app offers limited options for focusing video. There are apps you can buy that provide more control, but this is "no iFrills" stuff. The iPhone offers an AE/AF LOCK* feature, but what if there is no object in the field of view on which to focus? I could find nothing on the Internet for this use case, and frankly, after a while I stopped reading the repetitive tech-sites that regurgitate the same information.

Empirically I determined that, when AE/AF LOCK is invoked on a blank screen, the iPhone 15 focuses at infinity. I tested this by focusing on stars at night, taking a still image, and then covering the camera aperture and repeating the process. In video mode, AE/AF LOCK appears to work the same way, but I continue to test this, and if anyone has an authoritative reference please post a reply.
[* Auto Exposure /Auto Focus LOCK]


FRAME EXTRACTION
At a nominal frame fate of 240 fps, a one-minute clip takes a lot of space and contains thousands of frames, so I edited the movies in iMovie to reduce the final number of frames, but this step is optional.

Use FFMPEG to extract individual frames for processing. Not sure why AirDrop converted my SLO-MO video to 8-bit SDR, but according to HandBrake, that's what came across. Nevertheless, in the Terminal App, the following simple command did the trick, converting CLIP_6635.mov to a series of TIFF files as Frame0001.tiff, Frame0002.tiff, etc.

Command:
ffmpeg -i CLIP_6635.mov frame%04d.tiff


MERGE FRAMES
The application Hugin offers command line tools that align and merge (enfuse) target frames.
  1. Definitely use a tripod so you don't need to try to align the frames in post-processing. Especially if you shoot at night, since the alignment program uses control points and it's unlikely a night scene will offer more than one or two suitable points. (More likely, Hugin will try to use successive lightning channel features as control points yielding a decidely-psychedelic result.)
  2. The Hugin application of the Enfuse algorithm is intended to be used (for instance) in HDR photography, in which the same image is photographed at various exposures and fused (exposure fusion). But that's not how we will use it: each image we take will be essentially a different subject, and the overall imaging is inherently high-contrast. As a result, the fact Enfuse works at all is a bit surprising.
  3. Because of how the enfusing is being performed, at night we don't get the brightness contrast between the leader steps and the actual lightning discharge. For example, recoil leaders appear just as bright as lightning channel and the overall effect is a bit surreal.
Anyway: in Terminal, execute the following command, which will take all of the TIFF files in the current directory and create the generic output file "enfused.tif". (Note: the path to enfuse in the example is specific to my local environment.)

Command:
/Applications/Hugin_2020/tools_mac/enfuse --exposure-weight=0 --saturation-weight=0 --contrast-weight=1 --hard-mask --contrast-window-size=5 --output=enfused.tif *.tiff

Honestly, I tried changing these values and kept recursively returning to them. Perhaps this is proof of the maxim, "When you are on the top of a mountain, every step forward is a step down". More likely, I did not have an analytic measure for improvement and did not want to develop one.


After all, we passed "Science" about 3 exits back. This is pure art.


POST-PROCESSING OF ENFUSED IMAGE
I use Gimp to adjust the levels a bit in the final image, but that was mostly to satisfy personal aesthetics. The enfused images tend to be slightly lower contrast, which makes sense because we are fusing as many as a hundred images in which only a few features change from frame to frame.


RESULTS:

Just a few examples of the output of the process. (Only one movie file, though.) All video is taken using an iPhone 15 using SLO-MO mode with a FOV setting of ".5x".

iPhone Video of CA-CG flash taken August 11, 2025 from Owasso, OK. Facing E.​
CLIP_6635.jpg
Final "Enfused" Lightning Sill Photograph​


Two More Examples from the August 11, 2025 Session:
CLIP_6627_all_clip_frames.jpgMOV_6623.jpg

Could I have done better than this with my Nikon Z6? Of course, well...maybe: I don't have a wide-angle lens so this FOV wouldn't be possible.

When I'm out chasing I am not concerned with daytime lightning photography, except for the serendipitous flash captured while photographing a storm. The one time I did focus on trying to get a CG flash with this technique, I let an HP supercell sneak up on me. But when I do get a good one using this technique, I will follow-up with a post here.

P.S. I'm not trying to say there is anything groundbreaking here. This kind of imagery has been around for a long time. See, for example, here. Not to forget the stunning high-speed video @Dan Robinson posts. But how close can you come with an iPhone and a Mac and nothing else? I hope to do even better in time.
 
Last edited:
Back
Top