Certain types of lightning do better with rolling shutter than others. Slower lightning discharges like upward bolts and horizontal "anvil crawlers" can be slow enough that there is no change in brightness during the chip scan. Fast lightning phenomena, like cloud-to-ground bolts and some return strokes, are too fast for even the fastest-performing rolling shutter sensor to capture in a single frame. Occasionally you can get a CG with a long enough current pulse to fully expose during the sensor scan, but it's very rare. More distant bolts can also avoid splits easier since they take up less of the sensor area.
I've done a little research now. Partial scans are only a problem when the shutter speed is faster than the frame rate, which you can avoid if you have a manual shutter control option. The main problem is most DSLRs in video mode set the sensor read rate to match the frame rate. If your shutter speed setting matches the frame rate, you will capture most of the light from even the briefest flash. Capturing the flash isn't the problem. The problem is, unless you get extremely lucky, individual bright flashes that last for less than a single frame are split into pairs. One part of the flash appears on the bottom of one frame while the rest appears on the top of the next. Sometimes the faster frame-rate footage looks worse because you get darker horizontal black strips between very strong discrete flashes whereas a slower frame-rate would at least blend flash pairs together. Positive CGs and anvil crawlers turn out okay regardless of framerate because the main visible bolts seem to pulse at a frequency much slower than 1/30 second, exposing several frames at a time. You still get flicker lines from the dimmer but faster pulsing negative in-cloud channels that precede the main visible flashes though.
The real solution is to stick with a 30 fps framerate but allow the camera to read the sensor at the speed it would if you were shooting at 120 fps. Then you would at least have 75% chance of not splitting a quick flash, as the sensor read/reset cycle would only last 1/4 of a frame. CMOS sensors in DSLR cameras do seem to operate this way when shooting still photos. You don't get severe rolling shutter artifacts when shooting stills at 1/30 second shutter speed. Only in video mode does the camera not utilize the full speed potential of the sensor. I haven't had anyone explain why this is the case though. Simply reading the data as fast as the sensor is capable of doing wouldn't consume extra CPU power because there is a "rest" period where data is not being read from the sensor at all. Perhaps the sensor read function itself consumes more battery power at faster speeds, regardless of breaks. I know with GoPro the sensor read is matched to the frame-rate for the special image stabilization software algorithms it has.
Rolling shutter is a big enough problem that there is some progress taking place to develop global shutter CMOS chips. I think in 5 years or so we might start seeing some prosumer-level cameras with those improvements.
From what I understand, unlike CCD sensors, CMOS sensors have no way to instantaneously store pixel values. The CCD sensors could transfer all the data simultaneously because they used an analog method to transfer the charge to an unexposed buffer, which the CPU then read digitally behind the scenes. A CPU has to read individual pixels in binary one at a time. At the present moment chips on the sensor just aren't capable of reading an entire 20MP analog array into digital memory in parallel. It's the conversion from analog to digital that takes time, and if analog data isn't read directly from the sensor it must be stored somewhere in analog first. Apparently the extra analog storage step is what degrades image quality in low light (and also generates extra heat).