The idea is to look at the time-distance results for 30-second and 60-second full-disk results. If the time-distance noise is basically shot-noise, then the noise difference should be well-defined. If the noise is basically granulation, the answers should be the same.

To do a bang-up job of it, I'll use the data when we have continuous 30-second cadence data, do a track_region cube at 30-second cadence. Then I'll do two analyses, one of the subsampled by two data and then the second by block averaging to give 60-second data.

We have this nice sequence from June 19-30, 1997 which seems like the right time to use. It would be good to do a 16-hour sequence (1024 minutes). The prog name for the 30-second cadence data is fd_V_30s_01h. The data is lumped to one hour chunks with the same series numbering as the 60-second cadence data. It looks quiet at disk center on June 19-21, 1997, so any time in this area would be ok. Decided on a time centered on 1997.06.20_00:00:00_TAI. To get a background image to subtract, exported the hour 39144. Used one 30-second image to fit a central area. I need to keep reminding myself to use the one-hour averages dataset for this purpose.

The noise in the oi signal is the same for a distance of 1 degree. So this suggests that the difference between hi-res and full-disk is not the shot noise.