Iridium brightness compared

ROB MATSON (ROBERT.D.MATSON@cpmx.saic.com)
12 Jan 1998 20:47:29 -0800

On January 7th, 1998, Ed Cannon wrote about the brightness
of an Iridium 30 flare as compared to the Moon:

"...Since the Moon was in easy view, I made a quick back-and-forth
one-power comparison.  It appeared to me that the Iridium flare
was much more intense than the Moon's light.  I don't know the 
physics and/or optics to know what to make of that in terms of 
apparent magnitude.  Iridiums are nearly point sources, while 
the Moon is not.  In any case, I'd say that, in some very real 
sense, the Iridium flare was brighter than the Moon ..."

Ed is correct.  While the integrated visual magnitude of a full
moon is around -12.2, the Moon's *radiance* is much less than
that of an Iridium flare.  The confusion results from the difference
between radiance and irradiance.  Radiance is measured in
watts/cm^2 per steradian, whereas irradiance is in watts/cm^2.
Thus radiance takes into consideration the angular size of the
source.

For example, suppose you observe a flare from an Iridium satellite
when it is at a range of 1000 km.  The dimensions of an individual
main mission antenna (MMA) are 188 cm x 86 cm, so the area is about
1.62 square meters.  Let's say the geometry is such that your
view of the MMA reduces the projected area to 1.5 square meters.

At a range of 1000 km, the human eye cannot resolve this small of
an object.  Thus, if you were to double the size of the MMA, the
apparent brightness (radiance) would double.  You could continue
to increase the size of the MMA quite a bit before it would cease
to be a "point source".  Beyond this point, the brightness wouldn't
increase, just the angular size of the array.

A young, dark-adapted eye has a pupil diameter of about 8mm,
which leads to an angular resolution of about a quarter of an
arcminute.  That's about 73 meters at a range of 1000 km.  Thus,
a reflecting surface with a 73-meter diameter would be just resolvable.
That's an area of around 4150 square meters, compared to the 1.5
square meters of the real array:  2770 times larger.  This means
that if the moon had the same *radiance* as an Iridium flare, the
moon would be 2770 times brighter than the flare.

But the moon doesn't have the same radiance.  When you consider
that the sun shines at magnitude -26.7, a full moon (of approximately
the same angular size) is 14.5 magnitudes dimmer:  a factor of
631000!  Since the MMA is providing a specular reflection of the
sun, the radiance of the MMA is the same as that of the sun,
multiplied by the reflectivity of the MMA.  The reflecting surface is
silver coated teflon, so the reflectivity is very high -- say 90%.  This
means the MMA radiance is 568000 times brighter than the full
moon radiance.  (In other words, if you could only see a small
piece of the moon -- a piece with the same angular size as the Iridium
MMA -- than the MMA would be over half a million times brighter.)

But the moon *IS* resolvable, so you have to compare the radiance
of the MMA with the radiance contribution from a resolution-sized
piece of the moon.  This piece will have a solid angle 2770 times
larger than that subtended by the Iridium array; dividing 568000 by
2770, you get 205.  In other words, an Iridium flare is about 200
times "brighter" than the moon for a dark-adapted eye.  (Note that
if the pupil size is less than 8mm, that factor of 205 will drop by
the square of the same ratio; e.g., 6mm pupil gives a factor of
205 * (6/8)^2 = 115.)  --Rob