magnitude (of stars, in astronomy)

This entry treats visual magnitude. For other types of magnitude used in astronomy, see:

A scale for describing a star's apparent brightness, to the human eye on earth. The symbol for visual magnitude is m. The bigger the number, the fainter the star. The sun is minus 26.5; the brightest star minus 1.4; the faintest stars visible to the naked eye are around magnitude 6, and the faintest objects visible to the most powerful telescopes on earth are about magnitude 25.

Our system for describing the apparent brightness of stars apparently began with Hipparchus (fl 146-127 bce), a Greek astronomer who compiled the first star catalog, over objections that cataloging the gods' abodes was an outrageous impiety. He completed the catalog in 129 bce, but no copy has survived. The title (translated into Latin) and various reviews have survived, and suggest that the catalog gave both position and brightness for about 850 stars. In any case, it became the basis of Ptolemy's catalog, which has reached us (through Arabic) as the Almagest. The brightest stars were called “stars of the first magnitude.” Stars just noticeably dimmer than first magnitude stars were stars of the second magnitude, those noticeably dimmer than stars of the second magnitude were stars of the third magnitude, and so on. Five such steps took Ptolemy (and presumably Hipparchus) down to the faintest stars he could see: sixth magnitude stars.

The invention of the telescope made it possible to see stars fainter than the sixth magnitude, and these stars were also assigned magnitudes. Stars of magnitude 11.5 are visible (even 13 under the best conditions) through a telescope with a 3-inch aperture (the aperture is the diameter of the front objective in a refractor, or of the main mirror in a reflector); 13.0 through a six-incher; and 14 through a ten-incher.

Click here for a table of the visual magnitudes of some stars and planets.

In 1856 Norman R. Pogson suggested a mathematical definition of magnitude.¹ Like Herschel and other early observers, Pogson had noticed that if two stars differed in brightness by 5 magnitudes (for example, a first magnitude and a sixth magnitude star, or a 10th magnitude and a 15th magnitude star), the brighter one was about 100 times brighter than the fainter one. Observers had reached this conclusion through a fairly simple technique. The amount of light a telescope gathers depends on its aperture. If part of the aperture is covered up, faint stars seem to disappear, as the amount of light the eye is receiving from them becomes too little to be perceptible. Suppose we choose a star and, while observing it, cover up more and more of the aperture until that star can no longer be seen. Repeat the procedure with another star. The ratio of the brightness of the two stars will be the ratio between the areas of the apertures at the point at which each star became too faint to see. Pogson proposed that successive magnitudes differ by some constant multiplier. Call it x. The ratio of the brightness of a first magnitude star to the brightness of a second magnitude star will be 1:x; the ratio of a second to third is also 1: x, and so on. What is the ratio of the first magnitude star to a third magnitude star? 1:(x times x), or 1: x². Extending this, the ratio between the brightnesses of magnitude 1 stars and magnitude 6 stars must be 1: x⁵. Since observations had shown that this ratio is 1:100, x⁵ = 100. So, Pogson suggested, the ratio between successive magnitudes should be set at 1:the fifth root of 100, which is about 1:2.512. (2.512 × 2.512 × 2.512 × 2.512 × 2.512 = 100)

Pogson's definition determines the size of the steps between magnitudes, but not how bright any particular magnitude is. To do that, astronomers at first assigned magnitudes to a group of stars around the north celestial pole. With the development of more sensitive photoelectronic instrumentation this original definition was no longer good enough. Today, zero magnitude is defined by the magnitudes of ten rather average, non-variable stars fairly evenly distributed over the celestial sphere. By defining the ratio between magnitudes as Pogson did, and by the magnitudes assigned to the ten stars, astronomers have largely preserved the ratings Hipparchus gave the stars in his catalog 20 centuries ago.

1. Norman R. Pogson.
Monthly Notices of the Royal Astronomical Society, vol 17, page 12.

further reading

J. B. Hearnshaw.
The Measurement of Starlight: Two Centuries of Astronomical Photometry.
New York: Cambridge University Press, 1996.

sources

The apparent visual magnitude of an object is V(r, Δ, ) = V(1, 0) + C⍺ + 5log(rΔ), where r is the heliocentric distance and Δ is the geocentric distance (both in AU), C is the phase coefficient in mag deg⁻¹, and is the phase angle (deg).

Arthur N. Cox, editor.
Allen's Astrophysical Quantities. 4th ed.
New York: Springer Science+Business Media, LLC, 2004.
Page 162. V(1,0) is the absolute visual magnitude.

Sorry. No information on contributors is available for this page.

home | units index | search |  contact drawing of envelope | contributors | 
help | privacy | terms of use