INWIT A BRIEF HISTORY OF ILLUMINATION
Publications in Science and Mathematics, Computing and the Humanities
Museum Development, Educational Curricula, and Science Fair Initiatives
Educational Toys and Technology
A Brief History of Illumination
by Vincent Mallette
Copyright © 1999 Inwit Publishing, Inc.
The first artificial source of light was a burning brand plucked from a fire: the torch. That was about a million years ago, and that was all we had for a
million years.1 About 5,000 years ago people put oil in a clay jar, stuck in a wick of cloth or rope, and had the lamp.2 The ancient
world was lit by lamps.3 Many of them survive, and have been fired up by present-day archaeologists. The best ones give several times as much
light as the candle, which started appearing in Rome in the first and second centuries A.D.4 Although not as bright as an oil lamp, the candle
the no-spill lamp was very convenient, and rivaled the lamp as the main source of light during the Middle Ages. American colonists in the
1700's used a lamp, the beloved "Betty" lamp, for nighttime chores and reserved the more expensive candles for entertaining.
However, neither lamp nor candle gave enough light for fine work such as sewing in the nighttime.5 Mindful of this, many medieval guilds severely
penalized their artisans for doing work at night, because it would be shoddy if done by the sources of artificial light then available. In the late 1700's,
the Swiss physicist and inventor Aimee Argand developed a tubular wick for gas or oil that allowed air to circulate inside as well as outside; this
substantially increased the luminance of the flame and enabled fine work to be done at night for the first time. Unfortunately Argand lamps were tricky and
were not widely available. The cheap, even brighter Welsbach mantle6 was invented in 1885, but by then Edison's electric light (1879) was well on
Edison's electric light in its original form wasn't really practical or cost-effective, and wasn't a fit replacement for the gas light that limned the
Victorian world. But so great was the public's thirst for the electric light, which didn't exist, that they put up with Edison's wretched carbon filaments
for 30 years until Coolidge's ductile tungsten made decent incandescent lamps a reality for the first time.8
Incandescent lamps have been improved by filling with gas,9 inside-frosting,10 and most recently by using a halogen chemical to recycle
tungsten and reduce blackening. The fluorescent lamp was invented in 193811 and, it would seem, is confined by law to kitchens in houses, though
it is widely used in commercial buildings where its badly skewed spectrum has produced headaches and irritability for decades. High-pressure sodium lamps,
which can be recognized by their salmon-coppery hue, now dominate highway lighting where their speciously human glow has displaced the honest robotic blue
of mercury arcs.
Though not used for illumination per se, neon lights deserve a few lines. Several years ago the country was flooded with colorful brochures from an inane
neon sign manufacturer, depicting fin-de-siecle Paris blazing with neon lights rather like a Victorian Las Vegas. This was of course a farrago of fanciful
nonsense. Neon was not even discovered until 1898 and was not used in a lamp until 1909.12 And most neon signs aren't even neon they are
mixtures of neon and/or mercury, helium, argon, and krypton.13 Only the reddish-orange "Joe's Bar & Grill" signs, which are clear when unlit,
are pure neon.
Flame and even incandescent electric light sources were hopelessly inadequate for theatrical purposes. Early stage spotlights used a stick of lime heated
white hot by an oxyhydrogen blowtorch the fabled limelight which gave us the phrase "in the limelight." Later, electric carbon arcs, which had been
used in the 19th century for streetlighting, were doped with chemicals to improve their whiteness and were widely used to make movies the Klieg
light.14 The future of home lighting may center around the so-called "Earth Lights", which are fluorescent tubes driven by ultra-high-frequency
A-C. They are several times more efficient and long-lasting than incandescents, and can produce a very warm, human-friendly glow.
The holy grail of lighting, though, is a decently colored light which is cool: the hallowed "cold light."15 Lamps are hot because they're
inefficient and are much better sources of heat than of light (as any farmer knows who uses 40-W bulbs for his chicken brooders); the average incandescent
puts out 97% heat and 3% light.16 Fluorescents may go as high as 11.2% light,17 but no practical source of white light is anywhere near
the 90's and hence "cool."
Addendum (for the technical-minded)
What is the lux? You buy a camcorder and it's rated to pick up baby's first step in a light level of, say, 10 lux. Ten lux is what used to be called a
foot-candle,18 and it's rather dim. However, after 30 minutes of dark adaptation, the human eye can recognize objects in 0.007
lux!19 (For comparison, I measured late-afternoon sunlight in Atlanta as laying down 130,000 lux.)
Here are some recommended levels of illumination for various activities:
- Hospital operating rooms - 10,000 lux
- Dental clinic, patient's mouth - 2200 lux
- Bookkeeping - 215-320 lux
- General reading - not less than 200 lux
- Washing dishes - 50-100 lux
- Not falling over baby while videotaping her first step - at least 2 lux
What is the source, in a microscopic sense, of light? There are exceptions, such as synchrotron light, but in general visible light (400-700 nanometers
wavelength, blue to red) is the result of electron transitions in the outer orbitals of atoms. These transitions can be evoked by purely thermal agitation,
as in an incandescent lamp, or by a flow of electricity through a plasma, as in mercury, sodium, and neon lamps. It's another story for "invisible light."
Ultraviolet light is produced by both outer and inner orbital transitions; X-rays are exclusively inner-orbital or nuclear; gamma rays and cosmic rays are
nuclear. On the other side of visible, most infra-red is due to molecular vibrations; radar and microwaves are molecular rotations; TV, radio, and
everything longer owe their existence to surging electrons in classical oscillators. All these processes give off photons, whose energy is directly
proportional to their frequency. That is why long wavelength (low frequency) radiation is harder to detect each photon bears less energy. By contrast,
the photons of gamma rays bear so much energy that they can produce startling, irreversible effects in matter such as killing you. So if you invent a
gamma-ray lightbulb, I don't want to hear about it!20
1 Homo erectus is generally credited with the discovery or use of fire. It may have been as much as 1.5 million years ago. In any
case, Homo erectus contracted a new disease, Vitamin A poisoning, that may have been due to his intemperate consumption of yummy, fire-cooked meat (Panati's
Browser's Book of Beginnings by Charles Panati, Houghton Mifflin Company, 1984, pp. 41-42).
2 To be fair, wicked lamps (note that wick-ed and wicked are the same word!) have been found at the Lascaux cave in France ("The Lamps of Cosa"
by Cleo Rickman Fitch, Scientific American, Dec. 1982, p. 148). Lascaux, a famous site of prehistoric paintings, has been dated from 15,000 to 30,000
B.C. And there is evidence that crude stone lamps "probably fueled with animal fat and using grass or moss for a wick..." were in use around 79,000 B.C. (The
Timetables of Science by Alexander Hellemans and Bryan Bunch, Simon and Schuster, 1988, p. 5). But lamps didn't come into their own until after 10,000
B.C., when agriculture provided rich sources of oil olives in particular. And decent clay jars weren't available until the potter's wheel was invented
in Mesopotamia around 3000-3500 B.C. (The Timetables of Science by Alexander Hellemans and Bryan Bunch, Simon and Schuster, 1988, p. 9).
3 Testifying to man's thirst to push back the night, lamps were a mass-produced item even 2,000 years ago: a crate of them, still packed and ready
for shipment, was found at Pompeii ("The Lamps of Cosa" by Cleo Rickman Fitch, Scientific American, Dec. 1982, p. 156). The Chinese had lamps with
asbestos wicks by 308 B.C. (The Timetables of Science by Alexander Hellemans and Bryan Bunch, Simon and Schuster, 1988, p. 27); we have no figures on
their cancer rates!
4 Examples of candles have been found from the 4th millennium B.C. (The Timetables of Science by Alexander Hellemans and Bryan Bunch, Simon
and Schuster, 1988, p. 9), but the lamp was far and away more common until medieval times.
5 Even the Bible recognized this: Jesus said, "...the night cometh, when no man can work" (John 9:4).
6 The mantle was a fascinating, if belated, solution to the problem of converting a flame's copious heat into light. Many had noticed how a stick
of clay or metal flared up brightly in the intense, but nearly invisible, flame produced by the burner invented by Bunsen in 1855 (his least important
discovery!). The mantle in its evolved form is a cloth bag soaked in solutions of certain refractory oxides, often rare earths. After being fitted over the
flame (which may be oil, illuminating gas, or propane), the bag burns away and leaves a lacy skeleton of fused ceramic, which glows white-hot for weeks, and
produces a very satisfactory light, as any user of a Coleman lantern can attest.
7 We need to be reminded how really dim and poor unimproved flames are as sources of light. The sparkling parties and dances of Napoleonic times,
such as are depicted in movies like War and Peace (where the actors are in fact lit by 10,000-watt Klieg lights!), were so dim in reality that in spite of
massive banks of candles, you couldn't recognize your date across the dance floor.
8 This is of course a cranky and personal judgment. Still, the fact is that Edison carbon-filament light bulbs have been constructed in modern
times according to his recipe, and I am sure you would pronounce them wretched reddish, dim, fragile, short-lived alongside a tungsten-filament
lamp. Edison's first commercial bulbs had an absolute efficiency of less than a quarter of a percent! The most primitive tungsten bulbs were 6 times better
than that. I admit, though, that even Edison's Mark 1 was slightly more efficient than a low-pressure gas mantle. And metalized carbon (the "G.E.M" lamps)
approached tungsten in efficiency. (Efficiencies from American Institute of Physics Handbook, Third Edition, page 6-209.) For what it's worth, historians of
technology acknowledge that Joseph Swan's invention of the practical electric light was contemporaneous with Edison's (Information Please Almanac, Houghton
Mifflin Company, 1996, p. 551; The Universal Almanac 1996, Andrews and McMeel, 1995, p. 591). Recognizing this co-fathership, electric lamps for a while were
marketed under the name "Ediswan" (The People's Almanac # 2 by David Wallechinsky and Irving Wallace, Bantam Books, 1978, p. 432). Swan, an English
chemist, lived from 1828 to 1914; although knighted, he is little known in this country. I concede that Swan's lamp took honors second to Edison's at the
Paris exposition of 1881. On the other hand, Swan and other chemists perfected "squirted" cellulose for carbon filaments, a product even Edison was
eventually forced to use. Edison: A Biography by Matthew Josephson, John Wiley & Sons, Inc., 1959, p. 258 and p. 236.
9 Irving Langmuir did this in 1913 (The Universal Almanac 1996, Andrews and McMeel, 1995, p. 594), about the same time that Coolidge was
perfecting the tungsten filament. So the real inventors of the modern, practical incandescent lamp are Coolidge and Langmuir.
10 Although frosting would seem like an obvious thingto do to a light bulb to reduce the hideous glare of a naked filament, it was only done
sporadically until Pipkin's inside-frost process was invented in 1924. (By the way, inside frosting is about four times more efficient than outside frosting,
due to total internal reflection.) Ironically, today clear light bulbs usually cost twice as much as frosted bulbs.
11 Actually, demonstrations were conducted in 1936, but there were hardly any commercial sales until 1938. Both GE and Westinghouse are credited
with the invention; no individual inventor is singled out. Fluorescent lamps produce light in an indirect manner: mercury vapor in the tube is electrically
excited to give off powerful ultraviolet radiation, which impinges on the phosphor which coats the inside of the tube; the phosphor converts the UV to
visible light, in this case the rather sickly blue glow which we have all come to know and love as fluorescent illumination.
12 The Frenchman Georges Claude is usually credited; the year is often given as 1911 (Information Please Almanac, Houghton-Mifflin Company, 1996,
13 For example, blue light can be produced by a neon-argon or neon-argon-helium mixture.
14 These lights, like all unfiltered electric arcs, were very rich in harmful ultraviolet and produced a burning sensation in the eyes of the
actors, which was spoken of as "klieg eyes" (Merriam-Webster Third New International Dictionary, 1986, p. 1248.) As a protection, Hollywood actors began
wearing sunglasses all the time on the set. This touched off a fad: stars, starlets, and groupies started wearing sunglasses day and night.
15 A light source is spoken of as "cold" or "cool" if its equivalent black-body "characteristic" color temperature is very much higher than its
actual operating temperature. The yellowish-green of fireflies has a characteristic temperature of many thousands of degrees, yet the little critter is
barely warm, so efficient is its beacon. Very little energy is required to produce light if the conversion efficiency is high: the energy of a pea falling
an inch, if completely converted into light, would produce a faint glimmer for every man, woman, and child who ever lived. For the record, 1 watt equals 680
lumens at the wavelength of maximum luminosity for the human eye, 555 nanometers (American Institute of Physics Handbook, Third Edition, page 6-10). This is
called "the least mechanical equivalent of light," and no light source can ever be more efficient than this.
16 Actually, a 40-W incandescent bulb has an absolute efficiency of about 1_% and a 100-W bulb runs about 2«%. The efficiency goes up as the
wattage goes up. So a single 200-W bulb gives more light than two 100-W bulbs. Five-thousand-watt incandescent bulbs, which are seldom seen in homes, run as
high as 4 2/3 % efficiency. Light output should never be measured in watts, contrary to the cartons in which some bulbs are sold: "Light of a 150-watt bulb
from only 135 watts!" etc. Responsible manufacturers give the electric power consumption in watts and the light output in lumens, so lumens per watt is the
rational measure here.
17 This is for an 8-foot industrial slimline T8. A more typical figure, for a 40-W "standard warm white" household fluorescent tube, would be
about 9«%. The efficiency depends on the color: a pure green fluorescent tube, such as you might see at a carnival, runs over 12%. A pure red tube is
a pitiful 0.53%. This is the reason that fluorescent tubes tend toward the green and grotesque it's cheap! (Several companies offer so-called
"full-spectrum" fluorescent tubes, which claim to have a color approximating daylight. The ones I've seen have not impressed me.) (Efficiencies from
American Institute of Physics Handbook, Third Edition, page 6-209.)
18 To be more precise, 1 lux = 0.092902 foot-candle, and 1 foot-candle = 10.764 lux.
19 Encyclopedia of Physics, Second Edition ed. by Rita G. Lerner and George L. Trigg, VCH Publishers, Inc., 1991, p. 1342. This would not,
however, impress the harbor seal which can see perfectly well in the moonlight that filters through 1400 feet of seawater.
20 Some people can see blue flashes in their eyes from x-rays or gamma rays this is the wail of a dying retinal cell.
INWIT WRITINGS, LINKS AND SOFTWARE DEMONSTRATIONS
Copyright © 1982-2018 Christopher D. Watkins. All Rights Reserved. LEGAL
Orlando, Florida, USA | Wed 25 Aug 2004