As heralded by a cover story in Time (Feb. 3, 1997) and a special issue of Newsweek (spring/summer ‘97) the big story of the past year has concerned the most astonishing breakthrough ever in the field of child development - proof that brain-cell propagation during the critical formative years (birth to 6 years) is largely a function of the environment.
An environment rich in social stimulation and opportunity to explore invigorates brain development, and vice versa. It now appears that while genes may set an upper limit on a child’s general abilities, the more specific the aptitude (language, music, fine-motor), the more likely it is that early experience is the primary shaper.
These findings have direct bearing on the most controversial subject in education: attention deficit disorder.
Until now, the dialogue concerning ADD has been largely dominated by a troika of professionals who specialize in the diagnosis and treatment of ADD; a lobby-support group called Children and Adults with Attention Deficit Disorder; and Ciba-Sandoz, the manufacturer of Ritalin, a stimulant administered to some 90 percent of ADD children. The “ADD Establishment,” as I call the three, have steadfastly insisted that ADD is genetic in origin.
The latest evidence from the field of brain development, however, indicates that a deficiency of this sort can be explained primarily (but perhaps not exclusively) in terms of early environmental factors.
At this point, the proverbial sticky wicket arises, because disproportionate numbers of ADD-diagnosed children come from middle- and upper-middle class families. Their parents, understandably, tend to be indignant toward any suggestion that they failed to create optimal environments.
But in nearly every one of the homes in question, there sits at least one television set, and we know the average American child watches in excess of 5,000 hours of television before he reaches first grade.
The picture on a television screen changes, or flickers, every three to four seconds. A child watching a 30-minute TV program, therefore, isn’t paying attention to any one image for longer than a few seconds. Multiply that by 5,000 hours of watching (one-fourth of the child’s waking time!) during the years most crucial to brain development, and it is hardly far-fetched to suppose that the attention span of the child in question will be compromised.
Multiply that one child by the number of preschool children from all socioeconomic classes who spend disproportionate time watching television, and you’ve got an epidemic of kids with short attention spans, who therefore are impulsive, disorganized and forgetful (other symptoms associated with ADD).
The ADD Establishment will, as always, howl at this hypothesis, claiming it “blames” the parents. The fact is, if one has no way of knowing - as was the case - that a certain something can cause harm to children, then one can hardly be blamed for allowing a child exposure to it.
Besides, if it turns out that ADD can result from letting a young child watch television, then parents are actually empowered. With early detection and proper intervention, ADD can be significantly reversed, if not “cured.”
A researcher, commenting on the relationship between brain development and the number of words a young child hears on an average daily basis, recently stressed to The New York Times that the words need to come from “an attentive, engaged human being.” Someone, in other words, who exists in three dimensions and doesn’t flicker every few seconds. People on TV simply don’t qualify.
Subscribe to the Coronavirus newsletter
Get the day’s latest Coronavirus news delivered to your inbox by subscribing to our newsletter.