3   America’s Hundred-Year

     War against Precocity

 

 

 

            Adults have never liked two types of adolescents: those who act like youths, and those who act like adults. Teens who behave immaturely annoy grownups, but precocious ones are even scarier. Both kinds have prompted a century of confused measures that now subject American youths to the worst of both worlds. The U.S. Supreme Court has explicitly ruled that policy-makers may impose adult responsibilities and punishments on individual youths as if they were adults at the same time laws and policies abrogate adolescents’ rights en masse as if they were children.

            Three theories attempting to explain, from an historical perspective, the relationship between adults and adolescents will be explored here: linear, shock, and cyclical.

 

 

Linear theory: Degeneration

 

            This theory is simple: each new generation of youth acts worse than its parents-- ruder, dumber, lazier, more violent, hypersexed, less heroic, generally hellbound. Most adults throughout history appear to have subscribed to linear degeneration theory even though its logical implication is the apex of human civility, nobility, and diligence was Australopithecus.

            There has always been plenty of intergenerational tension. Perhaps the oldest continuous theme in adult commentary is to deplore the younger generation as worse than its grandparents. An hieroglyphic taken from an ancient Egyptian tomb, circa 4,000 BC, reads: “Today’s young people no longer respect their parents. They are rude and impatient. They have no self control.” If earlier cave-wall inscriptions could be translated, doubtless they’d read that today’s boys can’t hunt and girls don’t gather like their halcyon ancestors of yore.

            In 700 BC, for example, the Greek poet Hesiod proclaimed “no hope for the future of our people if they are to be dependent upon the frivolous youth of today, for certainly all youth are reckless beyond words, exceeding ‘wise’ and impatient of restraint.” Socrates complained that youth love luxury, bad manners, tyrannizing their teachers, gluttony (and, apparently, him). Plato accused the young of being lawless, disrespectful, disobedient, and morally decayed. By 400 BC, adults were already running out of epithets. None of the older-younger vitriolics regularly repeated since has improved on ancient tirades.

            In 1700, American religious icon Increase Mather declared that “if the body of the present generation be compared with what was here forty years ago, what a sad degeneracy is evident.” Youth of 1850 were derided by elder erudite George Templeton Strong as “so much gross dissipation redeemed by so little culture.”

            Similarly derogatory comments by adults about each younger generation recur below as we look at post-1900 American adolescent history.

 

 

“Future Shock” Theory: Baffled adults = frightened adults

 

            One modern trend prompting greater adult fear of youth today is progress and social change, or what anthropologists call the difference between postfigurative and prefigurative societies. In postfigurative societies, writes Margaret Mead (1970), change is slow and imperceptible; the lives of grandchildren resemble those of grandparents. In such repeating societies, which dominated human existence for hundreds of thousands of years and can still be found in dwindling numbers, the older adult embodies what the culture was, is, and what it will be, its rituals, and its wisdom. The job of parents is to transmit the customs of their parents to their children.

            Because of shorter lifespans of the past, the child in a postfigurative culture was acquainted only with the most robust elders who had survived to old age. The elders were respected as the strongest, most relevant and valuable members of the tribe and were confident in their ability to understand and predict the world their children would grow up in. Children and present generations face criticism from the old in such societies for alleged failures to live up to the traditions of legendary ancestors; the Old Testament prophets in the Bible are stern examples.

            In contrast, the prefigurative society is characterized by rapid, jolting, social change. In such a dynamic society, the elders find their experiences and wisdom increasingly irrelevant in a world in which the children are the true symbols of the culture. Parents’ attempts to transmit the values of their parents, or even their own, to children guarantees conflict in a radically transformed present--conflict most visible between first-generation immigrants and their American-born children. As Mead wrote in Culture and Commitment:

 

                In the past, there were always some elders who knew more than any children in terms of their experience of having grown up within a cultural system. Today there are none. It is not only that parents are no longer guides, but that there are no guides... There are no elders who know what those who have been reared within the last twenty years know about the world into which they were born...

                In this sense we must recognize that we have no descendants, as our children have no forebears (1970, p. 61).

 

It is very different to be the postfigurative, patriarchal repository of one’s culture’s eternal wisdom around whose gnarled feet youth must gather to learn the unchanging ways of the static tribe, versus a baffled old coot who needs a fuzzlipped, prefigurative kid to explain the gramophone, how to steer a Model T, New Math, why on earth anyone would pay for water in a bottle, or Netscape 12.5 for Windows 2010.

            In a society whose rapid changes favor the young and adaptable over the grayhairs set in their ways, teens get in their sarcastic jibes against the ossified old, adding to grandpa’s fears that he is yesterday’s debris to be replaced by the latest upgrade. Over-30 obsolescence lends a special anger to modern adult assertions of moral superiority and imaginings of endless crises among the young, but it does not explain why fear and anger against youth has taken hold so virulently in the United States and not so much in equally dynamic societies.

            In prefigurative cultures, grownups are apt to feel useless and baffled, less able to understand the current state of society or what “kids today” are thinking and doing. It does not follow that adults in prefigurative societies such as the U.S. should see the modern world of their children as frightening and menacing, but that has been the common response to each new cultural and technological innovation which is mastered by kids long before grownups. It is ironic (and perhaps unsurprising) that Americans, living in the most changeable and dynamic society, should be so fearful of every new change that seems to give their kids more options and advantages--that is, independent power. Boston University political science professor Alan Wolfe, who glowingly chronicles America’s upper-middle-class values, reported that parents are “unambiguously upset about the number of options their kids have.” In addition to taking the peculiar stance that young people should have fewer options, parents thought youths would benefit from suffering more “hard times” to “strengthen character,” an option these affluent parents did not mention considering for themselves (Wolfe 1997, pp. 116-32, 288-89).

            The “sense of distance” and “feeling of lacking a living connection” between generations in prefigurative societies “sometimes takes bizarre forms,” Mead continues. She recounts a group of American clergy who traveled to Sweden in 1968 to meet with young American conscientious objectors and concluded, “We are persuaded that these are our children.” But which generation in prefigurative society can be called alienated? Speaking (at age 68) as a member of the older generation, Mead argued:

 

In most discussions of the generation gap, the alienation of the young is emphasized, while the alienation of their elders may be wholly overlooked... We still hold the seats of power and command the resources and the skills necessary to keep order and organized the kinds of societies we know about. We control the educational systems, the apprenticeship systems, the career ladders up which the young must climb, step by step. The elders in the advanced countries control the resources needed by the young.

 

            In both prefigurative and postfigurative cultures, adults are prone to look at children and not like what they see. But in postfigurative societies, the child’s failings (imagined or real) are familiar to the adult, who knows how to correct them because the culture is a continuous recapitulation of the past.

            However, in prefigurative societies, adult alarm is heightened because the children--certainly those of the society at large, and often one’s own as well--may appear entirely alien, and the adult does not know what to do about it. In a prefigurative society, children face incessant criticism for not living up to elders’ traditional mores (whatever they are remembered to be) at the same time children cannot possibly do so because the world has changed so dramatically--in fact, adults don’t follow the traditional ways, either.

            “Today’s elders have to treat their own past as incommunicable” as well, Mead writes (1970, p. 61). Thus, adults constantly express longings for a lost, idealized past in which life was simpler, more moral, less violent, and politer. Such times did not exist, of course, no matter how often we hear that “kids didn’t act like that when I was growing up.” It is amusing to contrast older adults’ reconstituted memories of a crime-free Depression, for example, with actual police reports showing every form of crime and violence rampant in the 1930s (murder rates, for example, peaked in the United States in 1931 at levels not seen since). It becomes dangerous when adults attempt to forcibly restore the imagined past by means of imposing harsh regimens and draconian expectations on the young.

            Identical evolution from postfigurative to prefigurative society has occurred in Europe and Japan over the last century without arousing nearly the degree of fear found in the U.S. One possible reason is that while European or Japanese children may seem to think very differently than their elders, they at least look like their elders. In the multiracial U.S., as noted, the growing racial and ethnic diversity of the population has led to a situation in which the children appear to both think and look differently than their parent generation, so that diversity itself is subconsciously (and sometimes overtly) seen as the cause of society’s imagined degeneration. In America, degeneration over time is imagined whether it is occurring or not.

            “As long as any adult thinks that he, like the parents and teachers of old, can become introspective, invoke his own youth to understand the youth before him, he is lost,” Mead concluded. “...The adult imagination, acting alone, remains fettered to the past” (1970, pp. 63, 73). This natural allegiance of elders to the ways of judging and solving problems of the past can be a crippling impediment to dealing with the vastly new challenges youth confront. Mead argues that adult perceptions of social realities in recent decades, particularly as they apply to youths, are particularly distorted due to the newness of prefigurative societies.

            In many ways, this is the central problem in studying adolescents in dynamic, changing societies: adult researchers are enchained by outdated assumptions. Again and again in this book, we will see prominent, modern experts making bizarrely wrong assertions about adolescent and adult behavior that derive from the thinking of 30 or 40 years ago. That “top school problem” hoax that began this book is an example of the limited thinking of adults, including the most prominent and educated, in prefigurative societies and their readiness to accept even the direst, most obvious fictions when viewing the world of the young.

            Today’s criminologists, for example, still operate on anachronistic notions about crime originating in research decades ago, dubious enough at the time, that has since proven inapplicable to today’s crime patterns. Media critics assume television must exert controlling influence on today’s young in a complex, multimedia world because, back in their day, TV was the new, premier medium. Teens of 2003 handle and use drugs and tobacco very differently than baby-boom teenagers of 1973, yet both the leaders of America’s “war on drugs” and those who want to reform drug policies remain mired in their own, decades-old Sixties dogma.

            When adults’ natural orientation to a bygone past combines with difficulty in admitting troubling present realities--such as the recent explosions in drug abuse and crime among supposedly stable middle-agers--entire information systems on social problems break down. The most popular and quoted scientists, speaking to their peers’ and their own generation, are the ones who lend their professional veneer to confirm popular prejudices and affirm political imperatives of their day--that is, the past. Statements are made, and policy based on those statements, that have little to do with the realities today’s young people face.

            Some scholars have proposed designating change itself as a social problem, creating “future shock” (in the words of Alvin Toffler’s 1970s best seller), an “historic crisis of adaptation” requiring scientific management to slow to a humanly manageable pace. However, if Mead is right, the most recent generations of adults are uniquely muddled, for they live amid the contradictory expectations of older postfigurative and new prefigurative societies. Now that postfigurative cultures have all but disappeared, Mead suggests, future generations will become more used to the obvious reality that the child’s world no longer recapitulates the parent’s and grandparent’s. For the present generations, the inability of adults to adapt to changing realities--in California, for example, the transition to a multiracial, multicultural state in which mutual intergenerational support is critical--and older generations’ threatened, resistant stance to these changes poses immediate difficulties for both young people and the social fabric.

 

 

Cyclical Theory: Generational succession

 

            Generational historians Bill Strauss and Neil Howe, authors of Generations (1990), 13th Gen: Abort, Retry, Ignore, Fail (1993), and Millennials Rising (2000), argue that American history is characterized by repeating cycles of four generation typess, divided into dominant (civic, idealistic) and recessive (reactive, adaptive).

 

Dominant:

Civic generations emphasize outer-driven community, technology, economic growth.

Idealistic generations favor inner-driven moral principle, reform, and education.

 

Recessive:

Adaptive generations follow civic generations and stress pluralism, expertise, and social justice--qualities often trampled by civics in pursuit of community progress

Reactive generations follow idealistic generations to re-stabilize society after turbulent reform eras, stressing pragmatism and survival.

 

Strauss & Howe trace these generational cycles back to the 1600s and argue that they occur in a regular progression.

            In our era, they identify the “GI’s” (born 1901-24) as civic, a generation known for communitarian duty, sacrifice, and faith in technology, which labored through the Great Depression, won World War II, and provided 50 years of stable leadership from President Kennedy through George Bush Sr. Following was the Silent Generation (born 1925-42), which consolidated the scientific gains of the Civics during the 1950s while stressing a more open and tolerant society (i.e., respect for civil rights) as they aged. Then came the volatile Baby Boomers (born 1943-60), whose youth featured wrenching lifestyle and social change, experimentation, and invocation of moral principles (albeit often applied intolerantly and hypocritically). To stabilize society after the Boomer era, “13ers” (born 1961-81 and often called “Generation X”) brought a pragmatic tone and, though much vilified as “slackers” and little credited with anything, brought down the massive rates of social problems Boomers inflicted. Now a new Civic generation, the Millennials (born 1982-2001) is moving into young adult years, supposedly to return to more communitarian (or “scoutlike,” as Strauss and Howe put it) values.

            Strauss & Howe differ from conventional adult linear thinking in their argument that each generation fulfills a needed purpose--to correct the excesses of the previous generation, whether the excess is of the tumultuous or conformist sort. There is no generational deterioration, although stresses build to the point that America suffers a major crisis ever 70 to 80 years or so (Revolution, Civil War, Depression, and now a Mystery Crisis coming due).

            Their typology applies well to California’s more extreme patterns. Here, civics are really civic, idealists are over the top, and in-between generations really consolidate--to the point that California swings unpredictably back and forth:

 

·        California’s “Civics” are best exemplified by Governors Edmund G. “Pat” Brown and Ronald Reagan, who in succession in the 1950s, ‘60s, and early ‘70s forged the future of the state’s two largest institutions, schools and prisons.

·        The Civics’ building of California schools and government during the Brown era was heavily influenced by “Silent” values (embodied in former UC president and visionary Clark Kerr) stressing nonconformity and personal rights. These attitudes, though inconsistently handled by Kerr and other Silents, influenced the initiation of state civil rights agencies during the 1960s.

·        California “Idealists” ran amok during the 1960s, as Boomer youth (often allied with Silent adults) helped force civil rights and education reforms, leading university revolts and demonstrations in the hundreds of thousands and San Francisco’s world-famous counter-culture--and spawning a massive drug abuse and crime problem that plagues Boomers to this day as well as a host of moralistic, repressive measures in adulthood.

·        Of all recent generations, California Xers may be the most heroic, dramatically bringing down the staggering crime, drug abuse, suicide, and school failure rates of Boomers even as Xers became poorer, and stabilizing the state even as older generations relentlessly denigrated their morals and de-funded their institutions.

·        Finally, Golden State Millennials have continued the work of Xers, bringing a variety of youth problems to all-time lows by the late 1990s and early 2000 era--but, contrary to Strauss and Howe’s predictions, are feared even more than the goth-gang-image Xers were.

 

California deserves its reputation as an extreme state, and its youth are a big part of its barrier-pushing tradition. Strauss and Howe’s theories apply better to past generations than in explaining what is going on now, however--which is typical of the difficulty of writing (in philosopher Michel Foucault’s famous phrase) “the history of the present.”

            Combined with the “future shock” effect of prefigurative societies, Strauss and Howe’s generational succession typology can be re-read as a history of adult misconception resulting from lagged reaction time. Strauss and Howe’s works, read in this fashion, demonstrate how adults in changing societies misconstruct the image of emerging younger generations, misperceiving the young in the outmoded framework of the world the old faced when young. Civics, growing up in an era in which Nazis and Communists directly threatened the U.S., were unable to see what younger protesters saw clearly: that Vietnamese nationalists were very different. Baby boomers had extreme, ongoing drug abuse problems and remain unable to imagine drug abuse today as anything other than an affliction of young people (in fact, drug abuse now is overwhelmingly a middle-aged crisis).

            In combination, cyclical/future-shock theory--while more optimistic than the perpetual-degeneration dogma of linear theory--is also pessimistic. The U.S. will continue to repeat its historical cycle of adults misconstruing and misplanning for the real needs of the young, imposing unnecessary or harmful policies to fight yesterday’s crises and failing to provide youth with opportunities needed in the new world. This perpetual adult failure is why they predict recurring cataclysm every 75 years or so. Since the last one was the Great Depression, 1929-35, by their reckoning the next upheaval is due some time very soon in the New Millennium.

 

 

The Invention of the Adolescent

 

            As seen, American scientists invented “adolescence” a century ago as part of a panoply of bogus racial, gender, nativist, and other demographic hierarchies. The dominant generation during this period, by Strauss and Howe’s charting, were the idealistic Missionaries (born 1860-82), a moralistic cohort (similar to the later Baby Boomers) that brought both important social reforms (i.e., women’s suffrage, the “child savers,” and the New Deal) and the violent, crazed repression of Prohibition, anti-drug crackdowns, and racist scientific classifications. The results of this unfortunate era in late-1800s/early-1900s sociology, medicine, and psychology have been disastrous. The wildly embellished fears leading authorities such as psychologist G. Stanley Hall attached to the dark and violent world of adolescence led to repeated panics and adult repressions that maximized odds of emergence of the very horror that frightened adults the most-- “youth peer culture,” or what Mead might have termed “organized prefiguration.”

            “Scientifically validated” fear of darker races and immigrants merged with fear of women and youth to produce a distinctly American psychology of dread: the inferior, drug-crazed, hypersexed, violent, volatile Negro, Oriental, Mexican, and Catholic stand ready to corrupt the superior White Northern European races via the latter’s weakest components: white women and youth. This fear rages unchecked today. Witness the popular resonance of the 1999 movie, “Traffic” (black and Mexican heroin dealers seduce elite, white teenage girl) and incessant “expert” pronouncements that “the youth scourges of the inner-city have arrived in the suburbs.”

            Before 1900 or so, adolescents were viewed as what their label’s Latin root literally means--adults in progress. There was nothing special or terrifying about them. It was not remarkable that Shakespeare’s Romeo was 16 and commanded legions or that he and Juliet (13 or 14) were capable of adultlike romance and suicide.

            In 1800s America as in most pre-industrial and industrializing societies, the boys continued work on family farms and businesses or trained in apprenticeships; the elite continued schooling; the girls worked around their parents’ home until a man five to 10 years their senior married them. Urban youth gangs, girls prematurely knocked up, massive numbers of teenaged prostitutes, and drunken college boys were routinely deplored in the 1800s, much as were similar shenanigans among adults.

            In the early 1900s, organized by the writings of psychologist Hall (see Chapter 2) and others, a scare over teens erupted. Hall was not alone; organized groups calling themselves “child savers” were concerned with protecting children, which included adolescents, from what they saw as the increasing dangers of a nation that was looking less and less “American.” Several decades of heavy immigration had produced large areas of cities and industrialized states that were no longer dominated by native-born Western European whites. Drinking and opiate drugs (unfairly attributed to immigrants) were seen as a particular threats to childhood innocence, as were regimented schools, a harsh criminal justice system, and precocity in general. Hall’s fear of all things adolescent and precocious--particularly his claim that youths of 1900 were descending into debauchery and violence in ways only hardened criminals had a generation earlier-- gave voice and impetus to disorganized anxieties. Hall’s claims rested more in isolated, scary anecdotes than in any systematic examination of trends.

            Several outgrowths of the child-saver movement have persisted, though in peculiar forms. The most famous is the juvenile court, first inaugurated in Chicago in 1899 to deal specifically with juvenile crime, with Denver and Boston quickly to follow. Separate courts and judges were seen as necessary to redirect, more than punish, the innocent and malleable young offender led astray by the corruptions of adult society. Juvenile courts, which quickly spread to all states, stressed tailoring services “in the best interests of the child” rather than lengthy prison terms seen as appropriate to hardened adult criminals--prisons served only as training schools in crime at which young offenders were recruited and schooled by wily adult cons. (As we will see, the juvenile court today has almost completely reversed this mission and, in California, functions to imprison youths for longer periods than adults.) Likewise, child labor laws sought to protect younger children from workplace hazards, though the success of these depended more on the declining need for child labor increasingly filled by low-paid immigrant workers.

            A wing of the “child savers” was called the “boyologists” due to their belief that boys in particular were becoming feminized by the disappearance of manly proving grounds such as the western frontier, agriculture, craft apprenticeships, and war their subjugation by feminine institutions such as schools. Boyologists were determined to reverse this wussification trend lest boys shrivel into sniveling wimps. President Theodore Roosevelt warmly recommended that real boys get into at least one fistfight a day. (Roosevelt’s daughter, Alice, also caught the toughness, creating scandals by smoking, swearing, and drinking champagne at the White House.) Rugged outdoor programs such as the Boy Scouts, sports programs, and physical education were created both to prolong boyhood innocence (in recognition of the corrupting influences of precocity) at the same time it toughened young lads’ souls and sinews.

            Girls also came under increasing protection as child-savers found young females were not so benignly protected by family and society as previously assumed. Age-of-consent reforms revised state laws, which set the age at which young girls could consent to sexual intercourse with adult men at 10 or 12 in most states (in Delaware, the age of consent was seven, based on ancient English laws setting the age of a squire), upward to around 14 to 16. Reformers such as feminist Anna Garlin Spencer, in a 1913 article, pointed out the massive numbers of teenage prostitutes--in New York City, there was one prostitute for every 19 men in the population, 40% of prostitutes were age 16 or younger, and one-fourth of young prostitutes died of disease or violence every year. Opponents of raising the age of consent, including authorities such as the editors of Medical Age who seem to harbor a chronic fear of teenage girls, berated the power of “the licentious, designing demi mondaine, many of whom are under the age of eighteen” to seduce innocent males and then have them charged with criminal offenses.

            As with today’s liberal groups seeking to rescue the young from corruption, reformers of the early 1900s sounded a hostile note as well. In 1911, a middle-aged commentator penned “Letter to the Rising Generation” in Atlantic Monthly, accusing the young of “mental rickets and curvature of the soul,” indulging a “culte du moi”, and of growing up “painfully commercialized even in their school days.” And to the young: “What excuse have you, anyhow, for turning out flimsy, shallow, amusement-seeking creatures?” Responses to the article accused neglectful and indulgent parents of letting their kids grow up too fast (Strauss & Howe 1990, p. 250). Hardly a word would have to be changed for the same discussion to run today under the authorship of any number of youth-protection lobbies.

            Whatever the efforts to force youth out of the labor market while at the same time accusing them of being shallow pleasure-seekers, more were employed than at any time before or since. Reflecting the heavy work ethic of millions of new immigrants and the first massive wave of migration of African Americans from the rural South to the Northern cities, one in five youths ages 10-14, and three in five ages 15 to 19 held paying jobs in 1910. Many worked in industrial “sweatshops” (a term coined in that era), home seamstress piecework, or in street jobs such as newspaper hawkers, bootblacks, scavengers, messengers, cigar-rollers, and prostitutes, or as low-paid coal miners.

            They earned, and they spent. Youth income created and sustained the first youth-focused retail businesses: nickelodeons and candy stores (peddling jelly beans, Tootsie Rolls, Hershey bars, bubble gum) that doubled the national sugar consumption in one generation. During this period, advertising pioneer Stanley Resor teamed up with dour child psychologist John Watson at the fledgling J. Walter Thompson advertising firm to promote products based not on rational arguments of their superiority, but by playing on consumers’ “irrational desires” stemming from fear, hunger, vanity, and sexuality. Watson’s theory of behavior control held that children were best managed by adult’ manipulation of fear, rage, and love morphed into promoting mass consumerism of ever-changing styles based on maintaining a perpetual state of consumer anxiety and dissatisfaction with old ones (Spring 2004).

            While heavily employed, the share of youths in school declined during the period. World War I intelligence tests claimed to show the average young draftee had a mental age of under 12, prompting psychologist Henry Goddard to invent the terms “moron,” “idiot,” and “imbecile” to reflect the newly discovered degrees of stupidity. In the great influenza epidemic of 1918, 250,000 teens and young adults died, along with another 50,000 in what young antiwar cynics called the “sausage machine” of World War I.

            For all the efforts to “save” them, adolescents of the 1900-1920 era remained more at work, out of school, and fast to grow up than previous generations. Turn-of-the-century youth, the “Lost Generation” (born 1883-1900, ancestors of the later Generation X) were “growing up fast amid gangs, drugs, saloons, big-city immigration” and the turbulence of social reform (Strauss & Howe 1990, pp. 254, 255).

            Note how similar descriptions of the early-1900s “Lost” are to those later hurled at the 1980s “X.” “Their hardened precocity sat badly with values-focused elders”, who regularly decried the consumerism, cynicism, ruthlessness, and lack of innocence of youth “weaned on violence and noise,” as author Thomas Wolfe lamented. Precocity, let it be said, never sits well with American grownups--but neither does childishness.

            Much of the early-1900s fear of and for youth came to center on what many assume is a modern crisis--teen suicide. “Suicides, like all forms of crime, are becoming more and more precocious,” warned famed Stanford University child psychologist Lewis Terman in a 1913 paper. “In these days children leave their marbles and tops to commit suicide, tired of life even before they have tasted it.”

            Based on available coroner reports and research, Terman estimated 2,000 teens committed suicide every year; if accurate, a rate double today’s level. The “appalling rate of child suicide,” wrote Literary Digest’s editors in 1921, “is a frightful indictment of our Christian civilization... the average age of boys is sixteen years and girls fifteen.” What was to blame for kids killing themselves as never before? Then as now, whatever the commentator thought was wrong with youths was what was wrong with America in general.

            Terman blamed “cheap theatres, pessimistic literature, sensational stories, the newspaper publicity given to crime and suicides” as well as severe schooling, parental harshness, family disgraces, and excessive pressures on young people to succeed. Others raked feminism, alcohol prohibition, lax discipline, materialism, weakening grownups, and generational degeneration. “Our children are not so clean and innocent as those of an earlier generation,” the Catholic Universe lamented in 1921. Girls were killing themselves in despair of Suffragettes’ “insidious propaganda” that “women were the coming mistresses of civilization; men were back numbers; marriage was a relation of convenience; the world had been made a mess by the ignoramuses now in control,” announced the Baltimore American in 1920.

            While it is doubtful that feminism was the cause, teen girls’ suicide rate in 1915, at 3.4 per 100,000 population age 10-19, was higher than boys’ rates for the only known time in our history--as well as higher than the suicide rate of teen girls today (2.2 in 2001). And for those who worry today that college students’ mental disturbance has risen ...well, students have always been thought crazy. Press and experts’ claimed a “terrifying” “wave of suicides” among college men in 1927--blamed on rising cynicism and materialism among youth. While this suicide wave was debunked by later statistical study, suicide would continue to haunt the generation of those who came of age in the early 1900s more than any later generation.

 

 

Getting Hot about “Teen Sex”

 

            The massive growth of the public high school from 1890 to 1920, as well as child labor laws and increasingly automated industries that pushed youths out of the workplace, contributed to the segregation of adolescents from adults and the growth of youth culture. In 1900, 95,000 youths (38,000 boys and 57,000 girls) graduated from high school, just 6% of the 17-year-old population. In 1920, graduations tripled to over 300,000, equal to one in six 17 year-olds; by 1940, graduates would top 1.2 million. A major motive behind the creation of large institutions to manage youths was the growing desire of adults to separate themselves from adolescents and to actively prevent youths from entering what was seen as the vice-filled grownup world--one grownups seemed to thoroughly enjoy. With the rapid growth of schooling came the spread of formal testing, developed by psychologists to classify students according to temperament and educational aptitude, and, hopefully, redirect them to wholesome lives that adults clearly had not chosen for themselves.

            No sooner did institutions recognize the implications of the youth culture created by youths spending more time in peer-dominated environments than concerns arose. A 1912 survey found a large majority of those attending urban dance halls were ages 13 to 20. Testified one shocked investigator:

 

I saw one of the women smoking cigarettes, most of the younger couples were hugging and kissing... they were all singing and carrying on, they kept running around like a mob of lunatics let loose (Spring 2004, p. 76).

 

New amusement parks such as Coney Island drew hordes of young people and more alarmed exclamations on a burgeoning youth culture involved in scandalous new dancing styles, sexual mingling, and what is now called “hooking up.”

            The psychologists, child-savers, and educators had intended something quite different, of course. In his famous “Elmtown” study of youth organizations in a small town, sociologist August Hollingshead later provided a cynical judgment of the motives:

 

By segregating young people into special institutions such as the school, Sunday school, and later into youth organizations such as Boy Scouts and Girl Scouts for a few hours each week, adults apparently hope that the adolescent will be spared the shock of learning the contradictions of the culture. At the same time, they believe that these institutions are building a mysterious something variously called “citizenship,” “leadership,” or “character,” which will keep the boy or girl from being “tempted” by the “pleasures” of adult life. Thus the youth-training institutions provided by the culture are essentially negative in their objectives, for they segregate adolescents from the real world that adults know and function in. By trying to keep the maturing child ignorant of this world of conflict and contradictions, adults believe they are keeping him “pure” (Elmtown’s Youth, 1941, p. 149).

 

            The temporary flush times of the 1920s saw fears over the first manifestations of a distinct youth culture with its own dress, language, music, and entertainment, epitomized by the “flapper”--the modishly dressed young woman who dated unchaperoned, drank, smoked, cussed, and did who knows what else. The best-selling Flaming Youth, published in 1922, catalogued incidents of this newly expressive female sexuality, “a sheer femaleness that’s going to make trouble” as only sheer femaleness can. The flapper “has forgotten how to simper; she seldom blushes; and it is impossible to shock her,” wrote journalist H.L. Mencken (Strauss & Howe 1990, p. 255; see also Hine 1999; Moran 2000). Also distressing was that “Jazz Age” white youths of the 1920s were expressing dangerous affinity for Negro music along with popularizing boisterous dancing styles. (“Jazz,” after all, was a euphemism for sex). Dozens of fearfully titillating books with titles such as Passionate Youth, Blind Youth, Wild Youth, Gilded Youth, The Plastic Age, The Mad Whirl, and Youthful Folly followed, as worried and ignorant as anything published today.

            Retorted “‘These Wild Young People,’ by One of Them” in Atlantic Monthly in 1920: “Magazines have been crowded with pessimistic descriptions of the younger generation” but “the older generation has certainly pretty well ruined this world before passing it on to us” (Strauss & Howe 1990, p. 256). Young people who had earned their own money, suffered and survived their own way from a young age, and returned as veterans from World War I were not about to have a bunch of grizzled old Prohibitionists force their stern morality on them. Blind to each other, interestingly, adult and teenage morals were undergoing similar changes; opinion polls of the day found little difference between youths and their parents on promiscuity and other behaviors. In fact, as feminist Anne Temple pointed out, “preoccupation with the nature of (youth’s) sex life” allowed adults to be both voyeurs and condemners, as well as to safely sort out their own values (Kett 1977, pp. 262-64).

            Naturally, this era inaugurated the flowering of fears over youthful sexuality and the first sex education classes. Youths of the 1920s “pursue pleasure with an ardor that leaves the more recently emerged puritans of an older generation astonished and aghast,” declared one social commentator. “The girl of fourteen is the problem of today,” a Boston Sunday newspaper headlined, while Atlantic magazine in 1922 bemoaned “the perfect freedom of intercourse [then a term, like today’s “hooking up,” that maximized vague horrors by denoting everything from conversation to coitus] between the sexes, the unchaperoned motor-flights at night, the intimacies of modern dancing, the scantiness of modern dress, the frankness of conversation between young men and girls.”

            Drinking, all warned, is a big part of youth mating rituals, even amid Prohibition, leading to warnings of a generation consumed with Delirium Tremens. Today’s Little Miss Muffet, a 1920s rewrite of the new young lady went, “Sat on a tuffet/Drinking her whisky and gin/Along came a spider and sat down beside her/Said she, ‘It’s the D.T.s again.’” Also sexually menacing were darkened dance halls and movie theaters, sites not only of normal lascivious temptations fueled by racy race music and pictures, but also widely publicized (but never documented) tales of girls kidnapped and sold into white slave trade. When not depicted as posing actual violence, poorer groups were threatening middle-American morals again: “To the extent that middle-class youths were likely to experience premarital coitus, they were simply accepting sexual mores long common among those of lower social status” (Kett 1977, p. 261).

            Interestingly, Hall and his followers were not prudes; they counted among the most ardent champions of sex education in schools. The reason? Sexual degeneration among white teenagers would drag the entire race down to that of savage darker cultures, they worried. Indeed, Hall’s theory of the unique threat adolescence posed was its inchoate sexuality. Other young rebels such as Randolphe Bourne, Max Eastman, Margaret Sanger, and Floyd Dell championed a franker approach to sex as part of a larger movement toward women’s equality and independence from mandatory motherhood (Kett 1977). By 1927, some four in 10 American high schools provided sex education curricula, and a Commonwealth Fund study around the same time found classroom teachers rating “heterosexual activity” the worst student behavioral problem, followed closely by stealing, masturbation, “obscene notes and pictures,” and truancy. It is not clear how teachers became aware of these youthful vices, or if they were taking place at school or assumed to be elsewhere.

            The rise of sex education ignited a century-long, perhaps eternal, war. While sex educators who warned of rising pregnancy and crippling venereal diseases abetted by the “conspiracy of silence” regarding sexual information, traditionalists (led by Anthony Comstock, who successfully pushed the Comstock anti-obscenity laws) viewed adolescents as sexually innocent and crusaded against sex education as planting prurient ideas into pristine teenage minds. Sex educators, then as now, denied any romantic or healthy aspect to adolescent sexuality and largely saw their roles as deploying a combination of scare tactics about VD and pregnancy and morals imprecations to suppress it in all its forms. The degeneration of the sex education debate in subsequent decades reveals America at its delusional, youth-fearing worst.

            Indeed, nationwide birth statistics issued in 1920 might have given pause to both sides. They revealed a quarter-million births to mothers age 19 and younger, including 2,000 to mothers 14 and younger. This might have given a hint that teenagers were not sexually innocent, as the Comstockians hallucinated. However, 70% of the fathers of babies born to mothers under age 15, and 90% of those born to mothers age 15-19, were adults ages 20 and older--one-third of the fathers in “teen” births were over age 25. (For example, my grandfather, 25, married my grandmother, age 16, in 1912, an entirely routine liaison of that day and the beginning of a 70-year marriage). That might have given pause to sex educators, in turn, that the teen-sex issue was just one of “boys and girls.” The fact that most “teen pregnancy” really is “adult-teen pregnancy” has proven too uncomfortable a topic for any side in the supposedly frank sex-ed debate to incorporate to this day.

            In a later survey, 40% of those who were teens in the early 1920s reported heavy petting before the age of 16. Interestingly, the most affluent and educated boys and girls of the time were more likely to restrict their sexual activity to heavy petting (a catch-all term for sexual acts short of intercourse, such as mutual masturbation and oral sex), while the poorer youth were more likely to have intercourse. Eighty years later, furor over a supposedly unheard-of “junior high sexual revolution” involving young teens having non-intercourse sex once again rages senselessly among grownups who ought to know better.

            Lusty Roaring Twenties teenaged girls and young women dancing dirty to Negro music--imagine the panic in a society already primed to fear young white women as menaced by black men’s allures. If they craved the music, would they also crave the music maker? What was most fascinating about the worrisome 1920s youth culture was that it resulted from reformers’ own moves to prolong childhood, concentrate young people in larger schools, and remove them from the supposedly corrupt adult society youths could see adults enjoyed immensely. “The single most important cause for the rise of the flapper, and of the youth culture of which she was the symbol, was the prolonging of youth itself,” Hine (1999, p. 197) wrote, “For ever-growing numbers of young people, the real life of going to work and starting a family was deferred, replaced by a student life, played out almost entirely with people one’s own age.”

            Cynics saw an additional motive in attempts to control disorganized and peer-influenced adolescent sexuality: consumerism. “Over time, commodified sexuality replaced dance halls,” wrote Spring in Educating the Consumer Citizen (2004, p. 76). “Penny machines were introduced by measure kisses. The Tunnel of Love and the Canals of Venice [California] provided opportunities for couples to embrace.” The emergence of the high school prom and other formal dances in the 1930s was intended to replace chaotic, public teenage dating rituals with a controlled rite-of-passage institution--that incidentally launched a consumer boom of its own.

 

 

 

The Greatest Generation?

 

It is, I believe, the greatest generation any society has ever produced.

   - NBC News anchor Tom Brokaw, The Greatest Generation, 1998, front flap

 

            So say its many admirers now about the youth of the 1930s who lived through the Great Depression and World War II (what Strauss and Howe term the civic-minded “GI Generation”)--but adults of their day thought they were anything but great. The bitter pessimisms aimed at Jazz Age youth were just a warm-up for the anti-youth barrage that gained momentum in the 1930s. Drugs, crime, gun violence, apathy, promiscuity, mental disturbance, and moral collapse were widely depicted as the signatures of a new and frightening generation (Males 1996, pp. 259-64).

            “Youth gone loco! Villain is marijuana!” announced a popular magazine in 1936, part of a massive drug scare that generated such films as Reefer Madness and Cocaine Fiends. “Organized gangs are distributing drugs to every school in this city,” warned Reefer. “Dope peddlers infest our high schools... in every community and hamlet in our country.” Declared one official: “Hundreds of new (drug) cases involving our youth come in every day” while “drug-crazed teens have murdered entire families.”

            In 1935, the FBI reported “the average age of criminals is nineteen.” Twelve teenagers were executed for criminal offenses in 1937: all but one of them were black. In 1937, 11,000 teens died from violent causes, including 1,300 in suicides and homicides, 1,600 by firearms, and 5,000 in car crashes, and 300,000 teen girls gave birth, all massive increases from the 1920s. Only half of all teens were enrolled in high school. Testing by the American Youth Commission found 75 percent of young people tested “were suffering from some health defect induced mainly by mental anxiety.” Journalist Maxine Davis’s nationwide study of young people, The Lost Generation (1936), warned that adolescents were “confused, disillusioned, and disenchanted” and “rapidly approaching a psychosis.” Unlike their intrepid forebears, lamented Davis, “Today’s younger generation accepts whatever happens to it with sheeplike apathy.”

            Surveys in the 1930s found divorce had quadrupled in a generation. Forty-five percent of American college men rated themselves as “very promiscuous,” compared to 12% of college girls, the latter evidently very busy indeed. Sociologists Hornel and Ella Hart reported in 1940 that the proportion of girls who had premarital sex rose from 10% for those born before 1880 to 60% for those born after 1910; for boys, the increase was from 50% to 80%. Venereal disease rates, primarily gonorrhea and syphilis, reached epidemic levels among all age groups, teens included, in the 1930s, higher than any levels seen since. In 1938, Congress approved the first National Venereal Disease Control Act in response to the epidemic. The U.S. Public Health Service estimated one million illegal abortions per year in 1937.

            In Our Movie-Made Children (1933), author James Forman popularized the Payne studies of films’ impacts on youth. The Payne studies, conducted by University of Ohio researchers from 1928 to 1932, interviewed hundreds of children over movie habits, even wiring children’s beds in state institutions to measure restlessness. The studies concluded movies were “detrimental to normal health and growth,” promoting bad health, school misbehaviors, bad grades, truancy, promiscuity, and crime in youths. Anxious demands from educators, psychologists, politicians, and sensational books and articles claiming to document disastrous effects of mass media on children led to adoption of the first broadcast self-censorship codes restricting all offensive topics, including “ridicule of the clergy,” “deliberate seduction of girls,” “use of firearms,” and “sedition” (Spring 2004, pp. 107, 112-13).

            For all the panic about what kids were seeing at the cinema, you would never have guessed there was a Depression going on. American adults were displaying their penchant for fixating hysterically on imaginary moral and pop-culture threats to youth while ignoring real, manifest economic and social crises--a tactic Baby Boomers would later perfect.

            Signs of apocalypse were everywhere. American Magazine, the nation’s largest weekly, reported in 1936 that the “youth problem” was the nation’s most troubling, generating “literally thousands of letters from people of all ages.” It assigned a top reporter to the story, generating a pessimistic cover feature lamenting the growing criminality and confusion among “our muddled youth.”

            “Day by day the newspapers report to us one grave crime after another, one moral delinquency after another and one dereliction of duty after another,” Columbia University president Nicholas Murray Butler thundered in an address entitled, “The Perpetual Youth Problem.” So enraged was he at the sorry state of young people that he compared their morals unfavorably with Congress’s.

            In 1935, scholars George Leighton and Richard Hellman warned that high school students were roaming the nation “armed” and “out for whatever they can get, while it lasts:”

 

                A migratory worker who has traveled back and forth across the country for twenty years has described the comparatively recent appearance of firearms among the young bums. “In my day,” said he, “gats were almost unheard of... It’s different now... you find high school kids armed.”

                ... (A) generation, numbering in the millions, has gone so far in decay that it acts without thought of social responsibility. Assuming the unmitigated demoralization of these people, American society may find itself in the throes of this pathology within another generation. The Lost Generation even now is rotting before our eyes.

 

The motherland would never survive this cohort of young degenerates, they said.

            Seemingly before their eyes, the nation’s young people were unraveling. Rates of youth employment had fallen by half from the 1920s to the 1930s, and most of youths’ meager spending money came from a new institution--allowances from parents for household chores. “Searching aimlessly for a job, a place in the world, or an escape from a stultifying home life, 1,500,000 restless boys and girls flowed out over the nation when the depression struck,” reported Literary Digest in 1936. “’Jungles’ and hobo camps grew to unbelievable proportions” along with “an enormous increase in petty crime.” More than half the nation’s teens were neither in school nor working, a standard measure of high risk today (though the condition affects only a fraction as many youth now as it did during the Depression). The saccharine teen-movie musicals of Mickey Rooney and Judy Garland were diversions from the hard life, not realities.

            The same unemployment, poverty, and Depression woes were also afflicting adults. Despite idyllic memories of the Depression as a time of neighborly cooperation and little crime, FBI and health reports show every social ill exploded. Suicide leaped 40% from 1925 to 1931 to levels one-third higher than recorded today, and murder hit record high levels in the early 1930s that stood for six decades. Among both black and white residents of Los Angeles, coroner records show that murder rates of the 1930s were double those of the 1990s--a fact that would flabbergast elders of both races.

            Students of the 1930s, perhaps because they could afford to, were not without a sense of humor. Adults who decried teenage apathy waxed apoplectic about student activism. The chief source of grownup fear was the American Student Union, which American Magazine branded “a left-wing student organization whose bogeys are capitalism and war.” In 1936, half a million students demonstrated against American economic and foreign policies. Satirical student antiwar chapters arose on many campuses: Veterans of Future Wars (Princeton), Future Gold Star Mothers (Vassar), Profiteers of Future Wars (Rensselaer Polytechnic Institute), and Gold Diggers’ Auxiliary (Russell Sage College for Women). Yet, off to World War II several million 1930s youth would eventually go, where 300,000 would die.

            Meanwhile, unlike today, government rode to the rescue. President Franklin Roosevelt pushed Congress to raise taxes on corporations and funded an array of Depression-era relief programs called the New Deal, many targeting young people--though youth of color were afforded only segregated, underfunded efforts. The Civilian Conservation Corps was established in 1935, later employing 2.5 million youths and young adults at 24 to 30 hours per week in outdoor tree-planting, soil conservation, fishery improvement, and construction projects that still yield benefits today. The CCC generated $600 million in appraised work value and provided $250 million in income to workers (the equivalent of some $7 billion today) by 1936, as well as board and housing.

            Meanwhile, the National Youth Administration granted $71 million to college students in 1936, a college grant program more generous than today’s (for example, it would have paid the full annual tuitions for 1.4 million University of California students today). In today’s dollars, these programs would translate into $200 billion per year on government-sponsored employment and $15 billion in free grants to college students--far beyond what a richer US provides its young people today.

            As an astounding result of sustained government investment--even given the travails of the Depression and war--youth of the 1930s showed the biggest improvement in educational achievement ever. Compared to the previous generation’s, average length of schooling improved from ninth grade to 12th grade, the share of young people in college tripled, and aptitude scores soared over those of the “moron” generation of 1920 youths. The first vocational education also appeared, segregated by race and gender--industrial ed for boys, home ec for girls.

            Why were national leaders able to recognize and shift government response to help turn crisis into opportunity? A large part of the official attitude derived from increasingly sophisticated research on youth and adult malaise by a new breed of social scientists in the late 1920s, led by University of Chicago sociologists (called the “Chicago School”). It is an important lesson in how social science can influence political policy for the good.

            The Chicago School carefully mapped youth crime patterns in their city and discovered rates were highest in slum areas with transient populations. One of their revolutionary findings was that crime was a property of neighborhoods, not of the particular race that occupied them. A slum neighborhood had about the same crime rate when it was occupied by Irish or Poles as when it later housed African Americans or Puerto Ricans. Similarly, once a particular race or ethnic group progressed enough to move away from a slum neighborhood, their crime rates dropped in newer, more affluent surroundings. The Chicago School’s scientific approach to mapping crime and arguing for “social disintegration” rather than innate racial or personality traits bred crime proved so persuasive that it supplanted older psychological theories that criminals were born, not made.

            Consider also the innovative attitudes of President Franklin D. Roosevelt (and First Lady Eleanor, as well as many top administration appointees) in avoiding the fixation on the past and perceiving the new realities young people faced. In 1936, FDR delivered what may have been the most prefigurative-savvy presidential speech ever to 5,000 youths crowded into the Baltimore armory:

 

                The world in which the millions of you who have come of age is not the set old world of your fathers. Some of yesterday’s certainties have vanished: many of yesterday’s certainties are questioned... The facts and needs of civilization have changed more greatly in this generation than in the century that preceded it.

                ...You are measuring the present state of the world out of your own experiences. You have felt the rough hand of the depression. You have walked the streets looking for jobs that never turned up. Out of this has come physical hardship and, more serious, the scars of disillusionment.

                The temper of our youth has become more restless, more critical, more challenging... Youth comes to us and wants to know what we propose to do about a society that hurts so many of them. There is much to justify in the inquiring attitude of youth... It is clear that many of the old answers are not the right answers. No answer, new or old, is fit for your thought unless it is framed in terms of what you face and what you desire--unless it carries some definite prospect of a practical down-to-earth solution of your problems.

                ... Many older people seem to take unmerited pride in the mere fact that they are adults... And the tragedy is that so many young people do... grow up, and in growing up, they grow away from their enthusiasms and their ideals. That is one reason why the world into which they go gets better so slowly (Vital Speeches, 13 April 1936, pp. 442-44).

 

It is difficult to imagine any president today affirming the new realities of young people while questioning the assumptions of the old. While Roosevelt did not endorse the American Youth Act proposed by radical students and educators to invest an additional $3 billion in education, jobs, and services, the funding government provided to youth interests was impressive--especially in a depressed economy.

            From “I fear this generation of youth will be lost” (as Eleanor Roosevelt lamented in 1934), these supposedly apathetic, criminal, drugged, amoral 1930s youth are now extolled as war heroes, Nazi beaters, providers of every president from Kennedy to Bush Sr.--that is, The Greatest Generation! “The federal government has directed its attention to whatever age bracket the G.I.s (their term for the generation of the 1930s) has occupied” (Strauss & Howe 1990, p. 266). That is, the “Greatest Generation” has raked in the greatest amount of money in allowances from their parents and welfare from their parents’ government--though baby boomers were a close second.

 

Zoot suits and V-girls

 

            In big cities and California, white America was discovering disturbing new interracial trends. Race riots rocked Detroit in 1943. In May and June of that same year, a one-sided riot by Los Angeles-based white sailors against zoot-suited Latino youth--misnamed the “zoot suit riots”--was championed by the city’s establishment and frankly racist police authorities. LA’s anti-Mexican riots, together with 1942’s notorious “Sleepy Lagoon” prosecution and imprisonment (later overturned) of 11 Latino youths ages 17 to 21, all alleged to be members of the “38th Street Gang,” in a dubious murder case, were celebrated by the city’s news media and political leaders as necessary to quash the “awful... Mexican juvenile delinquency” problem.

            As black and Chinese men had been labeled genetically violent by social scientists a few decades earlier, and as all teenagers would be so branded a few decades later, L.A. police Captain Duran Ayres (who chiefed the Police Department’s weirdly named Foreign Relations Bureau) pronounced Mexicans and Indians as possessing and “inborn... desire to kill” that “has come down through the ages.” Caucasian kids got into harmless fistfights, Ayres testified in department reports and in court, but Mexican youths “use a knife or some lethal weapon.” These anti-Mexican bigotries and campaigns in Los Angeles were triumphantly cited by Germany’s Nazi and Japan’s Imperial radio as proof that American authorities endorsed Hitler’s racist doctrines. The crazed saga escalated when the L.A. County sheriff, himself of Latino descent, charged that the city’s Mexican-Americans were agents of the Japanese government bent on helping the Axis win the war.

            California’s brilliant journalist-historian, Carey McWilliams, sardonically presaged the attitudes of many California police forces and gang-profiling data-basers in the 1980s and ‘90s, in a 1948 essay on L.A.’s wartime racial violence: “If you were born of Mexican parents financially unable to move out of certain, specific slum areas,” McWilliams wrote, “you could be a gangster from birth without having to go to all the trouble of committing a crime” (Fool’s Paradise, pp. 186-87). McWilliams’ 1940s writings on L.A.’s police anti-gang sweeps, alarmist media reports on “gangs,” prejudices against Latino youth attire (especially “baggy clothes” allegedly worn to conceal weapons), the public visibility of “pachucos,” and staggering social class divisions that transcended even race could have been published in the 1990s with little alteration.

            In other respects, the 1940s represented something of a hiatus on youth bashing, no doubt because millions were fighting in World War II, where 15-, 16-, and 17-year-old troops and pilots were acquitting themselves well. Not only did the war’s recruitment, as well as killing and maiming, of young adult men open jobs to youths, but the economic boom afterward created millions of now familiar jobs even for younger teenagers in fast food, groceries, and other entry-level employment. Still, pioneers of high school Family Relationships classes noted in 1941 that business owners “found many of the high school graduates lacking in certain personality traits and ignorant of certain elementary rules of courtesy” that hampered their value as employees (Moran, p. 128).



             On the home front, the term “teenager” first appeared during World War II to describe the emergence of a new youth market as well as a threat to the social order, both fueled by the rising postwar purchasing power of adolescents. Worries about the “V-girls”--patriotic girls as young as 12 or 13 (Ladies’ Home Journal warned) determined to help win the war by giving departing soldiers that one last thrill before they headed for the front--and a rising teen “canteen culture” were muted. Teenage musical tastes continued to diverge from those of adults, popularizing swing music, a hard-driving jazz suitable for dancing played by “big bands” such as those led by Glenn Miller, Tommy Dorsey, and Duke Ellington. In 1940, Dorsey hired a young vocalist named Frank Sinatra, who became the first teenage pop star. Mobs of teenage girls, 25,000 teens in all, blocked streets and paralyzed midtown Manhattan when Sinatra sang with Benny Goodman’s orchestra at the Paramount in 1942.

            Suddenly, there was a teenage market with distinct tastes and money. The 1940s and 1950s marks the formal recognition of a “youth culture” in the most important way America could do: as a distinct market whose profit potential could be studied and systematically served by commercial interests.

            In the early 1940s, several marketers introduced clothing lines with labels such as “Teentimers” to plumb the new market. In 1944, Seventeen was born as the first magazine specifically targeting teenaged women as a separate demographic. While youth had set the pace for national fashion trends of the 1920s, by the 1940s teens had their own tastes that diverged from adults. The emergence of the teenaged consumer led to the first sophisticated marketing research techniques designed to discover what he or she wanted to buy. By 1960, fashion marketers had identified three distinct youth markets: subteen (age 10-13), teen (13-15), and juniors (16 and older) (Kett 1977).

            It took a teen to see the potential. In 1944, Eugene Gilbert, then 18, hatched three ideas essential to modern marketing: that teens would respond to retailers who sought to supply what they wanted; that the best way to find out what teens wanted was to have their peers ask them; and that prominent youths could be recruited to model products and help shape fashion tastes. Commissioned by corporate clients, Gilbert put 5,000 teenage pollsters in the field, conducting surveys and focus groups that both advised marketers and served as material for Gilbert’s weekly syndicated column, “What Young People Are Thinking,” that ran in 300 newspapers.

            Gilbert reported in 1958 that teens’ independent purchasing power was $9.8 billion (the equivalent of $60 billion in today’s dollars, in a population much smaller than today’s). Two-thirds of adolescents’ money came from parents and one-third from their own earnings. Gilbert found that six in 10 teens bought their own music and sports equipment, and four in 10 bought their own clothes. The young age of marriage in the 1950s (18 was the most common single age of marriage for American women then) led Seventeen and other girl magazines to advertise housewares, furniture, and other wifely accouterments even more than clothes! (Kett 1977)

            Not only that, teens strongly influenced adult purchases. “Parents tend to be confused about matters of tastes, confused about values, and all ready to abdicate decisions--whether about cereals, car colors, furniture, or clothing--to sons and daughters who have definite opinions, shared with large numbers of their contemporaries,” Gilbert observed. He was also one of the few to note that what we now consider the tranquil Fifties was really a time of great anxiety, as indicated by fears of Russia and the Cold War and the rapid popularization of psychoanalysis and barbiturate tranquilizers among adults. What might help relieve anxiety? Why, if not a little yellow pill, then a good consumer spending orgy.

            And record numbers could afford that palliative. For one of the few times in American history, government and the private sector were promoting greater class and income equality. From 1945 to 1970, real wages (that is, adjusted to factor out inflation) rose rapidly among all age groups--and there was little disparity between young and old. For example, the median income of householders ages 18-34 rose from $22,000 in 1947 to $41,000 in 1970 (in constant 2004 dollars), while that of householders ages 45-64 increased from $25,000 to $51,000 in the same period. The welfare system’s benefits for young and old also increased. Aid to Dependent Children real payments rose from $365 per household per month in 1947 to nearly $900 in 1970, while Social Security for elderly and disabled recipients rose from $340 to $775 per month. The GI Bill, instituted by a grateful nation to reward troops’ World War II service, became America’s largest welfare system, pouring hundreds of billions of dollars into education, housing, business, and other subsidies for tens of millions of families. “Between 1940 and 1960, home ownership rose by more than 100 percent, thanks partly to government money that made it possible for the first time in our history for more people to own than to rent,,” wrote Cheryl Merser in Grown-Ups, one of the few Baby Boom authors not self-fawning about her generation. “...When our parents came of age, houses, cars, spouses, and babies were practically presents from the government... it subsidized, among other things, my two sisters and me” (p. 66).

            The first formal efforts at desegregation began with Brown v Board of Education (in 1954, outlawing racially segregated schools) and civil rights laws aimed at securing more integrated schools and neighborhoods and equal employment opportunities for African Americans. Poorer and richer families alike had more employment, more subsidies--more money to spend. This included teenagers, whose per-capita spending in the 1950s was about equal to today’s levels. As will be seen, the fortunes of poorer, younger families reversed in 1970 and produced today’s huge income gaps.

 

 

The Fifties

 

            But in the 1950s, younger folks were doing better. Jobs that paid enough to allow younger men to support families were plentiful. With greater youth prosperity of the 1950s, the age of marriage dropped dramatically, reaching 22.5 for males and 20.3 for females during the late 1940s and 1950s--which meant nearly half of all new brides were teens. The birth rate soared among all ages; one in 10 teenage females gave birth during the 1950s every year, a record never since surpassed. The Fifties were the Golden Age of teenage pregnancy--or, more correctly, adult-teen pregnancy.

            High school graduation rates, stagnant at a little more than half the teen population during the 1940s, soared after 1950, rising by 50% per decade to nearly 3 million--three-fourths of all teens--in 1970. Teen-dominated institutions had arrived, and the high school was the center of it. While “the age group primarily affected by the institutions of adolescence has been dropping,” wrote Kett, “a high degree of institutional segregation has continued to shape the experiences of young people” (1977, p. 269). Radical education reformers such as Edgar Z. Friedenburg worried that the high school, stuck in its antiquated industrial-school preparatory model of a half century earlier, was crushing student individuality--abolishing the dynamism of adolescent identity itself in favor of producing corporate-cast consumer and employee clones.

            Establishment researchers such as James Coleman rediscovered that universal education had molded a fearsome peer culture, a cauldron of youth rebellion. In The Adolescent Society, a 1961 study of Chicago high schools, Coleman reported that in today’s teens, adult society no longer confronted “a set of individuals” but “distinct social systems, which offer a united front to overtures made by adult society.” Teens had become a separate nation in America’s midst, with values opposed to those of adults and their schools, Coleman charged. Youth values tended toward socializing and athletics, not academic achievement. Teachers had little influences on teens; parents were first, barely, followed by peers.

            Coleman and others, to affirm what they saw as the oppositional nature of teen society, introduced the nit-picking survey (today in routine use) that reported that nearly all teens were delinquent in some frightening way. Since delinquency could be established by the youth admitting such trivial offenses as stealing a pencil or talking out of turn, it is hard to imagine how adults, asked similar questions, could have escaped mass designation as criminals as well. (Indeed, a 1947 survey by two sociologists of 1,700 New York middle-class grownups found 99% had committed felonies serious enough to warrant loss of citizenship; men had logged an average of 16 apiece, women 11!)

            Coleman worried that adolescent society was a “Coney Island mirror” that reflected adult culture in a wildly distorted way. However, Hine observed wryly, “the scariest possibility of all was that adolescent culture was a truer reflection of the society at large than most adults realized”--or were willing to admit (2000, p. 246). In fact, research by a number of social scientists found “absolutely no good body of data on adolescents...which indicates the existence of a really deviant system of norms which govern adolescent life” (Kett 1977, p. 263).

            The 1950s have been depicted as a time of youth tranquility along shaded streets peopled by Ward, June, Wally, and the Beav. Funny, then, to go back to what Fifties adults thought of kids then. Alarmed at rising juvenile delinquency, best- selling books carried titles like 1,000,000 Delinquents (“their stories emerge like screams of terror in a quiet night”) or The Violent Ones (“powerful stories of the teen-age jungle”). Experts blamed the supposed teen crime orgy on the same suspects that would be cited a half-century later: lax parents, declining morals, and “salacious, sadistic” pop culture led by media moguls accused of “profiting by selling children and young people images of violence, brutality, racism, and sexual degradation”

(Hine 1999, p. 241). Ironically, the most violent, brutal, and sexually salacious films of the era were purportedly issued to warn parents and adults of rising teenage depravities, and the most blatantly racist event in popular media was the 1957 cancellation of disc jockey Alan Freed’s popular TV show for showing a black singer, Frankie Lymon, dancing with a white girl--along with “American Bandstand’s” ban against any form of interracial dancing.

            The 1950s brought a barrage of alarming, luridly-titled books, movies, and documentaries (chronicled in Teenage Confidential, a fantastic collection of 1930s, 1940s, and 1950s images of youths) warning that teens were forming a dangerous peer culture that defied adult values. The teenage girl was a particular object of fear, depicted as voluptuous, scantily clad, ready for all things evil. “WHERE ARE YOUR CHILDREN?” screamed the title of a Jackie Cooper film, over subheads, “Increase in youth delinquency alarms nation... Young thrill seekers danger to nation... murder, robbery... rape... prison.” Other titles: “Delinquent Daughters,” “Under Age,” “Rebellious Daughters,” “Girls Under 21,” warned: “they start by stealing a lipstick... finish with a slaying!” (Barson & Heller 1998).

            National fear of the latter included “horror comics,” blamed by psychiatrist Frederic Wertham for instigating a massive new increase in violence by youth. “Younger and younger children are committing more serious violent acts,” wrote Wertham in his popular book Seduction of the Innocent in 1953. “Even psychotic children did not act this way fifteen years ago.” Wertham cited a teen who tortured a four year-old “just because I felt like it,” a 13 year-old who committed a “lust murder” of a six year-old girl, and four boys who beat a candy store proprietor with a hammer and pounded a knife through his head. This, remember, is the 1950s, universally cited by commentators such as former President Clinton and former Los Angeles Police Chief Bernard Williams, among scores of others, as a time when youth crime consisted merely of shoplifting and joyriding.

            Like commentators who make identical assertions today, Wertham did not bother to check history. The same kinds of grisly crimes by youth were indeed cited in the 1930s, in equal tones of alarm that such things had never occurred in the 1910s or 1880s--which they had. (In fact, the nation’s youngest serial killer, the diabolical Jessie Pomeroy, age 14, rampaged in the 1870s.) And in the 1950s, the warning (issued by the Mid-century White House Conference on Children) that “the standards of the lowest classes can reach some of the boys and girls of other social groups” was the same alarm heard in the 1910s and to be heard again in the 1960s and 1990s.

            But when it comes to discussion of youth issues, reality means little; perception is everything. The U.S. Senate established a special subcommittee on Juvenile Delinquency and convened hearings chaired by Senator Estes Kefauver in 1953. In 1956, in a book titled Youth in Danger, the later committee chair issued this assessment amid details of eighth-grade dope parties, elementary school junkies, and high-schoolers’ mass murders:

 

Younger and younger children... thirteen and fourteen... are becoming involved in... the most wanton and senseless of murders... and mass rape... One youth gang, crazed by sixty-cent-a-gallon wine spiked with alcohol, roamed the streets, knives in hand, stabbing indiscriminately everyone they met. This outburst of senseless savagery sent



six persons to hospitals with serious injuries and resulted in the deaths of two others... High school students massed in a phalanx and swept through the trolleys, as ruthless and destructive as a tornado, literally tearing the cars apart, beating drivers and robbing their fares... a group of high school football players found a girl alone on a bus and shredded her clothes from her on the spot in a mass attempt at rape (Youth in Danger, 1956, pp. 6-7).

 

 The Senate subcommittee warned the number of delinquents entering juvenile courts rose from 300,000 in 1948 to 385,000 in 1953 and was predicted to reach 750,000 by 1960.

            Nationally, reports of sensational teen crimes made the covers of major news media such as Newsweek. “Let’s Face It: Our Teen-Agers Are Out of Hand,” the magazine’s September 6, 1954, cover announced. An inside article, “Our vicious young hoodlums,” deplored “the national teenage problem” led by “an orgy of crime...shocking...growing more and more common every year.” The article reported on four white youths ages 15 to 18 who admitted horsewhipping and setting fire to New York parkgoers, killing two “slowly and painfully” and injuring several more. Their leader called the murder “my supreme adventure.” Another teen gunman bragged, “I just get a kick out of it when I see blood running.” Newsweek’s reporters, like those of other media, roamed the nation to chronicle shootings, beatings, and rapes by youths as young as 10 from Gardena, California, to Atlanta and New York.

            In both Western Europe and the U.S., a new kind of youth gang was emerging -- the Teddy Boys in Britain, Halbstarke in Germany, the blousons noirs in France, and motorcycle gangs in America. The territorial street corner gangs were being replaced by roving, mobile youth associations. (While today’s reports on gangs express horror at such barbaric initiations as dumping urine on new members, 1950s gang rites often included excreta showerings) (Kett 1977.) Veteran gang researcher Malcolm Klein of the University of Southern California notes that gangs of the Fifties did not fit the theatrical West Side Story image; many of the alarming claims about gangs today were present decades ago. In The American Street Gang, Klein wrote:

 

In the gang world of the 1950s and 1960s... people did get hurt;  people did get killed. Drive‑by shootings were not uncommon, even though today's police and writers believe that drive‑bys are a new feature of gang life. I have three taped interviews with gang members in the mid‑1960s. All three describe drive‑by shootings in a matter‑of‑fact way, a part of gang life... Although some writers and officials decry the 8- and 10-year old gang member, they haven't been in the business long enough to realize that we heard the same reports twenty and forty years ago (1995, pp. 69, 105-06)

 

            In Philadelphia, a 13 year-old boy was charged in more than 50 robberies, rival gangs rumbled on the streets and attacked streetcar riders, and “girl gangs” were accused of torturing their peers. “Gangs have taken the place of parents,” warned the popular 1955 movie and semi-docudrama Blackboard Jungle, showing youths of every color running amuck in an urban school. In my home town, Oklahoma City, a massive dope scandal erupted among middle-class and affluent youth. Lurid tales of mass drug parties involving marijuana, barbiturates, booze, and even heroin, junior high sex and dope orgies, hundreds of back-alley abortions, and the testimony of 300 teens at emergency hearings by the state legislature culminated in the 1953 murder of the legislature’s chief investigator by a 16 year-old gunman in front of a police station.

            Then, in 1959, 19 year-old Charles Starkweather and 14 year-old girlfriend Caril Ann Fugate embarked on a midwestern murder, rape, and torture spree, gunning down the girls’ parents, two year-old sister, a gas station attendant, and eight more before their arrest. Dubbed “rebel without a clue” and immortalized in movies like Badlands, Starkweather was executed for the 11 murders in 1959. Fugate was sentenced to a life term but paroled in 1977. No teen serial killer even approaching Starkweather’s body count has since emerged, though the Manson family very likely committed more killings as a collective (see Chapter 7).

            Though such spectacular youthful mayhem generated Fifties public fears (now forgotten), the fact is that as a generation, youth of the 1945-64 period showed the lowest rates of murder, suicide, serious crime, drug abuse (or even drug use), venereal disease, and other ills of any generation in the century. Compared 1930s Greatest Generation teens, 1950s teens were 40% less likely to commit suicide and 50% less likely to be murdered, for example. Perhaps they were too busy having babies.

            Fifties kids had negligible rates of drug overdoses, but they did drink. A survey in (of all things) Better Homes & Gardens (March 1954) of 1,000 suburban teens ages 13-18 found 90% had imbibed alcohol, half drank at least once a week, half drank on dates, one-third drank at school events, one in six drank before age 11, four in five were drinking by age 14, one in 10 had a fake ID, one in three held teen drinking parties, and one in six reported problem behaviors such as fights, accidents, property destruction, sexual activity, and speeding after drinking. Other than traffic crashes, however, Fifties adolescents--the “Silent Generation”--showed remarkably low risks.

            Commensurate with the rising fear of girls, the war over teenage sex--or what the warriors euphemistically call “teenage sex”--was intensifying. Its popular myth was captured in a 1959 drama, Blue Denim, which depicted a promiscuous but naive 16 year-old knocked up by a 16 year-old boy-pretending-to-be-a-man and her subsequent rescue from a shady abortionist. This was not the larger reality of “teen sex,” as Chapter 6 will discuss. Very few 16 year-old boys caused pregnancies; 80% of the fathers in the record number of babies born to teen mothers in the 1950s were over age 20; one-fourth were over 25. But it was the fraction of “teen sex” that involved two teens that disturbed authorities the most, then and now.

            Things were about to blow--in every sense of the word. An early harbinger was California author Jessamyn West’s remarkable Cress Delahanty, published in 1949. West’s prim language and themes earned Cress, like her other popular novels, a prominent slot in literature and high school anthologies. But West’s subtexts of sexuality, violence, incest, and betrayal lurking below the surface life of a stable teenage girl growing up with wise and understanding parents on a pastoral 1940s Orange County ranch proved deeply troubling to confused critics. Especially the novel’s first chapter, which appears in no student readers. It finds Cress, age 12, left home alone at the ranch on a windy October evening by her parents, acting out erotic fantasies. “What do I here, alone, abandoned, hiding?” wonders Cress. Undressing, wrapping seductively in her mother’s black lace shawl, Cress imagines The Boy:

 

And because she regarded herself, thinking of him, he who was yet to come, it as as if he, too, saw her. She loaned him her eyes so that he might see her, and to her flesh she gave this gift of his seeing. She raised her arms and slowly turned and her flesh was warm with his seeing. Somberly and quietly she turned and swayed and gravely touched now thigh, now breast, now cheek, and looked and looked with the eyes she had given him.

 

To the adult world’s stifling conventionalities forcing false innocence on girls, Cress declares: “I hate them and I will dance them down:”

 

So she danced it, wrapped in the black shawl, with the dust motes dancing about her. She danced it until she trembled and leaning on bent elbows looked deep into the mirror and said, “There is nothing I will not touch... I will know everything.”

 

Cress’s eroticism, unheard-of in depictions of pubescent girls of the day, were made more disturbing by the chaste, good-girl image she presented to her parents--parents who understood they were being conned, who so earnestly wanted to be.

            West’s charming, bitter--and popular--novel of 1940s Orange County girlhood affirmed the generational chasm between child and parent adults preferred not to think about. It revealed a subsurface of young girls hidden behind masks, protecting adults from knowing.

            “I can dance the word,” Cress lamented, “but I cannot say it.”

            Soon, they would be saying it.

 

Bearing the Boom

            The Baby Boom was being born. From depression and wartime lows, the birth rate soared after 1945, peaking in 1958. California was even more fertile, especially its young women. In the late 1950s, an astounding 1 in 9 California women ages 15-19 gave birth, as did more than 1 in 4 ages 20-24, every year.

            Zoologist Alfred Kinsey’s blockbuster reports on sexual behavior in human males and females, later lumped together as The Kinsey Report, created front-page sensations when issued in 1948 and 1953. Kinsey’s interviews with 12,000 subjects found that not only did Americans fail to follow prescribed codes of sexual morality, the failure to do so was so widespread as to call into question whether such codes even existed. Kinsey found that fewer men were having sex with prostitutes than those of previous generations-- because so many unmarried women were taking up the slack.

            Young women in each new generation were much more sexually active, and at ever younger ages, than in previous ones, he found. Kinsey’s report on the rising routineness of male promiscuity did not raise much stir, but his subsequent report on women led to strong resentment, canceled speaking engagements, and Congressional threats to de-fund his research (Moran 2000). Kinsey’s report and concerns over delinquency led to a famous Commonwealth Fund study in 1951 that concluded that most parents lacked the “affection, stability, and moral fiber” and were “usually unfit to be effective guides, protectors, or desirable models” for their children (Moran 2000, p. 137). Harsh words indeed for Ward, June, Robert Young, Jane Wyatt, Ozzie, and Harriet. The consequences of parential wimpiness were graphically illustrated in such films as 1953’s Teenage Devil Dolls and 1955’s Rebel Without a Cause (as well as Blue Denim), showing kids “from good families” variously driven to delinquency, drugs, drunkenness, and promiscuity by weak fathers and overbearing mothers. It was, after all, “Jim Stark, teenager--from a ‘good’ family,” causeless rebel, corrupted by evil, dark forces lurking at the fringes of polite society, that most worried Main Street America.

 

The Heartland: drugs, orgies, murder, rape... and amnesia

            During my undistinguished tenure on my high school’s tennis team during my sophomore year in 1966, I received a breathless late-night phone call from a girls’ tennis team member wanting to know if I’d heard the news. Six senior members of the boys’ tennis team, all from rich, prominent, conservative families, had been arrested and charged with gang rape.

            Their later guilty pleas admitted they had been drunk one and all at a drive-in hamburger stand on the high school street of dreams, North Western Avenue, where a 15 year-old runaway from Texas was unlucky enough to be begging food as they pulled up in their ‘65 Camaro. They either forced or cajoled the girl into the car, drove her to a remote oilfield, and gang raped her for hours before dumping her under a bridge. One boy refused to go along with the rape, later called the police, and served as chief witness for the prosecution.

            The trial was every feminist’s nightmare. The well-funded defense attorneys imported every past boyfriend to testify graphically as to what the girl would do (so they said) without being forced. But her doctor-attested injuries and the holdout boy’s testimony were too much. The six remaining rapists were forced to plead guilty and accept the stern sentence of exile to a rich relative’s custody in some faraway state where no one had heard of their crime. The case created a weeks-long sensation at my school-- Harding High, renamed “Hard-On High” by imaginatives from rival schools--and great citywide moralizing from Baptist preacher to newspaper editorialist.

            What amazes me is that at my 25th high school reunion in 1993, I was the only one who seemed to remember this savagery. The puzzled looks I got (usually in response to some 40-age classmate’s indignance about how violent “kids are today”) convinced me they had honestly blocked all remembrance of it. Their mostly-white high school of 1966 could not possible have harbored tennis-team rapists.

 

            I encountered the same selective amnesia when I wrote a lengthy piece for the Oklahoma Gazette on Oklahoma City’s massive 1953 drug, sex, and murder scandals among high schools. No one remembered. I had to rely on 1950s newspaper reports and legislative and congressional testimony (Males 1991).

            Health authorities--led by the American Social Hygiene Association (ASHA)--argued that parents needed outside help, and the schools had to take over

the job of teaching morals. The term “sex education” was too fearsome to mention; “family life” or “anticipating marriage” were the common class titles. Venereal disease films prepared for the Army in World War II, showing gross physical and brain damage, were dusted off and shown in high schools. Family life educators insisted that only a narrow range of sexual behavior emphasizing chastity was acceptable for presentation in schools, since “‘healthful, wholesome personal relations’ are pretty much the same for all decent people.”

            Still, authorities in the late 1950s warned that VD rates among teens were rising rapidly--as usual, failing to mention that most resulted from adult-teen, not teen-teen relations, and, incidentally, failing to mention that such claims were not true. U.S. Public Health Service figures actually showed VD rates dropping among teens during the late 1950s. Honesty has only a passing acquaintance with the emotional “teen sex” debate.

            ASHA and other family life educators--then as now--insisted their efforts stemmed teenage sexual activity but could provide little evidence beyond weak assertions. An unusually graphic three-part article on “one of our most shocking social evils” appeared in the Saturday Evening Post in May and June of 1961 revealing that “every day, thousands of American women risk their lives to be rid of unwanted, unborn children.” Author John Bartlow Martin’s research estimated 750,000 to 2 million illegal abortions in the U.S. every year, and teenagers figured prominently in his series. “The choices open to a pregnant high-school girl are abortion, disgrace, or reluctant and often disastrous marriage,” he stated. Many women seeking abortions “have suffered childhood deprivation, divorce, spontaneous miscarriage, severe emotional disturbance and other sociomedical traumas” as well as feeling “unwanted by their fathers” (Martin 1961, pp. 19, 21, 22). These are not traits we associate with the Fifties.

            Thus, family life educators argued that schools should stimulate “a more widespread demand for the social services that have come into being to help people in the modern world,” including psychiatrists, marriage counselors, child guidance specialists, community centers, mental health experts, and a variety of other consultants now deemed necessary to guide childraising the increasingly confusing and anxious 1950s.

            “Well- adjusted families of the future would be like patients with a host of life-support tubes running from their bodies” (Moran 2000, pp. 154-55). And they needed it. There was considerable unease below the apparently tranquil surface of the 1950s. Beat poets such as Lawrence Ferlenghetti and Alan Ginsburg won large audiences. Paul Goodman, in Growing Up Absurd: Problems of Youth in the Organized Society, a relentless critique of all things Fifties, derided a gubernatorial call for volunteers and programs to lure New York City's youth of 1955 away from gangs: “Does the governor seriously think he can offer a good community that warrants equal loyalty?” (1956, p. 42)

            Actually, America was about to make an earnest effort to try.

 

 

 

References

 

Barson M, Heller S (1998). Teenage Confidential: An Illustrated Story of the American Teen. San Francisco: Chronicle Books.

 

Feldman S, Elliott GR, eds. (1993). At the Threshold: The Developing Adolescent. Cambridge MA: Harvard University Press.

 

Friedenburg EZ (1956). The Vanishing Adolescent. Boston: Beacon Press.

 

Hein M (2001). Not in Front of the Children: “Indecency,” Censorship and the Innocence of Youth. New York: Hill & Wang.

 

Hine T (1999). The Rise & Fall of the American Teenager. New York: Avon Books/

 

Kett J (1977). Rites of Passage: Adolescence in America, 1790 to the Present. New York: Basic Books.

 

Males M. (1991, February 12). "Drugs, Sex, and Violence:  Oklahoma City's Fifties Youth Crisis."  Oklahoma Gazette, pp. 1-4.

 

Martin JB (1961, May 20, 27, and June 4). Abortion. Saturday Evening Post, pp. 19, 21, 22).

 

McWilliams C (1968). The California Revolution. New York: Grossman Publishers.

 

McWilliams C. (2001). Fool’s Paradise: A Carey McWilliams Reader. Santa Clara, CA: Santa Clara University Press.

 

Mead M (1970). Culture and Commitment: A Study of the Generation Gap. New York: Doubleday.

 

Moran JP (2000). Teaching Sex: The Shaping of Adolescence in the 20th Century. Cambridge: Harvard University Press.

 

Rawls JJ, Bean W (1993). California: An Interpretive History, 6th edition. New York: McGraw-Hill Inc.

 

Spring J (2003). Educating the Consumer-Citizen: The History of the Marriage of Schools, Advertising, and Media. London: Lawrence Erlbaum Associates.

 

Strauss W, Howe N (1990). Generations: The History of America’s Future, 1584 to 2069. New York: Quill.