

People have always pushed back against the social expectations that constrain them.
Femininity
The Physical Education of Women is Fraught With Issues of Body, Sexuality, and Gender
A new book, ‘Active Bodies,’ explores the history.

Women who made a living as gym teachers—combining the “manliness” of a gym-centered life with the grace and maternal energy associated with femininity—were a puzzling sight to some, particularly before women’s professional sports existed. Gym class is now such an entrenched part of American education—and American comedy—that it seems like it’s always been with us. But in her new book, Active Bodies: A History of Women’s Physical Education in Twentieth-Century America, Bucknell University historian Martha H. Verbrugge traces the very specific history of this field, and its relationship to gendered ideas about health, bodily capabilities, and living an “active” life.
Before Woody Allen’s famous line—“Those that can’t do, teach, those that can’t teach, teach gym”—and before lowbrow dodgeball humor, there was the German, English, and American physical culture movement, a late 19th century wave of health mania inspired by a zest for competition and a belief that calisthenics and fresh air were core civilizational pressure valves.
By the early 20th century, physical education emerged as a distinct discipline and a vocation in the U.S., and it drew a varied lot of young men and women. As Verbrugge writes, before 1915, only three states required physical education. By the end of World War I, that number had grown to 28. Just over a decade later, it was 46 states. In addition to its roots in physical culture, the rise in phys ed was part of a broader national push for compulsory education, which included the formalization of attendance policies and curriculum. Exercise programs at public schools grew sharply during the first half of the 20th century, and many of the teachers were young women.
The women who found their way to the nascent field were mostly “white, native-born and middle-class,” according to Verbrugge, and their routes to phys ed were often “indirect or unplanned.” Some were athletes turning their enthusiasm for sport into a profession, others had discovered phys ed through forms of physical therapy after an injury. And still others may have been drawn at least in part by the promise of community. As Verbrugge writes, “A significant number of unmarried teachers were lesbians….Careers in recreation, physical education, and sports not only nurtured lesbians’ professional interests but also opened doors to nontraditional jobs and friendships.” Lesbianism was the field’s “open secret” (and best known stereotype) for decades, but fear of actually being accused of “deviance” was very real.
READ ENTIRE ARTICLE AT TIMELINE
How Training Bras Constructed American Girlhood
In the twentieth century, advertisements for a new type of garment for preteen girls sought to define the femininity they sold.

In the early twentieth century, it was still something of a novelty to encounter mass-produced undergarments at all—or clothing that was specific to children. Before then, middle-class children in Europe and North America essentially wore kid-sized versions of adult clothing. “From the seventeenth to the nineteenth century,” the museum curator Anaïs Biernat wrote in the 2015 collection Fashioning the Body, “like their parents, children’s bodies were constricted by a hidden frame consisting of whalebone stays or a corset that formed a rigid structure around the torso.”
According to the U.S. Patent and Trademark Office (USPTO), patents for softer, less restrictive alternatives to corsets were filed in the late nineteenth century and developed in the 1910s. The idea of separate undergarments for young girls developed in parallel. By the 1920s, a “junior” market for youth clothing was in full swing, as corset makers deliberately sought to turn young girls into lifelong consumers. According to the historian Jill Fields:
Manufacturers still maintained concerns that younger women in the 1920s might never wear corsets if they did not undergo the initiation into corset wearing that women had in previous generations. They looked closely at the circumstances of a young girl’s first corset fitting in order to find ways of luring young women to a corsetiere.
It wasn’t only corsets. Girdles and brassieres became part of the burgeoning youth market, with the Warner Brothers Corset Company selling a “Growing Girl” brassiere in 1917. The trend led a fashion buyer to report in the 1920s that “[s]mall sizes sell best—even the little girls wear brassieres now.”
READ ENTIRE ARTICLE AT JSTOR DAILY
The Real Calamity Jane Was Distressingly Unlike Her Legend
A frontier character’s life was crafted to be legendary, but was the real person as incredible?

‘This is the West, Sir,’ says a reporter in The Man Who Shot Liberty Valance. ‘When the legend becomes fact, print the legend.’ This is very much the advice that has applied to Calamity Jane over the years. She was the lover of ‘Wild Bill’ Hickok, avenged herself on his killer and bore his secret love-child. She rode as a female army scout and served with Custer. She saved a runaway stagecoach from a Cheyenne war party and rode it safely into Deadwood. She earned her nickname after hauling one General Egan to safety after he was unhorsed in an ambush. She was a crack shot, a nurse to the wounded, a bullwhacker and an elite Pony Express courier.
Not one of these things is true. In fact, as Karen Jones sets out dismayingly early in her book, the only things that the real-life ‘Calamity Jane’ can with confidence be said to have in common with her legend is that she wore pants, swore like a sailor and was drunk all the time. Martha Jane Canary spent most of her itinerant life in grim poverty and hopelessly addicted to alcohol. She worked not as an army scout but as a camp-follower, laundress, saloon girl and occasional prostitute. As one 20th-century biographer put it crisply, her true story is ‘an account of an uneventful daily life interrupted by drinking binges’.
Karen Jones’s book, then, is a sort of dual biography: it’s the biography of Martha Canary (who checks out halfway through this book in 1903, at 47, from alcohol-induced inflammation of the bowels), and it’s the biography of the legend that grew up around her, much of it during her own lifetime and with her encouragement and collusion, and how it changed over the years that followed. She was, writes Jones, a ‘multi-purpose frontier artifact’.
The main point that Jones makes, and makes rather a lot, is that by dressing like a man and drinking in saloons and swearing and shooting things (aka ‘female masculinity’) Martha disrupted the ‘normative feminine behavior’ of the Old West. Non-academic readers might be warned that there’s a good deal of social studies jargon woven through this story. Jones is forever going on about normativity and gender performance and ‘the frontier imaginary’ (in an apt piece of linguistic cross-dressing, ‘frontier’ here serves as an adjective and ‘imaginary’ as a noun). But the material is all here, and very interesting material it is, too. There can be pretty much no reference to Calamity Jane that this diligent researcher does not find space to note: the Beano character ‘Calamity James’; My Little Pony’s ‘Calamity Mane’; ‘“Calamity” Jane Kennedy’ in the Devon-set BBC comedy drama The Coroner; and even — I was impressed by this — a trash-removal company in Margate that ‘sports an advertising insignia of a horse-drawn stage and a name founded on a stupendous use of badinage: WhipCrapAway’.
READ ENTIRE ARTICLE AT THE SPECTATOR WORLD
“All the World’s a Harem”
How masks became gendered during the 1918–1919 Flu Pandemic.

Carlotta, with the drooping mouth; Esther, with the too-tilted nose, and Mary, the colleen with brown freckles, are taking full benefit of this temporary masked delusion…Carlotta, Esther, and Mary, and many other Carlottas, Esthers, and Marys, are making the most of the opportunity to show their eyes to advantage, forgetting the while about pug noses, large mouths, and freckles.
During the influenza epidemic that ravaged the United States in the fall and winter of 1918 and 1919, cities across the country advised or required masks. Soon, discussions of masks took center stage across American media. Newspapers were filled with articles explaining how to make, wear, and purchase masks. From their inception, these discussions were focused on gender, and women in particular: how were women adjusting to the new normal? What was the public’s perception of women wearing masks? Readers of the October 29, 1918 edition of The Seattle Star got a peek into this new narrative in the quote above, implying women with less-than-desirable facial traits should be grateful for the temporary reprieve flu masks provided.
This stereotype of women wearing masks found expression in a cartoon published in the Muncie Evening Press on October 23, 1918. In this cartoon, a male patient pretends to have the influenza “just to get that pretty little nurse around and here she is wearing a mask.” Yet in the final frame, when the nurse removes the mask, which she is “sick of wearing,” she turns out to be less attractive than the male patient has imagined — leading him to announce, “Me? I’m cured!” This cartoon reinforced stereotypes as it sexualized the nurse’s appearance, ignoring her professional role. In the context of the 1918 influenza epidemic, however, this cartoon also illustrates how the image of the masked nurse became part of daily experience.
READ ENTIRE ARTICLE AT NURSING CLIO
Nevertheless, She Lifted
A new feminist history of women and exercise glosses over the darker side of fitness culture.

Danielle Friedman’s new book Let’s Get Physical traces the recent history of women’s fitness in the United States, primarily through a series of biographical sketches of women who, like Prudden, have shaped that history. Each chapter centers on a woman, or several, who pioneered a major fitness trend in a given decade. The first chapter focuses on Prudden, detailing her research on children’s fitness, her many popular books promoting exercise for women, her prison program and its accompanying New York Times coverage, and her TV show. Later chapters focus on Lotte Berk, the originator of the barre workouts that emerged in the 1960s, and, of course, Jane Fonda’s ’80s aerobics.
Friedman’s lively anecdotes about these women and their early followers are fun to read. The book presents a dynamic cast of characters, many of whom are little-known now, like Lisa Lyon, a bodybuilder who posed for Playboy and was profiled by Eve Babitz in Esquire (Babitz wrote that she had a “perfect little Bardot-Ronstadt face”); or Janice Darling, an instructor at Jane Fonda’s original workout studio who was one of the few Black fitness instructors of the era and who credited her fitness routine with helping her recover from an accident that broke both her legs and severed a muscle in one eye. It’s fun, too, to gawk at the now obviously ridiculous sexism many of them faced. Friedman writes, for instance, that for decades women were discouraged from exercising because it was believed that too much exercise could make their uteruses fall out.
Friedman clearly views the telling of this history as a feminist project. As she writes, “American women’s fitness history is more than a series of misguided ‘crazes.’ It’s the story of how women have chosen to spend a collective billions of dollars and hours in pursuit of health and happiness. In many ways, it’s the story of what it has meant to be a woman over the past seven decades.” She is right that exercise is a significant concern for many women, and so there is a feminist stake in refusing to dismiss or overlook its history. But the fact that there is a compelling feminist argument for studying the history of women’s fitness does not make that history, in and of itself, feminist—something Friedman’s book struggles to grasp.
READ ENTIRE ARTICLE AT THE BAFFLER
The Fitness Craze That Changed the Way Women Exercise
Fifty years after Jazzercise was founded, it is still shaping how Americans work out—for better or for worse.

While exercise spaces for women existed at the time, they often assumed that women valued prettiness and poise over feeling powerful. As early as the 1930s, a Chicago “figure salon” invited women to “soothe the nerves and control the curves,” according to a 1936 piece in the Chicago Tribune. For decades, these businesses were largely owned by men, whose rationale for sex segregation—such as having “ladies’ days” at the bodybuilder Vic Tanny’s chain of clubs—was more about maintaining proper distance between the sexes than enabling women to freely enjoy exercise.
But ideas about women’s bodies and who should have agency over them, at the gym and elsewhere, were changing. New research touted the benefits of aerobic exertion, expanding the popular understanding of exercise to include arenas outside of smelly weight rooms. Many proponents of women’s liberation sought to obliterate old ideas about female frailty and celebrated what women’s bodies could do, whether breastfeeding or playing basketball. Along with Missett, women such as Jacki Sorensen, who developed the competing “aerobic dancing,” and Lydia Bach, who imported Lotte Berk’s barre workout from London, infused this philosophy into exercise.
Jazzercise, with its mostly female clientele and high-energy vibe, was of this moment that Missett seized and helped shape. Her family relocated to San Diego in 1972, where a body-conscious health culture was kicking up. Military wives packed Missett’s classes, which she said she taught so frequently that she nearly permanently lost her voice. When her students’ husbands were reassigned, many of these women were so heartbroken imagining life without Jazzercise that Missett created an official certification program, and then a franchise system, turning exercise into employment for thousands of women and creating global brand ambassadors before such a term existed.
Thousands of letters Missett has saved relay how Jazzercise moved women not only to lose inches, but also, in some cases, to leave abusive husbands, demand raises, and generally find joy in their bodies and lives. Jazzercise’s empowerment effect could be especially intense, because enjoying classes could become a career (more than 90 percent of franchisees begin as students; even more are women). I’ve interviewed women whose first solo travel, in their 30s, was to a Jazzercise convention. They found in the franchise a rare opportunity for employment and camaraderie that fit in with the demands of child-rearing. Missett relishes such stories of how Jazzercise has enabled women’s economic independence, including her own: She gleefully recounts a triumph in 1975 over a sexist Parks and Rec bureaucrat who balked at writing a big paycheck to a “little exercise girl.”
READ ENTIRE ARTICLE AT THE ATLANTIC
“The Wizard of Oz” Invented the “Good Witch”
Eighty years ago, MGM’s sparkly pink rendering of Glinda expanded American pop culture’s definition of free-flying women.

Delving into the provenance of Glinda’s character reveals a lineage of thinkers who saw the witch as a symbol of female autonomy. Though witches have most often been treated throughout history as evil both in fiction and in real life, sentiments began to change in the 19th century as anticlerical, individualist values took hold across Europe. It was during this time that historians and writers including Jules Michelet and Charles Godfrey Leland wrote books that romanticized witches, often reframing witch-hunt victims as women who’d been wrongfully vilified because of their exceptional physical and mystical capabilities. Per Michelet’s best-selling book, La Sorcière of 1862: “By the fineness of her intuitions, the cunning of her wiles—often fantastic, often beneficent—she is a Witch, and casts spells, at least and lowest lulls pain to sleep and softens the blow of calamity.”
The ideas of Michelet and like-minded writers influenced Matilda Joslyn Gage, an American suffragist, abolitionist, and theosophist. She posited that women were accused as witches in the early modern era because the Church found their intellect threatening. “The witch was in reality the profoundest thinker, the most advanced scientist of those ages,” she writes in her feminist treatise of 1893, Woman, Church, and State. Her vision of so-called witches being brilliant luminaries apparently inspired her son-in-law, L. Frank Baum, to incorporate that notion into his children’s-book series about the fantastical land of Oz. (Some writers have surmised that “Glinda” is a play on Gage’s name.)
Like Gage, Baum was a proponent of equal rights for women, and he wrote several pro-suffrage editorials in the South Dakota newspaper he owned briefly, the Aberdeen Saturday Pioneer. Although his book The Wonderful Wizard of Oz, published in 1900, is titled after a man, it is fundamentally a female-centric story: a tale about a girl’s journey through a land governed by four magical women. There are actually two good witches in Baum’s original version: Glinda is the witch of the South, not the North, in his telling, and she doesn’t appear until the second-to-last chapter. The book states that she is not only “kind to everyone,” but also “the most powerful of all the Witches.”
READ ENTIRE ARTICLE AT THE ATLANTIC
An Investigation Into the History of the ‘Ditz’ Voice
How pitch, tonality, and celebrity imitation have portrayed cluelessness.
In January, Saturday Night Live aired a sketch spoofing The Bachelor, one of many they’ve done throughout the reality juggernaut’s time on air.
Bachelor contestants adhere to a long-established archetype in the public consciousness: the vapid gold-digger who needs a man to make her life complete. Most of the cast members depicting the contestants adopted a certain speaking style: monotonous, with elongated ending syllables and a lot of vocal fry, in line with the voice associated with “ditzy” girls today. But host Jessica Chastain’s interpretation was slightly different: her voice had a higher pitch and a little more musicality—more AMC than ABC. Though it sounded old-fashioned, it was clearly recognizable as part of a library of voices women have pulled from over the years to play silly, sappy, or simpering women.
A version of this voice has existed since sound met film and, in a way, since a little before that. Actresses of early film played mostly damsels in distress or wide-eyed young women, and by the time talkies took over, women were still portrayed as less headstrong, more head-in-the-clouds. “The 1920s had a serious case of the cutes,” notes Max Alvarez, a New York-based film historian. “There is a prevalence of childlike women in the popular culture [at the time] … Girlish figures, girlish fashion, girlish behavior.” Along with these girlish figures came a girlish voice—high-pitched, a bit breathy, and a little bit unsure, evident in Clara Bow’s pouty purr, and even Betty Boop’s singsong.
Shortly after the advent of sound in cinema, the scrappy, spunky flappers of the ‘20s were relegated to supporting characters—“the gangster’s moll, the cocktail waitress,” says Alvarez. Musicals of the era, says Alvarez, were bastions of these kinds of wise-cracking wacky sidekicks. “Anything with a backstage Broadway setting, you’re gonna find these women.” The speaking voices filling these film’s chorus lines were still childlike as in the decade prior, but started to show signs of the modern-day “sexy baby voice”: a little bit breathy, a little bit nasal, and with fewer harsh consonant sounds.
READ ENTIRE ARTICLE AT ATLAS OBSCURA
The Failed Promise of the Aerobics Revolution
A new dramedy on Apple TV+ explores the roots of America’s fitness craze.

Revolutions don’t always happen in the streets. In the early 1980s, a seismic shift took place in strip-mall storefronts that smelled of sweat and Enjoli. Pulsing to the beat of Donna Summer and glistening with spandex, these fluorescent-lit rooms vibrated with the energy of career women and housewives bouncing in unison.
Aerobics was liberation. It offered a way for millions of women to feel proud of what their bodies could do, not just how they looked.
My mom was one of these women. She was intrigued at first by aerobics’ promise to help her lose the weight she had gained during her pregnancy with me, and then she found that she loved the music, the energy, the adult camaraderie.
One particularly challenging day at home caring for my older sister and me, she told me later, she was counting down the hours until my dad’s return from work, when she could leave for aerobics. She was already in her leotard, tights, and sweatband when he called to remind her he had to stay for a meeting that night — she’d have to miss her class. She sat down at the kitchen table and wept. Taking in the ridiculous combination of her Lycra and tears only made her cry harder. Aerobics, she realized, had become integral to her identity as a woman, independent from her roles as wife and mother.
READ ENTIRE ARTICLE AT THE NEW YORK TIMES
Sluts and the Founders
Understanding the meaning of the word “slut” in the Founders’ vocabulary.

“A lady who has been seen as a sloven or slut in the morning will never efface the impression she then made with all the dress and pageantry she can afterwards involve herself in,” Thomas Jefferson wrote to his daughter Patsy, then eleven, in December 1783. “Nothing is so disgusting to our sex as a want of cleanliness and delicacy in yours. I hope therefore the moment you rise from bed, your first work will be to dress yourself in such a stile as that you may be seen by any gentleman without his being able to discover a pin amiss, or any other circumstance of neatness wanting.”
It’s safe to say that “gentle parenting” enthusiasts wouldn’t approve of his approach, but Martha, Patsy’s mother, may have. When Jefferson wrote those words, he was less than a year into grieving her. Martha’s own mother died when she was two, and by the time she was twelve she had buried two stepmothers. Historians know little about her relationships with them, but we do know she wanted to protect her children from a similar fate. As she lay on her deathbed, witnesses—including Sally Hemings, Martha’s enslaved half-sister who would go on to have six of Jefferson’s children—would remember her making an astonishing request:
When she came to the children, she wept and could not speak for some time. Finally she held up her hand, and spreading out her four fingers, she told him she could not die happy, if she thought that her four children were ever to have a stepmother brought over them. Holding her hand, Mr. Jefferson promised her solemnly that he would never marry again. And he never did.
As Annette Gordon-Reed writes in The Hemingses of Monticello, “That Jefferson at age thirty-nine promised not to do so extraordinary.” But as Gordon-Reed brought to light, he didn’t spend the rest of his life alone.
READ ENTIRE ARTICLE AT STUDY MARRY KILL
Casimir Pulaski, Polish Hero of the Revolutionary War, Was Most Likely Intersex
Disputed remains were the right height and age and showed injuries consistent with the general’s life. There was just one catch: The skeleton looked female.

He is called the “father of the American cavalry,” a Polish-born Revolutionary War hero who fought for American independence under George Washington and whose legend inspired the dedication of parades, schools, roads and bridges.
But for more than 200 years, a mystery persisted about his final resting place. Historical accounts suggested the cavalryman, Casimir Pulaski, had been buried at sea, but others maintained he was buried in an unmarked grave in Savannah, Ga.
Researchers believe they have found the answer — after coming to another significant discovery: The famed general was most likely intersex.
READ ENTIRE ARTICLE AT THE NEW YORK TIMES
Ida Lewis, “The Bravest Woman in America”
In her thirty-two years as the keeper of Lime Rock Lighthouse, Ida Lewis challenged gender roles and became a national hero.

Ida Lewis, the namesake of Arlington National Cemetery’s Lewis Drive, was once known as “the bravest woman in America.” Lewis served as an official lighthouse keeper for the U.S. Lighthouse Service (later absorbed into the Coast Guard) from 1879 until her death, at age 69, in 1911. As the keeper of Lime Rock Light Station off the coast of Newport, Rhode Island, Lewis performed work that was critical to national security: lighthouses, administered by the federal government, aided navigation and helped protect the nation’s coastlines.
Lewis also performed personal acts of heroism by rescuing people from drowning in the turbulent, cold waters off Newport. According to Coast Guard records, Lewis saved the lives of 18 people, including several soldiers from nearby Fort Adams; unofficial accounts hold that she saved as many as 36. Until 2020, she was the only woman to receive the Coast Guard’s Gold Lifesaving Medal, the nation’s highest lifesaving decoration.
When ANC dedicated its 27-acre Millennium site in 2018, Lewis became the first woman honored with a road in the cemetery named for her. Ida Lewis Drive runs between Section 29 and Sections 77–84, the new sections created with ANC’s Millennium Project expansion, in the northwest of the cemetery. Lewis herself is buried at Island Cemetery in her birthplace of Newport, near the lighthouse she once managed.
READ ENTIRE ARTICLE AT ARLINGTON NATIONAL CEMETERY
Valentina Tereshkova and the American Imagination
Remembering the Russian cosmonaut Valentina Tereshkova, the first woman in space, and how she challenged American stereotypes.

The first woman in space was the cosmonaut Valentina Tereshkova, who launched on June 16th, 1963. Her craft, Vostok 6, orbited the planet forty-eight times over three days. Tereshkova’s achievement was one of great pride and propaganda value for the U.S.S.R.—and confusion and consternation for the U.S.A.
For one thing, she didn’t fit American’s Cold War-era stereotypes of Soviet women. One such stereotype, as historian Robert L. Griswold reveals, was the “graceless, shapeless, and sexless” Russian working class woman. Many Americans imagined female Soviets as miserable and shabby, suffering from bad clothes and makeup, thanks to their inferior form of government. According to Griswold, by the late 1950s, the “American conception of Soviet working class femininity became a way to reassert the boundaries of proper womanhood” which, after World War II in the US, no longer had a place for “Rosie the Riveter.”
Then there was the stereotype of the apolitical matron, informed by Nina Khrushcheva, partner of Nikita Khruschev. Practically everybody liked “Mrs. K.” when she toured the U.S. in 1959. Although she was in fact “a revolutionary in her own right,” in the eyes of the American media she “became a kind of world grandmother who focused on her family and had little interest in Kremlin intrigue.” Griswold writes that in this case, conservative Baby Boomers’ maternal ideology was more powerful than anti-Communism.
READ ENTIRE ARTICLE AT JSTOR DAILY
Jenny Zhang on Reading Little Women and Wanting to Be Like Jo March

From the moment I learned English—my second language—I decided I was destined for genius and it would be discovered through my writing—my brilliant, brilliant writing. Until then, I had to undergo training, the way a world-class athlete might prepare for the Olympics; so I did what any budding literary marvel desperate to get to the glory and praise stage of her career would do—I read and read and read and then imitated my idols in hope that my talents would one day catch up to my tastes.
At age ten, I gave up picture books and took the leap into chapter books, but continued to seek out the girly subjects that alone interested me. Any story involving an abandoned young girl, left to survive this harsh, bitter world on her own, was catnip to my writerly ambitions. Like the literary characters I loved, the protagonists in my own early efforts at writing were plucky, determined, unconventional girls, which was how I saw myself. They often acted impetuously, were prone to bouts of sulking and extreme mood swings, sweet one minute and sour the next.
I always gave my heroines happy endings—they were all wunderkinds who were wildly successful in their artistic pursuits and, on top of it, found true, lasting love with a perfect man. I was a girl on the cusp of adolescence, but I had already fully bought into the fantasy that women should and could have it all.
On one of my family’s weekly trips to Costco, I found a gorgeous illustrated copy of Little Women by Louisa May Alcott, a book I had seen and written off every time I went to the library, repelled by the word women. Unlike the girl heroines I loved, a woman was something I dreaded becoming, a figure bound up in expectations of sacrifice and responsibility. A woman had to face reality and give up her foolish childish dreams. And what was reality for a woman but the life my mother—the best woman I knew—had? And what did she have but a mountain of responsibilities—to me, to my father, to my younger brother, to her parents, to my father’s parents, to her friends, to my father’s friends, to their friends’ parents, to her bosses, to her coworkers, and so on?
Her accomplishments were bound up in other people, and her work was literally emotional, as she was expected to be completely attuned to everyone else’s feelings. She worked service jobs where she was required to absorb the anger of complaining customers and never betray any frustration of her own. Her livelihood depended on being giving and kind all of the time, suppressing her less sunny emotions into a perpetually soaked rag that she sometimes wrung out on my father and me.
READ ENTIRE ARTICLE AT LITERARY HUB
Amelia Earhart’s Last Flight
The aviation pioneer was many things before—and after—her career as a pilot was cut short.

There were, in fact, other famous female aces in the early decades of aviation. All of them were daring—some were said to be better pilots than Earhart—and many of them were killed and forgotten. If Earhart became an “icon,” it was, in part, because women who aspired to excel in any sphere, at a high altitude, looked upon her as their champion. But it was also because the unburied come back to haunt us.
Earhart had already tried to circle the globe once in 1937, flying westward from Oakland, but she had crashed taking off in Honolulu. Determined to try again, she coaxed additional funds from her sponsors, and “more or less mortgaged the future,” she wrote. The plane, hyped as a “flying laboratory” (it wasn’t clear what she planned to test, beyond her own mettle and earning power), was shipped back to California for repairs, and, once Putnam had renegotiated the necessary landing clearances and technical support, she and Noonan set off again, on June 1st, this time flying eastward—weather patterns had changed. A month and more than twenty-two thousand miles later, they had reached Lae, New Guinea, the jumping-off place for the longest and most dangerous lap of the journey. The Electra’s fuel tanks could keep them aloft for, at most, twenty-four hours, so they had almost no margin of error in pinpointing Howland, about twenty-five hundred miles away. Noonan was using a combination of celestial navigation and dead reckoning. They had a radio, but its range was limited.
Early on July 2nd, on a slightly overcast morning, about eighteen hours into the flight, Earhart told radiomen on the Itasca, a Coast Guard cutter stationed off Howland to help guide her down, that she was flying at a thousand feet and should soon be “on” them, but that her fuel was low. Although the Itasca had been broadcasting its position, so that Noonan could take his bearings and, if necessary, correct the course, they apparently couldn’t receive the transmissions, nor apparently could they see the billows of black smoke that the cutter was pumping out. Earhart’s last message was logged at 8:43 A.M. No authenticated trace of the Electra, or of its crew, has ever been found.
After twenty-five years of research, the Longs concluded that “a tragic sequence of events”—human error, faulty equipment, miscommunication—“doomed her flight from the beginning,” and that Earhart and Noonan were forced to ditch in shark-infested waters close to Howland, where the plane sank or broke up. There is, however, an alternative scenario—a chapter from Robinson Crusoe. It is supported with methodical, if controversial, research by Ric Gillespie, the author of “Finding Amelia” (2006), which has just been republished.
READ ENTIRE ARTICLE AT THE NEW YORKER
The Sorry History of Car Design for Women
A landscape architect of the 1950s predicted that lady drivers would want pastel-colored pavement on the interstate.

In 1958, landscape architect A. Carl Stelling tried to calm the fears of a public that would soon be connected by the interstate highway system. It wasn’t just anxiety about what these new roadways would mean for communities that was on people’s minds, there was also concern about exactly who would be using the highways. As Stelling wrote, “Say what you will—and all of us have—you are going to see more and more of the woman driver.” He added, “This prospect is not as catastrophic as it may appear.”
Stelling predicted that this new crop of “timid” and “panicky” drivers would spark changes to current roadways. Feminine pastel colors would “replace the drab, monotonous tones of present-day pavement”; designs would include an extra-slow, truckless lane for “women who become nervous at high speeds”; and wider travel lanes would allow women “a greater margin of error in their maneuvering.”
Stelling was hardly the first—or the last—to ask about the role of gender in the car industry. As sociologist Diane Barthel-Bouchier writes: “From their beginnings at the end of the nineteenth century, automobiles were defined as masculine.” The electric car, for example, was initially seen as a “ladies’ car for gadding about the city,” compared to the more “manly” gas-powered option.
READ ENTIRE ARTICLE AT JSTOR DAILY
We See You, Race Women
We must dive deeper into the intellectual artifacts of black women thinkers to support the evolution of black feminist discourse and political action.

When I was in graduate school, whenever a black woman scholar presented her work a peculiar phenomenon emerged: peers always and only remarked on the speaker’s person, rather than on her ideas. At almost no time did auditoriums or classrooms pour out with chatter about the scholarly intervention we’d just witnessed. Instead, fellow students talked about the often senior and sometimes advanced mid-career expert’s hair, clothing, impressive physical presence, or overall beauty.
The issue at hand is not that these speakers—visiting and core faculty from our university and leaders in their fields—were not attractive women, but that the intellectual production of these women did not provide the foundation for my colleagues’ attraction. Fellow graduate students could only observe these black women’s bodies. My peers were accidentally or willfully blind to black feminine scholarly production. In other words, these black women were intellectually illegible to my colleagues.
It is precisely this phenomenon of black women’s political, intellectual, and social illegibility that Brittney C. Cooper takes up in Beyond Respectability: The Intellectual Thought of Race Women, published last May. Cooper begins her critical intellectual history of black women thinkers and activists from the turn of the century through the 1970s by making the stakes of her study quite clear. She is not interested in charting only the biographies of the core figures in her work: Fannie Barrier Williams, Mary Church Terrell, and Pauli Murray. Instead, Cooper moves beyond biography to call for and demonstrate an in-depth analysis of literary production, philosophies, and direct political actions of these public intellectual “race women” and the women with whom they built robust proto-black-feminist discourses, frameworks, and blueprints.
READ ENTIRE ARTICLE AT PUBLIC BOOKS
Voices in Time: Epistolary Activism
An early nineteenth-century feminist fights back against a narrow view of woman’s place in society.

In the summer of 1837, Angelina and Sarah Grimké were traveling in eastern Massachusetts. The white, South Carolina–born sisters were working on behalf of the American Anti-Slavery Society, lecturing and organizing to end slavery immediately in the southern United States. During their travels in New England, they drew the ire of the conservative wing of the Massachusetts Congregational ministry, which opposed the immediate abolition movement. In July a pastoral letter was read from Congregational pulpits across the state condemning women who entered what was called the “public sphere” to give speeches supporting “reform.”
The attacks did not surprise the sisters. When they first began speaking in public about slavery six months earlier in New York City, many privately told them that they should stop. The sisters’ actions were indeed groundbreaking. While Frances Wright and Maria Stewart had given reform lectures earlier, neither had taken to the road on behalf of a single controversial reform. Catharine Beecher, prominent female educator, did not approve. Like the conservative Massachusetts Congregational clergy, she and her famous father, the Congregational minister Lyman Beecher, supported sending African Americans to Africa as a solution to slavery and opposed the immediate abolition movement.
In May 1837 Catharine published a pamphlet criticizing that movement as seriously misguided and Angelina Grimké, with whom she had been friends in the 1820s, for violating woman’s God-given place as subordinate to men. In response to Beecher’s pamphlet, Angelina published her own, in the form of a series of letters. The first letter was published in June in several reform newspapers. Two of the last three letters dealt with women’s equality and were as much a response to the pastoral-letter controversy as to Beecher’s limited vision for women. In the second-to-last letter, the first half of which is reprinted here, Angelina sets aside Beecher’s arguments and forthrightly states her own regarding women’s full human equality.
READ ENTIRE ARTICLE AT LAPHAM’S QUARTERLY
American Women’s Obsession With Being Thin Began With This ‘Scientist’
Greta Garbo and Marlene Dietrich were hooked on his diet.

Bengamin Gayelord Hauserwasn’t the first diet guru to worm his way into Western women’s collective consciousness. The dieting advice of William Banting, an English undertaker turned anti-fat crusader, was so influential in Victorian-era London that his surname became a verb, synonymous with dieting (i.e., “I’m banting”). Hauser also wasn’t the first to count celebrities among his followers. John D. Rockefeller and Franz Kafka were both devoted “Fletcherites,” convinced that chewing a mouthful of food 100 to 700 times resulted in improved health and a slimmer body shape, while Henry Ford was a Hay man (à la Dr. William Hay) who never ate starch and protein at the same meal.
What Hauser managed to do that Banting, Fletcher, and Hay couldn’t was capitalize on the fears and desires of women in postwar America. Unlike women of previous generations, those in the first half of the 20th century had fewer children, better health, longer lives, and more disposable income. Middle age, in particular, no longer meant retreating into a housecoat and waiting to die. Now women “could afford to have a new sense of optimism about what life over fifty could be,” writes Catherine Carstairs in her 2014 article for the journal Gender & History, “‘Look Younger, Live Longer’: Ageing Beautifully with Gayelord Hauser in America, 1920–1975.”
Hauser’s approach to diet and nutrition emphasized that living a healthful life meant travel and dancing and enjoying small pleasures. He gave women of a certain age permission not just to exist publicly but to be the center of attention. To be in the spotlight, though, was a privilege that only those who were beautiful and slim and took special care to adhere to a healthy diet deserved. “There is real tragedy in fat,” Hauser wrote in 1939’s Eat and Grow Beautiful.
How to get and stay slim? Hauser came up with a number of approaches in the 50 years of his career, including juicing, eating according to one’s “type” (potassium, phosphorus, calcium, and sulphur), avoiding white bread, sugars and over-refined cereals, and preparing “healthful” recipes like the “pep breakfast”: two raw eggs beaten in orange juice to create, as he writes in his most famous book, Look Younger, Live Longer (1951), a “creamy drink fit for a King’s table.” Most important of all, don’t skimp on the “wonder foods,” advocated Hauser, including yogurt, powdered skim milk, brewer’s yeast, wheat germ, and blackstrap molasses.
READ ENTIRE ARTICLE AT TIMELINE
As Swimsuit Season Ends, Pursuit of the ‘Bikini Body’ Endures
The “bikini body” is out. But the pressure to maintain the ideal female physique lives on.

I was 16 when I bought my first bikini at the mall. It was a reward. That morning, when I knew my stomach was flattest, I had lain on my bed, taken a deep breath and rested a ruler across my hips. Just barely, but sure enough, there was space between the plastic and my flesh. I deserve this, I thought, as I proudly plunked down the money I had earned scooping ice cream to pay for the few inches of shiny purple fabric. I had finally achieved the “bikini body” plastered on the pages of the teen magazines I pored over.
More than 20 years later, the anecdote makes me cringe. Openly aspiring to attain a “bikini body” has become shorthand for a slavish, self-hating fealty to a set of beauty standards decidedly out of step with the “empowerment branding” that today predominates in the fitness industry. Women’s Health banned the phrase from its cover in 2015, and a ubiquitous meme on the body-positive Internet reminds women that a “bikini body” is, well, any body that happens to be wearing a two-piece.
Yet ditching the phrase hasn’t destroyed its underlying ethos. Consider that the most popular global fitness celebrity is arguably Australian Kayla Itsines, whose “bikini body army,” armed with old-school before-and-after photos, is over 10 million strong.
READ ENTIRE ARTICLE AT THE WASHINGTON POST
Masculinity
When Wilde Met Whitman
As he told a friend years later, “the kiss of Walt Whitman is still on my lips.”

The 19th century was fixated on manhood. Much has been written about the constraints on Victorian women but gender expectations for men were no less real, although less pronounced. The debates swarming around Wilde were personal, but they also touched on fundamental questions about what made a man a man. Poetry was a battleground for masculinity, and Wilde had entered the fray.
“What is a man anyhow?” a then little-known poet called Walt Whitman asked at mid-century. His reply came in the form of Leaves of Grass, an 1855 poetry collection that sought to establish the nobility of the American working man. Whitman’s inclusive spirit and comprehensive range made his poetry nothing short of revolutionary. When he pictured seamen and horsedrivers, gunners and fishermen, he praised their blend of “manly form” with “the poetic in outdoor people.” Likewise, he assured readers that the ripple of “masculine muscle” definitely had its place in poetry. In Whitman”s book, a working poet could be as manly as marching firemen, and wrestling wrestlers could be just as poetic. Every working man could represent what he triumphantly called “manhood balanced and florid and full!” He redefined who counted as a real man.
It wasn’t long before the essayist Ralph Waldo Emerson was writing to congratulate Whitman. Emerson had given much thought to these matters. Decades earlier, in his celebrated 1837 “American Scholar” speech, he had observed that society rarely regarded a man as a whole person, but reduced him to less than the sum of his parts. Now Whitman’s poetry had restored men to their whole potential. Leaves of Grass “meets the demand I am always making,” Emerson told Whitman in 1855, praising his exceptionally brave handling of his materials. Here, finally, was an American poet who embraced the totality of man, and celebrated him as a fully embodied individual. “I greet you at the beginning of a great career,” Emerson wrote him.
For a long time, sexuality had been excluded from literature. No more. “I say that the body of a man or woman, the main matter, is so far quite unexpressed in poems; but that the body is to be expressed, and sex is,” Whitman replied to Emerson. The place to do it, he said, was in American literature. And the way to do it was by writing the truth about men’s appetites, and rejecting the fiction known as “chivalry.” At one time, chivalry designated medieval men-at-arms, but in Wilde’s lifetime, it meant idealized gallantry, especially towards women, and a willingness to defend one’s country. To Whitman, the notion felt clankingly old-fashioned. ”Diluted deferential love, as in songs, fictions, and so forth, is enough to make a man vomit,” he thought. Replace it with a truer picture of love and human nature, Whitman said, and “this empty dish, gallantry, will then be filled with something.”
READ ENTIRE ARTICLE AT LITERARY HUB
How John Wayne Became a Hollow Masculine Icon
The actor’s persona was inextricable from the toxic culture of Cold War machismo.

In the long working “friendship” between the two men, unless I missed it, Ford never spared a kind word for his protégé. In fact, Ford was savage in his mistreatment of Wayne, even though—or because?—Wayne worshipped him. (“My whole set up was that he was my mentor and my ideal! I think that deep down inside, he’s one of the greatest human beings that I have ever known.”) From Stagecoach through Liberty Valance, their last Western together, Ford rode Wayne so mercilessly that fellow performers—remarkably, given the terror Ford inspired—stepped in on Wayne’s behalf. Filming Stagecoach, Wayne revealed his inexperience as a leading man, and this made Ford jumpy. “Why are you moving your mouth so much?” he demanded, grabbing Wayne by the chin. “Don’t you know that you don’t act with your mouth in pictures?” And he hated the way Wayne moved. “Can’t you walk, instead of skipping like a goddamn fairy?”
Masculinity, says Schoenberger, echoing Yeats, was for Ford a quarrel with himself out of which he made poetry. Jacques Lacan’s definition of love might be more apt: “Giving something you don’t have to someone who doesn’t want it.” Ford was terrified of his own feminine side, so he foisted a longed-for masculinity on Wayne. A much simpler creature than Ford, Wayne turned this into a cartoon, and then went further and politicized it. There was an awful pathos to their relationship—Wayne patterning himself on Ford, at the same time that Ford was turning Wayne into a paragon no man could live up to.
Of all the revelations in Schoenberger’s book, none is more striking than this: After Stagecoach, a critical and commercial success, Wayne disappeared into mostly unmemorable films for another nine years. It was only in 1948, in the film 3 Godfathers, that John Wayne at last began to resemble the image we have of him in our heads. He was the apotheosis of a Cold War type—unsentimental, hard, brutal if necessary, proudly anachronistic, a rebuke to the softness of postwar affluence. He was turning, in other words, from an artist into a political symbol. “Unlike Ford,” Schoenberger says, “he ended up making propaganda, not art.” Wayne was an unyielding anticommunist; by binding up his screen image with his “ultra-patriotism,” as Schoenberger calls it, he posed himself against a liberal establishment that was feminized, and therefore worthy of populist disgust.
READ ENTIRE ARTICLE AT THE ATLANTIC
When Salad Was Manly
Esquire, 1940: “Salads are really the man’s department… Only a man can make a perfect salad.”

The differences between “women’s tastes” and “men’s tastes” are long entrenched in American cultural history. As the stereotype goes, meat is manly and women love salad.
Or is it that simple? A look back at the food writing directed at men in the period following the Second World War reveals a different relationship between men and salad. In this era, making and eating a salad wasn’t frilly and feminine, but was actually one of the most masculine things a man could do.
In a 1940 installment of his long-running Esquire cooking column “Man the Kitchenette,” Iles Brody writes: “salads are really the man’s department… Only a man can make a perfect salad,” which was “never sweet and fussy like a woman’s.” Given the right context, self-confidence, and ingredients, a man could transcend the girly boundaries of vegetables, reasserting his dominance in the kitchen and in his relationship to women. A growing body of instructional literature aimed to teach him how to do it.
READ ENTIRE ARTICLE AT JSTOR DAILY
The Masculinization of Little Lord Fauntleroy
The 1936 movie Little Lord Fauntleroy broke box office records, only to be toned down and masculinized amid cultural fears of the “sissified” male.

“With men and women at a late hour forming long lines in front of the box office, the management of the world’s largest theater late tonight announced the initial production from Selznick International would be held over for a second week by popular demand,” The Los Angeles Times reported in 1936. The production in question was Little Lord Fauntleroy, and by the newspaper’s account, crowds were braving terrible spring weather to see it. New York City moviegoers muddled through a “slashing rainstorm,” while Buffalo audiences tunneled out of “record-breaking snowstorms.” In Denver, too, there was heavy snow. Perhaps the most devoted fans were in Philadelphia and Cincinnati. Both cities were hit with floods, which did nothing to stop the ticket sales.
No matter where Little Lord Fauntleroy opened and no matter the forecast, there were broken box office records and “standing room crowds.” It was a testament to the enduring popularity of the character, an aristocratic young boy created by author Frances Hodgson Burnett. Yet, curiously, even as men and women were racing through the rain to see the production, there was backlash building. Little Lord Fauntleroy not only entered the slang lexicon as an insult, but inspired a panic among certain parents, who feared their sons might turn out “priggish,” “sugary,” or, as all these coded words seemed to suggest, emasculated.
Little Lord Fauntleroy debuted as a serialized children’s story in St. Nicholas Magazine in 1885. It was later collected into a book, “which vastly outsold masterpieces such as Tolstoy’s War and Peace” and became “a trans-Atlantic best-seller prized by a dual readership of adults and children,” Princeton professor U. C. Knoepflmacher writes. The story concerned a boy named Cedric Errol, born to an American mother and a British nobleman. Shunned by the snobby English side of the family, Cedric and his mother struggle to make ends meet in New York City after his father’s passing.
READ ENTIRE ARTICLE AT JSTOR DAILY
The Right Worries Minnie Mouse’s Pantsuit Will Destroy Our Social Fabric. It Won’t.
Of mice and men.

The recent announcement that Minnie Mouse has joined Pantsuit Nation, or at least Pantsuit Magic Kingdom, temporarily swapping her red dress for a polka-dot pantsuit designed by Stella McCartney, triggered a mini-meltdown on Fox News. Conservatives tend to be, by definition, change-adverse. Minnie’s new clothes — coming on the heels of culture-war skirmishes over the green M & M’s “progressive” sneakers and Hasbro’s dropping of the “Mr.” from Mr. Potato Head — may have been the straw that broke Candace Owens’s brain. The right-wing pundit went on “Jesse Watters Primetime” to slam Disney for making Minnie “more masculine” in an attempt to “destroy fabrics of our society” — an interesting Freudian slip, conflating “fabrics” (that is, textiles) with the social “fabric,” or structure.
Of course, as many pointed out on social media, Minnie has worn pants (and shorts) in the past. And at least she’s fully clothed, unlike some pantsless male Disney characters. (Looking at you, Winnie the Pooh and Donald Duck). But the move still riled the right because the politicization of women’s pants is an American tradition. Critiques of women’s fashion have often served as thinly veiled attacks on women themselves, and wearing pants — in the West, reserved for men from the late Middle Ages until just recently — is a convenient metaphor for appropriating historically masculine privileges, from voting to running for president.
In the 19th century, early suffragists like Susan B. Anthony, Amelia Bloomer and Elizabeth Cady Stanton experimented with wearing voluminous pants — also known as “the freedom dress” — but suffered so much mockery that they ultimately rejected them as unhelpful distractions from their cause. The 19th Amendment actually preceded women’s right to wear trousers in public, which was granted by the U.S. attorney general on May 29, 1923.
READ ENTIRE ARTICLE AT THE WASHINGTON POST
Mr. and Mrs. Talking Machine
The Euphonia, the Phonograph, and the Gendering of Nineteenth Century Mechanical Speech

In the early 1870s a talking machine, contrived by the aptly-named Joseph Faber appeared before audiences in the United States. Dubbed the “euphonia” by its inventor, it did not merely record the spoken word and then reproduce it, but actually synthesized speech mechanically. It featured a fantastically complex pneumatic system in which air was pushed by a bellows through a replica of the human speech apparatus, which included a mouth cavity, tongue, palate, jaw and cheeks. To control the machine’s articulation, all of these components were hooked up to a keyboard with seventeen keys— sixteen for various phonemes and one to control the Euphonia’s artificial glottis. Interestingly, the machine’s handler had taken one more step in readying it for the stage, affixing to its front a mannequin. Its audiences in the 1870s found themselves in front of a machine disguised to look like a white European woman.
By the end of the decade, however, audiences in the United States and beyond crowded into auditoriums, churches and clubhouses to hear another kind of “talking machine” altogether. In late 1877 Thomas Edison announced his invention of the phonograph, a device capable of capturing the spoken words of subjects and then reproducing them at a later time. The next year the Edison Speaking Phonograph Company sent dozens of exhibitors out from their headquarters in New York to edify and amuse audiences with the new invention. Like Faber before them, the company and its exhibitors anthropomorphized their talking machines, and, while never giving their phonographs hair, clothing or faces, they did forge a remarkably concrete and unanimous understanding of “who” the phonograph was. It was “Mr. Phonograph.”
Why had the Euphonia become female and the phonograph male? In this post, I peel apart some of the entanglements of gender and speech that operated in the Faber Euphonia and the phonograph, paying particular attention to the technological and material circumstances of those entanglements. What I argue is that the materiality of these technologies must itself be taken into account in deciphering the gendered logics brought to bear on the problem of mechanical speech. Put another way, when Faber and Edison mechanically configured their talking machines, they also engineered their uses and their apparent relationships with users. By prescribing the types of relationships the machine would enact with users, they constructed its “ideal” gender in ways that also drew on and reinforced existing assumptions about race and class.
Of course, users could and did adapt talking machines to their own ends. They tinkered with its construction or simply disregarded manufacturers’ prescriptions. The physical design of talking machines as well as the weight of social-sanction threw up non-negligible obstacles to subversive tinkerers and imaginers.
Born in Freiburg, Germany around 1800, Joseph Faber worked as an astronomer at the Vienna Observatory until an infection damaged his eyesight. Forced to find other work, he settled on the unlikely occupation of “tinkerer” and sometime in the 1820s began his quest for perfected mechanical speech. The work was long and arduous, but by no later than 1843 Faber was exhibiting his talking machine on the continent. In 1844 he left Europe to exhibit it in the United States, but in 1846 headed back across the Atlantic for a run at London’s Egyptian Hall.
That Faber conceived of his invention in gendered terms from the outset is reflected in his name for it—“Euphonia”—a designation meaning “pleasant sounding” and whose Latin suffix conspicuously signals a female identity. Interestingly, however, the inventor had not originally designed the machine to look like a woman but, rather, as an exoticized male “Turk.”
READ ENTIRE ARTICLE AT SOUNDING OUT!
Diners, Dudes, and Diets
How gender and power collide in food media and culture.

It took me six months to dream up the title Diners, Dudes, and Diets (University of North Carolina Press, 2020). For anyone who’s ever watched Guy Fieri’s show Diners, Drive-Ins and Dives, my inspiration is likely pretty clear. As I researched gender and power in contemporary American food media, I spent years analyzing Fieri’s polarizing persona (and watching dozens upon dozens of episodes of his shows), but I’d been thinking about what and why we eat for much longer.
I wrote my undergraduate thesis on the language of the weight loss industry and how it crafts a dieting theology based on “guilt-free” and “sinfully delicious” eating. Although I loved that research and writing, my younger self worried that such bookish activities were not enough to make a difference in the world, so I went to graduate school to study public health nutrition. I was convinced that fresh fruits and vegetables could fix everything, or at least put a dent in “diet-related diseases.” I’m now less interested in how food directly affects our health (though it does) or how it might change the world (though it can). I’m more invested in understanding why we believe this, why we continue to find in food such profound power, and why food remains such an anxious arena within our consumer culture and popular media.
In Diners, Dudes, and Diets, I focus on a particular kind of anxiety about our identities, which marketers call “gender contamination.” Marketing scholar Jill Avery explains this concept as “consumer resistance to brand gender-bending”—that is, how consumers react, sometimes quite negatively, when a brand’s perceived gender changes.1 In her study, Avery researched men’s reactionary, hyper-masculine responses when the Porsche Cayenne SUV became popular among women drivers, since these men considered Porsche a masculine brand—heck, one of the poster children for male midlife crisis! This idea of gender contamination can work in multiple directions. For example, a woman drinking a brand of whiskey marketed to men can create perceptions of empowerment. But for male consumers, sipping a diet soda marketed to women can lead to a perception of social stigma because the brand is feminized.
READ ENTIRE ARTICLE AT NURSING CLIO
Boys in Dresses: The Tradition
It’s difficult to read the gender of children in many old photos. That’s because coding American children via clothing didn’t begin until the 1920s.

Exploring the biographies of men as disparate as Tsar Nicholas II (b. 1868), Franklin Delano Roosevelt (b. 1882), and Ernest Hemingway (b. 1899), you’re apt to come across pictures of them as young boys looking indistinguishable from young girls. Their hair is long and they’re wearing dresses.
Scholar Jo B. Paoletti examines the changing fashions in children’s wear at the turn of the twentieth century, as a long tradition transitioned to more overtly gender-coded clothing. As she notes,
Until World War I, little boys were dressed in skirts and had long hair. Sexual “color coding” in the form of pink or blue clothing for infants was not common in this country [the US] until the 1920s; before that time male and female infants were dressed in identical white dresses.
Paoletti writes that young children’s clothing became more “sex-typed” as “adult women’s clothing was beginning to look more androgynous.” Before that transition, clothing styles for children followed a predictable progression.
READ ENTIRE ARTICLE AT JSTOR DAILY
Fear of the “Pussification” of America
On Cold War men’s adventure magazines and the antifeminist tradition in American popular culture.

Of all the responses to the COVID-19 pandemic in the United States—ranging from debates over mask wearing to school closings— perhaps the most bizarre is the suggestion that this deadly disease can be avoided simply through manliness.
Nowhere was this made more explicit than when former US Navy Seal Robert O’Neill shared a photo of himself, unmasked, on a Delta Airlines flight. “I’m not a pussy,” declared O’Neill on Twitter, as if to suggest that potent, masculine men, like those on Seal Team 6, would not be cowed into wearing cowardly protective gear (Never mind that a passenger sitting one row behind O’Neill, in a US Marine Corps baseball cap, was wearing his mask).
O’Neill’s use of the “P-word” was far from an outlier; in fact, it has been employed near and far in recent months. Adam Corolla stoked public outcry only weeks later when he maintained, incorrectly, that only the “old or sick or both” were dying from the virus. “How many of you pussy’s [sic] got played?” the comedian asked.
Nor were these remarks limited to COVID-19. Not to be outdone by such repugnant rhetoric, President Donald Trump—who elevated the word during the 2016 presidential campaign for other reasons—reportedly lambasted senior military leaders, declaring that “my fucking generals are a bunch of pussies.” On the opposite end of the military chain of command, 2nd Lt. Nathan Freihofer, a young celebrity on TikTok, recently gained notoriety for anti-Semitic remarks on the social media platform. “If you get offended,” the young officer proclaimed, “get the fuck out, because it’s a joke…. Don’t be a pussy.”
What should we make of these men, young and old, employing the word as a way to shame potential detractors? Perhaps the most telling, and least surprising, explanation is that sexism and misogyny are alive and well in Trump’s America. Yet it would be mistaken to argue that the epithet has regained popularity simply because the president seemingly is so fond of the word. Rather, such language—and more importantly, what it insinuates—is far from new.
In July, after Alexandria Ocasio-Cortez (D-NY) was verbally accosted on the Capitol steps by fellow representative Ted Yoho (R-FL), the congresswoman delivered a powerful speech on the House floor. The problem with Yoho’s comments, Ocasio-Cortez argued, was not only that they were vile, but that they were part of a larger pattern of behavior toward women. “This is not new, and that is the problem,” she affirmed. “It is cultural. It is a culture of lack of impunity, of accepting of violence and violent language against women, and an entire structure of power that supports that.”
She’s right. This “violent” language—calling women “bitches” and men “pussies”—and the understandings that accompany it has a long history in American popular culture. And few cultural artifacts depict such sexist notions more overtly than Cold War men’s adventure magazines.
READ ENTIRE ARTICLE AT HISTORY NEWS NETWORK
Frances Clayton and the Women Soldiers of the Civil War
Notions of women during the Civil War center on self-sacrificing nurses, romantic spies, or the home front. However, women charged into battle, too.

Popular notions of women during the Civil War center on self-sacrificing nurses, romantic spies, or brave ladies maintaining the home front in the absence of their men. This conventional picture of gender roles does not tell the entire story, however. Men were not the only ones to march off to war. Women bore arms and charged into battle, too.
DeAnne Blanton and Lauren M. Cook, They Fought Like Demons, p. 1.
Minnesotan Frances Louisa Clayton (sometimes spelled Clalin; born ca. 1830) was purported to have disguised herself as a man under the alias Jack Williams in order to enlist and fight in the United States army during the Civil War, at a time when women were barred from service. Some historians question the veracity of accounts of Clayton’s military service. However, her story would not have been as rare an occurrence as one might think. In They Fought Like Demons (2002), historians DeAnne Blanton and Lauren M. Cook note they had discovered evidence of some 250 women soldiers who adopted male personas in order to fight in the Civil War. Moreover, Blanton and Cook expect there are hundreds more women whose stories have gone undocumented as lower literacy rates as well as the private nature of their soldierly subterfuge meant they were less likely to write letters or diaries detailing their experiences than their male counterparts. “Unless women were discovered as such … or unless they publicly confessed or privately told their tale of wartime service, the record of their military career is lost to us today.” As the authors acknowledge, Black women, in particular, are underrepresented in this history due to the fact that biographical stories of Black soldiers serving in the United States Colored Troops largely went uncovered by the mid-nineteenth century’s racist and white-centered mass media. What is certain, however, is that “more women took to the field during [the Civil War] than in any previous military affair [in the United States’ history].”
What we know of Clayton comes from newspaper reports and men’s eyewitness accounts. Interviews with Clayton and witnesses featured in many newspapers when her story broke in 1863. One witness’s account lauds her service: “She stood guard, went on picket duty, in rain or storm, and fought on the field with the rest and was considered a good fighting man.” However, only sparse details about Clayton’s military service are documented as “most reporters found the story of the faithful wife more appealing than the details of Clayton’s life as a soldier.” Reports say she enlisted alongside her husband, John, in a U.S. Missouri regiment in the fall of 1861. She fought in eighteen battles between 1861 and 1863. These included the Battle of Fort Donelson in Tennessee (February 11–16, 1862), in which she was wounded. During the Battle of Stones River (December 31, 1862–January 2, 1863), Clayton reported having witnessed her husband’s death “just a few feet in front of her. When the call came to fix bayonets, [she] stepped over his body and charged.” Clayton was discharged in Louisville, Kentucky, in 1863.
READ ENTIRE ARTICLE AT UNIVERSITY OF VIRGINIA LIBRARY
Manly Firmness: It’s Not Just for the 18th Century (Unfortunately)
The history of presidential campaigns shows the extent to which the language of politics remains gendered.

The references to “manly firmness” are everywhere in late-18th-century political sources. For example, Edward Dilly wrote to John Adams from London in 1775 to praise the men in the Continental Congress, “for the Wisdom of their Proceedings — their Unanimity, and Manly firmness.” In the Declaration of Independence, Thomas Jefferson listed the crimes of the King against the North American colonists. He pointed a finger at George III for dissolving representative governments in the colonies because those governments had opposed “with manly firmness” the King’s “invasions on the rights of the people.” After the war, as James Madison and Alexander Hamilton pushed for a more centralized form of government, they used the adjective “manly” in three of the numbers of the Federalist Papers. In Number 14, Madison crafted a version of the American Revolution in which colonists had “not suffered a blind veneration for antiquity, for custom, or for names,” and praised “this manly spirit,” to which “posterity will be indebted.”
None of this is news for anyone who has studied the United States’ founding era for the last four decades. In 1980, Linda Kerber told us in the very first page of the very first chapter of Women of the Republic: Intellect and Ideology in Revolutionary America that the “use of man” by Enlightenment writers “was, in fact, literal, not generic.”1 It’s not that these writers and thinkers completely ignored women and women’s roles, but rather they did not challenge the roles of women in society in the same ways that they challenged their ideas about taxation, monarchy, male citizenship, and other formerly-accepted constructions in the world around them.
In the founding era, both male and female writers had acknowledged that women could understand politics, but when these same writers approached the idea of a woman’s formal political participation, they tended to do so in a mocking and dismissive manner. Decades of college students have now studied Abigail Adams’s 1776 plea to her husband that Continental Congress should “Remember the Ladies” as they created a “new Code of Laws.” These students have also read John Adams’s maddening response, calling his wife “saucy,” and proclaiming, “We know better than to repeal our Masculine systems.”
READ ENTIRE ARTICLE AT NURSING CLIO
What Maketh a Man
How queer artist J.C. Leyendecker invented an iconography of twentieth-century American masculinity.

He stands on a staircase behind his paramour. His hand, hovering in the void where tuxedo blends with room, holds a cigarette as though at any moment he might toss it like a pair of cards into the muck. His bowtie is exquisitely knotted, collar stiff and starched, boutonniere like a white heart newly blossomed on his breast, everything tailored to perfection. He has a strong jaw and a dimpled chin. He looks off into the distance, away from you and away from his lover, too, alluringly unattached. His eyes are lowered, melancholic but without a trace of self-pity.
Or: He teeters precariously—one foot on the ground, the other extended in front of him, mid-punt. His hands act as counterweights, balancing him amid the movement. His football pants are baggy at the knees and the hips. His blue shirt’s brown shoulder pads turn his toned if unexceptional shoulders almost Herculean. His eyes remain on the football that he’s just sent sailing through the air. There’s not a hair on his head that’s out of place, despite the movement. His cheeks are rosy—out of red-blooded exertion rather than jejune innocence—yet the whole affair feels unlabored, amusing, fun. The corner of his mouth curls up slightly into an almost grin. He hopes he has just impressed his high school sweetheart, who must be sitting in the stands with her parents. A sea of white crosshatched brushstrokes form a textured abstract backdrop.
Or: He stands upright, in profile, the posture of an educated Jesuit. His military uniform is tan, recently laundered. A satchel hangs off his shoulder, resting at the base of his buttocks. His head is strapped into a Brodie helmet. That brown leather strap is flush to his cheeks and chin, contorting his face uncomfortably. He tries to show no emotion as an older man pins a medal to his breast, but his eyes betray both a sense of pride and a measure of sadness. Yet another honor his fallen brethren will not receive.
READ ENTIRE ARTICLE AT LAPHAM’S QUARTERLY
Slut-Shaming, Eugenics, and Donald Duck
The scandalous history of sex-ed movies.

After excusing herself from the dinner table, the 13-year-old girl begins to shout, her excited voice ringing through her family’s Mid-Century Modern home, “I got it! I got it!!” Her mother, in a Donna Reed-type dress, beams, while her 10-year-old brother looks up quizzically and asks, “Got what?” The boy’s father turns to him and says, brusquely, “She got her period, son!”
I saw this film in a middle-school sex-education class in 1988, and even though I’d read, “Are You There, God? It’s Me, Margaret,” the movie seemed embarrassingly old and this scene particularly laughable. How uncool did you have to be to announce the arrival of your period to the whole house? Is it really something you want your dad and brother discussing over potatoes? After all, our school felt girls had to be separated from the boys in our class just to watch this movie.
Today, most American adults can call up some memory of sex ed in their school, whether it was watching corny menstruation movies or seeing their school nurse demonstrate putting a condom on a banana. The movies, in particular, tend to stick in our minds. Screening films at school to teach kids how babies are made has always been a touchy issue, particularly for people who fear such knowledge will steer their children toward sexual behavior. But sex education actually has its roots in moralizing: American sex-ed films emerged from concerns that social morals and the family structure were breaking down.
READ ENTIRE ARTICLE AT COLLECTORS WEEKLY
When Dieting Was Only For Men
Today, we tend to assume dieting is for women, but in the 1860s, it was a masculine pursuit.

It’s New Year’s resolution season, which means it’s time for many of us to try once again to stick to a diet. Today, we tend to assume dieting is mostly for women, but Katharina Vester explains that when Americans first started following diets in the 1860s, it was a masculine pursuit.
The notion of dieting hit the American shores as a British import in the mid-nineteenth century. This was an era of new individualism, when the idealized middle-class man rose through society by virtue of ambition and self-control. Reverend Sylvester Graham, inventor of the graham cracker, was already promoting an austere diet as a way to curb sexual impulses. Now, with more men working in sedentary professional jobs, many worried that their bodies were becoming soft and feminine. Men began looking to lose some weight.
In 1863, a British undertaker named William Banting published a diet book, A Letter on Corpulence, that quickly became popular in the United States. Banting’s Atkins-esque plan called for limiting starchy foods and sweets while consuming plenty of meat and liquor, a diet with a distinctly masculine slant. He also emphasized that the plan was rational and scientifically grounded and did not call for self-denial or sacrifice, which were understood as feminine attitudes in Victorian culture.
READ ENTIRE ARTICLE AT JSTOR DAILY
Feminism in the Dock
Can (and should) conservatives reclaim feminism from the radicals?

I am often asked by young women if we should take back the term “feminism.” It still carries the cachet of caring about women, and most of us are quite happy with the achievements of first-wave feminism: equality before the law, voting rights, and property rights. But feminism in its current form is a radical devolution: divorcing sex from gender, vilifying all masculinity as toxic, and warring against nature and the family. How do we take back a feminism that has become so distorted? And do we want to? It’s worth considering some of the reasons feminism resonated with women, to identify questions that remain unanswered, and challenges women will potentially always face.
The debate over taking back feminism is complicated by the fact that no cohesive and consistent definition of feminism exists. It seems the only thing that unites feminists is that they care about issues that are related, and sometimes only tangentially related, to women. Underlying that are strong disagreements about what is good for women and human beings and what encompasses a life well lived. Feminists can’t even agree on what defines “women.” For example, trans-exclusionary feminists rebel against the encroachment of trans women into female sports and spaces, while other feminists do not—at least publicly—voice such objections.
Often, feminism is discussed as coming in three or more waves. First-wave feminists wanted to be treated as equal citizens, a project that culminated with the passing of the 19th Amendment, guaranteeing women the right to vote. Second-wave feminism of the mid-twentieth century focused on eliminating discrimination in the workplace and expanding educational opportunities. The movement soon allied itself with abortion advocates, connecting it to a Sexual Revolution ethos that saw marriage and family primarily as impediments to women’s personal goals and ambitions. Third and subsequent waves of feminism are even more difficult to describe and delineate, as factions within the movement have grown leaving no clear breaks from one wave to the next. The element that labeled gender a social construct has taken off and advocated for positions with radical implications, including transgender ideology.
READ ENTIRE ARTICLE AT LAW & LIBERTY
The Huckster Ads of Early “Popular Mechanics”
Weird, revealing, and incredibly fun to read.

When I’m bored and worried about falling into a Twitter hole, there’s one thing that can always divert my attention.
Going over to the Internet Archive and perusing the incredibly weird ads of early-20th-century Popular Mechanics.
What, you haven’t already discovered this yourself?
You’re in for a treat. Popular Mechanics was a curious beast back in those days. Like the name suggests, it included tons of stories about inventors around the world, and stuff they were getting up to. So in the April 1920 issue, for example, you had articles like these …
George Washington Would Have So Worn a Mask
The father of the country was a team player who had no interest in displays of hyper-masculinity.

The genre “What would X do?” – where X stands for a noted figure in history, say Jesus or Dolly Parton – is silly. And yet, as a scholar writing a new biography of George Washington, I can’t help making a bold declaration: The Father of this country would wear his mask in public.
Face masks have become something of a political statement in the U.S. They are seen by some as a line in the sand between “effeminate” Democrats and “masculine” Republicans.
Opponents see them as a symbol of tyranny. More than a few men have a problem with wearing masks. To them, these face coverings are for the weak and sick and communicate insecurity.
READ ENTIRE ARTICLE AT THE CONVERSATION
Rachel Carson’s Critics Called Her a Witch
When Silent Spring was published, the response was overtly gendered. Rachel Carson’s critics depicted her as hysterical, mystical, and witchy.

When Rachel Carson’s Silent Spring came out in 1962, many Americans were horrified to learn about the dangers to humans and other life posed by pesticides. Critics quickly pushed back, and, as environmental historian Maril Hazlett writes, they did so in tremendously gendered terms, depicting Carson and the women who were moved by her messages as over-emotional and irrational.
Carson, a biologist by training, made the case against the pest-control methods used at the time. Her book presented the relationship between humans and nature in a groundbreaking way. Hazlett notes that many readers cited a passage describing how the same man-made chemicals were found in “fish in remote mountain lakes, in earthworms burrowing in soil… in the mother’s milk, and probably in the tissues of the unborn child.” Given such immense complexity, Carson pointed out that contemporary scientific knowledge was far too limited to justify enormous chemical interventions in the natural world.
READ ENTIRE ARTICLE AT JSTOR DAILY
Enjoy My Flames
On heavy metal’s fascination with Roman emperors.

The Twelve Caesars the Roman biographer Suetonius wrote of the emperor Nero, “When someone in conversation said, ‘Let the world go up in flames when I die,’ he said, ‘Not when you die, but while I live,’ and he clearly made it so.” Nero’s later successor Julian the Apostate describes Caligula as an “evil beast” the gods abhorred. And in 2015 the French death metal band Autokrator released its debut album, which features lyrics that continue Suetonius’ and Julian’s quest to denigrate Roman monarchs infamous for megalomania and moral depravity. Similarly, Autokrator’s mastermind Loïc Fontaine said, “I don’t admire rulers I speak about. Most of them are fucking bastards.”
Roman emperors have enjoyed a prolific reception in metal music around the world—Caligula and Nero most of all, with not only hundreds of individual songs but also entire concept albums dedicated to them, such as the Belgian band Paragon Impure’s 2005 album To Gaius! (For the Delivery of Agrippina) and the Russian band Neron Kaisar’s 2013 album Madness of the Tyrant. The year 2021 saw the release of two separate records about Nero: the UK band Acid Age’s Semper Pessimus and the Canadian band Ex Deo’s The Thirteen Years of Nero. The extent of certain emperors’ popularity can even be quantified, thanks to the online database Encyclopaedia Metallum. Entering each emperor’s name into the advanced search for their appearance in lyrics and song titles, and after eliminating duplicates and false positives (e.g., nero being Italian for “black”), led me to create the following bar graph, which went semi-viral on Twitter in April 2021:
READ ENTIRE ARTICLE AT LAPHAM’S QUARTERLY
Brett Kavanaugh Goes to the Movies
A film scholar reflects on the image of masculinity depicted in “Grease 2,” released the same summer of Kavanaugh’s alleged assault.
I’m a film studies professor, so when I first saw an image of Supreme Court nominee Brett Kavanaugh’s June 1982 calendar, I immediately noticed his movie plans.
In between exams, a beach trip, basketball camp and workouts, Kavanaugh, like millions of other Americans that year, went to the movies. In fact, 1982 was, at the time, Hollywood’s most lucrative year at the box office.
In June alone, Kavanaugh scheduled three trips to the movies: “Rocky III” on June 13, “Grease 2” on June 16 and, on June 26, “Poltergeist.”
READ ENTIRE ARTICLE AT THE CONVERSATION
Identity on a Spectrum
A History of Transgender Health Care
As the stigma of being transgender begins to ease, medicine is starting to catch up.

An estimated 1.4 million Americans, close to 0.6 percent of the population of the United States, identify as transgender. And, today, the topic of transgender health care is more widely discussed than ever before. Despite this, lost in the shuffle between conversations about equal access to bathrooms and popular culture icons is the history of a piece of modern medicine that should no longer remain so elusive. To be willing to embrace the future of this pivotal area of healthcare, it is imperative to understand the piecemeal roots and evolution of transgender medicine.
Magnus Hirschfeld, a German physician who could easily be considered the father of transgender health care, coined the term “transvestite” in 1918 at his Institute for Sexual Science in Berlin. Defining transvestism as the desire to express one’s gender in opposition to their defined sex, Hirschfeld and his colleagues used this now antiquated label as a gateway to the provision of sex changing therapies and as a means to protect his patients. Going against the grain, Hirschfeld was one of the first to offer his patients the means to achieve sex change, either through hormone therapy, sex change operations, or both.
In a time when his contemporaries aimed to “cure” transgender patients of their alleged mental affliction, Hirschfeld’s Adaptation Theory supported those who wanted to live according to the gender they felt most aligned with, as opposed to the gender that their sex obligated them to abide by. Much of the history of the institute’s early works were destroyed in the wake of the Nazi book burnings in 1933, but as far as history can prove, Hirschfeld’s institute was the first to offer gender reassignment surgery.
READ ENTIRE ARTICLE AT SCIENTIFIC AMERICAN
Beyond the Binary
The long history of trans.

Sometimes amateurish politicians say the quiet part out loud. So it was in April 2023 with Florida state Representative Webster Barnaby. Speaking in favor of anti-trans legislation, he told his chamber that trans people make him feel “like I’m watching an X-Men movie…. It’s like we have mutants living among us.” Besides dramatically missing the point of the X-Men, Barnaby displayed a common misconception: the idea that trans people are something new.
As the historian Kit Heyam reminds us, “Genders other than male and female have always existed,” even as far back as ancient Sumer. Some things about us really are new: The word “transgender” was likely coined in 1965. Specific regimes of medical treatment, like the estradiol valerate I inject every two weeks, are just decades old. Even newer is the phenomenon in which so many American young people choose “they/them” or other, less familiar nonbinary pronouns. But these new medical and social practices belie the long historical and cross-cultural span of other-than-binary, other-than-cisgender, more-or-less-trans existence. To borrow a phrase from the photographer, writer, and activist Samra Habib, “We have always been here”—or, at least, people somewhat like us have always been here.
That’s the point that resonates throughout Heyam’s fast-moving study Before We Were Trans. In it, Heyam offers proof that across many places and periods, people have lived outside of—or by violating—modern Western gender norms. “Anti-trans campaigners,” Heyam writes, claim that “trans people are new, and that means they’re not real.” Heyam shows that the campaigners are twice wrong: Unless you define us both narrowly and tendentiously, trans people are not new at all. The 18th-century Chevalier d’Eon, who lived part of his or her life as a woman; the British “‘female soldiers’ or ‘female sailors’” whose “recognition and treatment as men was often dependent on passing” as cigsender men; and the Samoan fa’afafine and fa’afatama (traditional third and fourth genders) are not trans in just the same way that I am trans, and yet their lives give evidence for mine. As Heyam writes, “It matters, for people who have been persistently told we have no history, when we find historical figures who feel like us.” Also, Heyam notes, “Many trans histories are inextricable from histories of other experiences”: If we attend to those experiences “from a place of care for people in the past,” we will find precedents for our lives today.
READ ENTIRE ARTICLE AT THE NATION
The Sissies, Hustlers, and Hair Fairies Whose Defiant Lives Paved the Way for Stonewall
In 1966, the queens had finally had enough with years of discriminatory treatment by the San Francisco police.

The queens had finally had enough: In August 1966—fifty years ago this month—transgender and gender-nonconforming customers at Gene Compton’s Cafeteria stood up to years of abusive, discriminatory treatment by the San Francisco police. The all-night restaurant in the city’s impoverished Tenderloin neighborhood was an unwilling haven for queer residents, and after its management called law enforcement to remove a noisy table of diners, patrons frustrated with the constant profiling and police harassment started throwing plates, cups, trays, and silverware at the officers. While police waited for backup, customers tore the cafeteria apart and the riot spread onto nearby Turk and Taylor Streets, damaging a police car and burning a newspaper stand to the ground.
Three years before the Stonewall riots in New York City, which most Americans consider the watershed moment for gay rights, transgender citizens of San Francisco took to the streets to demand better treatment and to hold their harassers accountable. As often happened when marginalized queer people fought back against oppression, their voices were silenced and their existence was criminalized. Although the conflict at Compton’s was mostly ignored by the media, including publications run by the nascent gay community, 1966 would prove a major turning point in the battle for transgender civil rights, a year when cultural shifts aligned to begin improving the trans community’s access to healthcare and its relationship with law enforcement.
In 2005, Susan Stryker shed light on this moment with her excellent documentary, “Screaming Queens: The Riot at Compton’s Cafeteria.” Soon after, Stryker completed a book entitled Transgender History, which placed the rebellion at Compton’s Cafeteria in context among other moments of resistance to state-sanctioned violence and milestones in the march towards transgender acceptance. Several gender-nonconforming residents who spent time in the Tenderloin and experienced abuse firsthand have since spoken out, including two women who spoke with us for this story.
Half a century after the riot at Compton’s, transgender issues have finally become part of our national political conversation, and yet trans history is still overlooked, or conveniently ignored. For many Americans, the current panic over public bathrooms is the first time they’ve really considered the experiences of trans people, even though the debate over gender presentation dates back at least to the mid-19th century.
READ ENTIRE ARTICLES AT COLLECTORS WEEKLY
In the 1940s, a Trans Pioneer Fought California for Legal Recognition. This Is How She Won.
Barbara Ann Richards designed—and then demanded—the life she deserved.

On July 1, 1941, a 29-year-old interior decorator walked into a Los Angeles courthouse and filed a request that vanishingly few law clerks would have processed before: As a part of her gender transition, she wanted to change her name on her legal documents.
The applicant, named Barbara Ann Richards, was a woman, but the state of California still classified her as a man, and her birth certificate listed a traditionally masculine name she no longer claimed as her own. Richards explained to the court that she had undergone, by her own description, a physical metamorphosis. She had “always lived and worked as a man until about two years ago,” at which point, she said, “I realized that some vital physiological change was taking place.”
Without warning, her beard—which she used to shave twice a day—stopped growing. Her voice changed pitch. “I began to observe that my skin had become smoother, that the shape of my face was different, my waist was smaller, my hips heavier, my throat smaller,” she said in her testimony. Accompanying those unexplained physical changes were psychological ones as well: She said she started gravitating more toward cooking and housework. She stopped reading Esquire, she said, and switched over to Ladies’ Home Journal.
The previous October, as World War II loomed, Richards had gone to register for the draft, but the Selective Service System turned her away. The government classified her as “4-F,” a designation for people “unfit” for the military that was commonly applied to homosexuals. Richards told the court that, after the selective service rejection, “I decided that I was, in every essential way, a woman, and I was determined that in justice to myself I should petition the court for feminine status.”
Her lawyer, a newly minted family law attorney named Chester B. Anderson, paraded out experts who offered medical explanations for Richards’ story. An endocrinologist named Marcus Graham, who claimed to have examined Barbara for an American Psychiatric Association conference, speculated that a childhood illness might have been the cause of her so-called metamorphosis. “Such diseases as mumps have been known to cause destruction of important male characteristics,” Graham said. Although rare, “it is possible through an illness to lose the predominating male characteristics.”
How “Gender” Went Rogue
Debating the meaning of gender is hardly new, but the clinical origin of the word may come as a surprise.

Gender is having an anxious moment. Several states, including Texas and Florida, have recently introduced laws to restrict or outlaw medical access to gender-affirming procedures. Many states also exclude transgender women athletes from competing on girls’ and women’s sports teams. Others track female athletes’ menstruation, simultaneously policing gender assignment and female reproductive rights. School boards across the country are banning books that feature (or sometimes merely mention) the LGBTQI+ community. Some male politicians have mocked the fact that gender might not match a person’s sex by claiming that they themselves are now women—to great guffaws of laughter among their audience members. Supreme Court Justice Ketanji Brown Jackson was asked to define what “woman” means in her confirmation process and was scorned when she refused to give a seemingly “common sense” answer.
These developments may seem novel, but debating the meaning of gender is hardly new. Nor is deploying so-called common sense to adjudicate such matters. Gender trouble, as the phrase goes, has a history. The meaning of gender and of categories like “woman” have always been historically contingent—and saturated with politics too. To believe otherwise is to ignore the past. Historically, the rights, roles, and privileges of womanhood have been attached to a particular type of woman—white, middle- to upper-class, heterosexual, and so on. Just ask Black suffragists if all women got the right to vote in 1920. The history of the word gender to refer to social sex is admittedly more recent. Perhaps counterintuitively, it was not the feminist movement that made the word. The term was introduced around 1954 in a very specific context: the Johns Hopkins Pediatric Endocrinology Clinic in Baltimore, where it was used as part of the treatment of children born with intersex traits (reproductive or sexual anatomy that does not quite fit our categories of male and female). Intersexuality (or hermaphroditism, an outdated term used at the time) was a problem in search of a solution—and that solution was the idea of gender. The term was meant to denote a sober “scientific” solution rather than being, say, the product of superstition or prejudice or politics. It was seen as a solution that would need to be routinized, and that would then perforce enter the domain of medical expertise.
The clinic “made” gender, in short. But here we need to backtrack. In the late 19th century, sex gonads were understood to determine masculinity and femininity. Physicians claimed that they could easily tell whether someone was female or male based on the appearance of their gonads. Anyone who possessed ovaries was thought to be a woman, no matter their more general appearance or character, and anyone with testes was a man, regardless of how they looked, felt, or behaved. By the 1910s, however, the idea of gonads as agents of sex difference gave way to the concept of sex hormones as chemical messengers of masculinity and femininity. The right decision in assigning sex became more fraught. Practitioners sometimes felt compelled to recommend letting certain patients remain in the sex that they were living in, even if it contradicted their gonadal sex. In the following decades, psychiatrists and psychoanalysts published cases in which a person’s psychological sex did not match their sexual anatomy.
READ ENTIRE ARTICLE AT LOS ANGELES REVIEW OF BOOKS
Who’s Afraid of Social Contagion?
Our ideas about sexuality and gender have changed before, and now they’re changing again.

Earlier this year Gallup published some incredible statistics, showing that gen Z is our queerest generation yet, with nearly 20 percent identifying somewhere under the broad LGBTQIA+ umbrella. (Some two-thirds of this group—13 percent of gen Z—identify as bisexual; about 2 percent of zoomers identify as trans.) When I was first coming out in the early 1990s, by contrast, “1 in 10” was a common gay slogan. How did we get here, with such wide differences in identification between generations? Are there actually more queer people now, or just more out queer people? Or are those the wrong questions to ask?
Conservatives have been pushing two related theories to explain this uptick. First, there’s the “social contagion” theory, which holds that in a world drowning in representations of heterosexuality and cisgenderness, meeting a single trans person, reading a book with a bisexual character in it, or encountering nonbinary pronouns on TikTok can totally destabilize the identity of an otherwise “normal” child. It’s amazing how fragile heterosexuality and cisness are in this formulation—almost like they’re socially manufactured identities, backed by huge amounts of ideological infrastructure, peer pressure, media recruitment, and social policing. Well, I guess conservatives aren’t wrong about everything.
Another theory, sometimes offered in tandem with the contagion idea and sometimes in slight opposition to it, is the “snowflake” theory: the idea that zoomers are confused, or pretending, or signaling solidarity, or just want attention, and thus their “identities”—pansexual, ace (as in asexual), genderflux, enby, and so on—aren’t even “real.” In part, this is just another version of the contagion fear. But there’s something else going on, something a little more interesting, which—in a roundabout way—can help to explain why I think we are asking some of the wrong questions about this uptick in queer identification. This particular queerphobic explanation adds additional requirements to clear the bar of “queerness”: it has to last this long or you have to be attracted to this many people of the same sex, or you have to felt this way from birth.
In other words, this is an argument about what it means to be queer: what factors matter in terms of defining sexuality and gender, and who gets to decide. In some ways, this is a discussion humanity is always having, and these ideas are constantly shifting over time.
Take the concept of being “transgender.” In 1996 groundbreaking trans author and activist Leslie Feinberg wrote Transgender Warriors, one of the first mainstream books written by a trans person looking at trans history. The book’s original subtitle was “Making History From Joan of Arc to RuPaul.” Today there’s pretty firm agreement that RuPaul Charles, the star and creator of RuPaul’s Drag Race, is not transgender. So did Feinberg make a terrible mistake? No, obviously not. What it means to be transgender has shifted over the last twenty-five years, from an umbrella term uniting people who exhibit gender-crossing or gender-confounding behaviors to an umbrella term uniting people who have gender-crossing or gender-confounding identities.
READ ENTIRE ARTICLE AT BOSTON REVIEW
The Many Lives of Pauli Murray
She was an architect of the civil-rights struggle-and the women’s movement. Why haven’t you heard of her?

The wager was ten dollars. It was 1944, and the law students of Howard University were discussing how best to bring an end to Jim Crow. In the half century since Plessy v. Ferguson, lawyers had been chipping away at segregation by questioning the “equal” part of the “separate but equal” doctrine—arguing that, say, a specific black school was not truly equivalent to its white counterpart. Fed up with the limited and incremental results, one student in the class proposed a radical alternative: why not challenge the “separate” part instead?
That student’s name was Pauli Murray. Her law-school peers were accustomed to being startled by her—she was the only woman among them and first in the class—but that day they laughed out loud. Her idea was both impractical and reckless, they told her; any challenge to Plessy would result in the Supreme Court affirming it instead. Undeterred, Murray told them they were wrong. Then, with the whole class as her witness, she made a bet with her professor, a man named Spottswood Robinson: ten bucks said Plessy would be overturned within twenty-five years.
Murray was right. Plessy was overturned in a decade—and, when it was, Robinson owed her a lot more than ten dollars. In her final law-school paper, Murray had formalized the idea she’d hatched in class that day, arguing that segregation violated the Thirteenth and Fourteenth Amendments of the United States Constitution. Some years later, when Robinson joined with Thurgood Marshall and others to try to end Jim Crow, he remembered Murray’s paper, fished it out of his files, and presented it to his colleagues—the team that, in 1954, successfully argued Brown v. Board of Education.
READ ENTIRE ARTICLE AT THE NEW YORKER
May We All Be So Brave as 19th-Century Female Husbands
Far from being a recent or 21st-century phenomenon, people have chosen, courageously, to trans gender throughout history.

One summer night in 1836, police found George Wilson drunk on the street in the Lower East Side in New York City. An officer took Wilson to the station. The officer believed that Wilson was a sailor, and also suspected that Wilson might not have been a man. Wilson had been legally married to a woman for 15 years, and living and working as a man for even longer. They told the police that their masculine gender expression was a temporary disguise for safety and ease of travel while they pursued the man they loved who had abandoned them.
The best defence against a hostile police force was to emphasise heterosexual romance and minimise the significance of gender nonconformity in one’s life. The truth came to light, however, when Wilson’s wife stormed through the police station to retrieve her husband. In an interview, Elisabeth disclosed that 15 years earlier she was not at all disappointed when she learned of her husband’s sex, and that they were happily married. Like the policemen who detained and harassed George and Elisabeth, the journalists who would later report on the incident were derisive. But George and Elisabeth were released without formal charges.
Female husbands were people assigned female at birth who ‘transed’ gender, lived as men, and entered into legal marriages with women. The phrase ‘female husband’ was first used to describe such a person in 1746 by the British playwright and novelist Henry Fielding. It circulated for nearly 200 years before losing meaning in the early years of the 20th century. It was never a self-declared identity category. No one was known to walk up to someone and say: ‘Hello, my name is George Wilson and I’m a female husband.’ Rather, it was a term used by others – usually male writers, policemen, judges and doctors – in reference to people whose gender expression was different from their assigned sex. Far from being a recent or 21st-century phenomenon, people have chosen to trans gender throughout history. ‘Female husband’ was a label predominantly used to refer to white working-class people.
In 1856, Miss Lewis of Syracuse in New York state fell in love with Albert Guelph, a charming newcomer. After a brief courtship, they wed in an Episcopal church the same year. The bride’s father soon became suspicious of Guelph and called the police. Together, the policeman and the father interrogated and examined Guelph on the suspicion that Guelph was a woman disguised as a man. They arrested and imprisoned Guelph. Justice Durnford sentenced Guelph to 90 days imprisonment in the penitentiary for violating the vagrancy statute – a very vague ‘catch-all’ crime applied mostly to impoverished people for being poor, homeless, begging, drinking or simply existing in public spaces. Vagrancy laws were also invoked for minor social infractions against morals or order.
The Syracuse Daily Standard took great interest in the case and provided regular updates. When the judge asked Guelph directly: ‘Are you a male or female?’ Guelph refused to answer, instead deflecting the question back to the judge, stating ‘your officers can tell you’ or ‘have told you’. Neither Guelph nor their lawyer made any attempt to explain or justify the status of Guelph’s assigned sex or gender expression. Instead, the lawyer noted that there was no New York state law prohibiting ‘a person to dress in the attire of the opposite sex’. This was true. Guelph was soon released.
It was typical in such cases for people like Guelph to offer an explanation or excuse as to why they were presenting as male. As long as the accused spun a convincing tale, assured authorities that they were not threatening, and begged for forgiveness, they might be let go without further punishment or harassment. Those who worked as soldiers and fought in a war were the most sympathetic of such cases, as patriotism was deemed their core motivation. Others who were poor or alone and explained that presenting as male offered them safety while travelling and/or a higher wage than they could earn as women were also treated with a degree of compassion and understanding – provided that they were willing to change their clothes and resume moving through society as women. Guelph was different: they assumed male attire because they wanted to and because they could. They refused to offer any kind of explanation or justification – sympathetic or otherwise.
What is Trans History?
From activist and academic roots, a field takes shape.

While in graduate school at the University of California, Berkeley, history department in the 1980s, Susan Stryker wrote a dissertation on the development of Mormon identity and community. Just as she was finishing up her PhD, she transitioned. “Let’s just say that the employment prospects in the historical profession as an out transsexual person doing early 19th-century religious and cultural history were zero,” she says dryly.
By necessity, she turned to writing about transgender history: “I did it out of formal academic training, and I did it strategically and tactically out of conditions of employability as an out trans person a quarter century ago.” What followed was a 17-year career publishing articles in academic journals, producing exhibitions and public history programs, serving as executive director of the GLBT Historical Society in San Francisco, and making an Emmy Award–winning documentary film—Screaming Queens—about the 1966 transgender riot at Compton’s Cafeteria in San Francisco. In 2009, Stryker joined Indiana University Bloomington as a tenured professor in the Department of Gender Studies, and from that year till 2013 she sat on the LGBTQ Historians Task Force of the AHA. Stryker had effectively helped establish an academic field she would finally get hired into.
Today, transgender studies and transgender history are legible fields of academic study: there are two volumes of The Transgender Studies Reader, a journal (TSQ: Trans Studies Quarterly), a number of new or forthcoming books from major university presses, and a spate of recent job ads seeking people with research and teaching interests in trans studies or trans history.
Unfortunately, the advances in academia paralleling the increased visibility of trans people in the public sphere have been accompanied by political efforts to regulate the lives of trans people. This includes the ability to access public spaces and basic necessities such as healthcare and economic security. More than a dozen states considered “bathroom bills” in 2017, and violence against trans people, especially trans women of color, continues to rise, according to the Human Rights Campaign and the Trans People of Color Coalition. With so much at stake and an audience that’s finally paying attention, trans history is a field filled with a sense of urgency and potential.
Prior to the emergence of trans history in the academy, much of the writing on the lived experiences of trans people was written either by medical professionals and psychologists, or by trans people themselves as autobiographies. Many trans people encountered themselves as historical subjects through such popular works as Leslie Feinberg’s Transgender Warriors: Making History from Joan of Arc to Dennis Rodman (1996)—“I couldn’t find myself in history,” Feinberg wrote. “No one like me seemed to have ever existed.”
In terms of scholarship, most academics cite Yale historian Joanne Meyerowitz’s How Sex Changed: A History of Transsexuality in the United States (2002) as foundational. “When that book came out, it went a long way toward creating a field,” says Elizabeth Reis (Macaulay Honors Coll., CUNY), author of Bodies in Doubt: An American History of Intersex (2009). It was important, says Reis, “to have a history written where then other historians could start looking up the footnotes and seeing where to even go to find material.” In another seminal moment, Stryker co-edited The Transgender Studies Reader (2006, with a second volume in 2013) and soon published Transgender History, an accessible book for the general reader on the history of transgender people in the United States.
According to Stryker, there are two ways scholars today approach trans history. In the first, historians analyze people in the past as trans, whether or not they used the label for themselves. In Transgender History, Stryker uses transgender to “refer to people who move away from the gender they were assigned at birth.” Even though the term only emerged in the mid-20th century, many scholars find this definition useful and methodologically liberating. Emily Skidmore (Texas Tech Univ.), author of the recently published True Sex: The Lives of Trans Men at the Turn of the Twentieth Century (2017), says, “Even though the term transgender is modern, people have moved from one gender to another for a very long time. And transgender history looks at that movement.”
READ ENTIRE ARTICLE AT THE AMERICAN HISTORICAL ASSOCIATION
All Americans Have the Right to Dress Exactly How They Want
The key to a successful legal campaign is realizing that gender freedom benefits everyone.

Transgender Americans are under attack. Across the country, Republicans have introduced an avalanche of legislation to restrict access to gender-affirming health care, censor how gender is discussed in schools, prevent trans people from using public bathrooms and even ban drag shows and cross-dressing onstage. In March, Tennessee criminalized drag performances where children are present. In April, Montana Republicans barred Representative Zooey Zephyr from the floor of the state House in part for her vocal opposition to a similar bill, which is now headed to the governor’s desk.
Attacks on gender nonconformity — and cross-dressing in particular — have a long history in America. Anti-drag laws similar to the one passed in Tennessee and even more restrictive cross-dressing bans were part of municipal criminal codes for most of the 20th century. But just as the laws aren’t new, neither is the fight against them. Over the course of the 1960s and 1970s, gender nonconforming activists argued that sartorial censorship harms anyone who deviates from rigid gender norms. These activists won in court. Looking back on their victories can inspire trans people and their allies today, not just by highlighting effective legal strategies but also by reminding us that state-mandated gender conformity is an affront to everyone’s right of self-expression.
Legal attacks on gender expression like those being passed today in Florida, Iowa, Montana and elsewhere have disturbing similarities to those that were on the books throughout most of the 20th century: Then, cities across the country criminalized appearing in public “in a dress not belonging to his or her sex.” Others prohibited “female impersonators” or “masquerade.” These laws were routinely used to harass and discredit anyone who transgressed gender norms, including feminists who wore men’s clothes to protest gender inequality, sex workers signaling that they were available to be engaged, drag performers, cross-dressers and people who today might identify as transgender. Arrests could have major consequences. Many people arrested under these ordinances lost their jobs and families.
READ ENTIRE ARTICLE AT THE NEW YORK TIMES
Who Lost the Sex Wars?
Fissures in the feminist movement should not be buried as signs of failure but worked through as opportunities for insight.

In an illuminating retelling of this period of American feminist history, “Why We Lost the Sex Wars: Sexual Freedom in the #MeToo Era,” the political theorist Lorna N. Bracewell challenges the standard narrative of the so-called sex wars as a “catfight,” a “wholly internecine squabble among women.” For Bracewell, that story omits the crucial role of a third interest group, liberals, who, she argues, ultimately domesticated the impulses of both antiporn and pro-porn feminists. Under the influence of liberal legal scholars such as Elena Kagan and Cass Sunstein, antiporn feminism gave up on its dream of transforming relations between women and men in favor of using criminal law to target narrow categories of porn. “Sex radical” defenders of porn became, according to Bracewell, milquetoast “sex positive” civil libertarians who are more concerned today with defending men’s due-process rights than with cultivating sexual countercultures. Both antiporn and pro-sex feminism, she argues, lost their radical, utopian edge.
This sort of plague-on-both-their-houses diagnosis has gained currency. In a 2019 piece on Andrea Dworkin, Moira Donegan wrote that “sex positivity became as strident and incurious in its promotion of all aspects of sexual culture as the anti-porn feminists were in their condemnation of sexual practices under patriarchy.” Yet the inimitable Maggie Nelson, in her new book, “On Freedom: Four Songs of Care and Constraint,” sees a “straw man” in such dismissive depictions of sex positivity. She says that skeptics forget its crucial historical backdrop—the feminist and queer AIDS activism of the eighties and nineties. For such activists, Nelson writes, sex positivity was a way of “insisting, in the face of viciously bigoted moralists who didn’t care if you lived or died (many preferred that you died), that you have every right to your life force and sexual expression, even when the culture was telling you that your desire was a death warrant.”
Both Bracewell and Nelson raise an important question about how disagreements within feminism are seen. Where the famous rifts within the male-dominated left—between, say, E. P. Thompson and Stuart Hall over Louis Althusser’s structuralism—are regarded as instructive mappings of intellectual possibility, as debates to be “worked through,” feminists tend to picture the great “wars” of their movement’s past as warnings or sources of shame. This is not to deny that feminist debate can have a particular emotional resonance. Sheila Rowbotham, though not averse to relitigating old arguments (especially with Selma James, a founder of the Wages for Housework campaign), admits that “connecting the personal with the political” could pose a particular problem for the movement: “when ruptures appeared these proved all the more painful.” She explains, “Theoretically I did not hold with the notion that because we were women we would wipe away political conflicts, but emotionally, like many other feminists, I was attached to a vision of us birthing a new politics of harmony.”
READ ENTIRE ARTICLE AT THE NEW YORKER
Pop Music Has Always Been Queer
Sasha Geffen’s debut book reveals that the history of pop music is a history of gender rebellion.

Sasha Geffen’s first encounter with a trans person was listening to Wendy Carlos, the A Clockwork Orange composer who helped develop the synthesizer as a musical instrument. The first time Geffen (who uses “they/them” pronouns) remembers hearing the word “androgyny” was in reference to Annie Lennox, and they were, like the other gay kids in high school, a big fan of the punk band Against Me. Transition (or in Geffen’s words, “figuring shit out”) became possible through music. Glitter Up the Dark is not just a chronicle of the transgressive possibilities of pop music but also a history of Geffen’s listening and a demand that we regard pop culture in explicitly political terms.
As they trace how music acts as a vessel for gender transgression, Geffen connects Gertrude “Ma” Rainey, the early blues singer who they say “set the stage for pop music’s tendency to incubate androgyny, queerness, and other taboos in plain view of powers that would seek to snuff them out” to a long history of musical expression—from Kurt Cobain and Courtney Love to Brooklyn rapper Young M.A. and Björk collaborator serpentwithfeet. As in much queer writing, origin is precious material, and the past informs Geffen’s reading of contemporary mainstream cultural production as well as today’s conversation surrounding gender identity. (“The history of American music is the history of black music,” they write, “and since the gender binary is inextricably tied to whiteness, pop music’s story necessarily begins slightly outside its parameters.”)
The book is also a collage of voices that quietly unsettle the status quo, nimbly linking artists one would expect to see in this sort of anthology (Prince, David Bowie, Missy Elliot, Grace Jones) and others who are more surprising (the Beatles, DJ Kool Herc, Klaus Nomi). If Geffen’s thesis is that “music shelters gender rebellion from those who seek to abolish it,” then the book operates in a similar way, circulating clues about Geffen’s realizations about gender alongside each artist they profile.
READ ENTIRE ARTICLE AT THE NATION
LGB and/or T History
“Transgender” has gone from an umbrella term for different behaviors, to an umbrella term for different identities.

Any line we might try to draw to separate lesbian/gay/bisexual history (LGB) from transgender history (trans) would be imprecise, blurry, and certainly not straight. Because of the incomplete archives we have, attempts to separate histories of sexuality and gender transgression often resort to pure guesswork. Worse, just trying to find that line forces us to consider all of history through two modern categories – gender and sexuality – making it harder to understand the very thing we are researching.
Even in just the last few decades, the meaning of “transgender” has shifted radically. To take one example: in 1996, Leslie Feinberg’s Transgender Warriors (one of the first popular histories of transgender people) was published with the subtitle “From Joan of Arc to RuPaul.” But today, RuPaul is considered a cisgender drag queen, not a transgender person. This shows how the conversation around transgender history has moved away from focusing on practices that cross gender norms (like drag), and towards focusing on people who self-identify as transgender.
While this is a triumph for self-identification, which allows people to express how they want to be understood regardless of how they might look to an outside viewer, it is not a stable ground from which to conduct historical research. Identity can be invisible, or unstated, and it shifts over time, making it impossible to definitively state the self-identification of people in the past. The importance of self-identification over behavior is, itself, a recent phenomenon worth investigating.
Therefore, we do not see “LGB” and “T” histories as discrete categories; rather, they are a collection of shared roots and overlapping fields of investigation. Many figures have a place in both histories, regardless of what terms they used to describe themselves. How can that be?
READ ENTIRE ARTICLE AT DIGITAL TRANSGENDER ARCHIVES
The Forgotten Trans History of the Wild West
Despite a seeming absence from the historical record, people who did not conform to traditional gender norms were a part of daily life in the Old West.

From 1900 to 1922, Harry Allen was one of the most notorious men in the Pacific Northwest. The West was still wide and wild then, a place where people went to find their fortunes, escape the law, or start a new life. Allen did all three. Starting in the 1890s, he became known as a rabble-rouser, in and out of jail for theft, vagrancy, bootlegging, or worse. Whatever the crime, Allen always seemed to be a suspect because he refused to wear women’s clothes, and instead dressed as a cowboy, kept his hair trim, and spoke in a baritone. Allen, who was assigned female at birth, was actually far from the only trans* man who took refuge on the frontier.
Despite a seeming absence from the historical record, people who did not conform to traditional gender norms were a part of daily life in the Old West, according to Peter Boag, a historian at Washington State University and the author of Re-Dressing America’s Frontier Past. While researching a book about the gay history of Portland, Boag stumbled upon hundreds and hundreds of stories concerning people who dressed against their assigned gender, he says. He was shocked at the size of this population, which he’d never before encountered in his time as a queer historian of the American West. Trans people have always existed all over the world. So how had they escaped notice in the annals of the Old West?
Boag expanded his research beyond the Northwest, but limited it to towns west of the Mississippi, and the period of time from the California Gold Rush through statehood for all the Western continental territories. It wasn’t that this time and place was more open or accepting of trans people, but that it was more diffuse and unruly, which may have enabled more people to live according to their true identities, Boag says. “My theory is that people who were transgender in the East could read these stories that gave a kind of validation to their lives,” he says. “They saw the West as a place where they could live and get jobs and carry on a life that they couldn’t have in the more congested East.” Consider Joseph Lobdell, born and assigned female in Albany, New York. When he surfaced in Meeker County, Minnesota, he became known as “The Slayer of Hundreds of Bears and Wild-Cats.”
READ ENTIRE ARTICLE AT ATLAS OBSCURA
The Life of Pauli Murray: An Interview with Rosalind Rosenberg
The author of a new biography explains how Murray changed the way that discrimination is understood today.

Alyssa Collins: Tell us a bit about how you came to this project. What are some of the factors that motivated you to write a biography of Pauli Murray?
Rosalind Rosenberg: I had never heard of Pauli Murray, much less considered writing her biography, before I met Ruth Bader Ginsburg in 1974. Ginsburg was then the first tenured female faculty member at Columbia Law School; I had just taken a job in the Columbia history department. In addition to teaching law, Ginsburg worked on women’s rights cases at the ACLU. By the time I met her, she had persuaded the Supreme Court that the 14th Amendment to the Constitution, passed after the Civil War to protect the rights of African Americans, should also be understood to guarantee the rights of women. As a young feminist, I was impressed by Ginsburg’s success. As a historian, I was curious to know what had inspired her to argue that gender discrimination was analogous to race discrimination. The answer was Pauli Murray.
Murray became a staple of my courses in women’s and legal history, and I included her as a central figure in my book, Divided Lives: American Women in the Twentieth Century (1992), but I did not consider writing her biography until the mid-1990s when her papers were opened to researchers at the Schlesinger Library, Harvard University. Historians often lament the dearth of archival material on Black lives, so I could hardly contain my excitement when I discovered that Murray had left more than a 135 boxes of private papers and photographs, mostly her own but also of family members going back to the Civil War.
Collins: Tell us more about the process of conducting research for this book. What were your sources, methods, and methodologies? What did you find most interesting or surprising during the research process?
Rosenberg: When my career began in the 1970s, young historians avoided writing biographies. We were looking for social patterns and structures. In this search, we were reacting against the “great man” theory of history – the idea that presidents and generals moved history. We were more interested in social movements than in individuals.
One other thing delayed my undertaking Murray’s biography. Murray wrote two memoirs, one, Proud Shoes, the history of her maternal ancestors going back to the early 19th century, and a second, Song in a Weary Throat, an autobiography. Murray seemed to have said everything worth saying in these books. But as I made my way through her personal papers, I realized that she had left a great deal out of those memoirs: most importantly her struggles over gender identity. Here was a person whose life, written with full attention to historical context, could illuminate the intersection of race and gender, as well as the challenges faced by someone growing up a century ago who felt more male than female. These were years when the term transgender did not exist and there was no social movement to support or help make sense of the trans experience. Murray’s papers helped me to understand how her struggle with gender identity shaped her life as a civil rights pioneer, legal scholar, and feminist.
READ ENTIRE STORY AT BLACK PERSPECTIVES
Transgender Men Who Lived a Century Ago Prove Gender Has Always Been Fluid
In her new book, ‘True Sex,’ historian Emily Skidmore looks at their lives and how society has treated them

In 1914, Ralph Kerwineo, a self-assigned man from Milwaukee, had a dalliance with a woman who was not his wife, prompting his actual wife to report to the authorities that her husband wasn’t biologically a man at all. Kerwineo was arrested for disorderly conduct, but later freed. He was told by the judge he ought to dress as a woman while in Milwaukee if he wanted to stay out of trouble.
The case of Kerwineo, born Cora Anderson, captured the nation’s attention, but in her new book, True Sex: Trans Men at the Turn of the Twentieth Century, historian and Texas Tech professor EmilySkidmore identifies a surprisingly wide range of responses Americans had to the scandal. As Skidmore says in describing the case to the hosts of the Backstory history podcast, the national response was “automatically Ralph Kerwineo is a deviant, is someone who is pathological, and how terrible he took advantage of this poor woman.” (He was referred to as the “girl-man” in some articles.) “But what’s fascinating,” Skidmore continues, “is that what the Milwaukee papers really do is they interview Ralph Kerwineo’s former bosses and they’re trying to understand what his life as a man was like. And if his life as a man was respectable, then his foray into masculinity was understood as something that was, you know, kind of OK.”
To listen to today’s politicians?—?prudish and prurient alike?—?one might think that gender categories were once perfectly stable and that transgender individuals are somehow an invention of the late modern age. But history shows that as a social category, gender has always been constructed, subject to debate, and to one degree or another, fluid. In True Sex, Skidmore explores the varied histories of American trans men long before that designation even existed. Reviewing newspapers and the literature of the field then known as “sexology,” as well as census data, court records, and trial transcripts, Skidmore weaves a tale of American gender that’s far more complex than many might think, one that reveals that it has never been a fixed reality.
READ ENTIRE ARTICLE AT TIMELINE
The Rightness of the Singular ‘They’
Merriam-Webster added a new definition to the word “they”: “used to refer to a single person whose gender identity is nonbinary.”

The venerable Merriam-Webster dictionary has declared the word “they” its 2019 word of the year. The singular they and its many supporters have won and it’s here to stay. Despite what many language skeptics think, the use of they as a gender-neutral singular pronoun is no mistake; it goes back to the 14th century.
For decades, transgender rights advocates have noted that literary giants Emily Dickinson, William Shakespeare, William Wordsworth, and Geoffrey Chaucer all used singular they in their writing. In a letter dated Sept. 24, 1881, Dickinson wrote: “Almost anyone under the circumstances would have doubted if [the letter] were theirs, or indeed if they were themself — but to us it was clear.” In “Hamlet,” Shakespeare used “them” in reference to the word mother: “‘Tis meet that some more audience than a mother — Since nature makes them partial — should o’erhear the speech.”
Even the most strident grammarians give pause when faced with evidence that singular they has long been a tool of the trade. Activists have also emphasized that singular they is used all the time in speaking and writing when we don’t know or don’t want to specify the gender of the subject — as in recent commentary about the whistleblower in the Trump impeachment inquiry.
The current usage of the singular they, however, has expanded beyond the historical precedent. A few months before declaring they the word of the year, Merriam-Webster added a new definition to the word: “used to refer to a single person whose gender identity is nonbinary.” Some might view this change as happening too quickly, but it has been a long time coming.
READ ENTIRE ARTICLE AT LOS ANGELES REVIEW OF BOOKS
What Quakers Can Teach Us about the Politics of Pronouns
In the 17th century, they also suspected that the rules of grammar stood between them and a society of equals.

Pronouns are the most political parts of speech. In English, defaulting to the feminine “she/her” when referring to a person of unspecified gender, instead of the masculine “he/him,” has long been a way of thumbing one’s nose at the patriarchy. (“When a politician votes, she must consider the public mood.”)
More recently, trans, nonbinary and genderqueer activists have promoted the use of gender-inclusive pronouns such as the singular “they/their” and “ze/zir” (instead of “he/him” or “she/her”). The logic here is no less political: If individuals — not grammarians or society at large — have the right to determine their own gender, shouldn’t they get to choose their own pronouns, too?
As with everything political, the use of gender-inclusive pronouns has been subject to controversy. One side argues that not to respect an individual’s choice of pronoun can threaten a vulnerable person’s basic equality. The other side dismisses this position as an excess of sensitivity, even a demand for Orwellian “newspeak.”
Both sides have dug in. To move the conversation forward, I suggest we look backward for an illuminating, if unexpected, perspective on the politics of pronouns. Consider the 17th-century Quakers, who also suspected that the rules of grammar stood between them and a society of equals.
READ ENTIRE ARTICLE AT THE NEW YORK TIMES
The Untold Story of Queer Foster Families
In the 1970s, social workers in several states placed queer teenagers with queer foster parents, in discrete acts of quiet radicalism.

When Don Ward was a child, in Seattle, in the nineteen-sixties, his mother, each December, would hand him the Sears catalogue and ask him to pick Christmas gifts. By the time his parents filed for divorce, the catalogue had become a refuge, for Ward, from their shouting matches. Eventually, instead of looking at toys, he began turning to the men’s underwear section, fascinated by the bodies for reasons he didn’t really understand. Soon, he started noticing the tirades that his father occasionally launched into about gay people. “I think all those queers ought to be lined up and shot,” Ward remembers his father saying.
“It was a bit of a lonely childhood,” Ward told me. After the divorce went through, he saw his parents under the same roof only twice. The first time was in court, when they fought over custody of Ward and his two brothers. (Ward’s mother won.) The second time was at a youth services facility, after a close acquaintance outed Ward to his parents, and, unable to tolerate a gay son, they mutually signed him over to the state of Washington. Ward was barely fifteen.
It was 1971, and the state of Washington didn’t know how to handle an openly gay teen-ager, either. The Department of Social and Health Services tried sending Ward to an all-male group home run by Pentecostals who were committed to exorcising the “demon of homosexuality” out of him. Ward didn’t get along with his roommates. The state placed him with a religious couple, who gave him a basement room that had only three walls; the lack of privacy, the parents said, would help keep his homosexuality in check. Ward left that home six months later, after a fight with his foster mom about chores. Then he was placed with a childless married couple, who seemed perfect, and who accepted his sexuality. Within a few months, they began to physically abuse him, Ward told me.
At Christmas, Ward would call his father. Every time, after recognizing his son’s voice, Ward’s father would hang up. Ward spoke with his mother from time to time, and he began visiting the Seattle Counseling Service, which had been established to “assist young homosexuals in meeting their personal, medical and social problems.” There, Ward met Randy, a volunteer counsellor with a distaste for gender conventions—Ward remembered him pairing red lipstick with combat boots. (Randy is a pseudonym, to protect his identity, as he never came out to his family.) Randy had a close friend, Robert, a more strait-laced gay man who was in his twenties and a reverend. Robert, who asked me not to include his last name, had been on good terms with many local church leaders until the spring of 1972, when he came out. “The situation is enough to gag a maggot,” a member of one church group then told a local newspaper. Robert moved to the Metropolitan Community Church, a network of gay-friendly churches that was founded in 1968, and he became prominent in Seattle’s gay-rights movement.
READ ENTIRE ARTICLE AT THE NEW YORKER
The 19th Century Lesbian Made for 21st Century Consumption
Jeanna Kadlec considers Anne Lister, the center figure of HBO’s Gentleman Jack, and the influence of other preceding queer women.

When we call Anne Lister, the 19th century British diarist and adventurer reimagined in HBO’s hit series Gentleman Jack, the “first modern lesbian,” what do we mean, precisely? Critics don’t seem to know. The catchy tagline coined by Lister’s devotees and perpetuated by the show’s marketing is good branding, but makes for a slightly confusing moniker: what is it, exactly, that makes Anne Lister a “modern” lesbian, let alone the first?
The answer goes beyond a casual Wikipedia-esque list of Lister’s propensities and accomplishments that most coverage of the show has thus far relied on. To understand what makes Anne Lister unique, you have to understand how lesbianism and identity were understood in the 1830s — and it’s far too simplistic to say that women with women was simply “unimaginable” for the time, that Lister was completely solitary in her pursuit of as public a commitment as would have been socially acceptable.
Lesbian content was not unfamiliar to 17th, 18th, and 19th century audiences. From lesbian eroticism in pornographic texts such as the psuedonymous Abbé du Prat’s The Venus in the Cloister: or, the Nun in Her Smock, published in 1683, to the trope of a “Female Husband” (which had historical grounding in famous figures like Mary Hamilton) to the romantic friendship of Ladies of Llangollen, who were contemporaries of Lister’s, the idea of women loving (and fucking) women was hardly new, if deeply socially unacceptable. Among women of the upper class with means, Lister was hardly alone in forging her own kind of life. The “first”? No.
Lister was ahead of her time, but not in the obvious way: not because of her desire, or even her willingness to throw off norms. Rather, her desire to live what we would identify as an “out” life (or, as “out” a life as possible) was informed by a distinctly Enlightenment-informed conception of her individuality and her psychosexual identity that would have been more at home in 2019 than 1839. In Lister’s time, lesbian wasn’t the distinct identity category it would later become. Lister’s prescient insistence on a cohesion between her public and private personas — an insistence on her sexuality as a vital component of her identity — was remarkable. Thanks to her diaries, we also have unprecedented access to how she herself thought of her identity and sexuality, as well as an explicit record of sexual activity. Ultimately, this means that Lister is a historical figure made for 21st century consumption, onto whose life we can easily project (if anachronistically) ideas like that of the closet and the difficulty of living an “out” life in Regency England.
READ ENTIRE ARTICLE AT LONGREADS
Work
Midwestern Exposure
Zooming in on the places where early women photographers could build a career.

How do you study the Library of Congress’ collection of nearly sixteen million photographs when the objects themselves are out of reach? It’s an esoteric riddle, perhaps, but one that Adrienne Lundgren, a conservator of photographs at the library, was forced to solve when lockdowns began in 2020. She decided that using data science to understand the early history of photography and the context of her institution’s collection might prove interesting, and she was right.
As Lundgren digitized records of the country’s first photographers, an unexpected pattern emerged. During the medium’s first two decades in the United States, the largest concentration of women photographers wasn’t in the Northeast, the country’s most populous region. Women photographers were most active in the Midwest. Between 1840 and 1860 more than half the country’s women photographers were working in just nine states: Illinois, Indiana, Iowa, Kansas, Michigan, Minnesota, Missouri, Ohio, and Wisconsin. Lundgren didn’t identify any new photographers in her research; she just looked at the existing records in a new way. “The data was all there,” she told me. “The geographic trends just couldn’t be seen in the original analog format.”
Feminist historians often ask, metaphorically, where were all the women? By this they mean: Where do women appear in history? Where can we find them in the historical record, which rarely includes their names? By taking this question literally, Lundgren has not only found an answer (they were in the Midwest) but also unsettled the early history of photography in the United States, which has long privileged Northeastern cities.
READ ENTIRE ARTICLE AT LAPHAM’S QUARTERLY
Will Covid-19 Lead to Men and Women Splitting Care Work More Evenly?
History shows that men have always been able to handle care work — when they have to.

In 2020, the centennial year of the 19th Amendment that secured women’s right to vote, covid-19 is threatening to deny or abridge women’s economic and professional advancement. The pandemic has hit the female-dominated service economy especially hard. Moreover, inside hospitals, health-care workers are struggling as never before to care for the sick — with inadequate PPE and pressures to physically separate from their families — which has also disproportionately hit women: In the United States today nearly four out of five health care workers are women.
In many states, covid-19 has sent children home for the rest of the school year, forcing parents to bear the double burden of a full-time job and full-time child care — something evidence suggests has affected women’s productivity significantly more than men’s. In this pandemic then, the face of care inside and outside hospitals is overwhelmingly female.
History tells us why that is. Today’s female care force is a product of persistent gender stereotypes that declare women to be more caring than men. These stereotypes have helped to funnel women into essential but undervalued jobs in fields such as health care and education and they have helped to sustain women’s disproportionate burden of care in heterosexual couples, even when partners have verbally agreed to a “50/50” split. Such inequities have replicated across generations. In many ways, our 2020 gendered world of care is little different from that of 1920 or even 1820.
READ ENTIRE ARTICLE AT THE WASHINGTON POST
The Factory in the Family
The radical vision of Wages for Housework.

Wages for Housework helps to recover a movement that had modest origins but spread around the world within several years. From the gathering in Padua, Italy, that launched the international campaign in 1972 to the spin-off groups like the New York Committee, the women of Wages for Housework made arguments and demands that were well ahead of their time, helping to fill in the gaps overlooked by the mostly male left and the mostly liberal mainstream feminist movement, both of which have long excluded the home and the processes of social reproduction from their activism and thinking.
As the IFC’s launch statement (which served as a founding document for the New York Committee) put it:
We identify ourselves as Marxist feminists, and take this to mean a new definition of class, the old definition of which has limited the scope and effectiveness of the activity of both the traditional left and the new left. This new definition is based on the subordination of the wageless worker to the waged worker behind which is hidden the productivity, i.e., the exploitation, of the labor of women in the home and the cause of their more intense exploitation out of it. Such an analysis of class presupposes a new area of struggle, the subversion not only of the factory and office but of the community.
To demand wages was to acknowledge that housework—i.e., the unwaged labor done by women in the home—was work. But it was also a demand, as Federici and others repeatedly stressed, to end the essentialized notions of gender that underlay why women did housework in the first place, and thus amounted to nothing less than a way to subvert capitalism itself. By refusing this work, the Wages for Housework activists argued, women could help see to “the destruction of every class relation, with the end of bosses, with the end of the workers, of the home and of the factory and thus the end of male workers too.”
READ ENTIRE ARTICLE AT THE NATION
This Is Helen Keller’s 1932 ‘Modern Woman’
In 1932, Hellen Keller offered some advice for the “perplexed businessman.”
In my childhood, even before my education had been begun, I was allowed to take part in the elaborate ritual which, in those days, marked the making of a fruitcake at Christmas time. Although I was blind, deaf, and speechless, the thrill of the occasion communicated itself to me. There were all sorts of pungent and fragrant ingredients to collect and prepare — orange and lemon peel, citron, nuts (which had to be cracked), apples, currants, raisins (which had to be seeded), and a host of other things. The family encouraged me to assist in these preparations, for they discovered that this was one means of keeping me, at least temporarily, out of mischief; and I, for my part, was just as eager to help, because I was always permitted to claim my wages in raisins.
All in all, this concoction of a fruitcake was a long and complicated task. If there had been some oversight in the preliminary planning and an important ingredient was missing, someone had to make a trip to town to fetch it. While the mixing process was being carefully attended to, a roaring fire was built in the stove. At last, when everything was ready and the fire was giving off just the right degree of heat, the great pan was placed reverently in the oven. The climax of the ritual was now at hand. The temperature had to be maintained for several hours with the utmost precision, and everybody had to walk about on tiptoe lest some unguarded step shake the floor and cause the precious batter, swelling with the heat, to fall. In the end, if all went well, we were rewarded with a very miracle of a fruitcake, without which Christmas would not have been Christmas.
READ ENTIRE ARTICLE AT THE ATLANTIC
The New Deal Program that Sent Women to Summer Camp
About 8,500 women attended the camps inspired by the CCC and organized by Eleanor Roosevelt—but the “She-She-She” program was mocked and eventually abandoned.

During the Great Depression, thousands of unemployed men picked up saws and axes and headed to the woods to serve in the Civilian Conservation Corps, a New Deal program that employed about 3 million men. But men in the CCC weren’t the only ones to take to the great outdoors on the New Deal’s dime. Between 1934 and 1937, thousands of women attended “She-She-She camps,” a short-lived group of camps designed to support women without jobs.
The program was the brainchild of First Lady Eleanor Roosevelt, who wanted an option for the 2 million women who had lost work after the stock market crash of 1929. Like their male counterparts, they looked for work, but stigma against women who worked and women who took government aid made finding a job even more difficult. Many women were forced to seek dwindling private charity or turned to their families. Others became increasingly desperate, living on the streets.
Their plight deeply concerned Roosevelt, who wondered if they might be served by the CCC. The program, which sent men to camps around the country and put them to work doing forestry and conservation jobs, was considered a rousing success. But Roosevelt encountered resistance from her husband’s cabinet, which questioned the propriety of sending women to the woods to work.
READ ENTIRE ARTICLE AT HISTORY
Factory Made
A history of modernity as a history of factories struggles to see beyond their walls.

By the middle of the 19th century, political economists agreed that slavery was archaic as well as brutal, unprofitable as well as unconscionable. They were inspired by an idea of “the market,” an idealized economic space governed by natural laws as immutable as entropy or gravity. The market treasured the right to freely sell one’s labor; the market abhorred slavery. But on the eve of the war of the rebellion in the United States, Southern slave owners argued that slavery was the future. In the industrial areas of the northern United States and western Europe, battles for fair wages and better conditions had been contests over the definition of “freedom” within the strictly regulated and heavily capitalized confines of factory work. “Wage slavery” was the opposite of a good working-class job. And yet, although most enslaved African Americans did agricultural labor, tobacco processing plants in the Upper South, iron foundries in Alabama, and a wide range of other industries employed more and more unfree people.
In 1851, De Bow’s Review, a leading journal of the slave-owning class, published a lengthy essay on the “Future of the South” that set out a plan to accelerate this marriage of industrialization and slavery. Cotton, the essay noted, was Britain’s most important manufacturing sector, the cornerstone of the most advanced industrial economy on earth. Cheap cotton cloth had become a staple across Europe and the Americas. Easy to clean, easy to replace, more comfortable than wool, and much cheaper than silk, it had made life easier for millions. The essay posited that although the factories that turned raw cotton into finished cloth relied on free labor, the raw material itself had “bound the fortunes of American slaves so firmly to human progress, that civilization itself may almost be said to depend” upon the preservation of slavery. The essay acknowledged that the Atlantic and the Mason-Dixon separated the slave power from the mills of Manchester and Massachusetts. However, mechanization seemed to promise a dramatic reduction in the number of people required for agricultural labor, freeing up an enslaved industrial workforce. As cotton refineries and cloth factories sprouted in the South, the entire Mississippi Valley would become a giant factory and New Orleans would become Liverpool, “communicating by the father of waters with that vast region which is to be the Manchester of the world.”
The marriage of slavery and factory work proposed in De Bow’s Review might seem incongruous. After all, the story that many white Southerners told themselves after the defeat of the slaveholders’ rebellion was one of a gallant agrarian South crushed by a relentless, soulless industrial North. But on the eve of the war, Northern and Southern businessmen were more frank about the intimacy of free labor and slave labor in the United States. In 1860 Edward Bean Underhill, a prominent British missionary, visited Cuba, where men from both sides of the sectional divide were staying at his hotel. A Southerner told him, “The North depends on the cotton growers of the South”; a Northerner told him, “The South depends on the North for capital, and even for existence.” The meeting itself says something more: Investors from free and slave states alike were keen to sink money into profitable Cuban sugar plantations, worked nearly exclusively by enslaved people.
READ ENTIRE ARTICLE AT THE NEW INQUIRY
Women at Work: A History
Women in the workplace, from 19th century domestic workers to the Rosies of World War II to the labs of Silicon Valley.

ED: So here’s a surprising statistic from 1999. That’s the year that employment figures for American women peaked at 74%. Since then, the percentage of women in the workforce has fallen off, dropping from fourth highest in the world in 1999, to eleventh today. And looking farther backward in time, we learn from people like Betty Soskin that the story of women entering the workforce stretches back many generations, long before World War II and Rosie the Riveter.
PETER: All of which points to the fact that the history of women in the American workforce is hardly a straight line, but rather one that zigzags through time and is very much shaped by class and race. For the rest of the hour today, we’re going to explore some of those zigs and zags. In what ways are things better today for women than they were in the past? And in what ways have we fallen behind?
BRIAN: But before we do that, let’s take a few more minutes to consider the story we just heard. Peter? Ed? You know, maybe I’ve read too many 20th Century history textbooks. But in general, when we think about Rosie the Riveter, we think about that heroic moment when women broke out of the home and served their nation by doing industrial work, often literally on the wings of bombers that they were building. And Betty presents such a different story of the women who worked during World War II. And what really strikes me is she puts that story in the context of a very long history of women working outside the household. So Peter, I wish you would tell me a little bit more about that history of women working inside the household and how they came to work outside the household.
PETER: Well, women have been working for millennia, Brian. That’s the news here. We think of household or the home as a refuge for sentimental family life, for nurturing, and all those good things, but the household is the primary unit of production throughout American history. And its hierarchical. There are apprentices, and servants, and slaves, and family members all working under the leadership of the planter, farmer, patriarch. That’s the basic unit of production, and women are integral to that work in the household.
READ ENTIRE ARTICLE AT BACKSTORY
Reading Betty Friedan After the Fall of Roe
The problem no longer has no name, and yet we refuse to solve it.

But just like a corporate solution didn’t fix the problem for Friedan’s generation, neither will fleeing the office to be a trad wife with chickens save ours. As Moira Donegan, one of my favorite working feminist thinkers and writers, reminds people, “Housewives have bosses too.” And in a society that makes it harder to divorce than marry, those bosses are harder to escape. Also, it’s worth pointing out, that if that version of womanhood was so satisfying, then why did a generation of women reject it in one of the largest most organized feminist movements in America?
The answer is, because being a housewife it’s not easy work and it’s not fulfilling. One thing that struck me as I read Friedan, is when she pointed out that women, when they had something to do other than their chores, actually got through their chores faster. I recalled how I had seemingly spent my entire past life as a housewife trapped in an endless cycle of chores. And how now, as a divorced woman, I was freer and my house was a lot cleaner. And not only because I had fewer chores or because I was cleaning up after one less person. But because I was no longer expected to perform gender in the same way. I was to put it simply, free like a husband. This is backed up by studies that show that despite the image of the harried single mother, single mothers actually had more free time and spent more time sleeping than their married counterparts.
While studies show that married couples are more financially stable, single women are more happy. Even divorced women, who are more likely to suffer economically from a divorce, are less likely to remarry. It’s almost as if the money and the stability aren’t enough. And that freedom is not something you can put a price on.
Valenti sums it up, writing, “In addition to being more economically, professionally and socially vulnerable, stay-at-home moms are also much more likely to be depressed and anxious. We live in a country that is notoriously unsupportive to mothers and families, with a culture that tells moms they should be grateful to have the ‘most important job in the world’ even though it doesn’t pay and doesn’t come with time off. Studies also show that women are more likely to initiate divorces than men, that women tend to be happier than men post-divorce, and that marriage benefits men more than it does women.”
I read Friedan last winter in big desperate gulps, sometimes listening to the audiobook on walks with my dog. It was depressing how relevant Friedan was then. And how more prescient she became after the Dobbs ruling.
Much like during the world in which Friedan published the Feminine Mystique, we are living in an era where women have fewer rights than our mothers did. America has the highest maternal mortality rate of the developed nations and we are forcing women into birth. Which will in turn force them into marriages and lives of limitation – and not limited because of children and a husband, but limited because they weren’t given a choice. This choiceless life can seem appealing. But the reality is it wasn’t fulfilling for the women of Friedan’s generation and definitely is not for us. Behind every trad wife influencer with chickens is a husband throwing his laundry on the floor and a housecleaner.
READ ENTIRE ARTICLE AT MEN YELL AT ME
How Literature Became Word Perfect
Before the word processor, perfect copy was the domain of the typist—not the literary genius.

“As if being 1984 weren’t enough.” Thomas Pynchon, writing in The New York Times Book Review, marked the unnerving year with an honest question about seemingly dystopian technology: “Is It OK to Be a Luddite?” The Association of American Publishers records that by 1984, between 40 and 50 percent of American authors were using word processors. It had been a quarter-century since novelist C.P. Snow gave a lecture in which he saw intellectual life split into “literary” and “scientific” halves. Pynchon posited that the division no longer held true; it obscured the reality about the way things were going. “Writers of all descriptions are stampeding to buy word processors,” he wrote. “Machines have already become so user-friendly that even the most unreconstructed of Luddites can be charmed into laying down the old sledgehammer and stroking a few keys instead.”
The literary history of the early years of word processing—the late 1960s through the mid-’80s—forms the subject of Matthew G. Kirschenbaum’s new book, Track Changes. The year 1984 was a key moment for writers deciding whether to upgrade their writing tools. That year, the novelist Amy Tan founded a support group for Kaypro users called Bad Sector, named after her first computer—itself named for the error message it spat up so often; and Gore Vidal grumped that word processing was “erasing” literature. He grumped in vain. By 1984, Eve Kosofsky Sedgwick, Michael Chabon, Ralph Ellison, Arthur C. Clarke, and Anne Rice all used WordStar, a first-generation commercial piece of software that ran on a pre-DOS operating system called CP/M. (One notable author still using WordStar is George R.R. Martin.)
READ ENTIRE ARTICLE AT THE NEW REPUBLIC
How the Personal Computer Broke the Human Body
Decades before ‘Zoom fatigue’ broke our spirits, the so-called computer revolution brought with it a world of pain previously unknown to humankind.

Late in 1980, Henry Getson of Cherry Hill, New Jersey wrote in to his favorite computer hobbyist magazine, Softalk. Getson described himself as a computer user of “less than expert status,” and expressed his appreciation for Softalk’s introductory tone and accessible articles, especially for someone like him, who had recently bought a personal computer and was just learning to program. His letter closed with a short question, a stray thread dangled from the hem of heaping praise: “P.S. Have any remedies for tired eyes?”
Softalk’s editors knew exactly what Getson meant, and responded at length to this “problem that many computerists share.”
“Some relief comes from double folding a washcloth, saturating it with warm water, and holding it against your eyes for several minutes,” they wrote. In later issues, fellow readers volunteered their own tips for dealing with eye strain. A reader from Texas recommended Getson modify his screen with a piece of plexiglass covered in “the sun screen material found in auto stores.” Another reader, from Malibu, California, suggested buying light green theatrical gel sheets, the kind used to color stage lights, and taping one over the monitor. We don’t know how Getson resolved to treat his tired eyes, but certainly he had no lack of homespun options volunteered by computer users negotiating similar issues.
What Getson was discovering, like all the rest of the personal computer early adopters of the 1980s, was just how much using computers hurt. Turns out, monitors caused eye strain. Or, to put it more accurately: living with computers routinely strained eyes. Vision problems were the embodied human residue of natural interactions between light, glass, plastic, color, and other properties of the surrounding environment.
The Feminist History of “Child Allowances”
The Biden administration’s proposed “child allowances” draw on the feminist thought of Crystal Eastman, who advocated “motherhood endowments” 100 years ago.

Joe Biden’s new stimulus package includes provisions for a “Child Allowance” that economists estimate could cut child poverty in the United States by half. The allowance—paid out in monthly installments of $300 per month for each child under the age of 5, and $250 per month for older children— has champions on both the Left and the Right. The policy takes its cues from an even more generous proposal drafted by Mitt Romney, known as The Family Security Act.
Despite bipartisan interest in reducing child poverty, Republican lawmakers, including Mike Lee and Marco Rubio, have dismissed Child Allowances by claiming that “an essential part of being ‘pro-family’ is being ‘pro-work,’” and warning that the monthly allowances will discourage parents from seeking paid employment.
That fear, however, is substantially unfounded: the allowance is neither enough to live on, nor is it tied to wages, so the benefit is not depleted by earned income. But the fact that such a fear exists is telling. It is a fear that categorically separates “family” and “work,” and revolves around the assumption that the only forms of valuable labor deserving of compensation are those performed outside the home. That is, the only forms of valuable labor are those performed in spheres not traditionally associated with women—and women of color in particular—as care work is in the US.
By offering monetary benefits to parents of young children, the Child Allowance has the potential to help challenge assumptions around the meaning and value of work. “One of the bigger symbolic purposes of the child allowance is to say the work a parent does is valid—it’s valid as work,” Samuel Hammond, director of poverty and welfare policy at the center-right Niskanen Center, told the New York Times. “I do think it’s a market failure in capitalist economies that there isn’t a parenting wage.”
READ ENTIRE ARTICLE AT JSTOR DAILY
The Inner Life of American Communism
Vivian Gornick’s and Jodi Dean’s books mine a lost history of comradeship, determination, and intimacy.

The communist stands at the crossroads of two ideas: one ancient, one modern. The ancient idea is that human beings are political animals. Our disposition is so public, our orientation so outward, we cannot be thought of apart from the polity. Even when we try to hide our vices, as a character in Plato’s Republic notes, we still require the assistance of “secret societies and political clubs.” That’s how present we are to other people and they to us.
The modern idea—that of work—posits a different value. Here Weber may be a better guide than Marx. For the communist, work means fidelity to a task, a stick-to-itiveness that requires clarity of purpose, persistence in the face of opposition or challenge, and a refusal of all distraction. It is more than an instrumental application of bodily power upon the material world or the rational alignment of means and ends (activities so ignoble, Aristotle thought, as to nearly disqualify the laborer from politics). It is a vocation, a revelation of self.
The communist brings to the public life of the ancients the methodism of modern work. In all things be political, says the communist, and in all political things be productive. Anything less is vanity. Like the ancients, the communist looks outward, but her insistence on doing only those actions that yield results is an emanation from within. Effectiveness is a statement of her integrity. The great sin of intellectuals, Lenin observed, is that they “undertake everything under the sun without finishing anything.” That failing is symptomatic of their character—their “slovenliness” and “carelessness,” their inability to remain true to whatever cause or concern they have professed. The communist does better. She gets the job done.
In their heyday, the communists were the most political and most intentional of people. That made them often the most terrifying of people, capable of violence on an unimaginable scale. Yet despite—and perhaps also because of—their ruthless sense of purpose, communism contains many lessons for us today. As a new generation of socialists, most born after the Cold War, discovers the challenges of parties and movements and the implications of involvement, the archive of communism, particularly American communism, has become newly relevant. So have two commentaries on that archive: Vivian Gornick’s The Romance of American Communism, originally published in 1977 and reissued this year, and Jodi Dean’s Comrade: An Essay on Political Belonging.
READ ENTIRE ARTICLE AT THE NATION
When Women Demanded Pay for Housework
The women of “Wages for Housework” didn’t want to split the burden of housework with men—they wanted to get paid for it.

In her 1970 essay “The Politics of Housework,” Pat Mainardi, a member of the radical feminist group Redstockings, lays out a step-by-step guide to persuade men to do their fair share of “dirty chores” around the house. Mainardi explains that her husband “would stop at nothing to avoid the horrors of housework.” Instead, she writes, he offered her excuses: “I don’t mind sharing the housework, but I don’t do it very well,” and “You’ll have to show me how to do it.” So Mainardi provides a list, a way to train her husband, which also serves as a primer for women to put men to work. “Keep checking up. Periodically consider who’s actually doing the jobs … Alternate the bad jobs. It’s the daily grind that gets you down.”
Despite the practical advice, a sense of futility pervades the piece. (“He won’t do the little things he always did because you’re now a ‘Liberated Woman,’ right?”) At the end of the 2,000-word essay, Mainardi signs off with the following: “I was just finishing this when my husband came in and asked what I was doing. Writing a paper on housework. ‘Housework?’ he said. ‘Housework? Oh my god, how trivial can you get?’”
By the mid-20th century, housework had undergone a dramatic revolution. Many of the tasks that had once consumed a homemaker’s day were transformed by technological innovation. Items once crafted by hand could now be bought off the rack; meals that once required extensive preparation were replaced by processed and prepared foods. Dishwashers, washing machines, and vacuum cleaners rendered clean-up a cinch.
Yet while a woman could become an accomplished homemaker and mother, she’d never receive the same level of praise, adulation, and respect as a husband who worked outside the home. For some second-wave feminists in the 1970s, this gap in respect was an obvious place to begin their fight for equality.
According to more mainstream feminists like Betty Friedan and Gloria Steinem, the answer was to put women on the same footing as men, liberate them from the shackles of their kitchens, and send them into the workforce, where they’d be able to compete with men on a level playing field. But for a small group of Marxist feminists, this strategy completely missed the point. The problem, they argued, was never with housework itself. The problem was that housework had never been truly respected as work.
We’ve Got the ’70s-Style Rage. Now We Need the ’70s-Style Feminist Social Analysis.
Amid all the stories about harassment and abuse, there’s been hardly any discussion about how we got here.

Amid the present flood of accusations of sexual harassment and abuse, there’s been a strange scarcity of broader social critique. We are accumulating copious evidence of wrongdoing, without looking deeper for a diagnosis. “Boy, all men really are pigs!” isn’t close to radical enough, because the sentiment invites criticisms like the one Masha Gessen recently made: that we’re on the brink of a “sex panic,” an epidemic of puritanism that will take down innocent men out of sheer inertia. A focus on these individual incidents of harassment, and not the structure that spawns them, is a weak strategy for change. Such an approach makes it much easier for naysayers and supporters alike to combat claims of harassment one by one, by casting aspersions on accusers, or condemning individual men for their actions.
This moment is not about sex, and the problem is anything but personal. As feminists have been saying for years, sexual harassment and abuse are about power—specifically, the enormous power men still wield over women. That’s why women (and men) are naming male offenders of all political stripes, religious and atheist, rich and poor, famous and obscure. “Sexual harassment” is not an illness in and of itself; it’s a symptom of patriarchy’s persistence.
We need to be unafraid to tie sexual harassment to other forms of violence against women—to see the connections between harassment and the pay gap, the lack of good child care, and the persistence of the second shift. We need to recognize how sexual harassment and racial injustice exist in symbiosis, and to think about how workers’ tenuous position at this particular moment in the history of capitalism has enabled sexual harassment to thrive. What we need right now to go along with our ’70s-style radical rage (as Rebecca Traister recently dubbed it) is some ’70s-style feminist social analysis.
The Story Behind the First-Ever Fact-Checkers
Here’s how they were able to do their jobs long before the Internet.

At TIME Magazine’s 20th anniversary dinner, in 1943, the magazine’s co-founder Henry Luce explained to those gathered that, while “the word ‘researcher’ is now a nation-wide symbol of a serious endeavor,” he and co-founder Briton Hadden had first started using the title as part of an inside joke for a drinking club. “Little did we realize that in our private jest we were inaugurating a modern female priesthood — the veritable vestal virgins whom levitous writers cajole in vain,” he said, “and managing editors learn humbly to appease.”
Luce’s audience nearly 75 years ago is not the only group to wonder about the origins of fact-checking in journalism, though the casual sexism of the 1940s would no longer fly. Today, especially amid concern over so-called “fake news” and at a time when it may seem inconceivable that checking an article would be possible without the Internet, it remains a natural question: How did this journalistic practice begin?
And, as it turns out, that story is closely linked to TIME’s past.
Play
Toys Are Ditching Genders for the Same Reason They First Took Them On
Why the Potato Heads are the latest toys becoming more inclusive.

Hasbro announced in tentative and confusing language last week that it was essentially expanding its iconic Potato Head toy brand to include a gender-neutral option. Mr. and Mrs. Potato Head will remain available for purchase separately, and the family pack of the toy will include diverse body parts and clothes to empower children “to imagine and create their own Potato Head family.”
This decision accelerates a recent trend toward wider representation and inclusivity in consumer products for children. Producers of toys, books, movies and television programs for children have striven to portray people with different races, ethnicities, sexual orientations and body types. Some of these products, including mainstream blockbusters such as 2019’s “Captain Marvel” movie and the newest Star Wars trilogy, also promote women’s empowerment. Yet with the exception of Mattel’s Creatable World gender-neutral dolls, few offerings for younger children have acknowledged a fuller spectrum of gender identities, reflecting the increasing gender diversity among younger generations of Americans.
Hasbro’s hesitant step into this territory ignited a predictable response of sarcasm and anger from right-wing media. These highly dramatized reactions reflect anxiety about the social influences of children’s products that stretches back to the late 19th century. From Anthony Comstock’s labeling of dime novels as “traps for the young” in 1883 to the Boy Scouts’ chief librarian worrying that series books were “Blowing Out the Boys Brains” in 1914 to Fredric Wertham’s 1954 anti-comic-book diatribe “Seduction of the Innocent,” vehement opposition to change in children’s consumer cultures is a recurring pattern in American history. Yet this opposition typically ends up losing out to market incentives and profit motives.
READ ENTIRE ARTICLE AT THE WASHINGTON POST
Making Music Male
How did record collecting and stereophile culture come to exclude women as consumers and experts?

According to the Recording Industry Association of America, a trade organization representing the music industry, 2022 was the first year that vinyl outsold CDs since 1988, and sales overall increased by 20 percent from the previous year. The resurgence could be part of an overall return to simplicity, as typewriters, ’90s-’00s-era digital cameras, and board games have also seen a spike. But it could also be something beyond nostalgic appeal.
As media studies scholar Roy Shuker found in his survey of record collectors, there’s no one reason that people are drawn to vinyl albums, but “the thrill of the obsession, linked to accumulation and completism; at times a preoccupation with rarity and economic value; and a concern for cultural preservation” are some of the most commonly shared ones. But for a hobby with so many reasons propelling it, there’s one stereotype about it that persists—that most record collectors are men. Most of Skuker’s respondents agreed that record collecting is seen as “largely a male activity.” And if that’s true, why?
The answer may be in how stereo equipment was marketed originally.
Before World War I, there was nothing inherently masculine about recorded music, explains cultural historian Keir Keightley. But by the 1960s, “home audio sound reproduction equipment had hardened into masculinist technologies par excellence.” The high-fidelity stereo, soon to be known as “the hi-fi,” had become popular with the introduction of the long-playing records (LPs) in 1948. And while LPs weren’t marketed as masculine, hi-fis were. As Keightley notes, a 1953 Life magazine article explains that “[o]ne of the strangest facts about both [hi-fi] bugs and audiophiles is that they are almost exclusively male…women seldom like hi fidelity.”
READ ENTIRE ARTICLE AT JSTOR DAILY
How DIY Home Repair Became a Hobby for Men
It was only in the 20th century that toolboxes became staples in the homes of middle-class men.

Many office workers stuck at home right now are finally getting around to long-neglected home repair projects (hopefully only in cases where they don’t need to going out shopping for non-essential supplies). Although today this kind of work is embraced by people of different genders, for men it might seem like a welcome throwback to old-fashioned self-reliance.
But, as historian Steven M. Gelber writes, the idea of do-it-yourself projects as a mark of manhood is not as old as you might think. It emerged only in the twentieth century.
By the last decades of the nineteenth century, Gelber writes, “industrialism had broken the farmer/artisan tradition of manual competence. Men could, if they wished, take up tools around the home, but very few seem to have wanted to, and there was no general expectation that they should.”
READ ENTIRE ARTICLE AT JSTOR DAILY
The Gilded Age In a Glass: From Innovation to Prohibition
Cocktails — the ingredients, the stories, the pageantry — can reveal more than expected about the Gilded Age.

In the early 20th century, bartenders at the world-famous Waldorf-Astoria memorized 271 concoctions. Scores of signature drinks were dreamt up in honor of people and events: the “Arctic” to celebrate Peary’s discover of the North Pole, the “Coronation” to commemorate King Edward’s ascension to the throne, the “Commodore” and “Hearst,” honoring business tycoons, and even the “Charlie Chaplin.” Imbibing at the mahogany bar aligned oneself with the wealth and tastemakers of America; crowds of Wall Street bankers like J.P. Morgan, celebrities like Buffalo Bill Cody and Mark Twain, and the high-society elites all enjoyed more than a few of the bar’s signature cocktails.[1] Yet, for many, these spaces of inclusion and identity were off-limits.
Cocktails — the ingredients, the stories, the pageantry — can reveal more than expected about the Gilded Age. From the 1870s through WWI, America took its place among the industrial and political powers of the world. Great fortunes were amassed by railroad tycoons and oil magnets such as the Vanderbilts and Rockefellers. A search for identity among the nouveau riche upper class grew out of the nation’s rapid modernization and unimaginable wealth. Identity — who you are and where you belong — became a major theme in the Gilded Age.
The era gave us some of our most famous cocktails as well; the Martini, the Daiquiri, the Old Fashioned, and more. Not every Gilded Age concoction was a hit, however. Some — such as the “Waldorf Astoria” (Benedictine and whipped cream), the “Bradley Martin” (crème de menthe and raspberry syrup), and the “Black Velvet” (champagne and porter) — we are more than glad to keep in the 1800s.[2] Yet those names — Bradley Martin, an infamous society host, Waldorf Astoria, the most elite hotel in New York, Daquiri, alluding to America’s global ambitions — all hint at identity and who is worthy of being remembered — and cheered too! These signature drinks (and countless more) served at the Astoria hotel or Fifth Avenue dinner parties reveal the loyalties of New York’s elite class and the familiar haunts of the millionaires.
Coinciding with a Gilded Age in cocktails, signature concoctions — full of ingredients and anecdotes — populated the landscape and secured the identity of a person, place, or idea for generations. Imbibing in cocktails became simultaneously a signal of inclusion to some groups or an act of defiance for others.
READ ENTIRE ARTICLE AT THE GOTHAM CENTER FOR NEW YORK CITY HISTORY
Jane Addams’s Crusade Against Victorian “Dancing Girls”
Jane Addams, a leading Victorian-era reformer, believed dance halls were “one of the great pitfalls of the city.”
It was, wrote Jane Addams, “a canker that the community must eradicate to save its future generations.” She wasn’t talking about the machines that mangled Chicago’s child laborers, or the teeming tenements where the city’s endless poor lived, worked, and suffered. Addams, perhaps the Progressive Era’s greatest reformer, was talking about dance halls—a menace that another author called “the world’s greatest tragedy.”
Addams could—and did—focus on other tragedies during her lifetime. There was the garbage that piled up in the 19th Ward’s alleys and on streets, uncollected and ignored by city officials. There was illiteracy and starvation and exploitation and crime. Addams tackled them all. But her attempts to eradicate the American dance hall were among her most concerted and perplexing.
To Addams, dance halls meant tragedy. But to young women in Chicago or New York, these “cankers” meant something much more. And that was precisely the problem.
READ ENTIRE ARTICLE AT JSTOR DAILY
From Boy Geniuses to Mad Scientists
How Americans got so weird about science.

In her 2016 book, “Innocent Experiments: Childhood and the Culture of Popular Science in the United States,” published by the University of North Carolina Press, historian Rebecca Onion explores American ambivalence toward science education over the last two centuries. As she delved into her research, Onion observed that even during the times that adult scientists have been eyed with suspicion, Americans have always loved the idea of the child scientist—specifically little white boys—exploring and experimenting with untainted hearts. Once the boys grow up, however, their love of science is viewed as eccentric, dorky, and possibly a bit unsavory.
That’s how we end up with a public that, on one hand, gets excited about topics touched on grade school—like news about Saturn’s rings or robot cars—and, on the other, fails to support important long-term research about climate or disease. When it comes to science, it’s as if Americans revert to their collective childhoods, rejecting research data that either conflicts with their worldviews—the theory of evolution, the safety of vaccines—or just doesn’t seem, well, fun.
While researching her American Studies master’s thesis, Onion noticed that at the turn of the 20th century, children were portrayed as having a particular affinity for animals and the natural world in general, whether they’re catching fireflies, climbing trees, or digging in the dirt. At other times—say, 2017—children are thought to intuitively understand very unnatural modern technology like smartphones and laptops.
“At different times in our history, people were invested in the ideas of children as being modern or as being anti-modern, which is a weird paradox I find fascinating,” Onion says. “And then it came to me: Science is the link that connects man-made technology and the primitive natural world.” After all, scientists have to use microscopes to view and fully understand organic cells and microbes.
READ ENTIRE ARTICLE AT COLLECTORS WEEKLY
Inequality and Injustice
How Men Muscled Women Out of Surfing
Why is surfing still stuck in the 1960s when women have always done it?

When I first started surfing, as a teenager in Honolulu in 1966, my uncle would clear a way for me through the big, intimidating men on long, heavy surfboards—men who vastly outnumbered women in the fabled waves at Queen’s surf break in Waikiki.
Back then, I didn’t see the irony in men dominating a break named for a powerful woman—Queen Lydia Lili‘uokalani, whose cottage had once stood on that very beach. As far as I knew, surfing had always been a man’s sport, one that girls like me were just breaking into. Only later did I learn that women had been surfing since the very beginning, but had been driven out of the sport as it became popular.
READ ENTIRE ARTICLE AT THE ATLANTIC
Women, Men, and Classical Music
As more women embraced music as a profession, more men became worried that the world of the orchestra was losing its masculinity.

In the early twentieth century, many women were entering the workforce and agitating for the right to vote, while many men were trying to figure out what masculinity meant in a world where the importance of physical strength was declining. As historian Gavin James Campbell writes, these questions played themselves out, among other places, in classical music.
Throughout the nineteenth century, Campbell writes, many middle- and upper-class women learned music as a domestic art, but women who made a career of performance were a rarity. Over time, though, exceptions like singer Jenny Lind, pianist Teresa Carreño, and violinist Maud Powell inspired imitation. By the turn into the twentieth century, many young American women were looking at music as a possible profession. Some became part of the opera, which was growing in popularity and required talented female voices. Others reacted to existing orchestras’ refusal to include female musicians by forming all-women musical organizations.
READ ENTIRE ARTICLE AT JSTOR DAILY
Just Wear Your Smile
Few who encounter Positive Psychology via self-help books and therapy know that its gender politics valorize the nuclear family and heterosexual monogamy.

As mask mandates eased across the United States, many women bemoaned the inevitable return of one of the more insidious banalities of misogyny: men telling them to smile. COVID-19 masking had offered a kind of consciousness-raising for many women, the absence of the requirement to smile in public making stark their habitual, constant emotional labor. One woman told a reporter for the Daily Beast, “Best thing about the masks is that men can’t tell me to smile when I’m out in public.” Another said she planned to continue wearing masks despite changes to the rules in her community, because “it’s just so nice and freeing to be able to decide whether to smile or not, just based on how I feel personally.”
These women’s comments were reminiscent of remarks made by Women’s Liberation activist Shulamith Firestone, who explained in her foundational 1970 book The Dialectic of Sex: “My ‘dream’ action for the women’s liberation movement: a smile boycott, at which declaration all women would instantly abandon their ‘pleasing’ smiles, henceforth smiling only when something pleased them.” Firestone’s use of the term “pleasing” remains machete-sharp, slicing through both sides of the compulsory smile interaction. A woman is “pleasing” to look at because she is smiling, and she is “pleasing” the man because he expects her to. At base, Firestone argues, the woman’s smile “indicates acquiescence of the victim to her own oppression.” And, if a man doesn’t get it—on the subway, at work, in the cereal aisle at the grocery, in class, at a club, walking down the street—he demands it. “You should smile more.” “Come on, lady, smile!” “Lighten up!” “You have Resting Bitch Face.” “Why are you so angry?” “Your clients/coworkers/boss would find you more approachable if you smiled more.” “Smile, bitch!”
Fortunately, our popular culture is finally starting to rally behind the position that men must stop telling women to smile. At the same time, however, a prominent subfield of psychology known as Positive Psychology, which purports to be the science of the good life, continues to insist that people—and especially women—should smile.
In 2001 psychologists LeeAnne Harker and Dacher Keltner published the findings of their study on smiling in the Journal of Personality and Social Psychology. The question the study sought to answer was simple: Was it possible to look at women’s college yearbook photos and from them make predictions about their future happiness? Yes, the Berkeley psychologists concluded, it was. Their predictions hinged on whether the women were smiling. But not just smiling; they had to be giving the camera (and the photographer behind it) an authentic smile—what supermodel Tyra Banks would call a “smize,” a smile that reaches the eyes. This “true” smile, the researchers contended, indicated that the subject was experiencing positive emotions like happiness or joy. And what proved that these smiling women went on to experience lives of true happiness and well-being? In addition to their self-reports, the women hadn’t stayed single beyond the age of twenty-seven and had divorce-free marriages.
READ ENTIRE ARTICLE AT BOSTON REVIEW
The Comforts of a Single State
Thomas Jefferson imagines an unequal gender utopia.

Thomas Jefferson’s attitudes about women were remarkably conventional, so conventional, for the most part, that to describe them is essentially to provide an account of the standard views about sexual difference and women’s roles held by men of his time, place, and social standing.
In the letters Jefferson exchanged with his friends, filled with chatter about young women, he was trying out and trying to make sense of the world of heterosexual relations. As historian Joanne B. Freeman has noted, Jefferson was an avid gossip, and, if the chitchat about who was courting whom did not serve as “conduits for other people’s aggression” in the way that political gossip did, it certainly revealed an underlying anxiety about who was getting ahead and who was being left behind. And, as he would later, when insisting that he was sick and tired of politics and wanted nothing more than to retire to his mountaintop, from time to time Jefferson proclaimed himself weary of the whole enterprise of courting the ladies. In one of his more gossipy letters (“You have heard I suppose that J. Page is courting Fanny Burwell. W. Bland, and Betsy Yates are to be married thursday se’nnight”), Jefferson denied not only that he was a gossip (“Who told you that I reported you was courting Miss Dandridge and Miss Dangerfeild?”) but that he even planned to marry. “Many and great are the comforts of a single state.” True, “St. Paul…says that it is better to be married than to burn. Now I presume,” Jefferson joked, “that if that apostle had known that providence would at an after day be so kind to any particular set of people as to furnish them with other means of extinguishing their fire than those of matrimony, he would have earnestly recommended them to their practice.” Who needed to court young heiresses when more willing women could be had for a considerably lower price? In this gossipy letter, Jefferson’s anxieties about the mercenary nature of the relationship between the sexes came through.
Once he married, Jefferson’s anxieties about sex and gender abated, and he soon made his stunning entrance on the political scene. By marrying a beautiful and affluent young woman, Jefferson was able to settle down, in several senses of the term. Whatever his earlier anxieties about marrying, he found great joy in domesticity, and it became for him, even after his wife’s death, the measure of human happiness. Jefferson moved, then, from a kind of conventional misogyny to a kind of conventional valorization of domesticity. We see this appreciation for the domestic life not only in Jefferson’s reminiscences about his married life but also in his subsequent contrast of the satisfactions of private life and the pure and unabated miseries of the public sphere. Not only was his marriage one of “unchequered happiness,” but his wife’s death in “a single event wiped away all my plans and left me a blank which I had not the spirits to fill up.” Twenty years earlier, the anxieties of courtship had left Jefferson depressed and listless; now it was the death of his beloved wife. Jefferson had grown up.
Jefferson carried this image of happy domestic life with him for the rest of his days; it became a touchstone for him. Jefferson’s comparisons of the miseries of public service with the rewards of family life are too well known to require more than a brief mention. “It is in the love of one’s family only,” he once told his younger daughter, “that heartfelt happiness is known.” In government, he found “every thing which can be disgusting,” at home with his family, “every thing which is pleasurable to me.” His protestations notwithstanding, Jefferson spent most of his adult life in public service; his family—and his idea of family, as those who knew his real self and as his only source of “heartfelt happiness”—anchored him, making his public service possible. Jefferson’s views about family life were wholly conventional, and a happy marriage enabled Jefferson to shed his early misogyny, or at least rub off its sharp edges.
Jefferson’s misogyny percolated to the surface when he was in France, between 1784 and 1789, serving as the new nation’s minister. He was single, engaged in an intense flirtation with the unhappily married artist Maria Cosway and perhaps beginning what would turn out to be a long liaison with his slave Sally Hemings. Whether owing to his own unsettled domestic life, or because he was living in a foreign culture, or some combination of the two, Jefferson had a lot to say about women, in particular about their appropriate roles in society. Some of these comments reveal, if not a streak of misogyny, at least an anxiety about female sexuality and the risks that it posed for both the male body and the body politic.
READ ENTIRE ARTICLE AT LAPHAM’S QUARTERLY
No Girls Allowed
How America’s persistent preference for brash boys over “sivilizing” women fueled the candidacy of Donald Trump.

Donald Trump is a baby; a child. Like a child, he whines, seeks attention, and throws tantrums when he doesn’t get what he wants. It’s appropriate that the Access Hollywood tape takes place on a bus, since it captures Trump and Billy Bush acting like pubescent boys making their way to the seventh grade. Addressing her husband’s comments on that tape in a recent interview, Melania Trump dismissed the Trump-Bush conversation as “boy talk.” She joked that she sometimes feels like she has two children at home: Barron, age 10, and her husband, age 70.
While the tape cost Trump in the polls, and lost him endorsements from mainline Republicans, it doesn’t seem to have fazed his hardcore supporters, who don’t mind his petulant debate performances or flashes of paranoid anger, either. Why doesn’t any of this evidence of puerility hurt his image among his base? It’s because Trump is a boy child. He’s Dennis the Menace, Bart Simpson, the scamp with a chemistry set who will blow up your basement; he’s snips and snails and puppydog tails. In our culture, we have long associated boyishness with freedom and personal authenticity. A boy is a man whose essential male spirit has yet to be crushed by the world.
Female Trouble
Clinton’s memoir addresses the gendered discourse and larger feminist contexts of the 2016 presidential campaign.

The title of Hillary Clinton’s memoir of her unsuccessful campaign for the presidency, What Happened, has no question mark at the end, although many people around the world might reflexively add one. Clinton’s defeat surprised—stunned—many, including, as is clear from her recollections, Clinton herself. The majority of polls of the likely electorate indicated that she was headed for a nearly certain win, although her prolonged struggle for the Democratic nomination against a wild-haired, septuagenarian socialist from Vermont was a blinking sign of danger ahead. A significant number of voters were in no mood to play it safe, and the safe choice was what Clinton far too confidently offered in both the primaries and the general election.
It was not only the polls that led observers astray; their instincts did as well. Many, within and outside the country, had trouble imagining that the American people would elect to the most important position in the land, perhaps in the world, a man who had never been elected or appointed to public office, nor sworn an oath of allegiance to the United States as a member of the armed services, nor shown any serious interest in public service. From the time candidates can be said to have openly campaigned for the office, those aspiring to the presidency have touted their possession of these extraconstitutionally mandated qualifications, and all presidents before Donald Trump had at least one of them. They are “extraconstitutional” because none is a requirement for the presidency. Article II, Section 1 of the Constitution lists only three qualifications relevant to modern times: that the president be a “natural born citizen,” at least thirty-five years old, and a resident “within the United States” for fourteen years.
The historian Jack Rakove observed that the American presidency “posed the single most perplexing problem of institutional design” that the framers faced. “The task of establishing a national executive on republican principles,” Rakove says, “puzzled” them. What kind of executive would be suitable in post-Revolutionary America? What type and amount of power should the Constitution give to a person who, in the absence of a vigilant citizenry, might begin to act like a king? And what if the people came to love the president so much that they did not mind—perhaps even welcomed—deviations from republican principles?
READ ENTIRE ARTICLE AT THE NEW YORK REVIEW
How New York’s Postwar Female Painters Battled for Recognition
The women of the historic Ninth Street Show had a will of iron and an intense need for their talent to be expressed, no matter the cost.

The photograph of Jackson Pollock that appeared in Life in August, 1949, didn’t look like anyone’s idea of an artist. Although he stood in front of an enormous painting, a fantastic tracery of loops and swirls that most readers would have found perplexing or ridiculous, the man himself was something else: rugged, intense, with paint-splattered dungarees and a cigarette dangling, with a touch of insolence, from the corner of his mouth. A rival painter, Willem de Kooning, said that he looked like “some guy who works at a service station pumping gas.” But the image was sexy, too—notably similar in type to the working-class stud made famous by Marlon Brando on Broadway the previous year. The subtitle of the accompanying article read, “Is he the greatest living painter in the United States?” The answer was presumably affirmative: why else was a little-known artist being featured in the biggest mass-circulation magazine in the country? The editors, however, were too skittish to render judgment on his mysterious new art. Instead, they offered the phenomenon of Pollock himself: a conspicuously modern artist without a trace of European la-di-da, an artist born in Wyoming, no less, who did his painting in a barn, using not a palette but cans of aluminum paint, into which he occasionally mixed (how much more macho could it get?) nails and screws. The big news was that it was safe, at last, in America, for a real man to be an artist.
Allowing Life to do the article, despite Pollock’s hesitation, was Lee Krasner’s idea. Otherwise known as Mrs. Jackson Pollock, Krasner was a fervent booster of her husband’s work, outspoken in her conviction that he was, as she liked to say, numero uno. She claimed to have believed in his genius from her first visit to his studio, in 1941, and she’d seen him through years of alcoholic turbulence, when he was selling so little that he couldn’t afford to heat their ramshackle house, on the outer reaches of Long Island. Krasner had worn long johns and heavy sweaters to work in the freezing room that served as her own studio—for she, too, was a fiercely serious artist. She had trained at Cooper Union, in a section of the school reserved for women, and at the National Academy of Design, where she learned to draw and paint in a rigorously traditional style. After discovering modernism, she had gone on to become a star pupil of the revered teacher Hans Hofmann, who praised her work as good enough to pass for a man’s. In the late thirties, working for the W.P.A.’s Federal Art Project, a government program that promoted strictly nondiscriminatory policies, she had led a crew of ten men working on a giant mural, now lost, on the subject of navigation. As was true for many women artists of the time, the program gave her a professional start, hands-on experience, and enough confidence to think that she might make it as a painter, even after the war effort brought the W.P.A. to an end, along with all vestiges of an art world that viewed women as equal players.
It’s impossible to know how she might have developed on her own. By the early forties, she was committed to an upbeat style of geometric abstraction, brightly colored, that gave Cubism a rhythmic swing. But meeting Pollock, moving in with him (in 1942), and marrying him (in 1945) radically reset her course. Beginning in 1943—the year of Pollock’s first solo gallery show—she painted almost nothing but “gray slabs,” as she put it, for three despairing years, while she struggled toward his kind of deeply personal abstraction, attempting to paint not what she devised but what she felt and, even more psychologically daunting, who she was. The answer would once have been clear: she was an escapee from an Orthodox Jewish family in Brooklyn, an Artists Union protest organizer, a gutsy woman who took no guff, an ambitious artist. Now, though, she seemed to have been transformed, as in some cruel fairy tale, into a lowly creature known as an artist’s wife. She got past the gray slabs in 1946, and for the next few years kept trying out new approaches, working mostly on a modest scale—she called her best work the “Little Image” paintings—and pushing on with quiet resolve. In 1949, however, just a couple of months after Pollock’s appearance in Life, she decided to stop exhibiting, following a series of dismissive she’s-no-Pollock reviews of a gallery show titled “Artists: Man and Wife.” At the age of forty, she was a scarred veteran who stood for everything that younger women artists feared and rejected. She was even known to cook.
READ ENTIRE ARTICLE AT THE NEW YORKER
A New ERA for Women in the Navy
Admiral Elmo R. Zumwalt, Jr, z-grams, and the all-volunteer force.

An article buried deep in the April 11, 1974 issue of the Press-Telegram out of Long Beach, California visually demonstrated the divergent expectations American women voluntarily adopted by the mid-1970s. Above the fold, editors chose to contrast the dual roles of the modern woman: the left image showed a grinning woman pilot in the cockpit of a C-1A Grumman twin-engine transport aircraft while the image on the right showed the same woman smiling adjacent to her fiancé opening wedding gifts in advance of their nuptials. When Barbara Allen Rainey applied for a commission in the United States Navy upon graduation from Whittier College in 1970, she had no intention of becoming a pilot, as naval aviation training remained closed to women. Rainey entered the Navy at an opportune time, however, as the social movements of the previous decade had shifted the consciousness of many Americans toward racial and gender equality. As the war in Vietnam drew to a close, causing the Department of Defense to shift from the draft to an All-Volunteer Force (AVF), and with the Equal Rights Amendment (ERA) gaining traction, the Navy responded by altering its personnel policies to open restricted billets to women. These changes not only reflected broader social and economic trends, but had the added potential to improve public opinion of the Navy at the end of a deeply unpopular war. This, in turn, could make the Navy more appealing to future recruits as the services competed with one another over the most qualified candidates, regardless of race and sex.
For nearly a decade, American women had voiced their discontent privately in surveys, anonymously in opinion columns, and, at times, publicly at rallies and protest marches across the United States. Beginning in 1963 with the publication of Betty Friedan’s Feminine Mystique—in which Friedan identified the siren song of labor-saving appliances and comforts that lured a cadre of predominantly white women to the comforts of middle-class suburbia—countless women expressed their personal bouts of ennui. The social and economic mobility that came to define much of the 1950s was generally unattainable for women of color, especially Black women who often left school early to work as laundresses or maids in white households to supplement the meager incomes of their families. Despite articles in Ladies’ Home Journal, Redbook, Good Housekeeping, Southern Living, and Women’s Home Companion telling middle-class American women that personal fulfillment could be found in their dual role as housewife and mother, much of these magazines’ target audience remained either dissatisfied, skeptical, or wholly unconvinced. “Their words,” Friedan remarked of her survey respondents, “seem to indicate that housewives of all educational levels suffer the same feeling of desperation.” Friedan identified “a stunting or evasion of growth” among American housewives who simultaneously suffered from a malaise she called “the problem that has no name.” The following year, Congress passed the Civil Rights Act, which included Title VII, explicitly prohibiting employment discrimination based on race, color, religion, sex, and national origin.
On the heels of the Civil Rights Act, a group of women committed to equality between the sexes formed the National Organization for Women. The group’s founders—which included Friedan and Pauli Murray, an androgynous Black civil rights activist—sought to reform social, demographic, and economic aspects of American life, including the institution of marriage into a partnership of equals. One of the ways parity between the sexes could be achieved, they determined, was through a constitutional amendment. Initially proposed to Congress by suffragist Alice Paul in 1923, the revised text of the ERA now read: “Equality of rights under the law shall not be denied or abridged by the United States or by any state on account of sex.” After forty-nine years, feminists found early-1972 as the ideal moment to push for congressional approval and state ratification. Although women accounted for only two percent of the seats in Congress, the ERA garnered support from both parties. At 1638 on 22 March 1972, the Senate passed the amendment with an 84-to-8 vote. Thirty-two minutes later, Hawaii became the first state to ratify the amendment. “Confidence that ratification would be achieved swiftly was expressed by a number of supporters of the amendment,” the New York Times reported the following day.
READ ENTIRE ARTICLE AT U.S. NAVY HISTORY
How the ‘Girl Watching’ Fad of the 1960s Taught Men to Harass Women
In name, ‘girl watching’ is long gone. In practice, the trend lives on.
In the spring of 1968, 21-year-old Francine Gottfried began working as an IBM machine operator at a data processing plant in lower Manhattan. Gottfried walked past the New York Stock Exchange to get from the subway to work each day, and she soon attracted a group of Wall Street workers who gaped at her large breasts and verbally harassed her. Over the following months, men circulated the details of her daily schedule—she typically emerged from the Broad Street subway at 1:28 pm for her afternoon shift—and the crowd grew.
By September, Gottfried’s body and the men’s aggressive behavior had become national news: “Boom and Bust on Wall Street,” read one New York Magazine article. According to the Associated Press, the group of men stalkers reached more than 5,000 on a single day; another news outlet claimed the group hit a record of 10,000. Police became concerned about the rowdy crowds of men who clogged traffic and crumpled car roofs as they clambered for a better view of Gottfried’s breasts. Soon, Gottfried required a police escort and had to take new routes to work to avoid the street harassment.
But if Gottfried’s treatment seems horrific, she was just one of many women subjected to “girl watching,” a two-decade trend popularized by an ad executive and broadcast through major ad campaigns. Girl watching guides and girl watching clubs of the midcentury (alongside cartoons and advertisements) taught American men how to sexualize and harass women on the street, among other public places. These activities even had a soundtrack: At the very moment Gottfried was being harassed on Wall Street, the song “Girl Watcher” by the American pop band The O’Kaysions was climbing the charts. The history of this girl watching phenomenon speaks volumes about how the “male gaze” was made and then resisted.
READ ENTIRE ARTICLE AT JEZEBEL
Building a Mystery: An Oral History of Lilith Fair
In the mid-1990s, Sarah McLachlan set out to prove a woman’s place was center stage.

In 1996, singer-songwriter Sarah McLachlan was tipping from alternative icon into something more like traditional pop success. At 26 she had garnered serious momentum—and 2.8 million albums sold in the United States—after her 1993 crossover Fumbling Towards Ecstasy. But as she ascended through the music industry, she kept hearing “no.” No, we can’t play your song—we already have another woman artist in rotation. No, you can’t put two women on the same concert bill—it’s box office poison. Sexism was passed off as age-old industry logic—logic that forced her into competition with other women artists to be the sole exceptional woman allowed opportunity.
McLachlan was not alone.
So she presented a challenge to her team—let’s prove them wrong—and in 1997, they assembled a bill of women to play massive outdoor venues across the States. McLachlan headlined, and they stacked the show with superstars like Tracy Chapman, Aimee Mann, Suzanne Vega, Emmylou Harris, Paula Cole, Patti Smith, and Lisa Loeb. The backlash to what would come to be called Lilith Fair was swift. From the beginning, Lilith frequently encountered skepticism from the industry and withering hostility from the media (“It is, in short, a total crock of shit,” wrote critic Gina Arnold), and was attacked by the Christian right. The Los Angeles Times reported music-industry insiders deriding it as “Lesbopalooza.”
But during its three-year run, Lilith Fair made over 130 tour stops in North America, featured roughly 300 women artists, drew more than 1.5 million people, and grossed over $52 million—more than $10 million of which was donated to women’s charities. The festival helped establish the careers of Missy Elliott, Erykah Badu, Dido, Nelly Furtado, Christina Aguilera, and Tegan and Sara, and brought the Dixie Chicks to a mainstream audience. The bills featured out, queer artists onstage, new artists alongside veterans, and an entire village of progressive activist causes. But more than anything else, it was visionary. Lilith Fair was glimpsing a possible future in which women were rightfully placed at music’s center and not its margins.
Yet two decades later, even as artists like Haim and Brandi Carlile cite Lilith as inspiration, it’s just as often a cultural punch line. This is the real story of how Lilith came to be—and what it ended up meaning to a generation that had never seen anything like it before.
READ ENTIRE ARTICLE AT VANITY FAIR
Why It’s Shocking to Look Back at Med School Yearbooks from Decades Ago
They offer jaw-dropping examples of the sexism and racism that shaped professional cultures.

Although they may appear to be innocuous collections of school memories, yearbooks have fueled major political controversies in recent months. Whether it be the racist photograph of a student in blackface and another in a Ku Klux Klan costume on Virginia Gov. Ralph Northam’s medical school yearbook page or Supreme Court Justice Brett M. Kavanaugh’s high school yearbook jokes about drinking and sex, decades-old school publications have returned to public scrutiny for politicians, and it’s guaranteed that Northam’s will not be the last.
But these shocking pages aren’t as much of an outlier as they may have seemed. During my research on women in medicine in the 20th century, I came across the seemingly peculiar incident of a Playboy centerfold in a medical school yearbook. I soon discovered similar pages in yearbooks from this time across the United States. The books — as yearbooks always do — reflected the contemporaneous culture of the institutions that published them. In medical school in the 1960s and ’70s, that culture was often roiled by a backlash against women and minorities, as the medical world increasingly opened for people other than white men.
My research found that editors at that time deliberately deployed sexism in yearbooks as women fought to enter coeducational medical schools in higher numbers. In 1965, women made up less than 10 percent of medical college matriculants. By 1975, that figure had increased to nearly 25 percent. To wrestle with the significance of this change, predominantly male yearbook editors drew upon the template and vocabulary of Playboy magazine.
READ ENTIRE ARTICLE AT THE WASHINGTON POST
How Flight Attendants Organized Against Their Bosses to End Stereotyping
The marketing of stewardesses’ bodies was long an integral part of airline marketing strategies.

In 1971, National Airlines introduced a new slogan: “Fly Me.”
The first print advertisement featured a photographic close-up of Cheryl Fioravante, a freckle-faced stewardess with a boyish coiffure, smiling innocently. “Hi, I’m Cheryl. Fly Me,” the ad read in large, black boldface. Subsequent ads invited travelers to “fly” other fresh-faced stewardesses, like Jo and Laura.
The innuendo-laden campaign cost National a small fortune—they spent over $9 million a year on the ads—but it paid off. The National Organization for Women objected to the ads, calling them sexist, saying that they presented flight attendants as a “flying meat market” and invited passengers to make sexual advances. Nevertheless, the “Fly Me” series raised the carrier’s profile and won a handful of advertising awards.
Advertising the bodies of women employees was good for business. Though the “Fly Me” campaign is now displayed on a number of websites, often as a glaring example of outdated sexism, at the time, “Fly Me” reflected the widespread sexualization of flight attendants, which belied the harsh working conditions that these women negotiated while flying unfriendly skies. That began to change in the mid-1960s when flight attendants mounted an organized push against their employers—and ultimately, improved their workplaces.
The bodies of women flight attendants have long been an integral part of the airlines’ marketing strategy. In the postwar period, government regulations ensured that fares, routes, and planes were nearly indistinguishable. To stand out, airlines marketed their flight attendants’ looks and promised an exciting or erotic in-flight experience. At the dawn of the commercial aviation industry, airlines introduced formal and informal policies to ensure that their flight attendants were uniformly young, slender, unmarried white women, running job ads with explicit requirements for height, weight, and marital status. “Charm farms,” as stewardess schools were often called, taught women how to exercise, walk in high heels, and fashionably style their hair and makeup. They often trained fashion models, as well.
READ ENTIRE ARTICLE AT JEZEBEL
An Enduring Shame
A new book chronicles the shocking, decades-long effort to combat venereal disease by locking up girls and women.

There are currently at least 2.3 million people detained and confined in the United States and its territories. Many thousands are held without having committed a crime, including nearly 11,000 children locked away for “technical violations” or “status offenses” such as running away or skipping school. And since at least 1978, the number of women and girls removed from society and locked up for extended periods of time has been growing at more than double the rate of men and boys.
According to The Trials of Nina McCall: Sex, Surveillance, and the Decades-Long Government Plan to Imprison “Promiscuous” Women, a new book by the law student Scott Stern, a shocking number of American girls and women were also locked up beginning in the 1910s as part of the now completely forgotten “American Plan,” a governmental effort to combat venereal disease. Stern happened upon this unnerving piece of history largely by accident when he was an undergraduate poking around the stacks of the Yale libraries. His curiosity piqued, he spent almost a decade digging into archival collections, visiting decaying rural towns, and interviewing people in their living rooms, trying to understand what this program was and what its human cost might have been.
READ ENTIRE ARTICLE AT THE NEW YORK REVIEW
Lampooning Political Women
For as long as women have battled for equitable political representation in America, those battles have been defined by images.

In the mid-19th century, female reformers faced an impossible task as they advocated for rights and aimed to maintain a high social standing. Women who had the means to live up to ideal femininity, but chose not to, prompted anxiety. Ultimately, one of the main tasks of the women’s rights movement was to justify their steps outside accepted gender roles and, eventually, change these roles. But without power, money, or organizational strength, activists could not change the way Americans conceived of political women. Cartoons that derided reformers proliferated during the years after the Seneca Falls Convention. Artists continued the tradition of representing political women as ugly, masculine threats to American values, including gender norms, white supremacy, and heteronormativity.
The women’s rights movement grew out of women’s activism on behalf of other people. In the 1830s, women, especially white middle- and upper-class women, participated in and led antislavery and moral reform societies. They signed petitions, raised money, and read the Liberator. Abby Kelley Foster, the Grimké sisters, and Lucy Stone traveled the country to deliver public lectures. Rather than leading organizations, they hoped to leave behind towns of supporters who organized on their own. By the end of the decade, their outspokenness angered even their fellow antislavery supporters. In 1840, the World’s Anti-Slavery Convention in London, for example, refused to seat female attendees. Discussions of women’s rights were part of local conversations years before women brought them to the national stage. In 1844, a group of men in New Jersey petitioned the state constitutional convention to grant women the vote. Two years later, three groups petitioned the New York constitutional convention to enfranchise women. One, a group of women from Jefferson County, asked for more than the ballot. They wanted “to extend to women equal, civil and political rights with men.”
In July 1848, about three hundred people, including prominent abolitionists Lucretia Mott and Frederick Douglass, met in Seneca Falls, New York. They sparked national conversations about women’s rights and set an agenda. At the end of the two days of proceedings, one hundred attendees signed a statement of their aims, the Declaration of Sentiments. The main author was Elizabeth Cady Stanton, a thirty-two-year-old highly-educated mother. Modeled after the Declaration of Independence, her Declaration of Sentiments affirmed that “all men and women are created equal.” Attendees called for rights ranging from women’s education and employment opportunities to the right to own property and control their money. Reformers included the right to the ballot, which they viewed as a tool to gain other rights and enact temperance and antislavery laws. In 1851, the attendees of the convention in Worcester, Massachusetts, resolved that “the Right of Suffrage for Women is, in our opinion, the corner-stone of this enterprise, since we do not seek to protect women, but rather to place her in a position to protect herself.” They addressed their opponents’ arguments, which contended that fathers, husbands, and sons represented their female counterparts.
Unlike petitions to local governments, the Declaration of Sentiments established specific goals to launch a movement. Stanton’s document asserted, “We shall employ agents, circulate tracts, petition the State and national Legislatures, and endeavor to enlist the pulpit and the press in our behalf.” They wanted their meeting to prompt “a series of Conventions, embracing every part of the country.” Between 1848 and 1860, they held national conventions every year (except in 1857). Organizers concentrated their national meetings in the Northeast and Midwest, but local groups held gatherings throughout the United States.
READ ENTIRE ARTICLE AT HUMANITIES NEW YORK
Where Gender-Neutral Pronouns Come From
We tend to think of “they,” “Mx.,” and “hir” as recent inventions. But English speakers have been looking for better ways to talk about gender for a long time.

Until relatively recently, gender-neutral pronouns were something people used to describe others—mixed groups, or individuals whose gender was unknown—not something people used to describe themselves. But even though people did not, in Young’s time, personally identify as nonbinary in the way we understand it today (though some identified as “neuter”), neutral pronouns existed—as did an understanding that the language we had to describe gender was insufficient. For more than three centuries, at least, English speakers have yearned for more sophisticated ways to talk about gender.
Likely the oldest gender-neutral pronoun in the English language is the singular they, which was, for centuries, a common way to identify a person whose gender was indefinite. For a time in the 1600s, medical texts even referred to individuals who did not accord with binary gender standards as they/them. The pronoun’s fortunes were reversed only in the 18th century, when the notion that the singular they was grammatically incorrect came into vogue among linguists.
In place of they, though, came a raft of new pronouns. According to Dennis Baron, a professor emeritus of English at the University of Illinois at Urbana-Champaign who wrote the definitive history of gender-neutral pronouns in his book What’s Your Pronoun?, English speakers have proposed 200 to 250 pronouns since the 1780s. Although most petered out almost immediately after their introduction, a few took on lives of their own.
Thon—short for that one—has resurfaced frequently since an attorney named Charles Converse first introduced it as a more elegant way of writing he or she. Converse claimed to have coined the word as far back as 1858, but it didn’t actually appear publicly in a magazine until 1884. The word made a splash in grammarian circles, and more than a decade later the publisher Funk & Wagnalls added thon to its dictionaries. “There was a sort of band of followers” for the word, Baron told me. “Through the 1950s and into the 1970s, there were prominent people in the U.S. who every once in a while would promote thon.”
The Sacramento Bee used the gender-neutral hir from the 1920s to the ’40s. Mx.—the gender-neutral equivalent of Mr. or Mrs.—was first recorded in an April 1977 edition of the magazine The Single Parent.
Many of these early pronouns were created either for linguistic simplicity or to include women, but none that Baron tracked from before recent decades had the explicit goal of encompassing a larger diversity of genders. “I’m sure there were people who said, ‘Hey, these pronouns aren’t me,’ but we don’t have a record of what they did after that,” Baron said. But “that doesn’t mean people weren’t talking about it.”
READ ENTIRE ARTICLE AT THE ATLANTIC
The Confederate Project
What the Confederacy actually was: a proslavery anti-democratic state, dedicated to the proposition that all men were not created equal.

The story of the Confederate States of America, the proslavery, antidemocratic nation created by white Southern slaveholders to protect their property, has been told many times in heroic and martial narratives. Stephanie McCurry tells a very different tale of the Confederate experience. Confederate Reckoning: Power and Politics in the Civil War South tells the real story of what the Confederacy actually was— a proslavery anti-democratic state, dedicated to the proposition that all men were not created equal.
Something stunning — epic even — transpired in the American South between 1860 and 1865. Then, in a gamble of world historical proportions, a class of slaveholders, flush with power, set out to build an independent proslavery nation but instead brought down the single most powerful slave regime in the Western world and propelled the emergence of a new American republic that redefined the very possibilities of democracy at home and abroad. In the process, too, they provoked precisely the transformation of their own political culture they had hoped to avoid by secession, bringing into the making of history those people — the South’s massive unfranchised population of white women and slaves — whose political dispossession they intended to render permanent. The story of the Confederacy is a story of intentions, reversals, undoing, and unlikely characters that form an arc of history rarely matched for dramatic interest and historical justice.
The short-lived Confederate States of America was a signal event in the history of the Western world. What secessionists set out to build was something entirely new in the history of nations: a modern proslavery and antidemocratic state, dedicated to the proposition that all men were not created equal. Confederates were fully caught up in the turbulent currents of history that roiled the hemisphere in the age of emancipation; their proslavery experiment was part of a far larger struggle then being waged over slavery, democracy, and the powers of nation-states. Theirs was a nation founded in defiance of the spirit of the age. Emboldened by the “failure” of emancipation in other parts of the hemisphere, convinced that the American vision of “the people” had been terribly betrayed, Southern slaveholders sought the kind of future for human slavery and republican government no longer possible within the original Union. Theirs was to be a republic perfectly suited to them as a slaveholding people, a republic of white men, defined by slavery and the political exclusion of the mass of the Southern people.
READ ENTIRE ARTICLE AT THE CONFEDERATE PROJECT
How Women Got Crowded Out of the Computing Revolution
Blame messy history for the gender imbalance bedeviling Silicon Valley.

Who wrote the first bit of computer code? That honor arguably belongs to Ada Lovelace, the controversial daughter of the poet Lord Byron. When the English mathematician Charles Babbage designed a forerunner of the modern computer that he dubbed an “Analytical Engine,” Lovelace recognized that the all-powerful machine could do more than calculate; it could be programmed to run a self-contained series of actions, with the results of each step determining the next step. Her notes on this are widely considered to be the first computer program.
This division of labor — the man in charge of the hardware, the woman playing with software — remained the norm for the founding generation of real computers. In 1943, an all-male team of researchers at the University of Pennsylvania began building ENIAC, the first general-purpose computer in the U.S. When it came time to hire programmers, they selected six people, all women. Men worked with machines; women programmed them.
“We didn’t think we should spend our time worrying about figuring out programming methods,” one of ENIAC’s architects later recalled. “There would be time enough to worry about those things later.” It fell to the women to worry about them, and this original team of women made many signal contributions, effectively inventing the field of computer programming. But programming had no cachet or notoriety; it certainly wasn’t seen as inspiring work, as historian Janet Abbate’s account of this era makes clear.
Sexist stereotypes are used today to justify not hiring women programmers, but in the early years of the computer revolution, it was precisely the opposite — and not without the encouragement of women as well. Early managers became convinced that women alone had the skills to succeed as programmers. Still, it was considered glorified clerical work.
READ ENTIRE ARTICLE AT BLOOMBERG
Where Are the Women? Past Choices That Shaped the Historical Record
When women are missing from the history we tell, sometimes it’s because of how their stories were preserved and told in the past.

Women are missing from our history textbooks and public memory—and not necessarily because their stories haven’t been told. Sometimes it’s because of how their stories were preserved and told in the past. Understanding decisions earlier generations made that hinder our ability to find women’s stories can make it easier for us to rediscover and tell them today.
As we do, writers in the past believed that whose stories got told, and how, mattered. Those aims motivated friends, family, and activists to preserve women’s written and spoken words. They also led authors to craft narratives and to edit women’s documents in ways that would be relevant or persuasive to their audiences.
Take the example of Isabella Graham. Born in Scotland in 1742 and married in 1765 to a British army physician, Graham followed her husband and his regiment to Canada, Fort Niagara, and, on the eve of the American Revolutionary War, Antigua. After her husband’s death there in 1773, she returned with her four children to Scotland. She moved for the final time to New York in 1789. There she opened a boarding school where George and Martha Washington and other prominent families sent their girls. Then she became a trailblazer in founding charities aiding women and children. Graham died in 1814, and over the following decades she was remembered as a philanthropist, an educator, and an evangelical role model. Well-known in the 19th century and remembered into the 20th, she is now unknown to most outside of early American historians.
READ ENTIRE ARTICLE AT AMERICAN HISTORICAL ASSOCIATION
Guess Who’s Going to Space With Jeff Bezos?
Wally Funk has been ready to become an astronaut for six decades.

In the beginning, the small group of Americans who aspired to become astronauts had to pass an isolation test. Spaceflight wasn’t going to be easy, and the country wanted people with tough minds.
For his test, John Glenn sat at a desk in a dark, soundproofed room. He found some paper in the darkness, pulled a pencil out of his pocket, and spent the test writing some poems in silence. He walked out three hours later.
For her test, Wally Funk floated inside a tank of water in a dark, soundproofed room. She couldn’t see, hear, or feel anything. She emerged 10 hours and 35 minutes later, not because she was done, but because the doctor administering the test decided it might be time to pull her out.
Glenn went on to become the first American to orbit Earth and one of the most recognizable names in American spaceflight. He died in 2016, the last member of NASA’s first class of astronauts. Funk never flew to space, and most people had probably never heard of her until today, when Jeff Bezos announced that Funk would join him on his journey to space, aboard his own rocket, built by his company Blue Origin. Three weeks from now, Funk will blast off into the sky, experience a few glorious minutes of weightlessness, and come back down. At 82, she is poised to become the oldest person to fly to space—a record currently held by Glenn, who went to space for the last time when he was 77 years old.
Glenn and Funk took their isolation tests during the exhilarating beginnings of the American space program, but their experiences didn’t overlap. Glenn was part of the Mercury program, NASA’s first attempt to send men—and, at the time, only men—to space, while Funk participated in a privately funded project meant to see how women held up to the pressures of spaceflight. Randy Lovelace, the doctor in charge of the effort, had worked with NASA’s male astronauts, and he suspected that women might fare just as well, or even better, than men in the tiny, cramped spaceships that NASA had planned; women are, on average, lighter and smaller, and would require less food and oxygen, scientists suggested at the time. Lovelace recruited female pilots under 40, matching the age requirements of NASA’s real astronaut corps. In the early 1960s, Funk and the others underwent the same intense barrage of physical and psychological tests that Lovelace had developed for the NASA men. The screenings were meant to push participants to exhaustion; as no one knew yet the toll spaceflight would have on a human body, sending astronauts at peak fitness seemed like an important hedge for success.
READ ENTIRE ARTICLE AT THE ATLANTIC
Pearl Jam
In the twentieth century, the mollusk-produced gem was a must have for members of WASP gentility. In the twenty-first century, its appeal is far more inclusive.

Graduation and wedding season is upon us. In my family, a family of girls, that means pearls. Months ago, I opened the red leather case containing the set that belonged to my late mother. For the first time in a number of years I gazed at them, nestled in the soft suede lining. Even after so long tucked away on the top shelf of my closet, they seemed lit from within, almost alive. How my mother loved them. Not only were her pearls, a stunning thirty inch “opera length” strand, a thing of beauty and expense, and a sensual pleasure to wear, they were a class signifier, subtly announcing to the world that she and my father had “arrived.” In considering her strand of pearls, I found myself preoccupied, not for the first time, with their organic beauty, history, and complex, often contested, always changing, social meanings—as layered and variable as the nacre from which every pearl is formed.
Archaeologists have excavated pearls, pierced and otherwise, from graves dating back as far as 5,000 BCE. For more than two thousand years, people have actively harvested or “fished” them, including in the Persian Gulf off the coast and islands around Bahrain. From there they were exported to cities all over India and China, as well as to Persia, Arabia, and Turkey. Well into the eighteenth century the pearling industry was the sole support of most Middle Eastern sheikhdoms; pearl fisheries and their associated markets scaffolded an emerging economy of luxury that reached its peak in early modern times, from roughly the fifteenth through the eighteenth centuries, when pearls were fished and traded the world across Asia, the Pacific, and the Americas.
READ ENTIRE ARTICLE AT JSTOR DAILY
Activism
Massachusetts Debates a Woman’s Right to Vote
A brief history of the Massachusetts suffrage movement, and it’s opposition, told through images of the time.

For over a century Americans debated whether women should vote. They wondered: was voting compatible with women’s traditional domestic work? If women participated in politics, would men continue as heads of the family? Would women remain virtuous and “feminine” or would they start to look and act like men?
In Massachusetts, suffragists were especially powerful. In 1850, Worcester hosted the first national women’s rights convention. Later, Lucy Stone led the nation’s largest suffrage organization and edited the longest-running women’s rights newspaper from her Park Street office. In 1895, fellow Bostonian Josephine Ruffin founded one of the first national groups to advocate for the rights of women of color.
Local anti-suffragists proved influential too. Their arguments against extending the vote to women dominated legislative debates and newspaper articles. In 1895, Massachusetts men and women formed the nation’s first organized anti-suffrage association.
This online presentation highlights the fight over a woman’s right to vote in Massachusetts by illustrating the arguments made by suffragists and their opponents. Women at the polls might seem unremarkable today; but these contentious campaigns prove that suffragists had to work hard to persuade men to vote to share the ballot. These century-old arguments formed the foundations for today’s debates about gender and politics.
Please note: This online presentation was derived from an exhibition, “Can She Do It?”: Massachusetts Debates A Woman’s Right to Vote, which was on display at the Massachusetts Historical Society between 26 April 2019 and 21 September 2019. This website does not show everything that was part of the exhibition.
READ ENTIRE ARTICLE AT MASSACHUSETTS HISTORICAL SOCIETY
Her Crazy Driving is a Key Element of Cruella de Vil’s Evil. Here’s Why.
The history of the Crazy Woman Driver trope.

Disney’s upcoming live-action film, “Cruella,” is a prequel to “101 Dalmatians.” Emma Stone plays the title character of Cruella de Vil, showing her backstory as a budding designer in 1970s London. Set in the anti-establishment punk rock era, the film reimagines the fur-obsessed puppy-napper as a sympathetic and even feminist figure. Or, at least she appears to be a heroine worth rooting for. That is, until we see her behind the wheel.
The tale shows de Vil’s rise from scrubbing floors to rising fashion star and then, a car thief, hot-wiring a stolen neoclassic luxury car and skidding through the streets of London.
This view of an unhinged de Vil driving madly through the city is a familiar one. Within the first seconds of the “Cruella” trailer, we see a Panther De Ville with a vanity license plate spelling out DEVIL. Likewise in the “101 Dalmatians” films, both the 1961 cartoon and 1996 live-action adaptation, de Vil’s car made its way down the road before she even appeared. It is the car that stands in for de Vil herself, in all of its glamour, excess and danger.
READ ENTIRE ARTICLE AT THE WASHINGTON POST
When Lesbians Led the Women’s Suffrage Movement
In 1911, lesbians led the nation’s largest feminist organization. They promoted a diverse and inclusive women’s rights movement.

In 1911, a team of three women with “lesbian-like” relationships – Jane Addams, Sophonisba Breckinridge and Anna Howard Shaw – took control of the suffrage movement, leading the nation’s largest feminist organization. They promoted a diverse and inclusive women’s rights movement.
My research suggests that the personal lives of these suffrage leaders shaped their political agendas. Rather than emphasizing differences of gender, race, ethnicity and class, they advanced equal rights for all Americans.
Suffrage scholarship has long acknowledged a shift “from justice to expediency” – from an emphasis on natural rights to an emphasis on gender distinctions – in the movement at the turn of the century.
The 1848 Declaration of Sentiments, a founding document of the suffrage struggle, proudly insisted that “all men and women are created equal.”
READ ENTIRE ARTICLE AT THE CONVERSATION
Intellectual, Suffragist and Pathbreaking Federal Employee: Helen Hamilton Gardener
Gardner’s public service did not end with her lifelong advocacy for women’s equality, but continued even after her death.

Courageous, risk-taking women have long led the ongoing struggle for gender equality in the United States. While Arlington National Cemetery (ANC) is most widely known as the resting place of many male military heroes, it also includes the graves of numerous prominent, pioneering women who were heroes in their own right. One such woman was Helen Hamilton Gardener (buried in Section 3 and pictured at right, circa 1920).
An intellectual, activist and champion of women’s rights, Helen Hamilton Gardner used her life experiences as inspiration for the social change she strongly advocated. Born Mary “Alice” Chenoweth, she sought independence early on by training at the Cincinnati Normal School to become a schoolteacher. At the time, teaching was one of the few acceptable paid professions for young women to pursue. She graduated in 1873 and took a position as a teacher in Sandusky, Ohio, where she quickly rose to become the principal of Sandusky’s new teacher training school.
However, Chenoweth’s success in Sandusky turned out to be short-lived, despite her excellent performance. After newspapers exposed her for having an affair with a married man, she resigned from her position. At the time, such accusations ruined a woman’s professional and moral reputation. Undeterred, she changed her name to Helen Hamilton Gardner. She refused to let her status as a “fallen woman” define her, and under her new name, she spent the rest of her life pushing against the social, sexual and religious norms that limited women’s independence.
As part of her evolution, Gardener immersed herself in the freethought movement, an intellectual movement that advocated for freedom of thought, secularism and the importance of science. She read widely, independently advancing her education, and she eventually became a protégé of Robert Ingersoll, a leader in the freethought movement who supported women’s rights. With Ingersoll’s support, Gardener emerged as a popular speaker. She steadily gained national prominence and notoriety, as well as criticism for espousing views that many found unseemly coming from a woman.
While Gardener’s speaking career included hardships and setbacks, she eventually flourished and also became a published writer of both non-fiction and fiction. Much of her writing focused on issues related to women’s rights. For example, in an 1887 letter to the editor of Popular Science Monthly, Gardener publicly sparred with William A. Hammond, former surgeon general of the U.S. Army (buried in Section 1 of ANC). She critiqued an article he had written, in which he claimed that physiological differences in the female brain made women intellectually inferior to men and unsuited to study subjects such as math. The issue escalated into a debate between Gardener and Hammond in Popular Science Monthly, and while the publication allowed Hammond the final say, the incident inspired Gardener to take a bold step. Upon her death, she willed her brain to Cornell University so it could be studied as an example of the brain of an accomplished female intellectual. According to the New York Times, Gardener believed that scientists had not had enough opportunities to study the brains of “the women who think,” and she hoped her brain would fill this scientific gap.
READ ENTIRE ARTICLE AT ARLINGTON NATIONAL CEMETERY
‘Free To Be You and Me’ 40th Anniversary: How Did a Kids Album By a Bunch of Feminists Change Everything?
Forty years ago this fall, a bunch of feminists released an album. They wanted to change … everything.

Why are your toes painted like that?”
The question came from the neighbors’ kid Cam, a fourth grader friendly with my children, as a group of us parents sat in his living room drinking wine one afternoon in June. He was sprawled on the couch, sweaty and red-faced from wrestling with his little brother, and he’d noticed that each of my toes sported a different bright color of nail polish.
“I painted them!” my younger daughter exclaimed.
“It’s true, she did,” I said. “Harper really likes painting nails, so I let her do mine.”
I’ve modeled Harper’s salon skills for the past few summers. I like that she takes the task so seriously, choosing colors from a Ziploc bag of polish we keep on a high shelf in the bathroom and applying them carefully to my big, gross toenails.
“But …” Cam began, pausing to consider what his question really was. He seemed torn between viewing me as an object of pity and a key to unlocking life’s mysteries. “But don’t your friends make fun of you?”
“Oh,” I said, putting on a casual air, even though the conversation seemed unexpectedly important all of a sudden. “No, not really. When you get older, you have a different relationship with your friends than you do when you’re a kid.” Now I paused to consider. “Or maybe when you’re a grown-up, you just choose friends who understand the things you do.”
The ‘Undesirable Militants’ Behind the Nineteenth Amendment
A century after women won the right to vote, The Atlantic reflects on the grueling fight for suffrage—and what came after.

Some mornings, President Woodrow Wilson would shut his eyes as he rode past the women who had assembled once again outside the White House. Occasionally he tipped his hat to them. He didn’t want a confrontation, but by the spring of 1917 it was clear that they weren’t going away. Wilson had claimed earlier in his presidency that he wasn’t aware women even wanted the vote. Plausible deniability was no longer an option.
The women had first appeared in the chill of January, silently holding banners that said: “Mr. President, you say liberty is the fundamental demand of the human spirit,” and “Mr. President! How long must women wait for liberty?” Bouts of miserable weather and jeering passersby came and went. The protests continued.
June brought chaos. After months of fragile peace, police started loading the women into paddy wagons. By autumn, hundreds of women had been arrested for obstructing the sidewalk outside the White House. Many of them were sent to prison. Newspapers reported that women were tortured at Occoquan, the Virginia workhouse where several prominent suffragists served time. The idea was “to break us down by inflicting extraordinary humiliation upon us,” Eunice Brannan told The New York Times after her release, in November. Brannan and others described being beaten repeatedly, dragged down stairs, thrown across rooms, kicked, manacled to prison-cell bars, denied toothbrushes, and forced to share a single bar of soap. Drinking water came from a dirty pail that sat in a common area. The guards, some of them marines from nearby Quantico, warned the women they’d be gagged and put in straitjackets if they spoke. Bedding was never washed, and the beans and cornmeal served to prisoners were crawling with maggots. “Sometimes the worms float on top of the soup,” one woman wrote in an affidavit. “Often they are found in the cornbread.” It wasn’t until the following spring that the D.C. Court of Appeals deemed the arrests unconstitutional.
READ ENTIRE ARTICLE AT THE ATLANTIC
The Power Suit’s Subversive Legacy
Women have long borrowed from men’s dress to claim the authority associated with it. It hasn’t always worked.

Power dressing is usually associated with white-collar women of the 1970s and ’80s—women who dressed for success. This look was iconic in its caricature: architectural shoulder pads, big hair, double-breasted jackets, and sensible heels. In film and television, power dressing suggested that hard work and a little feminist ingenuity was enough to propel a woman to the top. She’s Dolly Parton in Nine to Five, Melanie Griffith in Working Girl, Faye Dunaway in Network. She’s Grace Jones on the runway, Elaine Benes with a cigar between her teeth, heels kicked up on J. Peterman’s desk.
But power dressing appeared, in different forms, long before it became popular near the end of the 20th century. Women entered the workforce during the two world wars, only to be expelled from it (or sexualized in it) by the mid-century. Women have returned to the workplace many times since, whether thanks to women’s lib, through small-business ownership, or, most recently, by “leaning in.” In each case, defying the male gaze in the workplace and public life took serious negotiating. That negotiation began, in part, with power suits.
READ ENTIRE ARTICLE AT THE ATLANTIC
What Betty Friedan Knew
Judge the author of the “Feminine Mystique” not by the gains she made, but by her experience.

What prods our feminist foremothers into the light of reappraisal? The condition of women in the world is a changing state of affairs, except of course, depressingly, in all the ways it isn’t. Accordingly, the relevance of any departed feminist thinker to the present moment takes one of two forms. First, she’s to be resurrected because the injustice she once delineated has been substantially overcome; we can beatify her for having slain the dragon of this or that historical inequality. In other words, we find in her a sort of relevance of irrelevance.
Second, and conversely, her claim to our attention could be based on the persistence of some particular problem. Sexual violence? Unequal pay? Or, perhaps most urgently, the assault on reproductive rights? As last year’s Dobbs decision demonstrated, when it comes to justice, the arc of history sometimes bends right back on itself. “Relevance,” in this second realm of persistent problems, often proceeds from twinned, faulty assumptions: that the only reason a person might be induced to revisit feminist thought from the ’60s or ’70s is for the way it speaks directly and digestibly to the now; and that the point of the exercise is only to confirm, in impotent resignation, that little has changed, things remain terrible, and we need so-and-so “now more than ever”! In this way, things collapse into gynopessimism—which holds that nothing can really alter the fundamentally subordinate status of women.
READ ENTIRE ARTICLE AT THE NEW REPUBLIC
How Second-Wave Feminism Inexplicably Became a Villain in the #MeToo Debate
Talking sexism, ageism, and progress with Katha Pollitt.

In a recent article for Jezebel, Stassa Edwards wrote that “[t]he backlash to #MeToo is indeed here and it is liberal second-wave feminism.” Her piece followed a number of stories from female writers in their 40s and older—such as Daphne Merkin—taking issue with some aspects of the #MeToo movement. In Merkin’s words, there has been a “reflexive and unnuanced sense of outrage that has accompanied this cause from its inception, turning a bona fide moment of moral accountability into a series of ad hoc and sometimes unproven accusations.” Merkin’s sentiment and others like it outraged a number of (often younger) feminists. (A particular source of ire was the anticipated publication of a Katie Roiphe story in Harper’s, which was expected to be critical of #MeToo and out the person who started the Shitty Media Men list; the creator of that list, Moira Donegan, ended up outing herself instead.)
To talk about all this, I spoke by phone recently with Katha Pollitt, a longtime poet and columnist for the Nation who often writes about feminism and whose most recent book is Pro: Reclaiming Abortion Rights. During the course of our conversation, which has been edited and condensed for clarity, we discussed whether many of #MeToo’s critics are really feminists, what the moment needs to be even more effective, and why differences between younger and older activists are so hard to bridge.
Isaac Chotiner: What have you made of the generational tensions or differences between different waves of feminism that have arisen lately?
Katha Pollitt: I’m a little bewildered by it, for several reasons. One is that second-wave feminist is being used as a synonym for woman writer of a certain age. I mean, Katie Roiphe is not a second-waver. Daphne Merkin, Andrea Peyser—these women are not feminists at all, in my view. And they are not old enough to be second-wavers. I mean Katie Roiphe was minus 5 years old when The Feminine Mystique was published. So I think I would wish that the young women who are making this claim would read a little bit of history. I found it very offensive when Katie Way, who was the author of that piece on Babe.net about Aziz Ansari, insulted Ashleigh Banfield by calling her a “burgundy lipstick bad highlights second-wave feminist has-been.” I mean, it’s at that point you want to say, “Hello, my pretties, soon you too will be wearing the burgundy lipstick.”
“Take Me Out to the Ball Game”: The Story of Katie Casey and Our National Pastime
The little-known story of one of the best known sing-along songs, and its connection to women’s suffrage.

Every summer, in ballparks across the country, a familiar refrain is heard during the seventh inning of every game:
Take me out to the ball game,
Take me out with the crowd.
Buy me some peanuts and Cracker Jack,
I don’t care if I never get back,
Let me root, root, root for the home team,
If they don’t win it’s a shame
For it’s one, two, three strikes, you’re out,
At the old ball game.
Although this memorable chorus of peanuts and Cracker Jack is part of our national consciousness, the song’s little-known verses tell a deeper story, about a woman and her desire to be part of the rooting crowd. Her name was Katie Casey, and in 1908 she was affirmably baseball’s biggest fan.
Katie Casey was base ball mad,
Had the fever and had it bad;
Just to root for the home town crew,
Ev’ry sou Katie blew.
On a Saturday, her young beau
Called to see if she’d like to go,
To see a show But Miss Kate said “no,
I’ll tell you what you can do:[1]
“Take Me Out to the Ball Game” was Katie’s well known reply, but in 1908, a woman at the ballpark rooting and cheering was neither a common sight, nor was it fully accepted. “Take Me Out to the Ball Game” advertises just the opposite: that a woman’s place was indeed in the grandstand at the ballpark and not just safe at home.
Up Against the Centerfold
What it was like to report on feminism for Playboy in 1969.

I didn’t realize yet that the issue of Playboy’s distortion of the women’s rights battle and its casual byproduct—the refusal to allow me my own voice in an article I’d been assigned to write and which had been accepted by the editors who assigned it—was much bigger than my personal defeat. For starters, Playboy’s mistreatment of the women’s point of view had more relevance to the women who worked in Hefner’s empire than to me.
I withdrew my article: it would not be published. I soon learned Playboy hired Morton Hunt to write a piece expressing Hefner’s point of view.
Months later I heard a TV news broadcaster announce, “Playboy employee quits over exploitation of women by Hefner.”
I wasn’t sure I’d heard the words right, but sure enough, Shelly Schlicker, a Playboy secretary, had been caught late one night in Playboy offices xeroxing Hugh Hefner’s memo about my article. Playboy’s article on the new feminism, titled “Up Against The Wall, Male Chauvinist Pig!” had just appeared and Schlicker wanted to get Hefner’s revealing memo about my piece to the press.
Her story was picked up by many underground newspapers and by Newsweek, which praised my courage and quoted a Hefner spokesman that HH still stood behind his memo. Hunt’s article astonishingly concluded by relegating the women’s movement to “the discard pile of history”; in response, Newsweek called it a “long, rambling, and rather dull article.” (A year later Newsweek hired me as their second woman writer ever.)
I laughed. Playboy’s title for Hunt’s piece (“Up Against The Wall, Male Chauvinist Pig!”) was the kind of invective I was too “ladylike” to hurl at the magazine’s editors. But I would have loved to have had the chutzpah to have said, “Up against the centerfold, MCP.”
READ ENTIRE ARTICLE AT JEZEBEL
Let Us Mate
Proposal advice from Inez Milholland, originally published in the Chicago Day Book, 1916.

How will the leap-year girl propose? How will the leap-year bachelor act when she lays her heart at his feet?
The beautiful Inez Milholland Boissevain, who proposed three times, answers the first question, and her happy husband, Eugen Boissevain, replies to the second.
Everybody in America knows Inez Milholland, the beautiful suffragist who started her career by stirring up feminist agitation in staid old Vassar College, scorned New York society, practiced law in the ponce courts, and finally got herself jailed in England for rioting with Mrs. Pankhurst’s “wild women.”
There was amazement in the suffrage ranks when she suddenly became the wife of Eugene Boissevain. Feminists marveled that their dashing young general had consented to become the mere wife of a mere man. They wondered how young Mr. Boissevain did it.
Then came the most amazing act of Inez’s amazing career. She calmly announced that she did the proposing. This most beautiful of all suffragists further admitted that she had to pop the question three times before the man of her choice accepted.
READ ENTIRE ARTICLE AT LAPHAM’S QUARTERLY
How Mrs. Claus Embodied 19th-Century Debates About Women’s Rights
Many early stories praise her work ethic and devotion. But with Mrs. Claus usually hitting the North Pole’s glass ceiling, some writers started to push back.

Clement Clarke Moore’s 1823 poem “Account of a Visit from St. Nicholas” redefined Christmas in America. As historian Steven Nissenbaum explains in “The Battle for Christmas,” Moore’s secular St. Nick weakened the holiday’s religious associations, transforming it into a familial celebration that culminated in Santa Claus’ toy deliveries on Christmas Eve.
Nineteenth-century writers, journalists and artists were quick to fill in details about Santa that Moore’s poem left out: a toy workshop, a home at the North Pole and a naughty-or-nice list. They also decided that Santa Claus wasn’t a bachelor; he was married to Mrs. Claus.
Yet scholars tend to overlook the evolution of Santa Claus’ spouse. You’ll see brief references to a handful of late-19th-century Mrs. Claus poems – especially Katharine Lee Bates’ 1888 “Goody Santa Claus on a Sleigh Ride.”
But as I discovered when I began work on a class about Christmas in literature, the writers who created Mrs. Claus were not just interested in filling in the blanks of Santa’s personal life. The poems and stories about Mrs. Claus that appeared in newspapers and popular periodicals spoke to women’s central role in the Christmas holiday. The character also provided a canvas to explore contemporary debates about gender and politics.
READ ENTIRE ARTICLE AT THE CONVERSATION
What We Want Is to Start a Revolution
Formed in 1912 for “women who did things—and did them openly,” the Heterodoxy Club laid the groundwork for a century of American feminism.

Early in the twentieth century there lived in Greenwich Village a few hundred women and men who were bent on making a revolution not so much in politics as in consciousness. Among them were artists, intellectuals, and social theorists for whom the words free and new had achieved a reverential status. “Free speech, free thought, free love; new morals, new ideas, New Women”: these phrases had become catechisms, crusading slogans among those flocking to the Village in the early 1900s, many of whose names—Eugene O’Neill, Mabel Dodge, John Reed, Edna St. Vincent Millay, Max Eastman and his sister, Crystal—are now inscribed in the history of the time. To experience oneself through open sexuality, irreverent conversation, eccentricity of dress; to routinely declare oneself free to not marry or have children, free to not make a living or vote—these were the extravagant conventions of American modernists then living in downtown New York.
Most of these people considered themselves socialist sympathizers at the same time that they placed individual consciousness at the center of their concerns. They did not read Marx anywhere near as much as they read Freud, yet a major difference between them and European modernists was the expectation that in America social change would occur as much through progressive politics as through the arts. The push for labor reform especially provided drama enough for Greenwich Village radicals to feel themselves living an urgent life while in its service. There were headline-making strikes in those years—the shirtwaist makers’ strike in New York in 1909; the Lawrence textile strike in Lawrence, Massachusetts, in 1912; the Paterson, New Jersey, silk workers’ strike of 1913. All saw Village theorists on the picket line, Village painters producing propaganda art, Village journalists feeding strikers’ children—and always among them a significant number of the women who belonged to the Heterodoxy Club. The life and times of this club is the subject of Joanna Scutts’s lively and absorbing new social history, Hotbed.
READ ENTIRE ARTICLE AT NEW YORK REVIEW
The Powerful, Complicated Legacy of Betty Friedan’s ‘The Feminine Mystique’
The acclaimed reformer stoked the white, middle-class feminist movement and brought critical understanding to a “problem that had no name”.

Is it possible to address a “problem that has no name?” For Betty Friedan and the millions of American women who identified with her writing, addressing that problem would prove not only possible, but imperative.
In the acclaimed 1963 The Feminine Mystique, Friedan tapped into the dissatisfaction of American women. The landmark bestseller, translated into at least a dozen languages with more than three million copies sold in the author’s lifetime, rebukes the pervasive post-World War II belief that stipulated women would find the greatest fulfillment in the routine of domestic life, performing chores and taking care of children.
Her indelible first sentences would resonate with generations of women. “The problem lay buried, unspoken, for many years in the minds of American women. It was a strange stirring, a sense of dissatisfaction, a yearning that women suffered in the middle of the twentieth century in the United States.” Friedan’s powerful treatise appealed to women who were unhappy with their so-called idyllic life, addressing their discontent with the ingrained sexism in society that limited their opportunities.
Now a classic, Friedan’s book is often credited with kicking off the “second wave” of feminism, which raised critical interest in issues such as workplace equality, birth control and abortion, and women’s education.
The late Friedan, who died in 2006, would have celebrated her 100th birthday this month. At the Smithsonian’s National Museum of American History, a tattered, well-read copy of The Feminine Mystique, gifted by former museum curator Patricia J. Mansfield, is secured in the nation’s collections of iconic artifacts. It was included in the museum’s exhibition titled “The Early Sixties: American Culture,” which was co-curated by Mansfield and graphic arts collection curator Joan Boudreau and ran from April 25, 2014 to September 7, 2015.
READ ENTIRE ARTICLE AT SMITHSONIAN MAGAZINE
Suffrage Movement Convinced Women They Could ‘Have it All’
More than a century later, they’re still paying the price.

Recent studies on the impact of the novel coronavirus on American families reveal that women are being stretched very thin. During this pandemic, they are working more than men: caring for older or sick family members, teaching children, maintaining homes and keeping up with full-time jobs. Now parents are returning to work and scrambling to improvise child care, prompting many women to decrease their hours or even leave their jobs.
Although the global pandemic has dramatically exacerbated these problems, the reality is that women have always had to shoulder more than men. They have had to manage the affairs of the home as they juggle a wealth of responsibilities in society at large.
While this work is a necessity for most families today, there was a time when many Americans resisted the idea of women doing anything outside of the confines of home. The women’s voting rights movement radically transformed Americans’ views on this issue, enabling women’s greater participation in society. But, in doing so, suffragists — activists who fought for the vote — entrenched an impossible ideal of “having it all” that persists today.
READ ENTIRE ARTICLE AT THE WASHINGTON POST
Masher Menace: When American Women First Confronted Their Sexual Harassers
The #MeToo movement is not the first time women have publicly stood up to sexual harassment.

In the late 19th century, from the moment that American women were granted the freedom to leave their houses unescorted, they encountered a pest known as “the masher.” Generally, a smarmy mustachioed fop, this unfamiliar man winked at or brushed up against a shop girl on the streetcar, loomed over and stalked a working woman walking down the street, called out “hey turtle-dove” to teenage girls. The most galling mashers groped, hugged, and kissed any girl or woman they declared irresistible.
Today—almost 150 years later—we’re realizing that the masher, who is now called a “sexual harasser,” never went away. Thanks to women and men like the ones “TIME Magazine” dubbed “The Silence Breakers,” we know the most prominent mashers can be found watching TV in the White House and glad-handling potential voters at state fairs. They’re in Hollywood casting calls, on print and radio editorial staffs, sitting at the anchor chair on TV news programs, and commanding tech company board meetings. We know they also badger women with very little power to speak out, those who work at restaurants and retail spaces, on cleaning staffs, and in the fields.
Thanks to the #MeToo Movement, women and men are slowly becoming empowered to expose decades of abuses they’ve suffered from powerful men—and to demand those abusers lose their jobs. Every day, more and more politicians, celebrities, journalists, and influencers are being outed as perpetrators of sexual harassment and sexual assault. The American media is finally taking victims’ claims seriously in a way that wasn’t imaginable just a year ago. For some harassers, the consequences have been swift; others remain unscathed.
READ ENTIRE ARTICLE AT COLLECTORS WEEKLY
Who Killed the ERA?
A review of “Divided We Stand: The Battle Over Women’s Rights and Family Values That Polarized American Politics.”

In the summer of 1968, George Wallace, in between terms as governor of Alabama, concluded that endorsing the Equal Rights Amendment for women would help his third-party presidential campaign. He declared his support in a telegram to Alice Paul, the head of the National Women’s Party, who had cowritten the first draft of the amendment in 1923 and had been campaigning for it for forty-five years. The pro-segregationist Wallace was hardly alone among conservative politicians in his position. Strom Thurmond, a Republican senator from South Carolina, likewise supported the amendment, saying in 1972 that it “represents the just desire of many women in our pluralistic society to be allowed a full and free participation in the American way of life.”
In fact, the Republican platform had supported the Equal Rights Amendment as far back as 1940; opposition had come mainly from pro-labor Democrats, who feared that equal treatment for men and women would mean an end to legislation that protected women from dangerous jobs. Labor opposition waned as the increasingly active feminist movement—frustrated that the Supreme Court had never interpreted the Fourteenth Amendment’s equal protection guarantee to apply to discrimination on the basis of sex—made passing the Equal Rights Amendment a top priority. In 1971 the House approved the ERA by a vote of 354–24. The Senate followed the next year by a vote of 84–8. The proposed amendment’s language was straightforward: “Equality of rights under the law shall not be denied or abridged by the United States or by any State on account of sex.” The necessary ratification by three quarters of the states—the magic number of thirty-eight—looked eminently achievable.
READ ENTIRE ARTICLE AT THE NEW YORK REVIEW
Collected and curated by Bunk History