DMX Took a Trust Fall With His Music - The New York Times

2021-12-30 10:08:57 By : Ms. Alice Lee

He bore the kind of pain Black men rarely get to air in public, hoping that transparency would manifest the tenderness he desired.

Remembering some of the artists, innovators and thinkers we lost in the past year.

In “Sonnet 19,” the poet John Milton agonized over the loss of his vision, bemoaning the prospect that he would spend “half my days, in this dark world and wide,” bereft not only of his sight but of spiritual purpose. I hear an echo of Milton’s dark world in not only the title of the rapper Earl Simmons’s 1998 debut album, “It’s Dark and Hell Is Hot,” but in the tortured substance of his concerns. The title suggests a man thrown into dangerous circumstances without the benefit of guidance from a higher power, where the only way to survive is to accept a devil’s bargain: Life here on Earth is possible, but only if he submits to a moral darkness that will condemn him to hell. The rapper was obsessed with how this quandary could hollow him out and ultimately consume him — thus his stage name, “DMX,” an acronym for “Dark Man X.”

The moral price of life in a fallen world was not a thought exercise for Simmons, who died this past April of a cocaine-induced heart attack. Born in Mount Vernon, N.Y., in 1970, he was the only son of Arnett Simmons and Joe Barker. Barker left, leaving Simmons — a teenager — to raise her child alone in Yonkers. She struggled with how to raise a Black son amid the poverty of the School Street projects, and subjected him to outrageous abuse. In a 2019 interview with GQ, Simmons recounted being beaten so badly by Arnett that she knocked his teeth out; he was 6 years old. The mistreatment cloaked his life in almost total, grinding fear. “You couldn’t be too confident in my situation,” Simmons said in the 2020 BET series “Ruff Ryders Chronicles.” “Confidence would get you beaten. Expression would get your ass whooped.”

The abuse begot criminal and antisocial behavior — Simmons once stabbed another kid in the face with a pencil — which in turn triggered more abuse. One summer, trying to discipline Simmons, Arnett locked him in his bedroom for months. He was allowed to leave only for bathroom breaks. In 1983, Arnett effectively severed their relationship when she took him to the Children’s Village group home on the pretense that they were just visiting. It was a trick: She left him there. “Right then and there,” Simmons remembered in “Chronicles,” “I learned to just put away, conceal, bury whatever bothered me. End of story. I think another side of me was born right there, that enabled me to protect myself.”

But a love of music was born at Children’s Village, too, and when he returned to Yonkers, two years later, he clicked up with a local rapper named Ready Ron. They would wander the streets, Ron rapping while Simmons beat-boxed behind him. Ron encouraged him to rap, but according to Simmons, he also betrayed the burgeoning 14-year-old artist by tricking him into smoking a crack-laced blunt. That incident initiated an addiction whose shadow would haunt his life. (Ron has denied this.) From the beginning, Simmons’s love of music was bound up with mistrust, dependence and aggression. He described wandering Yonkers, “looking for people to rob — and if I came across a rap battle, just as good.”

Between 1986 and 1990, Simmons shuttled between jail and the streets, writing songs all the while, until Joaquin (Waah) Dean, an aspiring music executive who had co-founded the record label Ruff Ryders, found Simmons through his demo tape. Simmons’s drug habit and criminal streak forestalled his success, but he eventually secured a deal with Def Jam. He garnered a reputation as a battle rapper whose trademarks were an obsession with dogs, skillful modulation of speed and cadence and a snarling bark of a voice that conveyed a sense of lawless menace.

When Def Jam released “It’s Dark and Hell Is Hot,” it debuted at No. 1 on the Billboard 200 and went on to achieve quadruple platinum status. He was a curious figure in an era still high on Puff Daddy’s luxurious vision for rap: an armed robber who rapped about crime’s corrosive spiritual effects in a voice that sounded as if it were coming from a serrated throat. His follow-up albums, “Flesh of My Flesh, Blood of My Blood” (1998) and “… And Then There Was X” (1999), each debuted at No. 1 and went multiplatinum. Between 1998 and 2003, in fact, his first five albums debuted at No. 1, making him the first artist to ever do so. DMX became just as popular as Notorious B.I.G. and Jay-Z by presenting himself as an instinctual but anxious bruiser with a sense that his sins were damning him. “When you do dirt, you get dirt,” he rapped on The Lox’s 1998 song “Money, Power & Respect.”

On other occasions, he thought of himself as a human sacrifice: The cover of “Flesh of My Flesh” depicted him bathed in blood, hands raised like Christ presenting his stigmata. Hollywood tried to turn Simmons into a movie star — he appeared in five films between 1998 and 2004, including Hype Williams’s visually seminal gangster morality play “Belly” — but the old miseries dogged him no matter the dizzying professional heights he reached. His success was followed by an equally dizzying fall from grace: continued addiction; arrests for animal cruelty, tax evasion, possession and a host of other crimes; and the complete squandering of his earnings.

On songs like “Ruff Ryders’ Anthem,” triumphant production obscures the way he wrestles with the demons that precipitated his fall. “Niggas wanna try, niggas wanna lie/Then niggas wonder why niggas wanna die/All I know is pain,” he proclaimed in the first verse, positing dysfunction as a product of his brokenness. “How can I maintain with mad shit on my brain?” he asks. The song mixes images of criminal bravado with a shame and doubt that were DMX’s calling card. “Yeah, I know it’s pitiful,” he says of his behavior. On “The Convo,” he stages a dialogue with God about his wretchedness: “Here I am/Confused and full of questions/Am I born to lose/Or is this just a lesson?”

As littered with a truly shocking brand of misogyny and homophobia as his songs could be, they were also inventive in how they took the violent fantasies of subgenres like gangster rap and transformed them into music laden with vulnerability about Simmons’s own spiritual travails and mental-health struggles. On one song, he declares himself a “manic depressive with extreme paranoia.” In interviews, meanwhile, he was not shy in addressing his desire for an enduring intimacy, one that wouldn’t end in betrayal. In a recent interview with the rapper Talib Kweli, he recounted the story of Ready Ron with a frank confusion about how a man could do that to a child. It’s hard not to hear his music as a kind of trust fall, a hope that transparency regarding the pain he was in would manifest the tenderness he desired. In the director Christopher Frierson’s 2021 documentary “DMX: Don’t Try to Understand,” we see Simmons freestyling in a parking lot with a few younger rappers, weaving together stories of knotted frustration and resigned hopelessness: One of the younger rappers breaks down in tears, and DMX readily embraces him. “I barely know you,” he says. “But I love you.”

Ismail Muhammad is a story editor for the magazine.

The women who arrived at Rosalind Cartwright’s sleep laboratory in Chicago in 1978, carrying toothbrushes and pajamas, were in pain. They had left their husbands, or their husbands had left them. On some morning, perhaps after a fitful night, they had turned to the classifieds in their local paper — many were now hard up — and saw an ad: Were they feeling blue over a separation or divorce? Were they willing to spend the night in a sleep lab?

In the past, Cartwright had difficulty recruiting female volunteers. There was a stigma attached to sleeping outside the home in exchange for money; they also had beehive hairdos that they didn’t want to mess up. These divorcing women, though, were undeterred. They submitted to the cold gel that technicians dabbed on their foreheads and scalps before attaching electrodes; they lay down in unfamiliar beds. Some were motivated mainly by the small payment being offered. But most weren’t there for that, Cartwright told an interviewer more than 30 years later. They wanted access to their dreams. Cartwright — who was married four times, twice to the same man — understood. “I felt bonded,” she said, “with that sample of women.”

Close to two decades earlier, Cartwright was in her late 30s when her second husband moved out. Depressed and sleeping poorly, she dreamed anxious dreams. The most practical solution to this problem, she decided, was to work through the night; her mother had loved recalling her dreams, and Cartwright had always been curious about their function. So, she hired babysitters for her two young daughters and started her first sleep lab, at the University of Illinois College of Medicine, where she was a psychology professor. Using foam tiles, she converted the men’s bathroom from an empty psychiatric unit at the college into a bed chamber. “Right from the start, I felt at home watching the polygraph pens write out the sleepers’ patterns of brain waves, waiting for the dream indicators to begin,” she wrote in her 1992 book “Crisis Dreaming.” Over an intercom, she would then call the dreamer’s name and ask, “What was going through your mind just before I woke you?”

Dreams, Cartwright came to believe, weren’t random bursts of electrical activity, as some researchers had postulated. They weren’t memories being discarded to free up space in the brain (Francis Crick’s notion), nor were they manifestations of urges that people were too ashamed to admit even to themselves (Freud’s theory). Rather, she wrote, dreams were “designed not to erase experience but to highlight it, to help us monitor and update our internal picture of ourselves.”

To divorce is to have one’s self-image shattered, Cartwright knew, and that often leads to depression. By the time her inaugural group of divorcing women arrived at the Rush University Medical Center in Chicago, she was heading the department of behavioral sciences there — she insisted on being called “chairman,” not “chairwoman,” lest anyone think she wasn’t as powerful as her male counterparts — and had just founded one of the first sleep-disorder research and treatment centers. She would make major contributions to the understanding and treating of sleep apnea; this included studying the snorers’ partners, who, she realized, were likely to be sleep deprived as well.

To be a female scientist leading a department, you had to be a rigorous investigator, persuasive, charismatic and, above all, tough. Cartwright was also a single parent. The recording on her home answering machine — she encouraged colleagues to phone after hours if they needed help — said that she would call back if she “wasn’t particularly cranky.” At the dining table, her daughters unfolded reams of EEG printouts, and she showed them where the spikes in activity meant a dream had begun.

More than half of those who experience a depressive episode recover without treatment. Cartwright wanted to see if the dreams of divorcing women who were depressed would predict whether this happened for them. (She would soon add male subjects.) If the dreams of those who recuperated and those who didn’t had different characteristics initially, dream reports might be a useful diagnostic tool. Dreams play a key role in regulating troublesome emotions, Cartwright and others observed. But how? Disparities between the dreams of those whose moods stayed low and those whose moods improved might offer clues.

One woman dreamed that she was a pencil being inserted into a pencil sharpener. Afterward, a researcher asked if that meant she was being ground up. “No,” the subject said, “I was getting sharp.” She had never balanced a checkbook or completed an income-tax return and had become an instrument to do so. This kind of nocturnal problem solving, Cartwright’s research showed, was a positive sign; people who remained depressed tended to be passive and unemotional in their dreams. Those whose depression abated also generally had longer dreams with more complex plots that seemed “almost like a rehearsal for recovery,” she wrote in her 2010 book “The Twenty-Four Hour Mind.” New images often mixed with those from the past: One woman ran from unseen threats through neighborhoods from her youth, now hung in barbed wire, dragging her children by the hand and banging on doors; another encountered her ex-husband at a high school party, and when he exposed himself, felt embarrassed for him and walked away. Their unconscious, it seemed, was stitching old memories together with feelings stirred up by recent events to create a new identity. The next day, these participants felt better.

Those whose dreams didn’t have such narrative qualities continued to struggle, and Cartwright sometimes offered to work with them after a study for free. In “Crisis Dreaming,” which she wrote with Lynne Lamberg, she tells readers who are divorcing and whose “repetitive” dreams leave them feeling “worn out and unhappy the next morning” how to employ some of the same techniques that she had her patients practice: recognize a disturbing dream in progress, identify what’s gone wrong, stop the action and take charge to change it.

Cartwright herself had a kind of recurring dream throughout her life, beginning in childhood: A harlequin in fanciful hats would heckle and trick her into acts of self-sabotage, like going into school on a Sunday. She could never remember what his face looked like. Near the end of her life, Cartwright’s grandson, when he came to visit, would lie beside her in bed in the evenings and they would talk. A month before she died, on one of these occasions, she told him that she dreamed that she had been giving a major research presentation to an auditorium full of her peers when she spotted the harlequin in the audience. She felt doomed, but there was nothing she could do. She kept lecturing. When she finished, the crowd gave her a standing ovation, and she realized that the harlequin was gone. “He caused no trouble, no harm, he just listened,” her grandson told me. “He had chosen to sit and be at peace, and therefore she could sit and be at peace. She didn’t have to worry or fight against him.”

Kim Tingley is a contributing writer for the magazine and the Studies Show columnist. Her last feature was about a superspreading event involving the Skagit Valley Chorale.

On a cool spring night in 1973, more than 1,000 people — students, activists, hippies, spiritual seekers — crammed into a ballroom at the University of California, Berkeley. They had come to hear Rennie Davis, then 32 and one of the most admired antiwar activists in the country, talk about changing the world. Davis was nothing short of a celebrity. Two years earlier, he helped organize the massive May Day protests against the Vietnam War, and in 1969, he and six men, who would come to be known as the Chicago Seven, were charged with conspiracy and inciting a riot outside the Democratic National Convention. Davis was one of only two defendants to testify during the raucous, highly publicized trial, which featured a parade of colorful characters, including an unhinged judge and the defense witnesses Allen Ginsberg and Timothy Leary.

Davis was known for being even-tempered and a relentless organizer, but he combined his seriousness of purpose with charisma and an infectious optimism. While he’s portrayed in the 2020 Aaron Sorkin film “The Trial of the Chicago 7” as a nerd who “couldn’t sell water to a thirsty man in the desert,” as his fellow 1960s activist Frank Joyce put it to me, Davis was actually one of the antiwar movement’s most captivating speakers.

Davis would need those skills in Berkeley, where he had come to deliver a stunning message: Activism, he now believed, had failed to fix a broken country. The new solution — to war, poverty, racism — was spiritual enlightenment. “I’m really blissed out with a capital ‘B,’” Davis told the crowd. “We are operating under a new leadership, and it is divine. It’s literally going to transform this planet into what we’ve always hoped and dreamed for.”

The “new leadership” had an unlikely frontman: a car-obsessed 15-year-old Indian named Guru Maharaj Ji, dubbed the “perfect master.” (Writers and activists who struggled to understand his appeal preferred to call him other things, including “the fat kid” and “the paunchy preadolescent mystical magnate.”) Maharaj Ji, who now goes by Prem Rawat, was one of countless gurus who gained popularity in the West at the time; the teenager’s organization, called Divine Light Mission, had an estimated 50,000 followers along with hundreds of centers and ashrams across the United States. Acting as both devotee and spokesman, Davis insisted Maharaj Ji would bring peace to the world. “God is now on this planet,” he announced during a radio interview.

Davis’s message was catnip to Maharaj Ji’s followers in Berkeley, who danced and placed Easter lilies next to a picture of the boy on a linen-draped altar. But then came the catcalls. “We kept you out of jail, we came to Chicago, and now what are you doing to us?” someone yelled at Davis. “Kiss my lotus ass,” another sneered. Activists with “fury bleeding out of every wound,” as one writer put it then, hurled tomatoes at their former idol. A homeless man — or prophet, one couldn’t be sure — interrupted Davis with cheeky Buddhist riddles.

Things had not gone much smoother at a similar event in New York City. There, Davis tried in vain to convince the crowd that a spiritual focus was “totally consistent with the progressivism and values of political activist work,” according to Jay Craven, a young activist and filmmaker who was in attendance. Unlike others in that crowd, he wasn’t surprised by what Davis was now selling. Craven had recently returned from visiting Davis in India, where they had sat together on the banks of the Ganges while Davis, looking ethereal in a flowing white cotton tunic, spoke of “the intense white light he experienced when Maharaj Ji put his hands on his forehead and applied pressure to his eyeballs.”

Craven left India befuddled, a confusion shared by just about everyone who knew Davis. As the journalist Ted Morgan wrote in this magazine in 1973, summarizing the reaction to Davis’s conversion, “Nothing quite like this had happened since Augustine defected from Neoplatonism to Christianity.” But there had been signs that Davis was changing, especially after the May Day protests in Washington, D.C. “I never for a minute believed we would literally shut down Washington, but I think Rennie, who was always a grandiose thinker, truly did,” Craven told me. Disillusioned, Davis mostly stepped back from the fracturing antiwar movement. Instead, there were acid trips, New Age curiosities and talk of spending a year in the Sierra Mountains.

Davis wasn’t alone in abandoning political work for meditation and a belief in effecting social change through inner change. The early and mid-1970s saw “the wholesale transformation of many radicals and activists to new mystical religions,” the sociologist Stephen A. Kent writes in his 2001 book “From Slogans to Mantras.” The socialist newspaper Workers’ Power believed Davis and others had “learned the wrong lesson and decided that politics doesn’t work. So, if you can’t change the world, change yourself.” One of the period’s loudest critics of the guru worship exhibited by Davis and others was the writer and biochemist Robert S. de Ropp, who lamented that one could train a dog “and have him presented as the perfect Master, and I honestly believe he’d get a following!”

Maharaj Ji’s following was growing by 1973, so much so that Davis hoped he could fill the Houston Astrodome for the guru’s appearance and kick-start “the greatest transformation in the history of human civilization.” The three-day event was poorly attended and, unsurprisingly, did not bring peace to Earth. When a reporter caught up with Davis in 1977, he had recently moved out of a Divine Light Mission ashram. He was no longer a public figure, he said, because he saw “the process of cleaning up the world as the process of cleaning up your own act first.” Davis was now selling insurance, as reflected in the headline: “1960s Activist Rennie Davis Now a ‘Straight.’”

But the rest of Davis’s life can hardly be described as conventional. After the failure of a company he co-founded to invest in ecologically transformational technologies, he dropped out of society to spend the better part of four years living and meditating at the bottom of the Grand Canyon. Eventually he teamed up with his third wife to teach meditation and build what they called a “new humanity” movement, one “larger than the Renaissance, the American Revolution and the ’60s combined.”

Still, Davis remained proud of the political activism of his younger years. In 2013, he flew to Vietnam with other antiwar leaders from the ’60s to celebrate the 40th anniversary of the Paris Peace Accords. According to Frank Joyce, who was on the trip, some of the long-simmering tensions between activists and Davis resurfaced. “But Rennie was completely comfortable in his own skin and really did have inner peace,” Joyce told me. “That can be tough for people to understand. To some leftists, inner peace can be pretty irritating.”

Until his death this year from lymphoma, Davis was still predicting an imminent revolution that would transform the world. But as he made clear in “The New Humanity,” his breathtakingly optimistic 2017 book, the revolution will need both an inward and outward focus. Though “some activists may want to stay consumed with anger,” he wrote, that alone won’t save us. “We must heal as a species — starting with ourselves.”

Benoit Denizet-Lewis is a contributing writer for the magazine, a National fellow at New America and an associate professor at Emerson College. He is at work on a book about transformation and identity change.

Beverly Cleary was put on academic probation after first grade. Her biggest problem was reading: It didn’t interest her. The assigned books were all bland educational stories about polite children. Why, she wondered, didn’t anyone write stories about real kids — funny, angry, joyful, unruly vortexes of love and chaos? Kids who felt anxious, broke the rules, threw tantrums, pulled one another’s hair? Kids like her and her friends? What was the matter with authors?

After college, in the 1940s, Cleary was forced to ask this question again. She got a job as a children’s librarian, and she found herself sympathizing, deeply, with patrons who couldn’t find anything good to read. How were these rowdy little rascals — ragtag kids who scattered their baseball mitts across the circulation desk — supposed to connect with the generic adventures of Dick and Jane and Sally? Why would a puppy ever say something as boring as “Bow-wow. I like the green grass”?

Cleary solved this problem by becoming an author herself. Today, we can measure her vast success in all kinds of ways: She published more than 40 books, sold in excess of 90 million copies and won dozens of awards. (Back in 2000, the Library of Congress declared her a “Living Legend.”) But Cleary’s most important achievement was unquantifiable. She helped children — real complex children with real complex lives — begin to find themselves in books.

Cleary’s signature character, Ramona Quimby, is exactly the sort of unwieldy child who would have been excluded from old-fashioned kids’ lit. Ramona is proud, loud, fiery, sloppy, creative and energetic — a geyser of trouble. Book by book, she barges her way through elementary school, vexing teachers and testing her parents and irritating her big sister, Beezus. (“Beezus” was young Ramona’s mispronunciation of her sister’s actual name, Beatrice.) Ramona squeezes an entire tube of toothpaste into the sink, cracks a raw egg on her head at lunch, gets her new boots stuck in the mud at a construction site and boings a classmate’s curly hair so relentlessly that she gets suspended. She loves her new pajamas so much that she wears them to school under her clothes, overheating herself.

This was Cleary’s great gift: the ability to map the strange Newtonian physics of childhood — its bizarro laws of proportion and gravity, its warped space-time. She loved, especially, the spots where kids’ inner worlds (urgent, intimate, self-evident) conflicted with the outer world of adults (cold, foreign, arbitrary). Cleary understood that, to a child, 30 minutes often feels like 30 years, and that small setbacks — e.g., failing to sew a perfect pair of slacks for your stuffed elephant, Ella Funt — can feel like an apocalypse. For Ramona, the grown-up world is loaded with logical inconsistencies. She is late to school one morning because, quite reasonably, she thinks that “a quarter past 8” must mean 8:25, in the same way that a quarter of a dollar means 25 cents. On the first day of kindergarten, her teacher tells her, offhandedly, “Sit here for the present.” The teacher means sit here for now, but Ramona misunderstands, and as the other kids get up to play games and sing songs, Ramona sits there dutifully, waiting for the gift she believes she has been promised.

Many of Cleary’s stories grew out of her own life. She was the only child of a distant, depressed, overbearing mother. (“You are the type that will fade quickly,” her mother once told her, out of nowhere, while they were washing dishes.) Accordingly, Cleary spent much of her life feeling naughty. She was, admittedly, a bit of a troublemaker. “A Girl From Yamhill,” the first of her two memoirs, contains epic catalogs of childish high jinks: On the family farm, she amused herself by tripping chickens with a long pole; she touched a hot stove after her father told her not to; she yanked her cousin off a chair after an argument over who drew better birds; she stood up at the very top of a Ferris wheel; she once tried to cut off all her hair so she could look like her Uncle Fred. Her grandfather used to pay her a nickel to sit still for five minutes. Sometimes, other adults would compliment Cleary’s parents by telling them they had “a lovely girl” — and she resented this mightily. “I did not feel lovely, not one bit,” Cleary writes. “I felt restless, angry, rebellious, disloyal and guilty.”

These are the kinds of feelings that Cleary preserved in her books. She wrote by hand, with cheap ballpoint pens, and as her fame grew, decade after decade, she always resisted publicity. She preferred to let the books speak for themselves. Which they do — still.

Certain details in the novels are, inevitably, dated. (“She amused herself by punching the buttons on the cigarette machine in time to the Muzak, which was playing ‘Tie a Yellow Ribbon ’Round the Old Oak Tree.’”) But the tone is as alive as ever. Ramona helped me, as a boy in the 1980s, learn to process the big complex world around me: jeering classmates, fighting parents, carsickness, economic swings. And Ramona remains, waiting to connect with future generations. After I learned of Cleary’s death, I went out and picked up a used copy of “Ramona the Pest” from my local bookstore. On the title page, written with clear pride of ownership, was a message from a child — one or 10 or 15 years ago, it’s impossible to say. In blue pen, she listed her phone number. “If lost,” the child wrote, “call Jessica.”

Sam Anderson is a staff writer for the magazine and the author of the book “Boom Town.” His most recent article was a profile of the artist Laurie Anderson.

“My father consumed me,” Larry King said in a 1997 interview. “He wanted a son so bad.” Eddie Zeiger was only 30 when he buried his first son, Irwin. The 6-year-old had been complaining of stomach pains, but by the time Eddie and his wife, Jennie, got him to the hospital, it was too late: Irwin died of a ruptured appendix. The couple, adrift, quickly conceived again, and Eddie prayed for a boy — a chance to do it all over. His prayers were answered in the form of Lawrence Harvey Zeiger. (It wasn’t until his first radio show, in 1957, that Zeiger became a King: less ethnic, easier to spell.) Eddie doted on his son, taking him everywhere — to Yankee games or the Brooklyn bar he owned, popular with cops around the neighborhood.

On June 9, 1943, King was walking home from the library when he saw three squad cars parked in front of his apartment building. He was 9; in his memory, he checked out nine books that day. As he approached the peculiar scene, he recognized his mother’s screams. One officer — a friend of his dad’s — pulled King aside and drove him to the movie theater, where he broke the news: His father had died of a heart attack. Eddie, too, had been complaining of pain but dismissed it, choosing to go to work anyway. He was buried next to Irwin. King, heartbroken and resentful, didn’t cry. “I never went back to that library again,” he writes in his memoir, “My Remarkable Journey.” “And from that day on, I was nervous if I saw a squad car in my neighborhood. If one parked by my apartment building, I’d start running home, in fear that my mother had died.”

King spent his life dodging death, resistant but haunted by its specter. Naturally, this meant he couldn’t stop talking about it. His neurosis was a familiar theme whenever he was interviewed (“I’m scared to death of death!”), as mundane a fact as his favorite sports team. King daydreamed about his funeral the way a betrothed might fantasize about a wedding — the speeches, the ceremony, the guests — mourning only that he wouldn’t be there to see it. At home, he discussed his death so often that his wife had to intervene, saying that it depressed their children. He read the obituaries competitively, comparing himself with the people who were just a few years older — or worse, younger — than he was.

King took four human growth hormone pills every day, hoping they would buy him more time; he saw “The Curious Case of Benjamin Button,” a film about a man who ages in reverse, and was inspired, even envious. For years, he contended that he wanted to be cryogenically frozen upon his death, just in case scientists were able to eventually find the cure to whatever killed him.

Maybe this fear is why King crammed so much life in between those suspenders: eight marriages, seven wives, six kids; two bankruptcies and an arrest on larceny charges; a heart attack, quintuple bypass surgery, diabetes, lung cancer and what his doctors called an “indomitable spirit.” And, of course, all that airtime — 15 years of a national radio show (“The Larry King Show”), 25 years of a televised talk show (“Larry King Live”), then a high-profile cancellation and a revival (“Larry King Now”).

An avowed agnostic, King had no fantasies of the afterlife but always wanted to poke around in someone else’s. For decades, on his shows, he would ask guests — mediums, musicians, Marianne Williamson — what they thought happened after death. He deplored the idea of exiting the party while it was still going on, knowing he could never get back in. “Larry wanted to stay alive forever,” his best friend, Herb Cohen, told me. “He didn’t want to leave. He wouldn’t know who won the World Series.”

Only in his 80s did King finally decide that it might be time to go. In 2019, King suffered an aggressive stroke that left him in a brief coma. When he woke up and heard about his new life — dependent on others for everything — he immediately considered ending it. Then he saw his son Chance’s crying face by his bedside and decided to stay. His kids needed him, the way he had needed his own father.

But something had changed. No epiphanies, no newfound beliefs in a higher power. If anything, he felt lucky. He outlived his father by double and the average American male by a decade. Even if he had been trying to evade his own mortality, there was a blessing in how long he had been able to run from it.

So instead he did his usual: He went around talking about it. “I’m not afraid of it now,” he said in one of the interviews he gave after the coma, “because it’s the one thing all of us are going to face.” In another, he mentioned reading David Kessler’s book about closure as the final stage of grief. Maybe he found it: “I have less of a fear of dying now,” he said in yet another interview. “I’m 86, and it is what it is.”

Jazmine Hughes is a staff writer for The New York Times Magazine and a reporter for The Times’s Metro section. She last wrote about the musician Questlove.

On May 30, 1985, Brigitte Gerney was going to take a taxi home from the dentist, but it was such a beautiful spring day that she decided to walk. The dentist’s office was on East 69th Street, on the Upper East Side of Manhattan. Gerney lived near the United Nations, some 20 blocks south. She was walking by a construction site on the west side of Third Avenue, a little before noon, when she heard people screaming for her to get out of the way. She tried to run. But it was already too late. Gerney would later describe it feeling like an earthquake. Her bag went flying out of her hands and the pavement cracked beneath her. She noticed how cold the 35 tons of metal felt on top of her.

When James Essig, a patrol officer with the 19th precinct, arrived at the scene, he found a mobile crane tipped almost upside down over the edge of a foundation pit several stories deep. Pedestrians were pointing at it and screaming. It took him a minute to register what they were saying: that somewhere under there was a woman. Essig was 23 then, two years on the job. (He is now the N.Y.P.D.’s Chief of Detectives.) He and other officers took off their belts and formed a human chain. When Essig reached Gerney, he saw that her legs were pinned beneath the base of the crane. Her upper body was on a piece of plywood fencing suspended precariously over the pit. And she was conscious. Essig tried to reassure her. He told her that they would get her out. But in reality, he wasn’t sure. Any minute the crane could collapse into the pit, dragging Gerney down with it.

Gerney, who was 49 at the time of the accident, was born in Liechtenstein, a tiny German-speaking country between Austria and Switzerland. The crane wasn’t the worst thing that happened to her since she arrived in New York in 1966. She lost her first son in 1973 when he fell into a pool and drowned. In 1980, she survived lung cancer. In 1982, a gondola she was riding in at a ski resort detached and plummeted to the ground. A year later, her husband died of colon cancer. Somehow, Gerney never collapsed under the weight of the misfortunes that befell her. “Something about her nature allowed her to bend in this howling wind and not break,” K. Ann McDonald, a family friend, told me. “She was sort of weather-beaten in a good way.”

McDonald was in a cab on the F.D.R. Drive that day in 1985, heading north, when she noticed the terrible traffic. The accident had brought Midtown to a halt. Hundreds of onlookers crowded the streets, roofs and windows of nearby buildings, listening to transistor radios for updates; others watched Gerney’s body amid a mangled mess of metal on TV to see whether the “Crane Lady,” as she would become known, would live or die. Essig pushed the perimeter back to accommodate a growing number of personnel: police, fire, medical, heads of buildings and hospitals and at least two priests to deliver last rites. “It was a three-ring circus,” said Lewis Goldfrank, the head of the emergency department at Bellevue Hospital, who was rushed to the scene in a police car. In the middle of all of this was Mayor Ed Koch, who asked if Goldfrank could amputate Gerney’s legs. “I said I’d take a look,” Goldfrank said.

Paul Ragonese, of the N.Y.P.D.’s Emergency Service Unit, was now beneath the crane, administering first aid. “I’m going to die under here,” Gerney told him. Gerney also wanted to have her legs cut off. She had two young children, she said. They had just lost their father, and they needed her. Goldfrank saw that her legs were crushed below the knee. But they were still responsive. Her vitals were stable. The crane was effectively working like a tourniquet. He told Koch he wouldn’t amputate. “We haven’t done amputations in the field since probably the Civil War,” he told the mayor.

Rescuers deployed two other cranes to lift the fallen one. By 4 p.m., a third crane arrived from the South Bronx; weighing about 150 tons, it traveled at just a few miles per hour. Officers, meanwhile, dug broken concrete from beneath Gerney and used wooden planks to prop up the swaying plywood. Ragonese crouched into a two-foot-wide space, holding Gerney’s hand for so long that when he emerged, his legs gave out and he had to be hospitalized for muscle spasms.

Gerney would remain trapped beneath the crane for nearly six hours. When she was finally freed, at 5:53 p.m., the city closed a mile and a half of the F.D.R. so that she could be rushed to the hospital. Within an hour, Gerney was in the operating room. This was the early days of emergency medicine, and the trauma-care team at Bellevue worked most of the night repairing bones, vessels and skin. Gerney would undergo 13 operations in total. As she recuperated, President Ronald Reagan called. Nancy Reagan visited. As did Cardinal John O’Connor. The crane accident and its fallout remained in the news for over a year, as the crane operator pleaded guilty to second-degree assault. He was able to avoid prison because Gerney wrote a letter to the court calling for compassion.

In New York City, the Crane Lady reached superhero status. Taxi drivers recognized her in the rearview mirror. Strangers shared their struggles and asked for her advice. They told her how they had watched her on TV and how much she meant to them. Gerney had a sense of humor about the accident. “All this attention I’m getting for falling in a hole,” she’d say. She would warn people about boarding boats and planes with her.

A year after the accident, Gerney met and fell in love with Peter Rizzo, an orthopedic surgeon. They got engaged. But before they could marry, Rizzo was shot and killed in 1987 by a retired New York City firefighter, who was angry about a delayed medical-disability claim. If anything ever came close to breaking Gerney, it was Rizzo’s death. “That was just too much and unfair,” her son, Arkadi, told me. For years after the shooting, Gerney would close herself in her room and listen to a favorite film score by Ennio Morricone. “The Peter thing was just hard to make sense of,” Arkadi said. He imagined his mother would describe it as unnecessary. “Not that a crane falling on you is necessary.” Gerney never dated again.

The crane permanently damaged Gerney’s body. She walked with a limp and chronic pain affected her hips and spine. Muscle had to be taken from her back to rebuild her calf. She never could flex one ankle. “She had to learn how to walk from scratch because they weren’t the same legs,” her daughter, Nina, said. Gerney died as a result of heart failure related to Alzheimer’s dementia. But in her last days, when she would still use those legs to take walks in her garden, Nina told me, the memories that flickered back never included the bad things that happened to her — only the good.

Irina Aleksander is a contributing writer for the magazine. Her last feature was about sweatpants.

In the tributes that followed the death of Janet Malcolm, a clear pattern emerged: a word-cloud of severity. Malcolm was described as “piercing,” “precise” and “unsparing”; her prose was “clear as gin, spare as arrows,” “merciless,” “pitiless.” Her admirers seemed to feel simultaneously instructed and rebuked: wishing to be more like her, anxious that she would find us wanting.

This is a signature tension in Malcolm’s work. She is luxuriously attentive but also ruthless in articulating what she sees. In one motion, she honors and critiques. Over the course of her career — in articles in The New Yorker and The New York Review of Books and in 10 books that encompassed biography, literary criticism, legal reportage and profiles — Malcolm helped invent the forms that came to dominate modern journalism. Yet she trained her keen eye on her own profession, uncovering its falsities, deflating its self-importance. Her most famous article, “The Journalist and the Murderer,” published in The New Yorker in 1989, tells the story of Joe McGinniss, a reporter who befriended the accused murderer Jeffrey MacDonald in order to write about him. Its opening is often quoted: “Every journalist who is not too stupid or too full of himself to notice what is going on knows that what he does is morally indefensible. He is a kind of confidence man, preying on people’s vanity, ignorance, or loneliness, gaining their trust and betraying them without remorse.” Malcolm excoriates McGinniss for feigning belief in MacDonald’s innocence, and for mischaracterizing an essentially dull man as a wild narcissist — all to make his story better. Malcolm can’t stand this distortion — but at the same time, she coolly anatomizes the power games between journalists and their subjects, the ways that writing about someone necessitates playing on their vulnerabilities. Though she implicates herself, her omniscient tone rankled her peers. She seemed both apart from this fray and of it — a superior writer, perhaps, but no less a betrayer.

Yet the idea of Malcolm as cold and punishing toward her subjects is too limiting. Her body of work, as it evolved, is strewn with clues pointing to her complex view of the journalist-critic and her responsibility toward her subjects, and by extension her readers. She does not sit in judgment; her intelligence is more restless than that. Really what Malcolm asks of us is an alertness equal to hers.

Malcolm’s vision was rigorous and fascinated: What she was after was the kind of deep engagement that is ultimately a species of love. Take, for instance, her 1995 New Yorker essay on Bloomsbury. Malcolm writes so admiringly of Quentin Bell’s biography of his aunt Virginia Woolf that we can detect an identification, a hint of her own values. What makes Bell’s biography remarkable, she writes, is his intimacy with the family. He has “carefully studied each of them for years and has slowly turned their characters over in his mind, knowing their idiosyncrasies and weaknesses.” He sympathizes with them the way a 19th-century novelist might — with a “loving disapproval” that, like fiction, inspires a kind of “helpless empathy.”

These phrases — “loving disapproval,” “helpless empathy” — apply equally well to Malcolm’s work. Her careful attention, however cold it may seem, tends to generate unexpected comedy and warmth. Love and disappointment, in her work, are inevitably, sometimes frighteningly, mixed up. Malcolm was a devoted reader of 19th-century novelists — her favorites were Austen, Eliot, Trollope, Dickens, James, Hawthorne, Melville, Tolstoy and Chekhov — and she assimilated their qualities of compassion for human weakness. Her best pieces are really 19th-century novels disguised as 20th-century journalism. Consider, for instance, a 40,000-word profile of Ingrid Sischy, who was, in 1986, the 27-year-old editor of Artforum. It’s an account of the New York art world of the 1980s, but it’s also an essay about criticism: about taste and who and what we value.

It contains some of her best — quickest, sharpest — portraiture. Here she is on the Greene Street loft of Rosalind E. Krauss, a fearsome former editor of Artforum (in Malcolm, rooms are often a mirror of personalities): “Its beauty has a dark, forceful, willful character. Each piece of furniture and every object of use or decoration has evidently had to pass a severe test before being admitted into this disdainfully interesting room.” And here on Rene Ricard, a poet-critic who is part of the new guard: “He is thin and wiry, his brow is deeply lined, his eyes are frightened, and his mouth is petulant. His voice is high-pitched, and in it there is spite, self-pity, self-parody, seduction, false innocence, anxiety.”

This is “unsparing,” yes, but it also grants each person their personhood, with the sureness and vitality of the best painted portraits. Perhaps most revealing is her portrait of Sischy — the honest, plain, morally sturdy, unfailingly curious editor whose magazine nurtures and nimbly responds to change. Malcolm occasionally finds herself annoyed by Sischy’s “shining rectitude,” but her momentary irritation is “swept away by the disarming agreeableness of her company. Her capacity for enjoyment is movingly large. She is a kind of reverse Jewish princess: she goes through life gratefully accepting the pleasures that come her way, and if they are not the particular pleasures she ordered — well, so much the better.”

Sischy worries at one point that Malcolm finds her boring and too upright, but the journalist detects in her subject qualities that it seems she herself strove for: the capacity to create worlds through the act of seeing, an incorruptible frankness. Throughout the piece, Malcolm’s most devastating observations are of self-satisfied people, those who rest too comfortably in their sense of authority. It’s striking that a writer known for the lashing finality of her judgments reserves her most severe ones for those who think they know. As Malcolm wrote in that same essay on Bloomsbury: “Life is infinitely less orderly and more bafflingly ambiguous than any novel.”

Malcolm didn’t give many interviews, but the deepest one, with the writer Katie Roiphe, appeared in The Paris Review in 2011. (Knowing better than anyone the perils of sitting and talking with a journalist, Malcolm insisted on writing out and editing her answers.) In one of the most revealing moments, Roiphe asked her whether coming to this country as a child — Malcolm emigrated with her parents from Czechoslovakia in 1939, in retreat from the Nazis — gave her any sense of otherness or affected her identity as a writer. Malcolm replied that she remembered feeling confused and out of place in the English language, often misunderstanding simple phrases. (When she heard a teacher say, “Goodbye, children,” she envied the girl whose name she assumed to be Children and hoped the teacher would someday say “Goodbye, Janet.”) She then reflected, “I have never connected these pathetic struggles with a language I didn’t know to later struggles with the language I tried and try not to disgrace myself in as a professional writer, but there may be a connection after all.” That “pathetic” is so harsh, that “tried and try” so moving — evidence of triumph, evidence of self-doubt. Of stringency and tenderness, above all, with herself.

Sasha Weiss is the culture editor of the magazine.

Fifty-six years before Colin Kaepernick took a knee during the national anthem and became a sensation, there was Jim (Mudcat) Grant.

One Friday night in September 1960, Grant’s Cleveland Indians were hosting the Kansas City Athletics. The stakes could not have been lower: Both teams were lousy, and with only two weeks left in the long season, there was little to play for but pride. Grant, a pitcher, was not in the lineup that night, but he had plenty of pride.

The 25-year-old was shaped by his hometown, Lacoochee, Fla. The Klan rampaged freely there, shooting into the homes of Black families. Grant’s father, James Sr., died of pneumonia when Grant was a baby, so his mother, Viola, had to fend for herself and her six children. She took to hiding the young Grant in a wooden box near the fireplace of their shack — it had no electricity, no hot water, no toilet — when the Klan came through.

In his 2006 book, “The Black Aces: Baseball’s Only African-American Twenty-Game Winners” (which he numbered among), Grant recalled: “You had to always watch where you were and know what you were going to do, because something was going to happen to you every day. You knew of the lynchings. You would hear it in the night, and if you didn’t, word came through the next town that somebody was hanged or castrated.” Grant’s mother nevertheless managed to infuse him with an unshakable self-confidence.

By his teens, Grant’s precocious baseball talent landed him on the local Black team, the Lacoochee Nine Devils, where he starred. As a 14-year-old, he struck out 19 batters during a game on the road in Georgia. (Fearing for their lives, the team had to run for the bus after the final strikeout.) When Grant was 18, a scout for the Cleveland Indians got word about a top-flight talent down in Florida who had dropped out of college and was working as a carpenter’s aide to help support his family. The agent sought Grant out and offered him an amateur contract. It was at the subsequent tryout that Mudcat was born. “In those days, they thought all Black folk was from Mississippi,” Grant told a reporter. “They started calling me Mississippi Mudcat. “I said, ‘I’m not from Mississippi,’ and they said, ‘You’re still a Mississippi Mudcat.’ ” He protested, but the name stuck.

As the first notes of the anthem began to play on that September evening in 1960, Grant rose with his teammates. A talented vocalist, he loved singing the national anthem before ballgames. (Later in his career, in fact, he would become the first active player to sing the anthem before a game.) But on this night, he improvised his own ending: Instead of “O’er the land of the free and the home of the brave,” he sang, “This land is not so free, I can’t even go to Mississippi.”

Cleveland’s bullpen coach, a Texan named Ted Wilks, overheard Mudcat’s rendition, and began to call him “an objectionable name,” as The Cleveland Plain Dealer put it at the time. Grant settled things with his fists and left the stadium without a word. He was suspended, and his pay was docked for the final two weeks of the season. He later apologized for leaving the stadium without telling his manager, but not for punching Wilks. When Grant returned to the team at the start of the 1961 season, Wilks was gone, demoted to the minors. Grant would go on to have the best season of his career to date, leading the team in wins, shutouts and innings pitched.

After being traded to the Minnesota Twins in 1964, Grant became an all-star, a 21-game winner and The Sporting News’s American League Pitcher of the Year in 1965. In the biggest game of his life, a do-or-die Game 6 of the World Series that year, he pitched his team to victory — on short rest — while also hitting a pivotal home run. It should have been a legend-making performance, but the Dodgers ended up winning the series as their star pitcher, Sandy Koufax, won Game 7 and was elevated to legend status. The following season, the Twins gave Grant, still at the height of his powers, a lowball offer on a new contract. He signed it, after contentious negotiations, and was traded to the Dodgers a year later.

Feeling underappreciated after his great 1965 season, Grant began to focus on a new path toward the wealth and career satisfaction he felt he deserved: singing. He formed a musical group, an R.&B.-based act with backup dancers — Mudcat and the Kittens. With his suave voice and movie-star looks leading the way, the group took off. It would continue touring widely long after Grant had retired from baseball, making appearances on “The Tonight Show Starring Johnny Carson” and Mike Douglas’s show.

“I made way more money in music than I did in baseball,” he said.

Rowan Ricardo Phillips is the author of three books of poetry, most recently “Living Weapon,” and two books of nonfiction and a book-length translation of fiction.

Yasuhiro (Hiro) Wakabayashi, the great Japanese American photographer, would do whatever it took to make a surprising image. Even when that meant hanging in the air, supported only by a thin wooden plank jury-rigged with a ladder and some rope — as he did while photographing the Italian actress Alberta Tiburzi for the February 1967 cover of Harper’s Bazaar. He made this weightless balancing act look as casual as standing on the ground. With his legs suspended out behind him, feet crossed elegantly at the ankles, wristwatch peeking out from under a white shirt cuff, Hiro’s hands — his entire body, in fact — held still as a tripod to capture Tiburzi’s heavily lined eyes, the folds of her groovy brown-and-white dress swirling around her.

In the resulting image, her body and face collapse into a white space without depth. If he’s the Moon Man, looking down on Earth, she’s the Space Princess dreaming of returning to her more stylish world. “Surreal” is an adjective often used to describe Hiro’s work, but searching for meaning in an image by Hiro can sometimes feel a bit like pinpointing the edge of a black hole. There’s more than surreality at work, and it’s rooted in a kind of metaphysical mystery.

In the 1950s and ’60s fashion world, finding new perspectives was the ticket to success. “If you look into the camera and you see something you recognize, don’t click the shutter,” was the mantra Hiro absorbed from one of his mentors, Alexey Brodovitch. Brodovitch held court at Harper’s Bazaar, and had a prompt he loved to give his disciples: Capture something so many times that you no longer recognize the thing you’re looking at. In 1957, he gave Hiro, who was in his mid-20s, his first assignment for the magazine: to shoot a shoe. (Bazaar had already featured the work of a young illustrator named Andy Warhol.) Hiro passed the shoe test, and served as one of the primary photographers at Bazaar until he took over the magazine’s lead photography job from Richard Avedon in 1965, during what was arguably the pinnacle of American fashion-magazine innovation. He held the job until 1975.

It was a meteoric rise for a young man who had emigrated to California only three years before that first shot. Taking a Greyhound bus across the country, Hiro came to New York City, where he studied at The School of Modern Photography, before leaving and becoming an assistant to Avedon in 1956. It didn’t take long for Avedon to see he had a protégé on his hands, and the two remained lifelong friends and creative contemporaries. In 1999, after nearly half a century of friendship, Avedon edited a career-spanning monograph of Hiro’s work, with the mandate that every photograph chosen must be one that “only Hiro could have taken.” In the monograph’s foreword, Hiro wrote, “Richard Avedon, Alexey Brodovitch. They have merged in me and force me to look into the lens, look again, and for an instant, see myself peering back.”

Long before the invention of digital photography, Hiro found ways to create illusions using layers of film. By manipulating light in camera, he became a master of disorientation. Like Andrei Tarkovsky or Stanley Kubrick, he found ways to use earthly locales to suggest galactic travel, shooting familiar objects on beaches or barren deserts, to make us see the strangeness of our own planet. It’s possible to imagine that just outside his methodical frame, the horizon line was undulating.

Simple pictures turned uncanny very quickly. A woman in sunglasses veils her face in sheer blue fabric; a hand holds it tight under her jaw, invoking erotic asphyxiation. With very few props, she becomes an alien in her own world, a mind estranged from its body. In another shot from 1968, Hiro snakes a thin strip of metal around the head of the model Donna Mitchell, a frequent collaborator. The shape suggests a helmet fit for space-age travel — albeit one that would provide no protection.

“A Hiro image wasn’t just about precision — he wanted exactitude,” Mitchell recently recalled. “Not a spot of light, not an angle or a color value, nothing was an accident. Each strobe was timed within a nanosecond.” But once Hiro had landed where he wanted from a technical standpoint, he let go, floating in the imaginative space between his camera and his subject.

Like many iconic characters in the world of fashion, Yasuhiro Wakabayashi was known by a single moniker. As Hiro, the Japanese immigrant became an iconic American photographer. Personal rebranding is normalized in the fashion industry. The supermodel born in Somalia as Zara Mohamed Abdulmajid became simply Iman. The German countess Vera Gottliebe Anna Gräfin von Lehndorff-Steinort condensed into Veruschka. The Palestinian American model-influencer Jelena Noura Hadid is now more commonly known as Gigi.

Even as he condensed his unmistakably Japanese name — in an America that had been forcing citizens of Japanese descent into internment camps a mere nine years before he arrived in California — Hiro was familiar with displacement. The son of a professor, who Hiro suspected might be a spy, he spent his childhood in China, amid frequent upheaval. Various family homes in Shanghai were destroyed by fire. In 1936, as a small boy in the lead-up to the Sino-Japanese war, he was forced to flee his home with nothing but a backpack. As a teen he was drafted into the Japanese Army of the Occupation and sent to a remote camp in Beijing where he saw executions. By the time his family was sent back to Japan after the Second World War, Hiro was well acquainted with the devastation that men can inflict upon one another and the planet. He didn’t look away from the horror. On his enforced return to Japan, a country he’d never really inhabited, he visited Hiroshima to see the aftermath of annihilation for himself.

Traces of his tumultuous upbringing in Asia are seen throughout his work. Voluptuous smoke, curling from the mouth of a model, for instance, was inspired by the opium dens he witnessed as a child in Shanghai. One of his most well-known bodies of personal work is a Kodachrome series he made in 1981 with Japanese fighting fish, based on pets he kept as a kid. The red and blue fish flitting through a tank, their saturated colors turning up as they prepare to fight, provided the perfect moment to release the shutter.

Mark Holborn, in an essay from Hiro’s 1999 monograph, describes the first photograph in the book, of a tarantula crawling on a disembodied foot, as “a mark of Hiro’s humanity, like the stone footprint of Buddha.” Perhaps a certain Zenlike control can be found in his meticulous frames, or perhaps they are more like memento mori. Cropped black-and-white photographs of a naked infant lack all sentimentality, abstracting the child to sections of rolling flesh. Riders on a Tokyo subway pressed into the cars seem to be moving corpses. His photographs of the Apollo 11 spacecraft, taken with infrared film, capture the hallucinogenic space-age glory and the potential for human transcendence. But as with many of his images, the beauty is laced with an acid tinge — a green flash before a cataclysmic boom.

Many of Hiro’s pictures feel like attempts to choreograph chaos. As Donna Mitchell recalls, “I was capable of being very still, which he loved. He’d say: ‘Donna, don’t look at me. Don’t look at the camera.’ On most shoots I did with other photographers, the camera was very intrusive, but with Hiro we could forget it was there.”

Stella Bugbee is the editor of the Style section of The New York Times. She has written about many aspects of style and images in the last decade, including the way fashion photography needs to be reimagined in light of the #MeToo movement.

The image goes viral, or as viral as possible in the summer of 2007. We see the body of a gigantic silverback mountain gorilla hoisted high on crisscrossed branches carried aloft by at least 14 men through the bush. The dead gorilla is lashed with vines to secure his arms and legs. His prodigious belly is belted with vines, too, and his mouth is stuffed with leaves. The photograph seems like the end of a movie we don’t yet know the beginning to. He’s 500 pounds — a black-and-silver planet amid the green. Though we can’t see this part, some of the men are weeping.

The gorilla’s name is Senkwekwe, and he’s well known to the pallbearers, many of them park rangers who call him “brother.” He’s the alpha male of a family named the Kabirizis. (The American primatologist Dian Fossey was instrumental in studying the complex dynamics of these family units.) They’re a troop habituated to humans: gentle, curious, playful and often pleased to greet visitors, tourists and the rangers who protect them. Now, here on their home range, on the slope of the Mikeno volcano in Virunga National Park in eastern Congo, many of them have been murdered by armed militia members trying to scare away the rangers and gain control of the old-growth forest for charcoal manufacture. In a solemn procession, the dead gorillas are being taken to the rangers’ field station.

The photograph, shot by Brent Stirton for Newsweek, appears in newspapers and magazines around the world, awakening others to the issues the park rangers know so well: the need to protect the gorillas’ habitat, the bloody battle for resources (gold, oil, charcoal, tin and poached animals), the destabilizing presence of armed rebel groups as well as the Congolese Army inside the park’s borders. Though the park is designated a World Heritage site, more than 175 park rangers have been killed here in the last 25 years. What’s also not visible in this photograph is that only one gorilla survives the massacre, a baby found next to her slain mother, one of Senkwekwe’s mates, trying to suckle her breast.

The baby — a 2-month-old female, five pounds and adorable — is dehydrated and near death herself, so a young park ranger named Andre Bauma instinctively places her against his bare chest for warmth and comfort and dabs her gums and tongue with milk. He brings her back to life and sleeps and feeds and plays with her around the clock — for days, then months, then years — until the young gorilla seems convinced that he, Andre Bauma, is her mother.

Andre Bauma seems convinced, too.

The baby gorilla, begot of murdered parents, is named Ndakasi (en-DA-ka-see). Because no orphaned mountain gorilla has ever been successfully returned to the wild before, she spends her days at a sanctuary in the park with a cadre of other orphaned gorillas and their minders, swinging from the high branches, munching wild celery, even learning to finger paint, mostly oblivious to the fact that she lives in one of the most contested places on earth. She’s exuberant and a ham and demands to be carried by her mother, Andre Bauma, even as she grows to 140 pounds and he nearly buckles under her weight.

One April day in 2019, another ranger snaps a selfie with Ndakasi and her bestie, Ndeze, both standing upright in the background, one with a protruding belly and both with whassup expressions. The cheeky goof on humans is almost too perfect, and the image is posted on Facebook with the caption “Another day at the office. … ”

The photo immediately blows up, because we love this stuff — us and them together in one image. The idea of mountain gorillas mimicking us for the camera jumps borders and species. We are more alike than different, and this appeals to our imagination: ourselves existing with some fascinating, perhaps more innocent, version of ourselves.

Mountain gorillas exhibit dozens of vocalizations, and Bauma is always vocalizing with Ndakasi in singsong and grunts and the rumbling belches that signal contentment and safety. Whenever there’s gunfire near the sanctuary, Bauma makes sounds to calm Ndakasi. He himself lost his father to the war in Congo. Now he’s telling her it’s just another day inside their simple Eden.

“You must justify why you are on this earth,” Bauma says in a documentary. “Gorillas justify why I am here.”

Ndakasi turns 14 in 2021 and spends her days grooming Ndeze, clinging to Bauma, vocalizing back and forth with him. Mountain gorillas can live up to 40 years, but one day in spring, she falls ill. She loses weight, and then some of her hair. It’s a mysterious illness that waxes and wanes, for six months. Veterinarians from an organization called the Gorilla Doctors arrive and, over the course of repeated visits, administer a series of medical interventions that seem to bring about small improvements. Just when it appears she’ll recover, though, Ndakasi takes a bad turn.

Now her gaze reaches only just in front of her. The wonder and playfulness seem gone, her concentration having turned inward. Brent Stirton, who has returned to Virunga roughly every 18 months since photographing the massacre of Ndakasi’s family, is visiting, and he shoots photographs judiciously. The doctors help Ndakasi to the table where they attend to her. She throws up in a bucket, is anesthetized. Bauma stays with her the entire time; eventually, she’s taken to her enclosure and lies down on a green sheet. Bauma lies on the bare floor next to her.

At some point, Bauma props himself against the wall, and she then crawls into his lap, with what energy she has left, rests her head on his chest and sinks into him, placing her foot on his foot. “I think that’s when I could almost see the light leave her eyes,” Stirton says. “It was a private moment no different from a person with their dying child. I made five frames respectfully and walked out.”

One of those last photographs goes viral, beaming to the world the sad news of Ndakasi’s passing. What do we see when we look? Pain. Trial. Death. And we see great love too. Our capacity to receive and give it. It’s a fleeting moment of transcendence, a gorilla in the arms of her mother, two creatures together as one. It’s profoundly humbling, what the natural world confers, if we let it.

Bauma’s colleagues draw a tight circle around him in order to protect him from having to talk about Ndakasi’s passing, though he releases a statement extolling her “sweet nature and intelligence,” adding, “I loved her like a child.” Then he goes back to work. In Virunga, death is ever-present, and there are more orphaned gorillas to care for. Or perhaps it’s the other way around.

Michael Paterniti is a contributing writer for the magazine.

One summer day in 1978, deep in the woods of Northern California, a group of lesbian feminists, tanned and shirtless, tool belts strapped to their waists, hard hats on their heads, began building a house on what they referred to as “the land.” The air smelled of evergreens, sweat, idealism. There was no running water, no electricity, no phones, no men. They vowed that they would own this place together until their final breaths.

Several of the women were pioneers in the lesbian feminist movement, but Sally Miller Gearhart stood out. She was 5-foot-9 with thick, short brown hair, warm, deep-set eyes and majestic hands that animated the air as she spoke. Her sonorous voice was laced with a Southern accent. Women said they could feel her charisma from yards away. They felt it when she strode into Maud’s, a lesbian bar in San Francisco, or when she placed her hand on their shoulder. Or when she spoke at lesbian and gay rights rallies or jumped on the classroom table to get her students’ attention at San Francisco State University, where she was a professor of communications, the first open lesbian hired there in a tenure-track position.

Gearhart and other radical lesbian feminists strove to create an alternate, self-sufficient, women-centered world: During the apex of the movement in the 1970s, they generated dozens of newspapers and magazines (The Furies; Purple Rage; Dyke, A Quarterly) and created women’s (or womyn’s) music festivals, food co-ops, bookstores and record labels. They organized rape hotlines and domestic-violence shelters. And some went further, turning away entirely from the patriarchy and forming back-to-the-land separatist communities (Rainbow’s End, Fly Away Home, WomanShare). They were inspired in part by Black separatists and the belief that to liberate yourself from the oppressor, first you had to join with your own people and strengthen your self-identity.

The community that Gearhart and others formed in Willits, Calif., about 140 miles north of San Francisco, was small compared with others. At its height, 10 women owned several connecting parcels totaling more than 100 acres. They usually lived there on weekends and during the summer, along with their partners, friends, families. Men weren’t invited. Gearhart espoused a separatist vision. She wrote and spoke about a hoped-for future in which biological techniques would allow two eggs to produce only females and men would slowly be reduced to 10 percent of the population. Her 1978 speculative-fiction novel, “The Wanderground: Stories of the Hill Women,” imagined a world in which women lived together in nature, teleported, used psychic powers to communicate among themselves and with animals and strove to keep violent men off their land.

Gearhart was thousands of miles and a political world away from where she grew up, in a conservative Christian family in Pearisburg, Va. That’s where she learned to recite passages from the Bible (she also could deliver soliloquies from Shakespeare and poems by T.S. Eliot and Emily Dickinson). As a theater and speech professor when she was in her 30s, she was a devotee of Ayn Rand and wore patent-leather heels, red lipstick and nail polish. But love for a woman and a hunger for change took her to San Francisco in 1970, where she threw open the closet door and strode out. “Hi, I’m Sally Gearhart — I’m a lesbian,” she would say, shaking strangers’ hands on the street. Within a few years, she helped found one of the first women’s-studies programs in the country at San Francisco State, where she taught popular classes like “Patriarchal Rhetoric” and “The Rhetoric of Sexual Liberation.”

In 1978, she helped change history when she and Harvey Milk, a San Francisco city supervisor, led a campaign against the Briggs Initiative, a state bill that aimed to ban gay men and lesbians from teaching in public schools. With calm confidence, Gearhart outargued State Senator John V. Briggs during a televised debate. And she and Milk traveled the state working to successfully defeat the bill.

In the 1990s, she retired to the land full time. By then, many of the cabins had electricity, heat and plumbing. She and other women formed a barbershop quartet and performed in Willits, where she was also involved in community theater. And despite her earlier writings about a world largely devoid of men, she had plenty of male friends, along with politically conservative ones. She believed there was no person with whom she couldn’t connect.

But the vibrancy of the community diminished after 2010. That year, Jane Gurko, who owned and lived on the land from the beginning and whose house was the social hub, died. Years earlier, she and Gearhart were romantic partners, and they considered each other life partners. Other women moved away from the land for jobs or for other reasons. Still, Gearhart remained.

Several years ago, a documentary director, Deborah Craig, and her camerawoman visited to film Gearhart. She was 83, wearing jeans covered in paint and sneakers. A sign that read “Wanderground” hung on the front window of her one-room cabin. Inside, it was full of books and her assiduously kept files. Her shoes hung from the rafters. Gearhart told the women about her chain-sawing skills (she abided by the community rule: Only dead trees and downed limbs could be cut for firewood). She offered to take them on a tour. Craig and her camerawoman climbed in the back of Gearhart’s rusted maroon S.U.V., the upholstery ripped and chewed by her dog, Bodhi, who had dibs on the front passenger seat. Gearhart called out, “You OK back there?” She pressed her foot on the gas pedal and headed up a hill and into the woods. “Hold on to each other’s hands. We are encouraging relationships among women. It doesn’t have to be sexual, girls. Are you listening to me?”

Maggie Jones is a contributing writer for the magazine and teaches writing at the University of Pittsburgh.

Until his final days, Colin L. Powell remained preoccupied with fixing things. The former secretary of state and four-star general tinkered endlessly in his garage — sometimes with his welder and sometimes on a succession of early Volvos, which were less complicated than the Corvette he used to whiz around the Beltway. (He took the Corvette to a track to race against Vice President Joseph R. Biden Jr. and his Stingray in the fall of 2016. “You want a head start?” Powell goaded Biden. “Go ahead.”) He was a regular at the neighborhood hardware store in McLean, Va., where he rummaged through parts for his house’s malfunctioning dishwasher or leaky faucets.

His plywood-and-wire fixes often left something to be desired aesthetically. But they satisfied his native frugality, his curiosity about how things worked and, perhaps above all, his compulsion to repair rather than discard what was broken. When he was fixing things, his longtime friend and deputy secretary of state Richard Armitage said, “there was a result at the end of the day. It’s why he was so happy as an Army officer: You take a platoon, and you make it better.”

At Powell’s memorial service in November, his son, Michael, recalled the time in 1982 when his father, then stationed at Fort Carson in Colorado Springs, bought a pallet of defective adding machines from a government surplus auction so that he could take each of them apart and make them work again. He did not mention that his father’s career at the time had hit a brick wall, after receiving a lackluster annual efficiency report. Then and later, Powell refused to blame racism for the matter, though he might have had cause to suspect it. His Jamaican parents had taught him that the way to overcome bigotry was to “get over it and be better than them,” as Michael Powell recently told me.

He did: A decade later, Colin Powell was a four-star general, the chairman of the Joint Chiefs of Staff and arguably the most admired man in America. His swift ascent seemed to personify the military strategy that came to be known as the Powell doctrine: Establish precise goals, exhaust all diplomatic options, amass support from allies and the public, then defeat the adversary with overwhelming force.

Both the general and his doctrine became famous during the Persian Gulf war of 1991, an invasion of such brutal efficiency that it lasted all of 100 hours. The victory would not save the presidency of Powell’s friend and political benefactor George H.W. Bush. Yet Powell also seemed well suited to the center-left boomer triumphalism of Bush’s successor, Bill Clinton, whom Powell served for eight months of the new president’s first term. After all, Powell’s persona offered the tantalizing prospect of America moving past the two defining fault lines of the 1960s: race and Vietnam.

After Powell stepped down, Republicans swooned over the idea of the Black general as their standard-bearer. He and his wife, Alma, eventually decided that a life of electoral politics would not be to their liking. But he was still the most popular political figure in America five years later, when George W. Bush, in his first cabinet appointment, named Powell secretary of state.

By the beginning of 2003, Powell was faced with a problem that seemed beyond his ability to fix: the commander in chief was determined to go to war with Iraq, hastily and with threadbare support from America’s allies. Such a ground invasion flew in the face of the Powell doctrine. Alone among the members of Bush’s war council, the secretary of state enumerated to the president the many things that could go disastrously wrong. Still, when Bush asked in January 2003, “Are you with me on this?” Powell assured him that he was.

“What choice did I have?” Powell told me a decade and a half later. “He’s the president.” His decision reflected a career built on prevailing from inside the system, ever aware that quitting was exactly what the critics and bigots wanted to see him do. For once, however, the supremely self-confident Powell failed to appreciate his leverage with the American public. Had he resigned in protest, the likely succession of events might well have forestalled the war.

“They call me the reluctant warrior,” Powell told me, “but if you want to go to war, I know how to do it.” Bush tasked Powell not with overseeing the war but instead selling it to the public. The secretary’s infamous speech to the United Nations on Feb. 5, 2003, with its multitude of claims about Saddam Hussein’s illicit weapons program that would later be proved false, amounted to an indelible stain on an otherwise remarkable career of public service.

Powell later matter-of-factly described the U.N. speech to his son as the biggest mistake of his career. But he refused to denigrate his former commander in chief — who, after all, had delegated the burden of that speech to the one man in America who had the credibility to deliver it.

After departing the Bush administration in January 2005, Powell would sit in the fire-lit home office that he called “the bunker,” haloed by TV and computer screens and photographs of himself with the most powerful men and women in the world, taking calls from foreign diplomats and heads of state seeking his counsel. He tried his hand at the private sector, joining the board of the cloud-computing company Salesforce in 2014. He continued to work with students, particularly at his alma mater, the City College of New York, with its Colin Powell School for Civic and Global Leadership, and attended dedication ceremonies for elementary schools across America that bore his name.

Powell also was a regular on the corporate speaking circuit. He relished the challenge of tailoring his monologues to obscure organizations. At one such appearance in October 2019, a keynote address at the Multiple Myeloma Research Foundation’s annual fund-raising dinner in Chicago, he told the audience: “Well, we have something in common.” He had just been diagnosed with multiple myeloma, or plasma-cell cancer.

Every other Friday for the next two years, as the disease inexorably advanced against him, he drove himself to Walter Reed National Military Medical Center for his cancer treatments in the Corvette. “To the last fricking day,” Michael Powell recalled.

Even when he was secretary of state, Powell would spend his few idle hours tinkering in the garage, to a soundtrack of calypso, Broadway musicals and Bob Marley, ABBA and the Mighty Sparrow. “It was therapeutic to him,” said Peggy Cifrino, his longtime assistant. “He said: ‘Going into the garage, I can see that the carburetor’s the problem and fix it — unlike foreign policy, where nothing gets resolved. You’re just spending four years doing the best you can.’”

Robert Draper is a contributing writer for the magazine. He is the author of several books, most recently “To Start a War: How the Bush Administration Took America Into Iraq,” which was excerpted in the magazine.

When “The Mary Tyler Moore Show” made its debut in September 1970, it caused a delayed tremor. The sitcom, about the very grown-up exploits of a single woman over 30, had so-so early ratings and reviews, and there was talk of cancellation. Network executives told the writers to “get her married” before the end of the first season. They didn’t, and the show went on to become one of the most groundbreaking and beloved sitcoms in the history of television. And not the least of its achievements was that it helped make Cloris Leachman a star.

Leachman’s career was at that point something of a delayed tremor itself. She’d already been in show business for almost 30 years, from the time she was 17 and had her own radio show back in her hometown, Des Moines. She’d been a beauty queen. She studied at the Actors Studio in New York City, where no less than Marlon Brando called her “the most talented one.” She played Shakespeare with Hepburn and sang Rodgers and Hammerstein on Broadway. But she never seemed to last anywhere very long, a kind of restlessness at odds with her talent. And her career began to be back-burnered in 1953, when she married George Englund, an actor and Brando’s best friend.

The marriage eventually produced five children, and on the surface, it seemed ideal. “He was everything you could ever want, tall, handsome, glorious, a master of the English language,” Leachman’s daughter, Dinah Englund, told me. “But he was equally destructive.” She said that Leachman would stay home with the children while “he and Brando would go around [expletive] everything in sight.” Leachman got some work during this time, including a short stint as the mother on the “Lassie” TV show, but her career slowed down during what might have been some of her prime acting years. “He ran her down,” Dinah says of her father. “But she always defended him.”

By the early 1970s, with her children growing older, and more and more women publicly unhitching their lives from those of men, or at least feeling less constrained by their opinions, Leachman’s career finally caught a gear. (She was by this time separated from Englund, whom she finally divorced in 1979.) As Phyllis Lindstrom on “The Mary Tyler Moore Show,” Leachman was hilariously pretentious, meddlesome, often just plain mean, with, as Leachman said, “a runaway ego.” Yet Leachman found a way to make her sympathetic. And perhaps most important, Phyllis was always unabashedly 40-something. (From the overloaded highlight reel: “The Lars Affair” episode, in which Leachman gives new meaning to the notion of “a pie in the face”; the performance won Leachman one of her eight Emmy Awards. And this real-life outtake: She told her castmate Edward Asner, who also died this year, that she’d sleep with him if he lost 32 pounds; he got to 29. Leachman kept the offer open.)

Over on the big screen, Leachman was playing a character with an entirely different emotional temperature: Ruth Popper, an out-of-options housewife in a dead-end town in “The Last Picture Show,” one of the most acclaimed films of the 1970s. Leachman awakened the character’s long-subsumed sensuality with compassion and grit, and won an Academy Award. As different as they were, both of her breakthrough roles held up a corrective lens to the depictions of the unyoung onscreen, proving that they could be complex and garner large audiences, and suddenly Leachman, now in her mid-40s, was everywhere: Mel Brooks films, TV-movie tear-​jerkers, a spinoff series of her own. Middle age had become her golden age. Valerie Harper, herself a decorated member of the “Mary Tyler Moore” cast, said, “We all ought to bow down to you, get on our hands and knees, because you’re the only one who’s doing it right.” Or as Dinah Englund told me, “She was a comet, and she just exploded.”

Leachman worked for almost 50 more years, winning Emmys into her 70s as the wicked grandmother on the sitcom “Malcolm in the Middle” and, at 82, becoming the oldest contestant on “Dancing With the Stars.” In the last decade of her life alone, she had dozens of screen credits. She sometimes made curious role choices, TV shows like “The Facts of Life,” movies like “The Beverly Hillbillies” and “Beerfest,” trifles compared with her Olympian work in the 1970s, because they were all that was available or she needed the money. “She made millions,” Dinah Englund says, “but she also spent millions.” In her memoir, “Cloris: My Autobiography,” published in 2009, Leachman remembers practically the entire arc of her nearly 80-year career with a surprising equanimity. “Acting is make-believe,” she wrote. “Don’t make it a problem. It’s spontaneous. Have fun.” Or, quoting her former mother-in-law, the actress Mabel Al​bertson: “Make a good bluff. Then make the bluff good.”

The closest Leachman comes in the memoir to expressing regret or heartache is when she talks about her son Bryan, who died of a drug overdose at age 30, after years of struggling with addiction. “You use only one drug,” she wrote, “but it’s got higher lethality than all of his combined. Your drug is hope.” Yet when Dinah went to tell Leachman that Bryan had died, she “took the words and caught them midair and closed her hand. She said, ‘If I open it, it will kill me.’” That emotional detachment never leaked into her acting, though. Every role, big or small, had the same “clear, truthful reporting of human behavior,” as she wrote. Or as her son Morgan Englund says, “She just muted it all out and kept going.”

Rob Hoerburger is the copy chief of the magazine and the author of the novel “Why Do Birds.”

It is hard to conceive of a less crucial post in American diplomacy than the ambassadorship to Luxembourg. The country, which is smaller than Rhode Island and only slightly more populated than Wyoming, is the sort of cushy diplomatic posting typically reserved for generous but not terribly distinguished political donors. So when Bill Clinton tapped one such donor, James Hormel, for the post in 1997, there was little reason to think the decision would prompt a protracted and vicious battle with congressional Republicans and end by making history.

Hormel belonged to one of America’s most prominent business families. His grandfather George started the Minnesota-​based meatpacking company that his father, Jay, later turned into a corporate juggernaut with the invention of Spam. But Hormel, who grew up on a 200-acre estate in a house with 26 bedrooms, did not want to follow them into the family business. After graduating from Swarthmore in 1955, he married his classmate, Alice Parker. He attended the University of Chicago Law School and later worked as a dean there.

Ten years into his marriage to Parker, with whom he had five children, they divorced. Soon thereafter, Hormel came out to his family members as gay. “I tiptoed out of the closet,” Hormel later wrote — this was the mid-1960s, after all. But “the more open I was, the more confident I became,” he recalled, “and the easier it was to be out.”

After a decade or so of political and spiritual peregrinations — working in Washington for a left-wing third party that ran the comedian and activist (and vegetarian) Dick Gregory for president, moving to Hawaii and devoting himself to EST self-help practice — Hormel settled down in San Francisco in 1977. Though he did not have an interest in the family business, he did have some ideas about how to spend the family fortune. He became a philanthropist, with a specific focus on gay equality and rights, giving more than $15 million to L.G.B.T.Q. causes over his life and establishing himself as one of the most generous gay donors in U.S. history.

Hormel provided the seed money for the Human Rights Campaign Fund — now the Human Rights Campaign and the largest L.G.B.T.Q. advocacy group in the country — and the American Foundation for AIDS Research. He also made smaller donations to countless other groups and efforts, ranging from a documentary film that taught tolerance to elementary-school students to an annual L.G.B.T.Q. academic conference at his alma mater. Alongside the conference every year, Swarthmore students hosted a debauched, gender-​bending party, where the silver-haired Hormel, in a business-​casual uniform of oxford shirt and khakis, would dance awkwardly but enthusiastically alongside cross- and undressed college kids.

“The early ’90s were still a time when we’d come out to friends and family and were often met with rejection,” says Kari Hong, who came out as gay in her first year at Swarthmore and is now an immigration attorney. “Jim was just a source of joy. He was a terrible dancer, but he didn’t care. He showed us there’s a pathway to happiness and a pathway to having a very delightful life.”

But it was the ambassador appointment from Clinton, one of the many Democratic politicians to whom Hormel had donated prolifically, that cemented Hormel’s place in L.G.B.T.Q. history. Hormel was poised to be America’s first openly gay ambassador, and Senate Republicans objected to his nomination not because of his lack of foreign-policy experience — awarding ambassadorships to political contributors was a bipartisan practice — but because of his sexuality. Hormel, Senator James Inhofe of Oklahoma warned, was “a gay activist who puts his agenda ahead of the agenda of America.”

Suddenly, the issue of who served in a sleepy ambassadorship was transformed into an important struggle over gay rights. Clinton had been an unreliable ally in that struggle, caving to Republican attacks when he enacted the military’s “Don’t Ask, Don’t Tell” policy and signed into law the Defense of Marriage Act. But in a symbolic fight over a deep-pocketed donor, he went to the mat. Despite Senate Republicans’ refusal to put Hormel’s nomination to a vote, Clinton refused to withdraw it. Then in 1999, nearly two years after first nominating him, Clinton used a recess appointment, which doesn’t require Senate confirmation, to install Hormel as ambassador to Luxembourg.

The job was not without its downsides. In an effort to win Republican support for his nomination, Hormel had pledged that his partner at the time would not live with him in Luxembourg. He spent much of his 14 months in Luxembourg alone, attending commemorations of World War II events. He left the post shortly before Clinton exited the White House.

After returning to the United States, Hormel resumed his philanthropic endeavors. In 2006, he paid for a group of L.G.B.T.Q. Swarthmore students to attend a charity gala for a Philadelphia gay-rights group. At the event, Hormel met a student named Michael P. Nguyen Araque. Although Hormel was 52 years older than Araque, the two soon developed a romantic relationship. “We liked to joke that when I was a sophomore,” Araque says, “James was a senior.” After Araque’s graduation in 2008, he moved to San Francisco to live with Hormel. Gay marriage was legalized in California the same year, and in 2014, Hormel and Araque were wed in a ceremony officiated by Nancy Pelosi.

Although Hormel’s children initially disapproved of the relationship, they came to accept and appreciate Araque. “It was hard at first, but eventually it was like, ‘What are we bitching about? He makes Dad happy,’” Alison Hormel Webb, his oldest child, says. At Hormel’s memorial service in October at Grace Cathedral in San Francisco, Araque and Hormel’s ex-wife, Alice, took turns reading from the Book of Isaiah.

Jason Zengerle is a contributing writer for the magazine.

Michael K. Williams believed he would die young. By 25, he had a drug habit and had stolen a couple of cars, and though he wouldn’t label himself a “bad boy,” he said of his early 20s, “I had a way of always finding myself in trouble.” In a bar in Queens on his 25th birthday, an argument escalated to the point where a man spat a razor blade out of his mouth and sliced Williams’s face, leaving the scar that would become his unmistakable signature.

That Williams could have retaliated and didn’t matters. “I opted out,” he told The Hollywood Reporter in 2011. “I knew that I did not want blood on my hands. And I honestly believe that because I let it go ... it’s why people look at this and see a thing of beauty.” He continued, “Had I taken the other route, I think it would have made me ugly — from the inside.” It might also have led him down the paths of many of the characters he played, men whose lives were often ruined by the inability to resist the brutality and violence that defined their worlds.

But in Williams’s case, the scar that split his face in half led to unexpected opportunities. Raised by a strict Bahamian mother in Brooklyn’s Vanderveer Estates, he loved to dance. He went from getting down in N.Y.C. house clubs to touring with Missy Elliott, Madonna and George Michael, and choreographing Crystal Waters’s 1994 hit “100% Pure Love.” Williams danced like the last drink being poured into a glass, both urgent and unbelievably graceful, more in control than any man has a right to be. And then when Tupac Shakur saw Williams’s face in a grainy Polaroid on some production company’s wall a few months later, the scar that at one time threatened to ruin his life catapulted him into a career as a thespian.

After Williams appeared alongside Shakur in the 1996 film “Bullet,” his career took off. By 1999, he’d secured a role alongside Nicolas Cage in Martin Scorsese’s “Bringing Out the Dead” and had filmed a guest spot on “Law & Order.” And then, because getting steady work as a Black actor is effectively as difficult as being drafted by an N.B.A. team, casting directors stopped calling Williams. It would be two more years before he was cast in an episode of “The Sopranos.” By then, Williams was back in Brooklyn, working at his mother’s day care center and struggling to make rent.

Then one day, while posting up in his apartment with a cousin, staring at a television on mute, Williams watched a slightly younger version of himself walk across the screen. Maybe it’s not over, he thought. After borrowing money from his mother to produce portfolios of his past work, Williams began auditioning again and waited for a call.

The role that followed — Omar in “The Wire,” a gay Black man who wielded a shotgun against his enemies — gave visibility to a form of Black masculinity rarely seen on TV. “Omar’s coming” was both a warning and an admission: There are some of us who walk in this world unafraid of who we are. In one role, he managed to be a Black Robin Hood, a tender friend and lover and a ruthless avenger with a sardonic wit that challenged ideas of what is permissible in the lives of Black men on the screen. In doing so, he became the litany of us. The charisma and bravery of Michael K. Williams the actor allowed him to make the most fearless character on “The Wire” also the most vulnerable. The actor Wendell Pierce, who played Detective William Moreland, known as Bunk, on “The Wire,” said that Williams has opened up “a window to a world of men that we pass by or don’t know about.” More than portraying these men, Williams’s genius lay in his willingness to inhabit the lives that could have been his.

That he did it all the while grappling with his own battle with drugs is a wonder. Having a habit is a hell of a thing. Many of us have struggled with drugs and alcohol, or know family members or friends or co-workers who have; Williams was not immune. Maybe he showed such sensitivity on the screen because he knew how precarious it all was. On the job, Williams would say that he always kept his mess at “shoe level,” but he also showed a willingness to talk publicly about that mess: addiction, sexual abuse, homelessness. That he was willing to portray men grappling with the very disasters he knew so well allowed him to turn his art into something groundbreaking. Omar and the roles that came afterward display the complexity and artistry of Michael K. Williams, who knew that among Black men, even in the same community, even in the same house, even in the same body, Blackness is not one or 1,000 things.

But it came at a cost. To play “Lovecraft Country” ’s Montrose Freeman, whose character lived through the Tulsa race riots, Williams had to go to dark places of his own childhood to understand what this atrocity and its aftermath did to Montrose. “In that moment, I went home to the projects [where I grew up] in East Flatbush, Brooklyn, and remembered all the violence and the anger and the missed opportunities and the potential and the innocence lost and stolen.” While playing Freddy Knight in “The Night Of,” Williams got a glimpse of what his nephew, Dominic Dupont, experienced over his more than 20 years in prison. “That weighed on me,” the actor recalled. Such roles, in addition to playing Bobby McCray in Ava DuVernay’s “When They See Us,” and producing the documentary about juveniles in prison “Raised in the System,” suggested what would have been Williams’s next act. “This Hollywood thing that you see me in, I’m passing through.” Speaking at an event on criminal-justice issues, Williams said, “I believe this is where my passion, my purpose is supposed to be.”

Williams was a man of many gifts, and his art was a levee against what addiction could do to him. Maybe that levee broke. I’m hesitant to say it, to suggest that how he died is how he lived. But those who’ve witnessed him bust a move remember the joy with which this man danced. During the last year of his life, there was a video that went viral: He danced in a New York City park with each of his limbs seemingly in a different borough. More than any character he played, those flying limbs and that joy were Michael K. Williams.

Reginald Dwayne Betts is a poet, lawyer and contributing writer for the magazine. He is a 2021 MacArthur fellow.

On the morning of June 28, I learned from Twitter that the literary critic Lauren Berlant had died. Over the subsequent days, remembrances ran through my feed in waves of grief that felt tantamount to the passing of a celebrity. This might be unusual for a contemporary academic, but the strangeness, even surrealness, of the occasion also felt appropriate: As a scholar, Berlant helped us understand how popular culture and everyday civic life are driven by some of our most private — and often painful — desires. Berlant was a critic and scholar of gender and sexuality whose remit stretched from 19th-century American literature to Monica Lewinsky to BoJack Horseman. Through it all, they taught us to think of mass culture as a site where the intimate and public merge.

Berlant, who used the pronouns “she” and “they,” was raised in the affluent Philadelphia suburb Penn Valley. Their mother was an interior designer and, later, a real estate agent whom Berlant once described as having “died of femininity.” We might read Berlant’s work as both an interrogation and a loving recuperation of an investment in a restrictive concept: gender. As an English graduate student at Cornell University in the early 1980s, Berlant absorbed the influence of theorists and cultural-​studies scholars like Benedict Anderson, Raymond Williams and Michel Foucault and wrote a doctoral thesis on Nathaniel Hawthorne and “the romance of power” — or how stories of love are always also stories of domination. The entanglement between fantasies of love and power occupied Berlant for the rest of their career.

As a professor in the English department at the University of Chicago, where they worked for 37 years, and an editor of the influential journal Critical Inquiry, Berlant shaped generations of scholars, transforming the way we speak and write about gender and sexuality, in both academia and more public-facing criticism. Their work offered a different way of looking at why we desire what we know is bad for us: junk food, exploitative and unsatisfying jobs, reactionary politics, constrained sexualities — all the appetites that power the American dream machine. Their attention to the contradictory and messy emotional lives of those deemed minor or inconsequential is precisely what enabled their work to speak to so many.

Berlant articulated, with candor and compassion, how living under capitalism, racism, misogyny and homophobia meant mastering life as a series of compromises and concessions. But they were careful not to moralize. Instead, their work was organized around an abiding generosity and curiosity about the shameful inconsistencies driving people’s interior worlds. “There is nothing more alienating,” they wrote, playfully, “than having your pleasures disputed by someone with a theory.” Berlant returned again and again to the question of love and its disappointments, of why we pursue things and people who don’t love us back. In their 2011 essay collection “Cruel Optimism,” they posed the question, “Why do people stay attached to conventional good-life fantasies ... when the evidence of their instability, fragility and dear cost abounds?”

The road to utopia, they suggested, is paved with hopes whose cruelty lies in their impossibility. Berlant sought not to chasten us for our continued attachment to these hopes but to describe them fully, in order to explain why they might feel necessary to our thriving. In the face of global collapse, we have clung desperately to these fictions because we might not yet know how to live without visions of the good life. For Berlant, acknowledging this is the first step in narrating a shared sense of what our collective present looks like — and building alternatives to it.

To that end, Berlant’s scholarship was bound up with intimacy and friendship. They experimented with collaborative writing as a form of not just intellectual exchange but learning as well. “Other people’s minds are amazing,” they marveled in a 2019 interview. “There’s the complete joy of the ‘not me.’ Seeing somebody else at work, seeing somebody else’s generativity and seeing how, together, you can compose things that neither of you could have done yourself.” Berlant kept up a robust personal blog titled “Supervalent Thought,” where they riffed on everything from Henry James to eating disorders to sex scandals for a readership that reached beyond the confines of academic journals.

What I loved about Berlant’s work was how, amid the fraying of national fantasies like upward mobility, meritocracy, job security and equality, Berlant made it clear that feelings we assume are solely private — depression, bitterness, resentment — are anything but. Here, we might say, was Berlant’s theory of the “intimate public sphere” — a version of love — in practice. It was their vision of how we might come together and attach ourselves to people and ideas that might actually love us back.

Jane Hu is an English Ph.D. and a writer living in Oakland, Calif.

“He’d come out with this twinkle in his eye,” says Conan O’Brien about the comedian Norm Macdonald, who was a favorite guest on his various talk shows over the years. “And he’d sit down and I’d say, ‘What’s going on, Norm?’ And he’d say, ‘Well, Conan, I bought myself a farm.’” O’Brien laughed at the memory of a familiar Macdonald gambit. “I’d be thinking, You didn’t buy a [expletive] farm. But it was more fun to go: ‘Really? I didn’t know you had a farm, Norm.’ And he’d go, ‘Yeah, I got a farm for my three daughters.’ And again I’d be thinking, No, you don’t have three daughters. But the whole joy of it was to go along.” That’s because the subterfuge was the point. What made the comedy of Norm Macdonald so different from so many successful contemporary comedians, and what placed him profoundly at odds with our culture’s demands for how truth and authenticity are conveyed, was how tantalizingly little it gave away of its creator.

Norm Macdonald was a complicated, often inscrutable guy, one who (mostly) adhered to now quaintly old-fashioned codes of privacy and propriety, a rascally self-mythologizer and a levels-deep ironist. Those obfuscating qualities mean it’s probably easiest to define his comedy by defining what it wasn’t. And that can be summed up in a single word: confessional. “Nothing can be easier,” Macdonald said during one of our several interviews. “Confessional is bragging. That’s all it is.” For him, comedy that wore personal experience as a badge or was motivated by expressions of personal identity, politics or emotions were all symptoms of the disease of conceit. Confession, believed Macdonald — who had an ex-wife and grown son, though you would never have known it from his material — is “something you do in a dark booth beside a holy man” and “doesn’t really even have a place in social intercourse.” The result of this belief was that his form of honesty, at least as it was expressed through his comedy, was the inversion of just about everyone else’s. “I’d always learned,” he said to me in another one of our interviews, “that concealing everything was art.”

Macdonald — whose moment of greatest stardom, a 1994 to 1998 stint anchoring Weekend Update on “Saturday Night Live,” represented merely a blip in a longer, more fruitful career as a stand-up — was just as wily about the truth offstage, and just as happy to play with it. He elevated tales of his gambling misadventures to the stuff of myth. He claimed to know Bob Dylan, another canny self-mythologizer, and shared unlikely stories about the two of them discussing scripture and sharing beef stew. Macdonald liked to portray himself as a rube from small-town Canada, yet could conjure opinions on such matters as the merits of competing Proust translations. The lone book he wrote, a minor classic called “Based on a True Story: A Memoir,” was a comedic novel dressed up as a celebrity tell-all, the costume so convincing that some readers missed the joke. For the paperback, its subtitle was changed to “Not a Memoir.”

That commitment to dissembling wasn’t always so larky. Macdonald’s comedy had recurring strains of seeming misogyny and homophobia that made you wonder whether it was what he actually thought. I asked him about this once and he said, and I’m paraphrasing some saltier wording, that if you believed he meant his jokes about women and gay men then you were a dimwit — and if he did mean them, then he was a dimwit and thus irrelevant. And still, the layers of slipperiness accrue: I think back to a moment when, before we were about to go onstage for a public Q. and A. about his book, he took me aside and said that sour comments he’d previously made to me about women comedians had been made “in character” — an oddly uncharacteristic clarification.

It was only in retrospect that we learned the profound depth of Macdonald’s commitment to concealment. It was one thing for him to tell me this: “People think things are tragedy. They’re not tragedy. If you get cancer, that’s not a tragedy. If your mother dies when she’s 30, that’s not tragedy. That’s life. You don’t yell it from the rooftops. It has no place in comedy.” It was another to belatedly realize that he’d said it while living with his own cancer diagnosis. Why didn’t he share this? Lori Jo Hoekstra, Macdonald’s longtime producing partner and close friend, who was with him when he died — he’d kept his illness (first multiple myeloma and eventually leukemia) almost entirely hidden for nine years — explains his reticence plainly: “He wasn’t an open book; certain things were just outside his comfort zone.” Macdonald’s older brother, Neil, a writer, editor and former journalist for the Canadian Broadcasting Corporation, surmises that Norm’s behavior was a natural product of his having grown up amid stoic farmers in Ontario’s Ottawa Valley, within a heritage of severe, old-fashioned Scottish Presbyterianism. (Not exactly an emotionally giving milieu.) Their father, Percy, a stern schoolteacher, was also a model, at least in one very specific regard. “He’d be clearly in agony,” Neil Macdonald says about his father’s struggle with the illness that led to his death, “and you’d ask him, ‘How’s it going, Dad?’ and he’d say, ‘Oh, all right, I suppose.’”

Today we could call that repressed. Or we could say that Macdonald had his cultural and emotional templates and we have ours, and his comedy’s verve flowed from the space between. “He was an eccentric guy, you know what I mean?” says another friend, the comedian David Spade. “Like, he lived in L.A. and didn’t even drive. He always did his own thing. That meant he was always hard to pin down, even if it was just to get dinner.”

Which is why over the pandemic, as Macdonald, never much for sticking to social plans, grew sicker and became even more elusive, his friends had no reason to suspect anything was wrong. The threat of Covid, Spade reasons, only “upped by about 20 percent how hard he was to meet with.” So his behavior never really changed, and he certainly wasn’t about to start turning what he saw as his commonplace suffering into material.

The only joke that anyone I spoke to who was aware of his decline remembers him explicitly making about his situation came after he was wheeled out into the sun-splashed atrium of a hospital where he was being treated. Isn’t it nice here, Norm? “Yeah,” he replied, “in the atrium of diminished expectations.” To say more about his plight, given that the great gift of a comedian is the imaginative freedom to say (or withhold) anything, would have been the stuff of a hacky, inauthentic routine. Norm Macdonald did something different. All the way to the end.

David Marchese is a staff writer for the magazine and the columnist for Talk. Recently he interviewed Brian Cox about the filthy rich, Dr. Becky about the ultimate goal of parenting and Tiffany Haddish about God’s sense of humor.

When Mary Wilson of the Supremes died in February, I found myself doing what I often do following the loss of a musician I hold dear: I dove into an archive of photos. In images of the Supremes in the 1960s, Diana Ross was often the scene-stealer, with her stunning wide-eyed gaze that suggested she was just about to share a long-held secret. Ross often stood in the center, with Wilson gamely at her side. But the thing about Wilson — who was with the group from its 1959 inception as the Primettes to its breakup in 1977 — is that she was always there.

The Supremes began as the Primettes when a Detroit teenager named Florence Ballard recruited Wilson, a friend, to help create a female counterpart to a group called the Primes (a predecessor of the Temptations). Ross also joined, as did a fourth member, Betty McGlown. Later, McGlown left, and the remaining members became the Supremes and had a luminescent run, producing chart-topping singles like “Baby Love” and “You Can’t Hurry Love” throughout the 1960s. They ended the decade as the best-charting female group in music history, a distinction they still hold.

Early on, Ross was the face of the Supremes — so much so that, by the end of the 1960s, the group was called Diana Ross and the Supremes. But by 1970, Ross had left, as had Ballard, and its remaining members found themselves changing their lineup four times over 11 albums. While the group had a few more hits, several of their albums were poorly promoted and did not sell. Through it all, Wilson was a bedrock, nearly carrying Side 1 of the group’s 1975 self-titled album all on her own. As the only original member left, she was a familiar presence to whatever remaining fans the group had — a face and voice they’d come to rely on. By then, the Supremes were essentially Wilson’s group, and she refused to let them fall apart, even when the shifting of the musical times suggested that their moment should be up.

And then it was up. After the Supremes stopped recording together, Wilson released a self-titled solo debut in August 1979. The album had the misfortune of being released around the same time as a new Diana Ross record that received better promotion; it also came at the outset of a racist and homophobic backlash against disco music. Critics and the public paid Wilson’s album little attention, and it was essentially forgotten in the years after its release. (Its re-release was in the works at the time of her death.) But it’s one of the rare records I’ve held onto for more than a decade. Never lent out, never given away.

Wilson’s greatest gift was her ability to temper longing with a kind of optimism, which is clearly on display in this album. In its songs, love can be more than just an endless cycle of wanting — a cycle that I, like many people, can get wrapped up in. In the patient and tender “Pick Up the Pieces,” she presents a listener with not only the sadness of a diminishing love, but also a determination to keep the love alive: “There’s no reason why we can’t make it.” We must make it, she seems to be insisting. We have to.

Yet it’s not quite right to discuss Wilson’s life and career as one of only endurance and sacrifice. She was also magnetic, easy to fall in love with, endlessly charming. Wilson knew the secret that I have returned to, particularly during these past several months of ever-mounting anguish, anxiety and grief. She understood that there was a time to be heartbroken, and there was a time to dance. The two modes operated in service of each other.

My favorite Mary Wilson moment takes place in 1973. On a riser above the “Soul Train” stage, Wilson playfully chides Don Cornelius, the show’s host, begging him to dance with her in the famous Soul Train Line. She’d never gotten to do it before, and Cornelius, to that point, had never done it, either. He tries to divert her pleas with smooth flirtation. When an audience member eggs Wilson on, he gestures at them, playfully but anxiously, attempting to move past the moment.

In the clip I’ve watched over and over on YouTube, there is a jump cut. And the next thing you see is Cornelius dancing down the line with an ecstatic Wilson, her smile outshining every bit of regalia crowding the “Soul Train” set. When I think of Mary Wilson, that’s what comes to me: this endless desire to pull someone else along with her in her joy, to open it up so that Don Cornelius — and we — could feel it, too. It’s those small moments that must be stashed in the memory, in the limbs that feel heaviest on the sad days. You don’t know how good it is to shake off the grief until you’ve done it a few times. Until you’ve grabbed some people by the hand and dragged them along with you to perform a miracle.

Hanif Abdurraqib is a contributing writer for the magazine from the East Side of Columbus, Ohio.

Names, like all conventions of language, hold the tremendous power of creation — we are given them, but we are also able to give them to ourselves. They are invocations, especially when they command “full use of the tongue,” as the poet Warsan Shire once wrote. They help us remember who we are, and they also telegraph to the world who we are.

Kiér Laprí Kartier chose a name that anointed her with glamour, like the supermodels she was inspired by, and she also tethered herself to family. Kiér came from her mother, Arnitra Solomon-Robinson, who first heard the sweet-sounding name back when she was in high school. It felt unique, like her own first name, but more than that: It turned heads, and Solomon-Robinson wanted her firstborn to stand out. Laprí was her revamped middle name, and her last name, Kartier, came from her new community, whose house surname was inspired by the jeweler Cartier, and their signature “love” bracelets that require a screwdriver to lock the bonds into place.

Kiér and her mother were close. She taught her mother popular TikTok dances, and they liked to cook — usually seafood — while they sang along to their favorite artists: SZA, Saweetie, Ariana Grande. Even after Kiér moved out to live with her boyfriend, they tended to talk every day on FaceTime. “She wanted to make sure she had seen me and I had seen her,” her mother says.

Kiér met her boyfriend, Coty Gibson, when she was working at Walmart in Dallas — they caught each other’s eye and started DM’ing on Instagram. They moved in together in the spring of 2020 with “nothing but a TV and our clothes,” but quickly made their apartment a warm home. Friends often dropped by to visit with their puppy, Bella, or play Just Dance or Mortal Kombat on the Xbox. In the spring, after making a home with Gibson, Kartier started making herself at home in her body — wearing her hair long, so long that it sometimes brushed against her hips. She preferred a natural, classy look — pink lip gloss and pristinely polished white or nude acrylics. Not long after, Gibson recalls, Kiér began having difficulties at her new temp-agency job: “They were picking on her for every little thing.” Eventually, she was fired.

The act of self-realization is inherently so radical and so daunting that very few of us will ever be able to fully do it in our lifetimes. For many, independence of self is conflated with the milestone of turning 21, which our culture views as the ultimate signifier of liberation. It’s the age when, for many of us, you can legally rent a hotel room by yourself, visit a casino and go to a bar. Kartier had big plans for her freedom year: season tickets to Six Flags and gender-affirming surgery. She and her best friend, Joshua Wilson, often made the three-and-a-half-hour drive to Houston, where they could hit clubs and imagine themselves on a trajectory of fame, fun and fortune, the kind of decadent lifestyle that fuels most of social media.

She resembled a baby Naomi Campbell and dreamed that one day her dimples, height and bone structure would get her work as a model. Wilson and Kartier had many heart-to-hearts about the violence, discrimination and heightened vulnerability she could face as a Black trans woman. But she wanted people to see, Wilson says, that this is “who Kiér has been this whole time.”

Over the summer, Kartier began acknowledging her true self publicly by updating her name across social media platforms. She was in the process of shaping her expectations for her life, her career, her family: the existential dilemmas that all 20-somethings are supposed to have the luxury of wrestling with. “She was figuring it out,” Wilson told me. “But she never got to finish figuring it out.”

On Sept. 30, Kiér called another friend, Josh Mack, to see if she could come over. She asked him sweetly if he would cook for her. Mack loved to lavish on her, so he went all out: salmon croquettes with smothered potatoes (“a Southern thing,” he told me), sweet rice and homemade biscuits using a family recipe. It felt like a celebration: The radio was on; there was flour everywhere. Kiér stepped out around 8 to run a quick errand. Mack made sure to set aside a plate for her. As hours passed, Mack’s heart grew heavy with worry. He covered Kiér’s plate so it would keep. Before he went to bed, he put it into the freezer, where it remains to this day.

At approximately 9:30 p.m., the Arlington Police Department found Kiér, fatally shot, in the parking lot of a nearby apartment complex. Her death, according to the Human Rights Campaign, made her at least the 38th trans or gender-nonconforming American to die this year by violence. By the time of this printing, an additional dozen such tragedies would follow, cementing 2021 as the deadliest year on record for trans folks.

Jenna Wortham is a staff writer for the magazine and co-host of the podcast “Still Processing.”

Christopher Plummer claimed that accepting the role of Baron von Trapp in “The Sound of Music” arose out of “the vulgar streak in me.” Movie stardom was not something he had set his sights on; it was a classical actor’s stage career he always wanted, having fallen in love early with a style of acting he witnessed in the touring troupes passing through his native Montreal, a style he would later associate with Laurence Olivier: “that timeless, larger than life kind of performing that belonged to an unidentifiable golden age, when the actor reigned supreme.”

Already, by the time the von Trapp offer came, he had made his mark playing Henry V, Mercutio and Richard III at places like the Stratford Festival in Ontario, the American Shakespeare Festival Theater in Connecticut and the Royal Shakespeare Company. While still in his 20s, he turned down a seven-year contract offered by David O. Selznick in order to play Hamlet “for at least 25 cents a week,” as he put it in his memoir, “In Spite of Myself,” published in 2008.

Still, there must have been something irresistible about appearing in a big-budget spectacular in the mid-1960s. A number of Plummer’s peers, actors like Peter O’Toole, Albert Finney and Alan Bates, all of whom had played the classics in repertory, made the transition effortlessly. But though the movie he disparagingly referred to as “S&M” became the most popular film of its time, Plummer never achieved film success on the level of those others.

He seems to have intuited that the very qualities he brought so effectively to Baron von Trapp, a cold imperiousness, an emotional chill, were not exactly a ticket to mainstream success in the age of “What’s New Pussycat?” The actors filmgoers embraced in the 1960s were the ones who seemed able to open themselves fully to emotion, not to stifle it, and Plummer on film was never going to be a great liberating force. Nor did he try very hard to. Though he continued to accept film roles in the wake of “The Sound of Music,” there was always a kind of ambivalence to his choices — he had a habit of choosing films that seemed destined to fail — and a detachment in the performances themselves.

But the deeper reason Plummer remained a stage creature most likely has to do with what he called, in his autobiography, his “strange loyalties.” That is, to his original idea of himself, to the heroic ideal of the classical actor, to the example set by those who came before him. “He strove to have a career like Gielgud, Richardson, Redgrave, but on this side of the Atlantic,” the director Doug Hughes told me. (Hughes directed Plummer in his final Broadway appearance, in a 2007 revival of “Inherit the Wind,” as Drummond, the defender of Darwin.) Plummer seems to have understood that his greatest gift as an actor, a barely contained rage, was far better suited to the stage, where he knew how to work it to perfection.

Eventually, toward the end of his life, he was offered a slew of great character parts that let him channel that splendid rage onscreen — Mike Wallace, Leo Tolstoy, J. Paul Getty. But when he finally won an Oscar, in Mike Mills’s 2011 film, “Beginners,” playing a long-closeted gay father facing death, it was for a performance in which he allowed the chill to fall away entirely. In his tender scenes with his son (Ewan McGregor), he seems to have left behind every vestige of Baron von Trapp in favor of a new, and startling, emotional availability.

As satisfying as it must have been to at last triumph in films, that was not to be the end of it. At 80, he returned to Ontario for one last go at Prospero in “The Tempest.” For a man always tagged by his colleagues as deeply unsentimental, that theater, and its players, seem to have provided an emotional locus. Plummer would insist on being allowed to sit alone in the darkened theater, listening for the voices of “my actor friends,” departed colleagues who, like him, dedicated themselves to the tradition he revered. Sometimes one’s deepest loyalties are to the ghosts in the room.

Anthony Giardina is the author most recently of the plays “The City of Conversation” and “Dan Cody’s Yacht.”

Sometime in 1972, a 4-year-old boy slips away while his two older sisters are changing into their bathing suits in the bathroom of a local public pool in Washington, D.C. Frustrated, or afraid that they left him, he leaves LeDroit Park and walks down Georgia Avenue, all the way downtown, wearing only a pair of swimming trunks, tennis shoes and a towel. He stops in front of a pawnshop and inquires about the guitar in the window — and makes it home safely with the help of a nearby security guard.

The boy had always gravitated to music. His parents, Charlotte and Carl Edward Thompson Sr., noticed their only son, Carl Jr., a.k.a. Chucky, making music out of wooden spoons, pots, pans and even windshield wipers from the time he was about 2. His sisters, Chrystal and Carla, remember Sears department store “wish books” with musical-instruments ads ripped out. Thompson’s parents gave him a drum set when he was 4, and he eventually taught himself to play all the instruments on those torn pages.

Thompson quickly became steeped in the musical history of go-go, the city’s proprietary form of funk. When he was a teenager in the 1980s, he played congas in Chuck Brown’s band, the Soul Searchers, and became quite close to Brown, who is regarded as the Godfather of Go-Go. “As far as go-go goes,” Thomas Sayers Ellis, a poet and photographer from Washington said, “Chucky was the closest mixture of a suave James Bond and a maestro Quincy Jones D.C. had produced in years, a seer-hearer of the entire sound grid.”

Recently, I went to a “bounce beat” show at the Lincoln Theater, less than a mile from Thompson’s childhood home. During breaks, the host asked the audience trivia questions: “Who gave singing lessons to TJ in New Impressionz?” “Can you name three go-go venues that operated from 2007 to 2009?” The exchange felt like listening in on another language. To be among go-go lovers is to be among people who resist being fully ascertained — the genre itself a negotiation between popular music and Black insider knowledge. I spent two hours listening in a state of delightful confusion, smiling at the fact that Black people still have our own secrets.

Thompson’s career took off after his transformation from a musician to superstar producer in the early 1990s. He was best known for producing hits for Mary J. Blige, Nas, Usher, TLC and Sean Combs’s Bad Boy Records. As a member of Bad Boy’s “Hitmen” production team, Thompson made soulful R.&B. and hip-hop smashes for the Notorious B.I.G. (“Big Poppa”), Faith Evans (“You Used to Love Me”) and Shyne (“Bonnie and Shyne”). The beat of Nas’s “One Mic,” the rapper’s 2002 comeback single, resulted from Thompson’s tapping on the back of a guitar. Blige’s “My Life” (1994), which he co-produced, has been hailed as one of the greatest R.&B. albums of all time. “I wasn’t even looking at Mary as this big artist,” he says in an Amazon Original Documentary commemorating the album’s 25th anniversary. “I just wanted to make sure she had that royal, but yet still grounded, hood feeling about the songs that I was delivering to her.” By all accounts, navigating what it meant for a generation of Black artists to sound both “royal” and “grounded” was the mark of his career.

This was a man who found music everywhere, and talking with his loved ones, so did I. I heard music in the laughter of Thompson’s mother and sisters when they remembered his love of Gucci cologne “with the gold top,” and his oldest daughter, Ashley, one of Thompson’s five children, when she recalled the time he took her prom-dress shopping. The minor-key melodies of his loved ones when they got choked up. The ghost notes in their pauses.

The celebrated producer James Harris III, who goes by Jimmy Jam, places “My Life” in his top five albums of all time but said that one of his favorite Thompson compositions was Faith Evans’s 1995 hit “Soon as I Get Home.” “That was just the prototypical gospel-chord anthem,” Harris told me. The producer Salaam Remi said, “It feels like everything that’s spiritual.” He added: “But it also feels sexual, sensual. The chord changes and the mood of it take me into my ’90s room when it’s dark. It’s like a slow-jam tape at its best moment.” Thompson intended the track to be an interlude, but he ended up creating a full-fledged saga, an Odyssey in 5 minutes and 24 seconds. Gina Rojas, Thompson’s companion at the time of his death, said that he dedicated the song to her two decades after he produced it. She recalls him telling her, “It wasn’t until I started coming home to you that I understood what the song meant.” She took a beat. There was that music again.

Niela Orr is a story producer for Pop-Up Magazine, an editor at large for The Believer and writes the Bread and Circuses column for The Baffler.

No show about life at the turn of the millennium — especially not one called “Sex and the City” — would be complete without the so-called gay B.F.F. Carrie’s male best friend, Stanford Blatch, competed with New York City itself for the title of “the fifth lady” on the show. Over the course of six seasons and two movies, he would come to define this stock character — a transitional role on the road from sissy villain to full-fledged protagonist.

Willie Garson, the actor who played Blatch, made a whole career playing stock characters, appearing in more than 70 movies and 300 episodes of TV. He debuted onscreen in 1986, with a small role in a TV movie on Ted Bundy. He’d go on to make his name playing nameless characters: “clerk,” “assistant,” “waiter,” “corporate guy,” “telephone operator,” “nitwit executive.” He had an arc as a suspected killer on “NYPD Blue.” He played Lee Harvey Oswald three separate times. At 5-foot-8 with Central Casting-pattern baldness, he had the kind of unassuming physicality that could blend into the background or be called forth to serve as a foil to square-jawed onscreen masculinity.

Born in Highland Park, N.J., in 1964, Garson had a family history that no doubt provided colorful reference for his work. His grandfather was an immigrant who went into the wine business, producing plonk for alcoholics. The winery gave way to a whole slew of crooked gambits that Garson described as “Jewish mafia” stuff. His father worked part of the week administering a fleet of pay-by-the-day televisions in New Jersey hospital rooms. Then, from Thursday on, he played blackjack in Las Vegas, flying home on Sundays. Garson was bar mitzvahed in a blue velvet three-piece Pierre Cardin suit. After that, he started taking the train into New York, working the youth audition circuit by himself. “He was already a raconteur,” says Sarah Jessica Parker, who first crossed paths with him when they were young adults. “It was very strange to me that someone with very little life experience could spin a yarn and hold court like that.”

Though Garson had memorable parts in big movies — most notably as Ben Stiller’s doctor in “There’s Something about Mary” — Stanford Blatch was his star-making role. (After “Sex and the City” premiered in 1998, he always had stacks of scripts on his desk inviting him to audition for gay roles.) The show was born into a different media climate, an era when men who had sex with men were portrayed as tragic martyr figures, flaming-but-sexless makeover bots or, rarely, ordinary dudes who just happened to date dudes. In Blatch, Garson found a playful middle ground, channeling the undefensive mannerisms of a man who camps mainly for his own pleasure. He was gentle, but savvy; romantic, but still pragmatic; and overflowing with wonder, but never saccharine. Though Garson himself was straight, he shared many of these traits. “He was Stanford through and through,” Cynthia Nixon says. “Although, I have to say, Stanford is sort of hapless, and Willie was anything but hapless.”

Garson’s friends knew him as omnivorous and worldly. He collected shoes and watches and eyeglasses (and sometimes pilfered from the costume department). Because he had worked with everyone, he always had good anecdotes to share on set. He loved poker. He invested in restaurants. In his free time he worked with foster-care causes, which led him to adopt his son, Nathen, in 2010. He was a single father.

This confident and idiosyncratic strain of masculinity often confused people. In promotional cycles for “Sex and the City,” interviewers regularly asked if he was gay — a question that can lead celebrities to say funny stuff. (Matt Damon: “Whether you’re straight or gay, people shouldn’t know anything about your sexuality.”) Garson often dodged the question on principle. The year before he died he told Page Six: “When I was on ‘White Collar,’ no one ever asked me if I was a con man, and when I was on ‘NYPD Blue,’ nobody ever asked me if I was a murderer. This is what we do for a living, portray people.”

Garson told Parker he had pancreatic cancer just before shooting began for the “Sex and the City” limited series, “And Just Like That ... ,” which premiered this month. At first, she was the only person on set who knew; Garson didn’t want people to treat him differently. “One of the hardest parts about witnessing the end,” Parker says, “was that I knew if Willie told me he had to go home, it was because he really had to go home.” Ultimately, he could not finish the season. In his final days on set, he told many colleagues individually. “I think it was really important for him finally to be able to tell people,” Nixon says. “It was a kind of coming out.”

Jamie Lauren Keiles is a contributing writer for the magazine. Their last article was about the Sturgis motorcycle rally.

Melvin Van Peebles made uncompromising films — most famously the “Sweet Sweetback’s Baadasssss Song” from 1971, which speared boldly into the social and racial fissures of the day — and ignited the genre of “blaxploitation.” But he also wrote novels and plays, painted portraits and recorded spoken-word albums, and nowhere was his freewheeling creativity more evident than in the Blue Room, his treasured studio space inside his Hell’s Kitchen home.

“My dad got a kick out of taking something from everyday life and seeing it as worthy of being sculpture,” the filmmaker and actor Mario Van Peebles says. “Someone could have filing cabinets in their office, but why not get the back of a VW bus, cut it off, put it on the wall and use it as a filing cabinet?” To find just the right bus, the elder Van Peebles scoured salvage yards. Then he figured out a way to make real steam blow out of the tailpipe jutting from the wall. (The bird droppings on the skylight coffee table were fake.) “He had this fanciful, wily sense of humor, and a love of the everyday.”

Van Peebles, who always hungered for intensity, filled his apartment with bursting colors. The Blue Room was his favorite, and the space where so much of his art was conceived. For the 2003 biopic “Baadasssss!,” Mario — who directed the film and plays his father — hunted down the exact shade for the walls of the set. Melvin “would sit in the Blue Room and look out through the windows onto the wonderful view on the street and watch the light play across,” Mario says. “He passed away in that apartment — he wanted to be back in a space he had created and enjoyed, in which he’d given birth to so many of his projects.”

Amy X. Wang is a Beijing-born, New York-based writer and the assistant managing editor for the magazine. She is at work on her first novel.

Additional design and development by Jacky Myint.