welcome to strong feelings! Essays by writers we love, in which they share their most impassioned opinions on a given subject. If you love our usual advice column — don’t worry it’s not going anywhere. This month for strong feelings, writer and critic Terry Nguyen (she/her) unpacks "It Girls,” and all their layers.
Over half a century ago, Andy Warhol popularized a prediction about celebrity, a statement so prophetic that it now sounds like a tired cliché: One day, everyone will have their 15 minutes of fame. There is no doubt that we are living in Warholian times. We take for granted the ease of fame — how any person can become an icon overnight, an idol on screen. We are oversaturated with an infinite scroll of images and videos, but there is no longer a singular muse at the center of it all, discerned by her frequent appearance on magazine covers. In the ‘90s, there was Kate Moss. I associate the 2000s with Gisele Bundchen, Sarah Jessica Parker, and Devon Aoki. But by the 2010s, my memory became crowded with the many “It Girls” I saw on Tumblr, then Instagram.
New York Magazine’s May cover story highlighted the decades-long lineage of Manhattan’s “It Girls.” It walked us through generations of muses, like Bianca Jagger and Tina Chow, whose star power was fueled, in part, by their association with pillars of culture, like Warhol. But the cult of the cool girl extended beyond the Big Apple to encompass models (Pat Cleveland, Naomi Campbell, Kate Moss), actresses (Molly Ringwald, Brooke Shields, Chloe Sevigny), musicians (Edwidge Belmore, Pat Benatar, Courtney Love), and socialites (Alexa Chung, Lucy and Plum Sykes, Nicky and Paris Hilton). The most visible It Girls became household names (and faces). They were praised for their beauty, style, talent, or some nebulous combination of the three. They had “It,” whatever it was.
Today, we have what New York dubbed “It Girl inflation.” Social media has dismantled the reign of the cool girl, a title once bestowed by cultural gatekeepers (fashion magazines, designers, street-style blogs) upon a select few. Now, anyone can be an It Girl on Instagram or TikTok. Fame is a ladder with many rungs; each ascending step can be leveraged for gain. Magazines, too, have been overeager in appointing new It Girls, like Addison Rae and Emma Chamberlain, to retain a semblance of influence over a new generation of readers. “The explosion of the internet changed the market for ‘It,’” according to the editors of New York, “dramatically increasing both demand and supply.” It’s worth asking, however, whether the ineffable “It” factor — a quality predicated on exclusion — can meet the inclusivity of our culture today, or if its continued use is simply indicative of an old-guard grasp at relevancy. Her retirement reflects the dwindling power of the tastemakers who made her, from magazines to entertainment studios, in telling young people who or what they should care about.
The It Girl is, to put it bluntly, old. The term formally entered the cultural lexicon about a century ago with the popularity of the 1927 film It, starring the actress Clara Bow. The movie’s success led the press to call Bow “the It girl,” a phrase originally coined by the English writer Elinor Glyn. Glyn described “It” as a warm and magnetic charm, an intrinsic “self-confidence and indifference as to whether you are pleasing or not.” Regardless of what “It” really meant, the term was catchy. In later decades, tabloids and magazines conferred the title upon the glamorous and highly visible — social darlings who were “famous for being out, famous for being young, famous for being fun, famous for being famous.” It Girls came in all shades and styles, although they were mostly white, rich, and thin. Crucially, the It Girl had to be crowned by the media. She couldn’t declare herself one. Her existence, in turn, gave the media its cache.
The It Girl’s reign coincided with the golden age of the glossies, a bygone era in which magazines were the primary purveyors of taste and style. Consumers used to depend upon magazines like Vogue, GQ, Vanity Fair, or TIME to tell them what was “cool” or “in,” who was “hot” or “It.” The media chronicled and captured the zeitgeist such that it became the zeitgeist — a symbiotic relationship that slowly crumbled once the internet came along.
When I think about the decentralization of culture, I always return to Miranda Priestly’s cerulean monologue in The Devil Wears Prada. Her infamous speech revealed the fashion industry’s influence over something as commonplace as color, how an editorial or runway decision can trickle down into the wardrobe of a casual shopper. The opposite seems to be true today across all culture industries, not just media. Those of us who came of age in the early-to-mid aughts (i.e. millennials and cusp Gen Z-ers) were the last to remember print magazines’ stronghold over pop culture. Whereas today, digital media outlets have a minor impact on trends. Even storied magazines have resorted to drumming up trend discourse for clicks.
The Devil Wears Prada-era monoculture has fractured. We are siloed within our own algorithmic preferences, but still curiously nostalgic for some semblance of hierarchy. Gatekeepers of the fashion, music, and media industries have depended on certain old-fashioned notions of “cool” to stay relevant. Perhaps that is why we’ve clung to such dated descriptors, terms like “It Girl” or “indie,” to separate our tastes from the masses, even though the concept of “mass culture” is itself antiquated.
I often think about the viral Wendy Williams quote: “She’s an icon, she’s a legend, she is the moment.” Despite being a good soundbite, it doesn’t quite make sense in today’s culture, which is composed of many moments; each online platform has their own distinct lineup of stars. The moment is actually quite fleeting. This has, to a certain extent, affected the longevity of It Girls. Their time in the limelight is limited, and their relevance, much like ours, is predicated on posting for a brief blip of virality.
The critic Safy Hallan-Farah argues that Gen Z has “created a new kind of selfhood” called “hyperreal individualism.” Young people’s consumer identities are fluid and rooted in a variety of cultural references. Hallan-Farah writes: “Hyperreal individualism is where the original references are largely illegible or incoherent, but the individual wishes to define themselves and create an identity around their own disparate tastes and styles anyway.” This doesn’t mean we’re any closer to understanding ourselves; we simply have more options to pick from, more micro-trends to dress up as. The same goes for It Girls.
In this era of personalized style and beauty standards, Gen Z gravitates to their own unique muses. Rather than collectively emulating one or a small select batch of It Girls, each consumer has their own individualized moodboard of influences and influencers. The modern It Girl has not gone extinct, but her sphere of influence has shrunk. She is more of a brand than a woman, a commercialized vessel for her own beauty and fashion line. Perhaps because the “It” factor no longer guarantees lasting success or relevance, today’s crop of influencer-It Girls have undue pressure to capitalize on their identity, rather than simply exist in the spotlight. If everyone, as Andy Warhol once said, will have their 15 minutes of fame, then maybe we all have “It” now—whatever it is.
Edited by Amalie MacGowan and Mi-Anne Chan.
I think the "it" factor requires a sense of mystery and unattainableness that our current culture's objective to self-market, self-brand, and become an entrepreneur of the self all reduce any allure of curiosity to commerce.
This makes me think of a conversation I had with my fiance the other night. He said that everyone would know who Shohei Ohtani was because he's the most famous person in baseball.
I disagreed, arguing that universally famous people no longer exist outside of the Kardashians, Beyonce, Lebron James, and a few other key players in pop culture. The idea of "fame" has been diluted by the sheer number of people who are now considered famous in their own algorithm's world. I'm not sure if that's a good or a bad thing.