Nostalgia for Now
– Brandon Ambrosino
How digital and social media blur the lines of memory, history, and reality
It was late 1997, and America was experiencing a national retro crisis — according to the Onion, the nation’s leading satirical news publication. “If current levels of U.S. retro consumption are allowed to continue unchecked,” warned the fictitious Retro Secretary Anson Williams, “we may run entirely out of past by as soon as 2005.” The reason was simple enough: the “retro gap” — the distance between a historical trend and its revival — had been decreasing at an alarming rate, and if it kept shrinking, the past and the present would become one and the same, enmeshed in a giant “retro-present warp,” or “futurified recursion loop.”
As usual, the Onion, in its own way, was onto something.
Four years prior to the satirical Onion article, Tom Vanderbilt opined about the ever-shrinking “nostalgia gap,” and comically posited scenarios that might come true if this shrinking continued at its current dizzying speed: “The previous month’s Top 40 will appear in boxed-CD sets, as television commercials intone: ‘Do you remember what it was like in April, to be young and carefree, listening to the music that made you feel that way?’ Hey man, is that April Rock? Well, turn it up!”
As George Carlin put it in the late nineties, America has no now. So it seems.
It wasn’t always that way, though. Once upon a time, way back in the middle of the twentieth century, America was all about tomorrow. Consider the Jetsons, which premiered in 1962, just a few years after the launch of Sputnik opened the Space Age. Matt Novak positions the television cartoon within the golden age of American Futurism: “The Jetsons” was the distillation of every Space Age promise Americans could muster,” he says. “It had everything our hearts could desire: jetpacks, flying cars, robot maids, moving sidewalks.” Or think about the Tomorrowland section of Disneyland, which Walt Disney described during its dedication in 1955 as “a step into the future, with predictions of constructed things to come.”
But all of that momentum started slowing down in the final third of the twentieth century, as Professor Andreas Huyssen notes in Present Pasts: Urban Palimpsests and the Politics of Memory: “Since the 1980s, it seems, the focus has shifted from present futures to present pasts.” Indeed, what began as a century on the edge of tomorrow ended with its eyes fixed anywhere but ahead.
The closer we got to the new millennium, the slower time seemed to pass. Sure, we were creeping ever forward, guided by our politics and medicine and technologies, but something was … off. The time felt, if not out of joint, then at least severely sprained — particularly in pop culture. And as Simon Reynolds says in his 2011 book Retromania, it feels especially wrong for culture-makers to have their gazes locked on yesterday.
Wrong or not, by the time it reached the aughts, pop culture seemed stuck. “Instead of being about itself, the 2000s had been about every other previous decade happening all at once,” writes Reynolds. This phenomenon, he argues, started to nibble away “at the present’s own sense of itself as an era with a distinct identity and feel.”
To wit: the hugely successful Nostalgia Entertainment-Industrial Complex. Take its most recent iterations in the form of reboots and reimaginings of media properties like the X-Files, Wet Hot American Summer, Baywatch, Magic School Bus, Twin Peaks, Gilmore Girls, Legends of the Hidden Temple, the Power Rangers series, and MacGyver. Backstreet Boys and Hanson are back on tour, and toys like Fisher Price’s Chatter Phone (first introduced in the early 1960s) and Hasbro’s Furby (the smash hit of Christmas 1998) are experiencing a surge in sales. Then there are the period shows steeped in iconic eras — Stranger Things, Mad Men, The Goldbergs, The Americans, The People v. O.J. Simpson — which, while not remakes, still exploit our nostalgia impulse.
To be sure, looking back at past cultures is nothing new. Renaissance artists looked back to classical Greeks and Romans, and Gothic architects looked back to their medieval forebears. But today’s looking back seems fundamentally different, as Reynolds puts it: We seem to be the first “society in human history so obsessed with the cultural artifacts of its own immediate past.”
Think about the wildly popular I Love the [Decade] television series. Originally conceived and produced in Britain, the show was shown in America on VH1. The first incarnation, I Love the ’80s, aired in 2002, and was quickly followed with retrospectives on the ’70s and the ’90s. Interestingly, the latter premiered in 2004, not long after the close of the decade being explored. Even more curiously, I Love the New Millennium premiered in 2008 — before the decade was even finished, and long before the new millennium was over.
The dictum “history isn’t merely what happened in the past” was proven true by these series, which were less interested in explaining historical periods and more interested in reminding viewers that certain things existed at a particular time. Oh, remember Pogs? Oh, and Nickelodeon! The point wasn’t to understand or interrogate certain epochal phenomena, as Reynolds explains, but rather to nostalgize them, to get viewers to say, Aww, these things happened, or Cool, these things happened, or sometimes simply, Oh yeah, here are things that happened.
This is a particularly postmodern way of looking at the past — not reverently, but bemusedly. In his discussion of the concept of “retro-chic,” historian Raphael Samuel notes how postmoderns “make a plaything of the past,” by cultivating an air of “detachment and ironic distance” from it. When they remake an original, they aren’t concerned with historical accuracy but with “decorative effect.” To them, the past is something like a flea market, filled with various cultural signifiers that can be bought on the cheap and displayed anew in whichever context the new owner desires. There’s a sense in which the ethos of the various I Love The series has become our default way of viewing history: as our own “plaything” (Samuel’s word), as a gigantic and ever-growing archive from which to borrow and repurpose the styles and slogans and entertainment that charm us.
Huyssen uses the term “memory boom” to describe the fact that “memory has become a cultural obsession of monumental proportions across the globe.” While there are certainly differing political and social reasons for this boom, the most obvious one has to do with how we remember. “One thing is certain,” writes Huyssen, “we cannot discuss personal, generational, or public memory separately from the enormous influence of the new media as carriers of all forms of memory.”
That is, we can’t talk about the recent shrinking of the nostalgia gap without discussing the concomitant explosion of the historical archive — the Internet, which has allowed the past to catch up with us. We have come down with what Reynolds calls a “delirium of documentation,” and philosopher Jacques Derrida calls Archive Fever. To have this fever, Derrida says, is to “burn with a passion” for archiving.
It is never to rest, interminably, from searching for the archive right where it slips away. It is to run after the archive, even if there’s too much of it, right where something in it anarchives itself. It is to have a compulsive, repetitive, and nostalgic desire for the archive, an irrepressible desire to return to the origin, a homesickness, a nostalgia for the return to the most archaic place of absolute commencement.
Fever does seem to be the right word for our obsessive need to catalogue every bit of our lives, no matter how trivial. As a result, Reynolds says, we have degraded the archive into an anarchive: “a barely navigable disorder of data-debris and memory-trash.”
Our past keeps growing, and as it does, it continues to crowd out our present, shortening the already narrow nostalgia gap. If Tom Vanderbilt thought treating last month’s music as classic was silly, think about various #TBT (“Throwback Thursday”) posts online, which celebrate historical events that happened a mere seven days ago.
We could shrink this gap even further. Like many kids her age, my 20-year-old sister is obsessed with the 1990s. When Netflix announced that it was remaking the ABC television show Full House, she and her friends took to Facebook to share their delight that a show from “their childhood” was coming back. This reaction struck me as odd because my sister was born in 1996: a year after the original series ended. She does the same thing with other ’90s phenomena, taking to social media to share images and songs and neon colors from a decade that she describes not as her favorite, but as her own.
It is as if even the gap between the past and present shrank to almost nothing, but the present kept going, turning in on itself, collapsing ever more, until the very present itself became a memory. We might describe this self-cannibalism with Frederic Jameson’s notion of nostalgia for the present: “We draw back from our immersion in the here and now … and grasp it as a kind of thing.”
The word nostalgia comes from two Greek words — nostos, which means “return home,” and algia, or “longing.” Nostalgia, then, is a longing to return home, or a kind of homesickness. In 1688, Swiss physician Johannes Hofer coined the portmanteau in his dissertation to refer to “the pain a sick person feels because he is not in his native land, or fears never to see it again.” Specifically, Hofer used it to describe the desire of Swiss mercenaries — soldiers fighting for other countries in foreign lands — to return to their native homes.
Nostalgia was not taken lightly, as it was believed to cause nausea, loss of appetite, fever, cardiac arrest, and suicidal thoughts. As Svetlana Boym notes in The Future of Nostalgia, nostalgics became single-mindedly obsessed with returning home, and became indifferent to everything else. Those afflicted confused the past and present, the real and imaginary, where they were and where they wanted to be. Good thing, then, that Hofer knew a cure: send the nostalgic home. Hofer’s “discovery” was made after treating a student from Bern who was studying in Basel. After the student came down with palpitations and fever, Hofer realized that there wasn’t much he could do for his patient, so he sent him home to die. Surprisingly, that prescription did the trick: returning to his homeland healed him.
Originally, nostalgia’s pain was “the ache of displacement,” as Reynolds puts it — the Swiss mercenary desperately longed to be somewhere else. But eventually, nostalgia became a temporal affliction — the desire to be in some other time. If the original disease had a cure (going home), temporal nostalgia does not (at least until we perfect the art of time travel). By the middle of the twentieth century, Boym explains, nostalgia was no longer seen as an individual sickness, but the default modern condition, marked by “displacement and temporal irreversibility.” And the condition has only become exacerbated since she wrote these words in 2001 — the same year that set a question mark against all of our twentieth-century present futures.
Our lives are increasingly lived online, where we are limited neither by time nor place. The difference between Millennials (those born between 1981 and 1997) and Generation Z (those born in 1998 and after) is that, while that first group remembers a time before the Internet, the latter are digital natives. Their place is both everywhere and nowhere. Their place is actually no place at all, but rather a space — cyberspace. With eyes focused on the refugee crisis in the Middle East, we can sometimes forget that even our own advanced societies are producing people who are increasingly rootless, unsettled, and in a constant state of wandering. Like Proust, “our destination is no longer a place, rather a new way of seeing.”
Virtual reality and augmented reality are two obvious fulfillments of Proust’s prophecy. But a better example of this new way of seeing is social media.
Let’s start with the photo-sharing platform Instagram, which was founded in 2010 and bought by Facebook in 2013. Importantly, the app allows you to filter your images by fading them, adjusting the contrast, blurring specific areas, and the like. In other words, you can edit a photo you just took to look as if you took it a few decades ago. (The app Hipstamatic was later developed to make the idea even more explicit.)
There are practical reasons, of course, for the popularity of this kind of retro-filtering, not least in that it makes it seem as if a low-quality, grainy look — a very common effect with many smartphones — was what the photographer was after. But the better answer probably has to do with our obsession with nostalgia. Filtering our images with a retro aesthetic, argues cultural theorist Nathan Jurgenson, fits the larger trend of social media, which “forces us to view our present as increasingly a potentially documented past.
Think about your last Facebook status. Are you still doing, thinking, eating, or hating whatever you described? Probably not. Chances are, you weren’t even doing that activity when you were writing about it, and even if you were, you probably weren’t doing it exactly the way you described it.
How do social media affect our notions of temporality? We tend to warn ourselves that whatever goes into the cloud will stay there forever. True enough: the Internet makes it impossible to forget. But there’s another way that social media affects our notions of temporality: by shrinking this moment.
The evolution of social platforms bears this out. Blogging allowed us to quickly write our little-edited thoughts as we had them. Facebook encouraged us to update our friends with brief descriptions of our moods and activities. Twitter came along and allowed us to be ever briefer about things, whittling down our moments to 140 characters. Eventually, words proved too tedious, so Instagram allowed us to capture our mundane experiences in images. But the present, we decided, could be ever briefer, and so Snapchat came along and made it so that our photos self-destructed after a few seconds.
Are there limits to how much we can shrink the present? Or is it possible that we can shrink a moment so much that we negate it?
In the first episode of the dark satirical 2011 British television series Black Mirror, the prime minister of England is blackmailed by a kidnapper who has abducted a member of the royal family: either the prime minister has sexual intercourse to completion with a pig on live television, or the kidnapped princess will be murdered. Against the British government’s wishes, news of the blackmail breaks on social media, and suddenly the entire world is glued to their smartphones and tablets and computers to watch the travesty take place in real time. The prime minister shockingly goes through with the demand, and the global audience is slightly mortified, but mostly gratified. In the end, the princess is released unharmed, so the bestiality seems to have been worth it.
But the audience soon learns that it wasn’t. The kidnapper had released the princess a full 30 minutes before the sex act transpired — but because everyone was glued to their screens watching history being made, no one saw her stumbling down Millennium Bridge in broad daylight. The moral of the story is that to watch history unfold is to shut your eyes to its very unfolding. To document the present is to erase it.
In our brave new world, all moments are now and all places are here with two important exceptions: now and here.
Two recent experiences I’ve had will illustrate this. The first took place in Niagara Falls. I’d waited 30 years to experience the wonder, and finally I was there, standing what seemed like mere yards away from one of the falls, vainly wearing a yellow poncho that was somehow supposed to keep me dry from the spray of water. The roar of the thing causing my insides to pound, the smell of wet rock, the mist almost too thick to see through — the moment was overwhelming.
Suddenly, a hand tapped me on the shoulder, and a young man about my age asked me in broken English if I would move away from where I was standing. He and his family wanted to take a picture in front of the falls, and I, rudely enough, was in their frame.
I wanted to deny his request, to tell their family, in my own broken French, that they would have to wait until I was finished experiencing the falls, or, better yet, that they should turn away from their cameras and have real-life experiences of their own. But I considered that noncompliance would seem rude, and so, taking one final look at the waterfall, nodded and left to find my partner. A few seconds later, I looked back at the family to see how their picture was turning out. They were gone. They didn’t want to see Niagara Falls. They wanted a picture so that they could be seen forever standing in front of Niagara Falls.
The other experience I had took place in Key West. The city is beautifully kitschy, and is home to several historic buildings, including Ernest Hemingway’s Florida residence. As with most historic buildings, the Hemingway Home offers its own tour, but my partner and I decided to explore the residence on our own. As we made our way up the steps, we noticed we were caught behind a small woman who was keen on filming her entire experience of making her way up to the second floor. When we reached the top, we realized we’d be stuck behind her the entire visit.
What irked me wasn’t that she was taking pictures of what she was seeing; it’s that she was taking pictures before she was seeing. As she toured the home, she held her camera out in front of her at arm’s length, the effect of which was that her camera experienced Hemingway’s home before she did. Actually, it might be truer to say she didn’t experience the writer’s home at all.
A photograph is a cue to remember that I was really here. I photograph Niagara Falls; therefore I am really here at Niagara Falls, and the photograph offers definitive proof of it. But, of course, I am not really here unless I’m here. Here is where I exist at a given moment. If, in the smartphone age, our only experience of a place is through the lens of a camera, then in what sense are we ever truly here? To modify a line from Gertrude Stein: There is no here here.
To experience the present for the first time through the frame of camera isn’t to experience the present as present, but to experience the present as if it’s already the past. When a moment and its memory become almost the same thing, remembering becomes pre-membering.
In an essay on Susan Sontag’s On Photography, novelist William Gass asks,
Suppose, for instance, we contrived to dimple up an image, by artificial means, created the picture of a person who never existed (doctored photographs do that for events). The photo would still “look like” a man, but it would not be the image of anybody, and so (without its of) would not be an image. Would it any longer be a photograph?
Though set as a parenthetical, Gass’s “without its of” is helpful for understanding what I’m describing. Social media is not the documentation of our experience because it doesn’t have an of. The documentation of my experience is my experience. As soon as I announce what I am presently doing, I cancel out my doing of the thing, and replace it with a sign that I am doing it. This sign, as Derrida would say, “represents the present in its absence. It takes the place of the present.”
On social media, we are never present. Rather, the very sign of our presence — a status update, a tweet, a picture of what we’re eating now — is the promise of our absence. My Facebook profile, like yours, is an eerie reminder that I don’t actually know where, or when, I am.
Our obsession with nostalgia is illuminating in many ways. It represents a desire to return to a world before terror; our ever-increasing ability to document the past and to store our present; and even the continued success of consumer capitalism. But perhaps the best explanation is also the simplest: looking back is our veto of the present.
And what of it? It’s not as if contemporary life hasn’t given us plenty of reasons to look away. Nor will the world come to a grinding halt because we’re all watching remakes of 1990s television shows and retro-filtering photos of our dinner. Yet there are reasons to be cautious, or at least aware, of how pre-membering is affecting our lived realities.
A fantastic short film called What’s on Your Mind? explores the gap between our experience and our documentation of it. The video opens with an unhappy couple sitting in front of the television and eating a microwavable dinner. “Sushi with my girl tonite!!!” writes the boyfriend on Facebook, and he instantly gets a like. He’s hooked! He realizes people will reward him with likes if he can write statuses about how fun his life is. The problem is his life isn’t very fun. But his social network connections don’t know that. Instead of writing that he put his coworkers to sleep with a boring presentation, he writes that the presentation went great!
Then the lies become even more egregious. He puts on workout clothes, drives somewhere scenic, and writes that he’s on a 20K run. Social media buys it, and he scores 27 likes. When his heart’s broken after he finds his girlfriend cheating, he writes, “Finally single!” When he pays for an evening with a sex worker, he updates his profile to show that he’s in an “open relationship.” 103 likes! Fired? “Quit my dead-end job!!! #followyourdreams”
Everything changes when Scott sees his girlfriend post a selfie with her new boyfriend, which earns 1,638 likes. “My life sucks,” he writes — and the shot abruptly cuts to one of his friends adjusting their settings to “Hide all from Scott Thompson.”
Across all of our social platforms, we work hard to present ourselves authentically because we know that authenticity is rewarding. We want our selfies to appear candid and our jokes to sound effortless and off-the-cuff. But our existence on social media is always already framed, which makes it difficult to nail down any of our identities. I approach you online the same way the woman from my Key West story approached the Hemingway Home. In the digital world, all of our present interactions are pre-memberings.
French sociologist Jean Baudrillard coined the word “hyperreality” to describe “the generation by models of a real without origin or reality.” In other words, a copy of a copy, where no original exists. Contemporary philosophers apply the idea to many things, including virtual and augmented realities, and reality television. We could also apply the concept to pornography, which purports to be a digital copy of an original sex act. Yet many people who watch pornography model their real-world sexual encounters on the films. Which sex, then, is the original and which the duplicate?
“The authentic fake” is how Umberto Eco defined hyperreality. The Greek word for authenticity, authentikos, refers to a genuine original. An auth-entic work of art is made by an artist himself, as opposed to a duplicate made by another. But in hyperreality, fakes are the only originals, memories are the only presents. It’s copies all the way down.
“When the real no longer is what it used to be,” says Baudrillard, “nostalgia assumes its full meaning.” So we turn on Fuller House, and listen to Carly Rae Jepsen’s remake of the original ’90s theme song, and with our smartphone cameras in selfie mode, record ourselves as we sing: Whatever happened to predictability?
Brandon Ambrosino is a graduate student in theology at Villanova University. He has written for BBC Future, the New York Times, Boston Globe, The Atlantic, Politico, The Economist, and other publications.