Returning home for the holidays can make one feel like a fish out of water. Or a fish in outer space. Well-conditioned habits are thrown to the wind. Sleep hygiene is performed with reckless abandon. Screen time minutes skyrocket. Nutritional concerns are obliterated. Nothing in my childhood home is where I remember it being when I was a kid. The spices are now in this drawer, the mugs are now in that cupboard. I’ve developed my own habits as an adult that are incompatible with the architecture of my upbringing.
Of all the household habits that differ from mine, none are as jarring as the television being constantly on. At all hours of the day, vivid colors and sounds blare from the family room, even when there’s no one sitting on the couch watching. The TV plays while I sip coffee, while I make lunch, while I brush my teeth. When I walk downstairs in the morning, I press the mute button, and when I exit the room to use the bathroom, I re-emerge to find that the television has been unmuted, playing for a phantom.
The white noise of the TV is a marked trait of a slightly older generation, one that saw the device as a status marker and purveyor of pleasure. Baby Boomers and Gen X have long-held cable subscriptions, enabling them the luxury of simply flipping the television on and playing a channel at random for the sake of it, the sounds of news anchors, laugh tracks, and advertisements flooding their living space as a kind of bright reminder of humanity. Millennials and Gen Z are far more likely to subscribe to multiple streaming services over a cable package, which offers a different model of entertainment - one in which the viewer selects what they want to watch in a single sitting, rather than allowing undiscernible programs to hum in the background all day long.
The television being a source of daily white noise was a prelude to the smartphone embedding itself into young people’s daily routines today. Both the TV and the smartphone share the capacity to numb. After a few hours of watching television, I stand up and see fuzziness in my field. After an hour of scrolling on my phone, human interaction feels dull. In either scenario, my receptors are fried, requiring greater stimulation to light them up again.
If the television is on all day or if one scrolls on their phone during each of their free moments - if one’s true state of neutral is some kind of media-enabled stimulation - then it will take more than just watching an episode of a TV show to feel entertained at the day’s end. This is why people have begun overlapping their device usage - scrolling on Twitter while watching Netflix, for example.
For n+1 magazine, former Netflix employees shared that of Netflix’s “thirty-six thousand microgenres,” “casual viewing” encompasses programs one can watch while doing other things - background noise approximating the linear TV experience. Most of this content includes sitcoms, reality television, and nature documentaries. One screenwriter who’s worked with Netflix explained that a common note from company executives is to “have [characters] announce what they’re doing so that viewers who have program[s] on in the background can follow along.” Tavlin provides an example from the Lindsay Lohan-led Netflix movie Irish Wish:
“‘We spent a day together,’ Lohan tells her lover, James… ‘I admit it was a beautiful day filled with dramatic vistas and romantic rain, but that doesn’t give you the right to question my life choices. Tomorrow I’m marrying Paul Kennedy.’
‘Fine,’ he responds. ‘That will be the last you see of me because after this job is over I’m off to Bolivia to photograph an endangered tree lizard.’”
I witnessed the need for this feature firsthand while watching TV with my family. Over our most recent holiday break, I decided to show my family members some of my favorite movies that I watched this year - namely Challengers (2024), Anora (2024), and The Holdovers (2023). Everyone struggled to stay off their phones for the duration of each of the two-hour films, opting to Snapchat and text people back, scroll on Instagram, or swipe around Candy Crush. The habit was worst during the dead space of the movies - transitional scenes, establishing landscape shots, or scenes free of dialogue. If nobody was talking onscreen, their eyes went elsewhere.
This is obviously frustrating for numerous reasons. I had to fill the group in every thirty minutes on what was happening onscreen. As the credits rolled, some would say that they “didn’t like” the movie, to which I would counter that they barely saw enough of it to develop an opinion. And finally, and most importantly, we left the viewing experience feeling less connected (or at least I did). Part of what makes co-viewing movies with loved ones enjoyable is the feeling that you’re experiencing something together. Like attending the ballet or visiting the aquarium, your memories of the movie become branded with the initials of the people you saw it with. Actively engaging in media fosters a sense of togetherness - even if the people you’re with didn’t like the film you shared, they at least cared enough to absorb it and develop an earnest opinion.
Perhaps some of this is my fault - maybe I could picked films more tailored to my audience. But another part of this is surely related to a generation’s inability to keep their attention focused on one thing at a time - especially something that’s not immediately interesting to them off the bat. I fear that we’re losing our ability to grapple with challenging media - or to even ingest long-form media altogether.
When I consider this issue, my mind immediately jumps to traditional literacy. If people are unable to get through a movie or TV show without using their phones, how are they able to read books? A 2021 Pew Research survey reports that about 23% of Americans haven’t read a book in whole or in part in the past year, whether in print, electronic, or audio form. This figure has nearly tripled since 1978, according to a Gallup survey. When analyzing the two surveys, Professor Steven Mintz also points out that there’s been a decline in avid readers - according to the 1978 Gallup survey, about 40% of Americans had read eleven or more books in the last year. That figure dropped to 28% in the 2021 Pew Research survey.
Americans still understand the mechanics of how to read - what they’re reading is just shifting dramatically. For example, Mintz argues that reading emails, text messages, and short-form articles on mobile phones and tablets is greater than ever. However, academics concerned about a literacy crisis argue that our collective desire to read sophisticated texts and comprehend authors’ ideas has diminished, as has our familiarity with cultural references, including religious, historical, and literary allusions.
As with many moral panics, anxieties about illiteracy aren’t new. In 1979, historian Harvey J. Graff coined the “literacy myth” to describe the belief that the acquisition of literacy is a necessary precursor to economic development, democratic practice, cognitive enhancement, and upward social mobility. According to this “myth,” literacy transformed human life in fundamental and far-reaching ways, as agriculture and city development did. Graff’s research contradicts this myth, arguing that the connections between traditional schooling and social mobility aren’t necessarily natural.
Mintz, however, writes that the literacy myth doesn’t necessarily assert that there’s zero connection between literacy and socioeconomic development. It simply speaks to literacy being an “article of faith,” thus any perceived decline in literacy feels like a threat to the well-being of individuals and society at large. Today, however, “literacy” refers to more than just reading skills and comprehension - it’s become synonymous with proficiency at virtually any skill or competency (i.e. financial literacy, data literacy, media literacy, and so on). Moreover, our capacity for literacy hasn’t necessarily diminished but has changed, even as common reading material shifts further towards a shorter format.
Our increased distractibility and inability to engage with lengthy texts is still concerning, as it suggests a potential lack of “cultural literacy,” or fluency with the allusions and references that educated people are assumed to know and that largely grow out of broad and intense reading.
Most recently, cultural illiteracy was on display on Twitter after director Christopher Nolan announced plans to make a film adaptation of Homer’s The Odyssey in 2026. Many online expressed their unfamiliarity or disinterest with the text - even in the analogous sense. Others responded with shock that public schools aren’t teaching foundational Western texts like The Odyssey any longer - even as a means to educate students on the “Hero’s Journey” archetype. Ironically, such fervent apathy - and even disdain - towards material like The Odyssey runs concurrently with a rampant anti-intellectualism movement on and offline, which began sprouting up more boisterously as rural Trumpism did in the United States. The rejection of science supporting climate change and vaccinations and general distrust in intellectuals feels more adjacent to literary lethargy (and even much of BookTok) than it may have in a pre-2016 America.
Of course, rejecting science and rejecting ancient literature aren’t the same thing - each has its own function. From Galilei to Semmelweis to Fauci, scientists have long been defamed and persecuted by religious and political institutions that fear their power being undermined. It’s easy to weaponize people against systems of higher education, which have been kept exclusive with high tuition and discriminatory acceptance practices.
Despite the elitism, cultural literacy - and its adoption through challenging oneself with novel and complex ideas - remains important, as it frees us from a place of narrow-mindedness, allowing us to connect laterally across time and space. We become a part of what Mintz dubs “the great conversation,” engaging in dialogue with those who came before us. How splendid is it to know that people have long felt the same feelings we feel today, that we are not unique in our suffering or desire for pleasure and adventure?
And simultaneously, how great is it to engage with a third thing - a work of art, a song, a movie - with another person and spar or gush about it afterward? To develop your own opinion - even if it’s only in your head. To have proof of thought - proof of existence - rather than ingest media like a painkiller, washing away signs of illness.
Last year, I told the lady next to me at the ballet to please put away her phone after she scrolled through the second act of The Nutcracker. She huffed and gruffed, and rightfully so, as it was a pretty annoying thing for a stranger to ask. I didn’t like asking her to do it and probably won’t ask anyone to do so again in the future. How nice it would have been for the two of us strangers to have shared the experience of the ballet, instead of engaging in a silent battle of ideologies. Freedom of choice vs. distractibility politics. It’s a battle I won’t enter again. Unless, of course, it’s in the living room. In front of the TV. With my family.
I can't imagine being on my phone during a ballet. What's the point of even being there then? Might as well have just stayed home and scrolled from the comforts of bed and saved some money
My partner and I watch a lot of movies at home—last year I logged over 350 on Letterboxed. Getting to put my phone down for a few hours and live in another world is one of my favorite feelings! Why would you not want that!