• About

The Mixed Tape Anthology

~ an archive of a world of thoughts before and after the "digitalverse."

The Mixed Tape Anthology

Monthly Archives: June 2019

The Quests and Questions that Stories Inspire: A Case for the Humanities in the Time of Technology

25 Tuesday Jun 2019

Posted by mixedtapeanthology in Uncategorized

≈ Leave a comment

Sharmaine E.D.S. Browne | 25 June 2019

After a particularly difficult week, a colleague asked how my class went that afternoon, and I lit up. “It was amazing!” And generally, that’s how it goes. I work hard to create lively, vibrant classroom environs to cultivate intellectual and creative risk-taking. The students, in turn, motivate me to work harder, achieve more. What first struck me about my job years ago is that the heaviness of life falls off when I walk into a classroom. I feel the responsibility I have to my students deeply, and their own efforts inspire me and give me hope for the future of our planet.

I love my work, and I appreciate all my colleagues, many of whom are, really, the students. Every day presents possibilities for transformation, for learning, and most of all, for discovery and a different process of making connections than the digital connectivity to which we have been culturally trained for the past decade. It’s difficult to describe the sensation of witnessing a class of students burst into conversation, each one eager to contribute, the excitement of critical thinking at work. But what I find most meaningful about teaching in the humanities isn’t just what we teach, but where what we teach takes us.

So, when I read article after article bemoaning the increasing irrelevance of the humanities, I realize that too many of us who study and work in the humanities and the arts too often fail to assert ourselves and the worth of the work. In fact, lately, it seems like too many of us in the humanities are apologizing or retreating, saddened but resigned to irrelevance. The death toll has been sounding for several years now from various corners of this digital world.

I’ve seen STEM friends gleefully giggle on social media whenever an article comes out about the low pay and lack of suitable working conditions for humanities majors. They openly chastise those who chose to work in the humanities whenever an article rolls through discussing the lack of basic benefits for those who are unable to find full-time work and must work as adjuncts in an ever-shrinking field which daily discards full-time jobs in favor of temporary or part-time positions. Scrolling through the smirks gets old. But I also read experts weighing in on the death of the humanities, their supposed diminishing purpose, and their inevitable end in a world overlaid with ever-multiplying digital platforms. And of course, The Chronicle of Higher Education seems to publish a new article on the demise of the humanities every time I look. It is certainly true that universities no longer value the humanities and arts as much as they should, but this has been the case for decades, through some perspectives, centuries, but our culture’s failure to value what is important doesn’t make it unimportant. Just as morality and the law don’t always align, what has value and what is valued are not always in sync either.

Let me be clear: when I talk about the importance of the humanities in education, I am not referring to the hyperbolic clichés of out-of-touch scholars laboring for years on miniscule topics of irrelevance in the rarified air of an imaginary ivory tower—not to say there isn’t an important role for intensive work on minute, obscure topics in all research and scholarship. There is, in fact, a crucial role for such intellectually and labor-intensive work in all fields. And nobody snubs scientists and programmers for most often being engaged in small, tedious studies which would be of little interest to the general public; they are not chastised for their research. In all scholarship, there is a cascading effect to research which may take place beyond the public eye which may have significant cultural effects sooner or, often, years later. Such work is necessary and should not need to be defended.

The humanities, though, also occupy a more expansive space in the cultural imagination because they are intrinsic to the broader business of humanity—critical thinking, empathetic understanding, remembering history, foreseeing the future, protecting individual liberties, creating imaginative possibilities. But that expansiveness means exposure, and under the weight of an ideological gravity which insists on a certain analytical (and profitable) direction, gross generalizations keep arising from the misguided prejudices of an ideology which values STEM over the humanities.

1

As is the habit of many of us in the humanities, the week has left me pondering the many ways corporations slip habits into our lives which seep into the minutiae of our days, transforming them, reified, invisible yet integral to our daily existence. I find myself asking what the social (read: “digital world”) world and the natural world now have in common. In both cases, the worlds in which we move through our lives changed, and we with them, and we really didn’t have a choice in the matter. We didn’t even know to make a choice or that there was a choice to be made. As now-digital natives living in a changing climate, I keep trying to look clearly at what carried us here, to this time, under these conditions, in the places we live, and the answer looms large, technology, or rather, the more recent later twentieth and early twenty-first century corporate harnessing of our technologies—technologies which now feel so natural that we no longer notice their reified presences pressed into every aspect of our experiences and our days.

I call the where of how we live now the ‘digitalverse.’ We move through and interact with a fully integrated series of overlapping, intersecting networks now—while awake and asleep. As organisms, we have been fully assimilated into this digital, global grid, throbbing with imperceptible, seemingly endless electrical impulses. However, for millions of years, our ancestors were fully integrated into an entirely different series of networks—those of nature. It would be our particular species, Homo sapiens, who would deeply affect and change our world beginning with agriculture and the control of plants and animals.

Two-hundred thousand years ago, modern humans evolved in Africa. Homo sapiens were one of four species of humans as far as we know. We are the sole survivors. The other three became extinct, Homo erectus 70,000 years ago, Homo neanderthaleasis or Neanderthals 28,000 years ago, and Homo floresiensis 17,000 years ago. Then, 12,000 years ago, civilization as we know it began when we stopped living as nomadically and began living in villages, permanent settlements which emerged from the agriculture which would eventually give rise to cities. It would take another 11,800 years before we would reach the technological savvy of the Industrial Revolution and become bold enough to make a play for world domination—over Nature.

Our technological prowess has always marked our cleverness as a species, and it has been largely through our technologies that we have able to survive or thrive over millennia. Over thousands of years, we became experts at our various technologies, our technê, which in the Greek meant “craft” or “art.” Most basically, as a species, we use our technê or technology to interact with our environs—often to tame and control it, but always to manage it—or rather, ourselves within or in relation to it. First, fire enabled us to cook our food so that we would evolve to think, warm ourselves to thrive, transform elements to progress and control our environs, and fight off enemies to survive. While there is evidence that hominins were able to control fire 1 million years, ago, evidence of hearths is only about 400,000 years old. The wheel wouldn’t be invented until 3,500 B.C. A rudimentary steam turbine was invented during the first century A.D., but the first steam machine and steam engine, invented in Spain and England respectively, would not be invented until the 16th century. These inventions would eventually allow for the collapse of time through space by train and transport. The printing press would be invented in the 15th century. Electricity, first harnessed by Benjamin Franklin in the 18th century and then applied to power in the 19th century, would further collapse the distances of space and the lengths of time through communication devices—the telegraph, telephone, cell phone, flip phone, and eventually the smart phone.

Our ability to transform and manipulate raw materials, sand into glass into windows, metals and stone into bridges and structures, and coal, oil, and gas into fuel (the list is long), seemingly enabled us to ‘overcome’ nature. However, climate change would bely the tragedy that our way of doing business in modernity has turned our strength into our weakness. Has our cleverness via the harnessing of technology by corporations turned out to be our species’ tragic flaw, our hamartia? This is the kind of question which emerges from fields in the humanities.

I think back over the digital years—to the first time a colleague complained with tears of frustration that he couldn’t get his students to put down their phones and focus. That was nine years ago, 2010, but it feels like at least 20 years ago. It was 19 years ago but seemingly longer when Professor Ellen Willis, a feminist voice and unapologetic critic and journalist of might, complained to us during our cultural journalism class that when you saw someone talking to themselves in the streets it once meant that they were, as she put it, “crazy.” “Now,” she said, “it means they are talking on a phone.” Cell phones were still flip phones then, not yet ubiquitous, certainly not addictive.

Today, smart phones are everywhere, and we are now wired into them because they are wired intrinsically into our days—our schedules, daily habits, weekly rituals, and our minds and bodies. To accommodate this intimacy with this new digital presence in our lives, we had to change how we sleep, wake up, meet up, study, move, friend, date, teach, work, relation, do business, and bank. Even the way we are entertained has changed. Now, movies are likely to have little dialogue—as the drama largely takes place on screens. Plots which are supposed to take place now must be wrapped around the widespread digital habits and infrastructures of our days—or the plots have to take place before the 90s. It is astonishing how completely we have changed every last corner, nook, and cranny of our lives. We didn’t miss a beat, a moment, or a spot on earth. All together now, planet earth, as far as its human inhabitants go, just changed – en masse – and fast. The change was so fast, generations rippled apart ….

We no longer share a knowledge of base lines—what it means—or once meant—to live. We have lost each other in this mighty shift. But somehow, we forget who (or what) drove the changes we now take for granted as our new normal—a handful of corporations moving in concert to accomplish, without resistance, a complete takeover of the globe—Google, Facebook, and Amazon. In the 1990s, such a sci-fi scenario would have been taken for fantasy. Yet here we are. It isn’t the tech that astonishes me as much as the unbridled, unchecked, unhinged power of today’s tech’s corporate masters over our lives in relation to our complete complacency about the corporate good.

When referring to “tech,” it is easy to list technologies, but what technology actually “is” at any given time isn’t so simple to define. Our relationships to our technologies are even less simple than that because our understanding of what technology actually is changes from generation to generation, and that understanding can somewhat still be calibrated to lifetimes. At a given time, and in a given place, a given group of people can generally agree about what they would designate “technology.” Today, for instance, most of us would first describe smart phones, tablets, and laptops as current technology. From there, we might move outward from ourselves, describing various technologies with which we are less intrinsically connected. However, the farther away we get from the invention of a technology, the less technological an innovation feels. An Uber driver recently said to me that he thought technology was changing people, but on second thought, he said, people used to say the same thing about television—as though t.v. were not a technology. This is a common way of thinking.

In our minds, technology means “new,” therefore, recent, at least at first blush. Few people would name “trains” as a technology in 2020, but in 1820, trains would have been one of the first technologies to come to mind, as a “shock to the status quo” as Marc Greuther describes it in his review of Michael Freeman’s Railways and the Victorian Imagination (1999), technology which first dazzled then transformed the nineteenth-century mind. And while generations used to be lifetimes, as life spans have grown longer, generations have been pared down to a handful of years because we measure generations, now, by current-time technology savvy, not milestones or lifetimes. We laugh about it, teasing each other based on levels of expertise and ways of doing life. And really, when one stops to think about it, it’s kind of absurd. When our tech savvy becomes the gold standard for respect, we have a problem. Frankly, STEM fields have a vested interest in ignoring the problem.

Children are nimble. They are quick to assimilate any technologies that adults make available to them, and our technologies are the invisible scaffolding of our lives. Of course they become tech savvy college students and adults. However, everyone, Baby Boomers, Gen X, Millennials, and Gen Z alike have all cultivated a facility with technology so that swiping, scrolling, and clicking has become natural enough to be mundane. But as with language, the younger you are when you are immersed in it, the easier it comes to you. Our facility with tech can be compared to accents in language, only with tech, what marks one’s fluency is speed and assimilation, the integration of the technologies into the fluid motions of our bodies and our lives. The youngest among us are perhaps most fluent with at least the surface aspects of tech—they can “speak” it easily. My parents are always surprised by my speed with various devices; if they saw my students in action, I imagine they would be floored. We measure our technologies by our gains which we, unfortunately now gage by efficiency, speed and productivity, but we rarely stop to consider our losses amidst technological progress, seduced by convenience and our own pleasure centers in the brain. Convenience, progress, and efficiency carry costs, often overlooked in culture but as often explored in literature.

In addition to technology, another uniquely human endeavor is storytelling, and this is ancient. Stories began orally and pictorially, and they too have changed in sync with emerging technologies. What is quite special about stories, however, is that stories remember. They resist acceleration, demanding our attention, slowing our pace, inviting thought. We have told stories for as long as our collective memory reaches back. I imagine storytelling sliding back through generations, around fires, at bedsides, over tables, and along travels. Our play, poems, epics, short stories, and novels have ridden the currents of culture over centuries. And I wonder, might storytelling preserve the foresight we lost foresight for fire?

In fiction, we read stories such as E.M. Forster’s “The Machine Stops” (1909), Ray Bradbury’s “The Veldt” (1950), and Ursula Le Guin’s “The Ones Who Walk Away from Omelas” (1973), or Octavia Butler’s “Speech Sounds” (1983). Forster manages to create a world in which every surviving individual is willingly solitary, only communicating with others, even family, through versions of Skype or other screen technology. When a son wants to meet with a mother in person, she is shocked. Bradbury creates a world in which a smart house becomes the primary care giver, and the children, Peter and Wendy, no longer have any need for parents who would turn “off” the house and take away the children’s virtual playroom. Le Guin paints a dreadful picture of a “perfect” world in which the suffering of a child is necessitated and justified by the relative success of the many. And Butler depicts a post-apocalyptic world in which the loss of communication parallels a loss of intellect and increase in violence. Each one of these writers creates a dystopian future of their own creative musings, and each one, eerily, extracts vulnerabilities and tendencies in human nature to weave threads of stories into parables which capture so many of the nuances of our contemporary world. How can we allow the relevance and resonance of our stories, our imaginative play, to become irrelevant to education?

In some cases of literature, as in Forster’s story written 110 years ago, the author’s foresight is downright bewildering. It isn’t just that he foresaw the dominance of machine technology, but that he was able to guess its general progression in relation to human behavior, psychology, and ideology. His guesses weren’t precise, but they were prescient. Forster imagined a world overlaid by a technological infrastructure through which human beings lived their lives in isolation, nonetheless, fully integrated into the gridwork of a world by interfacing with others via screens, even teaching online. The adults tremble at the idea of going outside, and they dissuade their children from venturing into world they believe remains terrifying and toxic. In many ways, there is here. In his wider work, Bradbury imagined tiny seashells for ears (ear buds), an increasingly accelerating world in which nobody walked, only sat in front of screens while attention spans grew shorter and shorter, books shrank into clips (before being burned as illegal), and relationships diminished into talking walls (screens). The value of this work isn’t just entertainment, it lends perspective and it does so at a distance, illuminating without the heat. If a lightbulb is 5% light and 95% heat, literature is 95% light and 5% heat.

Literature stirs us to action and rumination because it emerges out of the world without being accusatory. It combines the fields of history, philosophy, drama, and anthropology. It teaches us about the past, yes, but more importantly, it teaches us about the present’s direction into the future. And most often, literature is able to articulate the feelings of individuals who are searching to express their own experiences. In essay after essay, and class discussion after class discussion, reading literature motivates students to share their thoughts on technologies, social media, isolation, loneliness, and an increasing inability to connect in a world of increasing digital connections. They are quick to discern the pitfalls alongside the boons, the dangers amidst the progress because they themselves are experiencing them. Contrary to what some of my older friends and relations expect to hear, younger generations are increasingly aware of diminishing gains—and individual losses sustained. Literature pries open the imaginative spaces that the world closed for them; it elicits their silent questions and provides them with the spaciousness to search for answers.

In a recent class in which many of the students were film majors, future game designers, and aspiring writers, we were studying the foundations of stories, and we were reading the thoughts and theories of various critics who have weighed in over the centuries, including Northrup Frye (1912-1991) and his ideas on archetypal narratives. In this case, I drew a circular diagram on the board depicting these narratives in four categories, Romances (usually involving quests; science-fiction falls under this category), Comedies (usually involving obstacles to be overcome, resulting in “inclusion”), Tragedies (of course ending in disaster), and Satires (ironic, sardonic, and often gritty). What becomes abundantly clear while reading Frye is how we keep telling so many of the same stories in different variations for thousands of years. By studying such work, we become aware of our own contexts and histories. Shared worlds of past and present materialize in our imaginations. Understanding where we come from often helps us decide where to go. Our stories emerge from our lives, and our lives make up our histories. Telling a good story can change an individual’s world. Telling a good story well can leave a lasting impression over generations.

One student observed, “so Eternal Sunshine of the Spotless Mind really brings all of these genres together.” Yes! A friend once said that this movie, in particular, made magnificent use of its medium rather than simply borrowing the literary medium (and badly) as many movies do. Whether we are aware of it or not, we still largely adhere to Aristotle’s standards in Poetics that plot is primary; however, Aristotle was referring to plays, and stories were generally dramatic or oral. We did not yet have the technology for film or the novel then, and each genre had its own unique possibilities as media emerged. A great story does not simply rely on plot. A great story draws upon the numerous story elements available to the particular medium which carries that story to an audience – whether it’s a stage, a screen, or a book. Eternal Sunshine of the Spotless Mind plays with its possibilities and makes uses of its form in the delivery of its content. But I hadn’t considered the wide array of archetypal narratives it drew upon, and this student’s observation delighted me. More importantly, it inspired other students by bringing history into the present. Studying the history of philosophy, art, literature, music, and culture has a direct bearing on what we can learn and what we can create today. But this demands time, spaciousness, and thought. In this instance, the example is light-hearted, fun, and quirky, but imagine when it applies to graver topics.

In an era of so many digital natives, we complain a hell of a lot about diminishing attention spans, increasingly unfocused students, and multiplying addictions to social media platforms, gaming, and what-should-have-been-otherwise-mundane-methods-of-communication, emailing and texting. We often (falsely) believe that while the youngest among us are prone to the addictive tendencies that media giants like Google and Facebook prey upon, older generations are more immune; in fact, we are all vulnerable to the predatory algorithms, and like most addicts who are protective of their substances, the vast majority of us are protective of our content fixes. Our egos fiercely protect our basil ganglia’s circuits of triggers, needs, and fixes. We operate in an odd space of denial (not unlike decades of climate change denial), somewhat acknowledging that we can’t do without our devices while stubbornly or secretly looking forward to our next fix. This leaves us vulnerable in hard-to-discern ways.

I first assigned Nicolas Carr’s “Is Google Making Us Stupid?,” (2008) soon after it was published in the Atlantic. I read in Carr’s article the truth of my own experience: my mind was changing. States Carr,

Over the past few years I’ve had an uncomfortable sense that someone, or something, has been tinkering with my brain, remapping the neural circuitry, reprogramming the memory. My mind isn’t going—so far as I can tell—but it’s changing. I’m not thinking the way I used to think. I can feel it most strongly when I’m reading. Immersing myself in a book or a lengthy article used to be easy. My mind would get caught up in the narrative or the turns of the argument, and I’d spend hours strolling through long stretches of prose. That’s rarely the case anymore. Now my concentration often starts to drift after two or three pages. I get fidgety, lose the thread, begin looking for something else to do. I feel as if I’m always dragging my wayward brain back to the text. The deep reading that used to come naturally has become a struggle.

I brought the topic to my students. Only a few initially dismissed the possibility that Google was anything but good and benevolent, but by the end of the semester, we had all agreed, yes, our reliance and in some cases dependence on the internet was changing our minds. Since then, I have not even had to pitch my case. We all find the interest with its search engines and media platforms useful and indispensable. But students are becoming increasingly savvy and self-aware about the nefarious effects of our online addictions and attachments—able to weigh the good and the bad without going to extremes.

Today, my students are downright sophisticated in their self-reflection about the damaging fallout of our social media use. The effects are of very different natures. Our reliance on Google signals a loss of serendipity, individuality, and the kind of deep knowledge accessed only by deep drilling – into databases, books, and archives, and even daydreams—the stuff that doing nothing is made of. In addition, we are actually losing the ability to drill down into ideas because we are losing the practices of research, browsing, questioning, and exploring the vast resources available to us over centuries. Google is a godsend in many ways, especially for students and writers, but it is only a tool of convenience while it is substituted for a well and wealth of knowledge—which is isn’t. I’ve always been skeptical when we defend our internet searches with the argument that we have the world at our fingertips—as though the wealth of knowledge available is somehow just right there and doesn’t demand work and investments of time and mental energy. Also, when we think to ourselves, “I can look up anything I want!” The question is, do we? I’m not arguing against the internet. Rather, I am skeptical of it becoming the first and last word, the answer to it all. Any scholar knows, such a belief is rife with problems. As a friend once said, we are becoming a world of fingertips—a quip which carries multiple meanings. We are enclosing ourselves within ourselves, we are curating images of ourselves which play out in the world to be seen and read by others enclosed in themselves who reply to our curated selves with their fingertips. As we venture out of our homes less frequently, we venture out of our own opinions and perspectives less frequently still. New algorithms facilitate this—delivering the search results we prefer to see, based on our search history, and online behaviors which are being recorded, stored, and disseminated back to us in the form of ads and information that corporations want us to see or opinions various interest groups want us to believe.

The effects are all around us now, and one would need to be wrapped into a cocoon of greed, opportunism, or denial to insist that we do not have a social crisis on our hands – from the disintegration of relationships and social skills, the rise in mental illness and loneliness, and the decline in empathy, to diminishing attention spans, the erosion of our privacy, the death of small businesses and with them whole neighborhoods, and the elimination of lives alive with serendipity. As another friend once lamented, who wakes up on a Saturday morning inspired by memory anymore, a memory that would motivate us to get out of pajamas and take a walk, stroll through downtown, browse through the stacks of an independent bookstore which has stood at a corner for 75 years, or thumbs through vinyl albums at the same record shop that everyone in the neighborhood had been frequenting for decades? How many small hellos have we given away on any given morning? How many conversations haven’t we had? And all without being seen, recorded, or tracked by cameras, Alexa, and Google apps on our phones? Why would we give all that up? For convenience and efficiency, essentially, bargains and ease.

The digitalverse focuses our attentions narrowly, whether it’s to purchase, vote, or curate images of ourselves and lives, and there are benefits of course, but the conversations that emerge from “humanities” classroom force us to engage with the world as it is. The tangential and inspired sparks which come from debate and discussion help us hopscotch to unexpected realizations so that we are better able to make unpredictable connections which are relatable to the world. During a lesson plan on creating atmosphere in stories, in addition to analyzing literary language, we also looked at cinematographic techniques. Horror movies are a particularly effective genre for this, and John Carpenter a particularly effective director. While watching a clip from a 1981 movie, The Fog, I was startled by a particular scene at a gas station. The aesthetics of the set were beautiful, and they made me nostalgic for our mechanical devices the slower pace of life before digital. The lighting, the atmosphere, the colors were all gorgeously orchestrated. But what surprised me was the abundance of glass bottles and containers in the refrigerated displays of the gas station’s convenience store.

There was just a lot less plastic in circulation 1981. I hadn’t really thought about it until noticing this detail in a movie during a lesson, the shift to plastic was as recent as the past 30 years, not, as I had assumed, farther back during 1950s when the suburbs were being built and convenience was being sold at a furious pace via the appliances being advertised to populate the new suburban households. No, it was during the 1980s and the 1990s—when our awareness of the coming global crisis emerging from climate change was well underway. This was not, of course, in the lesson plan design, but materialized naturally—leading to further discussion about the deliberate choices that were made for profit at the expense of polluting the planet.

We chose single-use plastic over recyclable glass because plastic was easier and cheaper. For the consumer, it was simply easier to load up shopping bags (then, paper, then plastic too) with plastic bottles and jars instead of glass ones. Plastic was lighter, it didn’t shatter, and it was (in the shortsighted view) easier to dispose of. For the corporations? Plastic was cheaper and far more profitable than glass. Plastic is now poisoning us and our oceans, all the inhabitants of the planet, and of course, the planet itself. Plastic is a problem. I don’t know many who would disagree. I bring it up as an example of shortsightedness, as an example of an inability to foresee its devastating consequences. Unfortunately, the same can be said of many of our inventions of efficiency, progress, and profit. Making the connections, or “critical thinking,” is central to the humanities at any given moment, unpredictably, unexpectedly.

2

In a recent WSJ Saturday Essay, “Stop Worrying About the ‘Death’ of the Humanities” (2019), Adam Kirsch rightly argues that “[p]ursuits such as literature, art, and philosophy are fundamental expressions of human nature. While they have taken very different forms in different times and places, no civilization has been without them, and there is no reason to think that ours will be the first.” With respect to recent trends such as the soaring popularity of STEM majors and the drain on humanities majors, Kirsch notes that “[t]hese trends started to spark alarm around five years ago, when observers began to talk about the “death,” “decline” or “crisis” of the humanities. Since then, alarm has turned into something more like panic. “Who is going to save the humanities?” wondered Michael Massing in the New York Review of Books earlier this month, echoing last summer’s headline in the Chronicle of Higher Education: “The Humanities as We Know Them Are Doomed. Now What?””

Massing attributes the huge numbers of students flocking to STEM majors to the 2008 financial crisis, looming student debt, and the disparity between starting salaries for STEM and humanities graduates. As he notes, “the median annual earnings for engineering grads is $82,000, compared to $52,000 for humanities grads.” Eric Hayot, in “The Humanities as We Know Them Are Doomed. Now What?,” notes with alarm that “the decline in the humanities majors since 2010 is over 50 percent.” And rightly, Hayot advises us to reconsider the ethics and wisdom of continuing to accept doctoral candidates into programs when we know that there won’t be any jobs waiting for them on the other side of harrowing years of study, research, debt, and effort. As he asks, “[w]hen an English department goes from 414 majors in 2005 to 155 in 2015 (as did the University of Illinois at Urbana-­Champaign’s), or from 126 to 55 (as did Harvard’s), what department head can justify increasing the size of the tenure-line faculty? On what grounds can we even argue for hiring replacements for our retiring colleagues?”

The mindset which has drained the humanities of its perceived value in the larger culture, in part, comes from misunderstanding its role in education. Kirsch demonstrates this tendency when he argues that “[u]niversities are not responsible for, or capable of, creating a living humanistic culture” because while “scholarship is an important part of that culture” it is “not its engine.” While scholarship in the humanities, indeed, may not be the engine per se of the arts and humanities, the business of universities is not only to engage in scholarship but also, and quite obviously, education. Education may lay the foundation for the creation of the arts and a humanistic culture, and education may most certainly provide the atmosphere needed to facilitate and produce a humanistic culture write large. In the university, there is room for both, tucked into the corners of databases, libraries, and labs, research and scholarship may blend with what happens in the classroom, teaching the foundational studies and creating the environs for creative invention, depth of understanding, and the kinds of critical skills which may facilitate a better, kinder, more vibrant world. Some institutions of higher education emphasize research; others emphasize teaching. This is common knowledge. In either case, there is room for both the study and creation of the critical and the creative—in the humanities as well as the sciences, technological, engineering, and mathematical fields as well.

A point most often overlooked is this: in financially elite institutions, even drained of historically stable financial backing, the humanities will continue in some form. The danger is that the humanities may become less accessible to all but the most privileged. This is a problem because the education offered by the universities is for everybody, not only a financially elite few. With that education comes exposure because an education in the humanities is also about access—a point too often overlooked (which I suspect is a problem of the kind of ignorance born of privilege, a Marie Antoinette absence of knowledge). So while we may not have cause to worry about the death of the humanities by way of creating or accessing the arts and philosophies in the more privileged strata of American society, we certainly need to worry about access to literature, art, history, philosophy, and music at a time when culture is so one-sidedly valuing STEM and its potential profits for those who require more grist for their multi-billion dollar mills.

Yet, it is true, as Kirsch claims, that interest in the humanities does not seem to be waning in the American public. People are still reading, art exhibits and theatrical productions still have audiences, and movies continue to play a central role in American culture, perhaps more so than ever before. Contrary to Kirsch’s claim that “[m]uch of what students learn may not be directly applicable to their lives and careers” citing “need[ing] to know how to identify a school of painting or interpret a poem” as terribly narrow examples of the skill sets that the humanities provide, the humanities directly apply to students lives in ways that are transformative, empowering, and irreplaceable: 1) We encourage students to take risks and enter into conversations while confidently demonstrating communication skills ourselves, kindly but strongly. A young woman, nearing graduation, approached me at the end of a class to tell me that she feels empowered to go into a male-dominated field confidently after taking my class. She is armed with tools for success. 2) We teach literature which, among other things, cultivates empathy. A young man told me that they all learned how to be more empathetic toward people. He will be well-equipped to succeed in team efforts and gage the consequences of actions and policies on others. 3) We encourage them to advocate for themselves and to identity the connections between the work we do in class and their future fields of study. Another student wrote that they learned how to be adults. They are better prepared to take responsibility for their work and their goals. 4) We teach research and reasoned argument in every class. Another student told me that she learned how to converse about controversial topics without getting angry for the first time, armed with new skills sets, language, and historical precedents. The ability to keep a level head during conversation or conflict will serve her well professionally. Other comments, feedback, and evaluations attest to learning how to write clearly, persuade effectively, read deeply, organize efficiently, research thoroughly, let alone thinking critically and historically while analyzing thoroughly and fairly. I could cite, literally, hundreds of these student comments. What so many discussions about the humanities lack is a knowledge of the actual experience of actual students for whom education is most meaningful. And yes, this does and will affect American culture and even democracy for generations to come.

The humanities do teach us how to think critically which in turn may teach us how to feel empathetically. The humanities provide us with skill sets that include reading critically, writing articulately, analyzing astutely, and yes, creating inventively. Perhaps most importantly, the humanities empower us and enable us to climb out from the inner coils of the ideologies which bind us too tightly, to see the bigger picture, to recognize our own individual worth and value the worth of others, and to learn how to navigate and in some cases transform the current, cultural currents. This is indisputable although too easily overlooked because such effects are so diverse, distant, and difficult to measure.

Kirsch quotes Matthew Arnold, a nineteenth-century critic, as having “defined culture as ‘the best which has been thought and said in the world.’” Taken out of context, as it is, true, “few humanities professors would now want to claim the authority to say what is best, or even agree that there is such a thing as ‘the best’ art or thought.” Arnold’s two finer points, however, in “The Function of Criticism at the Present Time,” had to do with the role of criticism in culture, what we would now think of as scholarship in the humanities. First, Arnold juxtaposes the critical with the creative, judging the critical to be of a “lower rank” than the creative, but he goes on to say that creativity is not limited to “producing great works of literature or art.” Critical endeavors may be creative in their own right. Second, he draws our attention to the role of the critical in relation to the creative—which is that the critical work creates the atmosphere in which the creative work may thrive:

for creative literary genius does not principally show itself in discovering new ideas; that is rather the business of the philosopher; the grand work of literary genius is a work of synthesis and exposition, not of analysis and discovery; its gift lies in the faculty of being happily inspired by a certain intellectual and spiritual atmosphere, by a certain order of ideas, when it finds itself in them; of dealing divinely with these ideas, presenting them in the most effective and attractive combinations, making beautiful works with them, in short.

In a myriad of ways, the cultural critic has the capacity to create the atmosphere in which the arts and humanities may flourish. Critics and scholars alike have played pivotal roles in bringing the public’s attention to one work of art or another, bringing works of art to the forefront or out of obscurity, and motivating individuals to creativity. To claim otherwise is stubborn, but tragically, an offshoot of a way of thinking in the current climate. Shakespeare, for instance was retrieved from falling into disfavor by a scholar, Samuel Johnson.

As for the role of the humanities in culture, Kirsch echoes a claim made by George Steiner among others that the humanities do not humanize, saying that the idea that the “humanities humanize” is a “difficult argument to make,” given examples to the contrary, pointing to two of the most horrifying, depraved, inhumane instances of hatred, the genocides of Hitler and Stalin. Kirsch observes that “highly educated people … were, for instance, devoted Nazis or Stalinists and … used their learning to buttress their defense of inhumanity.” I am reminded of post hoc ergo propter hoc, or false cause. Although these tragedies marked a failure of the hope of progress for humanity, the great collapse of Enlightenment hopes for us as a species, these crimes against humanity were not perpetrated because of the humanities. The horrors and atrocities of human behavior of the twentieth century were a failure of humanity, not the humanities. Moreover, the humanities remember.

The humanities may not be able to save us from our destructive and sometimes evil tendencies as a species; however, in an albeit faith-based essay, Mark Watney makes a compelling observation, that “despite the humanities’ inability to “humanize” us  in the Enlightenment sense of the word , they have left us a remarkable record of our deep-rooted and mysterious sense of right and wrong.” He suggests that “the true ‘humanizing’ power of the humanities [is] not in protecting us from depravity, but in exposing its terrifying reality, and recording our despair and yearning for purity.” In another fascinating Atlantic article celebrating the life of a former professor in memoriam, “Humanizing the Humanities: John Rassias and the Art of Teaching,” Lara N. Dotson-Renta describes the situation this way:

The humanities and the “soft” skills these disciplines foster are pitted against the sciences when in fact they are a part of an ecosystem of knowledge, a balance in ways of thinking and seeing. There is value in debating the ethics of King Creon’s refusal to allow Antigone to bury her brother in Antigone, just as there is worth in delving into the mysteries of atoms. While society at large requires more “education,” to advance, it has failed to see that it comes in many forms, arguably narrowing its meaning and application.

Dotson-Renta is right when she observes that “[a]s a culture, the country has come to place decreasing value on thoughtfulness, abstraction, and nuanced critical thinking that poses big (uncomfortable) questions rather than presuming answers.” I often ask my students to “follow the question” in search of answers rather than following the answers they hope to find to reinforce their existing opinions through research. Being able to rest in uncertainties, seek out uncomfortable truths, and shake ourselves out of our own ideological malaise to extricate ourselves from blind adherence to what we already believe is one of the true powers of a humanities education.

3

Anecdotally, I first realized how endemic undervaluing the humanities was to academia during my time at NYU twenty years ago. The disparity between the living accommodations offered to humanities verses STEM students would have been laughable were they not so deleterious. As an example, humanities graduate students were crammed as pairs into studio apartments built for one person while neuroscience graduate students enjoyed enormous apartments all to themselves. We paid $2,000/month for the pleasure of living in a small, dark studio tucked into the one of the over-large, super-block Washington Square Village buildings built in 1959, though granted, they were located well in the Village, as we pathetically tried to create some semblance of privacy with folding screens we purchased at the family-owned, corner hardware store (gone by now I’m sure). By contrast, my neuroscientist friend paid $400/month for a single—a hip, inspiring, three-times-as-large apartment in a small, red-brick nineteenth century building with character, an open floor design, and a sweet fire escape. He had pillars. We had cockroaches. He had large windows and hardwood floors. We had aluminum-framed windows (only one opened) and linoleum. As a student of cultural journalism and literature at the time, or in my roommate’s case, music therapy, the message was clear: theorizing a tinier part of a teeny tiny part (as my neuroscience friend described it) of a tiny specialized field of science which might one day be useful for a population, or not, was more valuable than an informed population or healing an abused child. The humanities address individuals and communities—made up of individuals—while STEM majors address populations, theories, and the possibilities of progress—irrespective of individuals. Where value is places is reflected in the treatment of fields’ students and scholars. A professor of mine in graduate school, Ira Shor, once told us to look around. The rooms we inhabit, as tenants, workers, or students tell us a lot about how we are valued, or not.

The maltreatment trickled down into a trailing condescension that infected even our social lives, once slyly swirling around the cocktail table one night at a favorite nightclub, The Fez in NoHo (gone since 2005)). Caught in the middle of a debate which I look back on now as toxic masculinity in action, a young woman sitting between two neuroscientists, a would-be knight in shining armor and an aggressor, I sat there uncomfortably as they debated the merits of a graduate degree in the humanities. My friend was arguing a case for the humanities having investigated what we humanities-types do with our critical thinking and scholarship. Literary criticism had passed the test, and he had become convinced that our intellectual pursuits were as intellectually rigorous as theirs. My friend’s colleague, on the hand, snorted and disagreed, looking simultaneously amused and uncomprehending. He made it clear he thought the very idea ridiculous. It was only later I would realize just how insulting that conversation had been on a personal level. But the attitude was endemic. The universities have been telling us in the humanities that we aren’t worth as much as our STEM counterparts for some time. Why?

Follow the money. It isn’t because the humanities aren’t worth as much culturally, artistically, individually, or intellectually; it’s because STEM is more profitable and can be harnessed for maximum power by elites and organizations whereas the business of the humanities is actually, in part, to stem that flow of unchecked power and money through many means: cultivating critical thinking skills, questioning dominant authorities, challenging unquestioned ideologies, studying history, reading literature, fostering the kinds of perspectives through which, gasp, empathy emerges. As a result, in a market-based economy which has transformed into a full-blown market-driven culture, the humanities have been cobbled, trimmed, cornered, and harnessed—because the humanities have the potential to get people to think.

In 2014, seemingly while sipping an old fashioned, Benjamin Winterhalter wrote a wonderful article for The Atlantic on “The Morbid Fascination With the Death of the Humanities” in which he observes that “the humanities crisis is largely a positive feedback loop created by stressing out over economic outcomes.” After meeting and talking with Matt Langione, who was studying literature at Berkeley at the time, and whose interests were in Modernist literature by way of studying neuroscience, Winterhalter captures the irony: “The novelty of Matt’s studies, it seemed to me, encapsulated the craziest thing of all about the whole ‘crisis of the humanities.’ The conversation about funding for the humanities somehow manages to proceed in complete isolation from the actual practices of today’s humanistic scholars.” The fact is, the humanities are not irrelevant. They have never been irrelevant. They are, nonetheless, a problem for corporate America because the humanities are (or should be) engaged in the business of thinking. We may not be able to save the humanities in education, and we will see the destructive effects in culture, perhaps not with a bang, but a whimper.

 

Works Cited

Baker, Kevin. “The Death of a Once Great City.” Harpers. July 2018. https://harpers.org/archive/2018/07/the-death-of-new-york-city-gentrification/. Accessed 20 June 2019.

Bosker, Bianca. “The Binge Breaker.” Atlantic. Nov. 2016, https://www.theatlantic.com/magazine/archive/2016/11/the-binge-breaker/501122/. Accessed 20 June 2019.

Bradbury, Ray. “The Veldt.” The Illustrated Man, 1951, pp. 7-19, http://teachers.wrdsb.ca/jackmaw/files/2012/09/The-Veldt-Ray-Bradbury-pdf.pdf. Accessed 19 May 2019.

Butler, Octavia. “Speech Sounds.” Isaac Asimov’s Science Fiction Magazine, Dec. 1983, https://www.unl.edu/english/docs/englishweek17/engl200-speechsounds.pdf. Accessed 19 May 2019.

Carr, Nicholas. “Is Google Making Us Stupid?” July/Aug. 2008, The Atlantic, Atlantic Media Company, June 2018, www.theatlantic.com/magazine/archive/2008/07/is-google-making-us-stupid/306868/. Accessed 19 May 2019.

Dotson-Renta, Lara N. “Humanizing the Humanities.” The Atlantic, 17 Jan. 2016, https://www.theatlantic.com/education/archive/2016/01/humanizing-the-humanities/424470/. Accessed 25 June 2019.

Freeman, Michael. Railways and the Victorian Imagination. Yale UP, 1999.

Forster, E.M. “The Machine Stops.” Oxford and Cambridge Review, Nov. 1909, The University of Rhode Island, https://www.ele.uri.edu/faculty/vetter/Other-stuff/The-Machine-Stops.pdf. Accessed 19 May 2019.

Hayot, Eric. “The Humanities as We Know Them are Doomed. Now What?” The Chronicle of Higher Education. 9 July 2018. https://www.ewa.org/latest-news/humanities-we-know-them-are-doomed-now-what. Accessed 25 June 2019.

“Humans Change the World.” The Smithsonian Institution’s Human Origins Program, Smithsonian National Museum of History, 14 Sept. 2018, humanorigins.si.edu/human-characteristics/humans-change-world. Accessed 19 May 2019.

Kerry, Cameron F. “Why Protecting Privacy is a Losing Game Today—and How to Change the Game.” Brookings. 12 July 2018, https://www.brookings.edu/research/why-protecting-privacy-is-a-losing-game-today-and-how-to-change-the-game/. Accessed 20 June 2019.

Kirsch, Adam. “Stop Worrying About the ‘Death’ of the Humanities.” The Wall Street Journal, 26 April 2019, https://www.wsj.com/articles/stop-worrying-about-the-death-of-the-humanities-11556290279. Accessed 25 June 2019.

Lawton, Graham. “Every Human Culture Includes Cooking – This is How it Began.” New Scientist, 2 Nov. 2016, https://www.newscientist.com/article/mg23230980-600-what-was-the-first-cooked-meal/. Accessed 25 June 2019.

Le Guin, Ursula. “The Ones Who Walk Away from Omelas.” New Dimensions 3, edited by Robert Silverberg, Nelson Doubleday, 1973, https://www.utilitarianism.com/nu/omelas.pdf. Accessed 19 May 2019.

Manney, PJ. “Empathy in the Time of Technology: How Storytelling is the Key to Empathy.” Journal of Evolution and Technology, Sept. 2008. https://jetpress.org/v19/manney.htm. Accessed 25 June 2019.

Massing, Michael. “Are the Humanities History?” The New York Review of Books, 2 April 2019, https://www.nybooks.com/daily/2019/04/02/are-the-humanities-history/. Accessed 25 June 2019.

Palermo, Elizabeth. “Who Invented the Steam Engine?” Live Science, 19 March 2014, https://www.livescience.com/44186-who-invented-the-steam-engine.html. Accessed 25 June 2019.

Rosin, Hanna. “The End of Empathy.” Civility Wars, NPR, 15 April 2019, https://www.npr.org/2019/04/15/712249664/the-end-of-empathy. Accessed 20 June 2019.

“The History of Electricity—A Timeline.” The Historical Archive, 13 Feb. 2007, http://www.thehistoricalarchive.com/happenings/57/the-history-of-electricity-a-timeline/. Accessed 25 June 2019.

“The Invention and History of the Printing Press.” PS Print, https://www.psprint.com/resources/printing-press/. Accessed 25 June 2019.

Twenge, Jean M. “Have Smartphones Destroyed a Generation?” Atlantic, Sept. 2017, https://www.theatlantic.com/magazine/archive/2017/09/has-the-smartphone-destroyed-a-generation/534198/. Accessed 25 June 2019.

Watney, Mark. “Torturing Jews and Weeping Over Schubert: Have the Humanities Failed to Humanize Us?” Dappled Things, n.d., https://dappledthings.org/12338/torturing-jews-and-weeping-over-schubert-have-the-humanities-failed-to-humanize-us/. Accessed 25 June 2019.

Winterhalter, Benjamin. “The Morbid Fascination with the Death of the Humanities.” The Atlantic, 6 June 2014, https://www.theatlantic.com/education/archive/2014/06/the-morbid-fascination-with-the-death-of-the-humanities/372216/. Accessed 25 June 2019.

Wolchover, Nathalie. “Why It Took So Long to Invent the Wheel.” Live Science, 2 March 2012, https://www.livescience.com/18808-invention-wheel.html. Accessed 25 June 2019.

 

The Lost Quiet

16 Sunday Jun 2019

Posted by mixedtapeanthology in Uncategorized

≈ Leave a comment

When was the last time you walked into the world without filters? When was the last time you gazed out onto a beautiful landscape without thinking of taking a picture, or deciding not to take a picture? When was the last time a picture never entered your mind? When was the last time you worked in silence or sat inside a quiet house? I miss the quiet noises with the buzzing, blinging, and swooshing, and scrolling.

Subscribe

  • Entries (RSS)
  • Comments (RSS)

Archives

  • October 2019
  • June 2019
  • March 2019
  • February 2019
  • November 2016

Categories

  • Uncategorized

Meta

  • Create account
  • Log in

Blog at WordPress.com.

  • Subscribe Subscribed
    • The Mixed Tape Anthology
    • Already have a WordPress.com account? Log in now.
    • The Mixed Tape Anthology
    • Subscribe Subscribed
    • Sign up
    • Log in
    • Report this content
    • View site in Reader
    • Manage subscriptions
    • Collapse this bar