Bradley Garrett on “The Value of Trespass”

“This talk was given at a local TEDx event, produced independently of the TED Conferences. Urban exploration (UE) is the assertion of bodily freedom through the practice of trespass. Governments, corporations, and fellow citizens raise countless borders, and rarely do we discuss whether these borders are ethical, justified or even legal. Often, we’re not even aware they exist or how they shape our lives. By hacking the city, UE democratizes urban space and questions the legitimacy of the lines that divide us.

Bradley L. Garrett is a lecturer in the Economy, Governance & Culture Research Group at the University of Southampton with a passion for photography of off-limits places. His first book, Explore Everything: Place-Hacking the City (Verso Books 2013), is an account of his adventures trespassing into ruins, tunnels and skyscrapers in eight different countries. Details of his current research, media projects, publications and events can be found at bradleygarrett.com”

Posted in Uncategorized | Tagged , , , , , , , , , , , , , , , , , , , , , | Leave a comment

anna schuleit’s “bloom”

valeriamsouza:

Absolutely amazing project, courtesy of Sara Hendren’s blog, “Abler.”

Originally posted on Abler.:

Anna Schuleit’s Bloom is one of my long-held favorite installation works, and it’s a perfect Abler project. So why haven’t I featured it before now?

a view down an institutional hallway, whose floor is full from end to end and side to side with blooming bright orange tulips.

It was staged at the Massachusetts Mental Health Center in 2003, in the days before the center shut down, after nine decades of patient care. Schuleit covered the entire four floors with 5,600 square feet of sod and 28,000 blooming flowers.

Schuleit says the work addresses, in part, the strange lack of flowers in psychiatric settings, while they appear everywhere in other clinical environments. That distinction packs so many assumptions about who is sick, and why, and how.

an institutional basement, with peeling paint, whose floor is covered with soft green grass

And it’s impossible not to consider the modes of care, such as it was—restraint, or nurturance, or abuse—that would have transpired in a many-decades-old institution. These settings are always an index of the wider culture, revealing how we care for those whose maladies often can’t be seen.

View original 83 more words

Posted in Uncategorized | Leave a comment

An Education in Class

"Education," 1890, by Louis Comfort Tiffany and Tiffany Studios.

“Education” (Chittenden Memorial Window at Yale), 1890, by Louis Comfort Tiffany and Tiffany Studios.

Dedicatory preface: This post is dedicated to the anonymous steel worker from Worcester, Massachusetts, who helped me get to New York City in September 1998. You brought me to the bus station in Worcester and made sure I had cigarettes (I’ve since quit smoking—don’t worry!), snacks, and maybe $20 cash. You did not harm me. You could have taken advantage of me, and you did not. I asked you how I would pay you back and you told me: “Just dedicate your first movie to me.” Spoiler alert: I did not become what I thought I would become back when you asked that I dedicate my first movie to you. I am not a filmmaker. My journey, of which you were an integral part, is chronicled in the post that follows. You carried me on one leg of it, and eventually—after many twists and turns—I emerged a college professor. I dedicate this story—the story of my education—to you, Sir. Thank you.

***

Part I.

I am here today because a total stranger paid for my education, to the tune of nearly a quarter of a million dollars. That’s what I estimate it cost my patron—an individual unrelated to me by blood or marriage—to fund me through a B.A., an M.A., and all but the last two years of my Ph.D. Because of this financial support, I managed to complete an education with only around $10,000 in student loan debt—30% less than the national average in 2012. More importantly, I managed to actually complete my education.

But see, I’m one of the lucky ones.

I come from a lower middle class background and finished high school in the 90s—before prestigious Ivies like Harvard began offering free rides to students from families with annual incomes under $60,000. 

College should have been out of reach for me, and were it not for my patron I would not even have finished the Bachelor’s, challenging as it was to do manual labor (waiting tables, cleaning houses) full-time while attending college full-time and maintaining a GPA high enough to retain the modest financial aid for which I had been deemed eligible. In fact, I think my fate would have been quite similar to that of the working-class students interviewed by Jen Silva for her study Coming Up Short: Working-Class Adulthood in an Age of Uncertainty (2013, Oxford UP). Most likely I would have floundered for years at low-skilled jobs offering poverty wages, barely able to make ends meet and completely unable to afford an education. Or perhaps I would have taken on a massive amount of student loan and credit card debt, only to find that I was unable to keep up with the monthly payments. (As it is, it is taking me years to pay off the relatively small amount of debt I have accrued.)

When I think about what could have been, the alternate universe of my life looks pretty bleak. And it’s unfortunately not at all difficult for me to imagine this parallel universe because many of my friends—peers from similar lower middle class or working-class backgrounds—are living it as we speak. (There but for the grace of my patron go I…)

I’m not more intelligent than these friends. Nor am I more “deserving” than they. Nor harder working—no. What I am is unfathomably, improbably, unbelievably lucky, and this is what I remind myself of daily.

***

Part II.

"Taft Public Library and Mendon Town Hall, MA," by John Phelan.

“Taft Public Library and Mendon Town Hall, MA,” by John Phelan.

As far back as I can remember, the only thing I wanted was an education. I craved books like most kids crave toys or video games and would beg my parents incessantly for more, more, more. I’d spend hours every day reading and writing, and by junior high was easily devoting up to 9 hours per night studying in our basement. At parties and family functions I was reluctant to socialize, preferring instead to shut myself away in the silence of coatrooms and read.

My maternal grandmother, a secretary at Brandeis, always spoke in glowing terms about faculty and grad students at her institution, and though she died when I was only 13, her profound respect for the professoriat made a lasting impression. I wanted so very much to be like the professionals she admired.

My family’s background was and remains solidly lower middle class: my relatives are for the most part honest people who work hard—albeit at low-paying, unskilled jobs. We wait tables, sling retail, work in fast food chains. Some family members hold white collar office jobs or are employed in health care as nurses. To the best of my knowledge, nobody’s household income comes close to hitting the six-figure mark—including those comprised of at least two adults working full-time.

The town I grew up in is mostly lower middle class as well. A small, rural community that has doubled in population from approximately 3,000 to 6,000 since my childhood, Mendon boasts one of the nation’s last remaining drive-in movie theaters. It’s one of those towns that nobody (including Massachusetts natives) has ever heard of and that requires speakers to geographically situate it relative to larger surrounding towns in order to communicate where, exactly, it is located: “You know: near Milford, Uxbridge, Hopedale….” 

“Oh, OK.”

Like most children from Mendon, I attended H.P. Clough elementary, Miscoe Hill Middle School, and Nipmuc Regional High School. Because our town was so small, students were grouped together beginning in the 5th grade with children from the neighboring and slightly larger town of Upton.

Many of us did odd jobs under the table in our spare time. I cleaned houses for cash in elementary school and started babysitting a local 2-year-old when I was 12. When I turned 14, my mother marched me into Nipmuc to have my work permit authorized and then dropped me off at a coffee shop called The Donut Hole—my first “real” job. Within a few months I was opening and closing the store and performing the duties of a manager. I was 15. I never questioned any of these arrangements. That I had to work—and work hard—was something my parents had always impressed upon me, and the environment in which I was raised left me with the idea that all children must, like me, work.

The only characteristic that distinguished me from my peers was my hunger for education. I was not the brightest student in my school—and in fact I am quite sub-par when it comes to math—but I probably did have the most intense work ethic. In 4th grade I would lug a giant red dictionary on the school bus and read word definitions, in alphabetical order, during free periods. Around this same time my mother returned to college at Framingham State in an effort to complete her B.A. I remember pilfering her Psychology 101 textbook, with its forest-green cover, and schlepping that to H.P. Clough as well. My mother didn’t finish the degree, but she kept that textbook, and when I close my eyes I can still visualize many of its pages.

When your parents aren’t doctors, lawyers, CEOs, or oil magnates, they tend to tell you that “education is important,” but other than that they’re fairly hands-off. Mendon, Mass.—unlike, say, Milton—is not some hotbed of scholastic and professional competition. Part of this is class-related, and part of it was the result of growing up in the 80s and 90s, before the onset of American culture’s feverish obsession with standardized testing and “prestige” [1]. What this means is that children weren’t pushed or coached and it was generally expected that each child would achieve according to his or her “own potential” and drive. As kids, we were allowed to be mediocre—even fail. I have mixed feelings about the culture in which I was raised. Because I was a bright and exceedingly stubborn child, the lack of deep parental involvement motivated (forced?) me to figure out how to navigate the system independently. It worked out in the end primarily because I am unreasonably strong-willed. Most children are not quite as headstrong; I watched a lot of people brighter than me give up when faced, over and over, with a lack of scaffolding to help guide them through the educational labyrinth.

Silva eloquently captures the kind of culture to which I am referring in the following passage of Coming Up Short:

The responsibility for deciphering the rules of the game […] fell squarely on Alyssa [one of Silva’s research subjects] and her family—none of whom had the knowledge to fill out the FAFSA. (88)

Silva’s point, and mine, is that when your parents don’t understand how to negotiate the college prep and admissions system, they cannot transmit those skills (which they lack) to you—their child(ren). Children of working class and lower middle class parents are therefore left struggling to navigate the maze by themselves, often cobbling together knowledge gleaned from high school guidance counselors and wealthier, savvier peers. In order to be successful in this context, a student must be both reasonably intelligent and unreasonably motivated. Most adolescents—indeed, most human beings—are not the latter.

***

Part III.

"Phillips Exeter Academy Panorama," by E. Chickering & Co.

“Phillips Exeter Academy Panorama,” by E. Chickering & Co.

I’ve never claimed to be a reasonable person; I’m often contrary, willful, rashly audacious. I’ll never be described as someone “everyone loved.” I am just not that person. These character faults, though not laudable per se, were what enabled me to succeed. My parents and local culture may not have pushed education, but I most certainly pulled for it—kicking, screaming, and (when necessary) dragging everyone else along with me in my ferocious pursuit. At age 13 and without parental assistance, I applied for and was accepted into Phillips Exeter Academy’s summer program. Somehow I managed to persuade my parents to shell out the tuition—a cost which, in retrospect, we probably could not afford. [2]

Summer 1994 thus marked the beginning of my true education in class. I’m sure it must be the same for the wealthy kids: when you’re surrounded by wealth, you assume everyone is more or less in the same socioeconomic bracket as you. Or perhaps wealthy children are inculcated with a more nuanced understanding of class divisions than working and lower middle class children. I have no idea.

My life has been spent on the periphery of the wealthy—watching, listening, studying, and sometimes serving them, but never really being included as part of their social sphere. I suspect I’ve had more opportunities to rub elbows with the wealthy than most other members of my socioeconomic class, simply because I’ve always been able to get into their schools.

Yes, I said “their schools.”

I expected Exeter to be a place full of kids like me. I expected to be surrounded by geeks and dorks—kids enthralled by and enthusiastic about learning. I’m sure I enjoyed my summer at Exeter, if the four handwritten “Instructor’s Reports” are any indication. According to the official paperwork, I completed “Honors work” in “Worlds of Fantasy” and “Journalism” but merely “Satisfactory work” in “First Year Algebra.” [3] The journalism instructor, in particular, seemed pleased with my performance. 

The interesting thing about Exeter is that my memories of the school have exactly zero to do with any of the courses I took. Reading the evaluations of my performance is like perusing reports on some other person; I have no firsthand memory of anything discussed. I believe the journalism instructor when he indicates that I “wrote six articles on a variety of subjects, and four of them made page one [of The Exonian, the student newspaper].” He further describes some of the articles, including the topics with which they dealt. Yes, I am positive I did write them, but no—I have no recollection whatsoever of the experience. [4] This is unusual for me, but there is a reason for it. My only real memories of that summer are of the painful realizations that a.) socioeconomic class existed and b.) I was situated towards the bottom rung of the ladder.

The key player in my teenage imagination at Exeter was a young man named Fabian Basabe. Fabian has appeared semi-regularly in the press over the past decade, often bearing the dubious title of “the male Paris Hilton” and waxing lyrical about how he “doesn’t work” because it’s “not interesting.” Given that we don’t exactly run in the same social circles, I never had the opportunity to interact with Basabe during his New York “It boy” phase of the early 2000s, which began when he was “dismissed” from Pepperdine University for “submitting a paper he’d purchased on the Internet,” and ultimately reached its peak with a one-episode stint alongside Kourtney Kardashian on the failed E! reality show “Filthy Rich: Cattle Drive.” The press routinely describes Fabian as “a sweetheart,” and I have to admit that my memories of him as a person are not negative. Even in his teens he was charming and friendly—although he did not associate extensively with students outside his socioeconomic class while “studying” at Phillips Exeter. I would observe, transfixed, as Basabe paraded around campus with his entourage of fellow rich kids. We all liked Fabian—“Faby-Baby” as some of the girls called him—but we kept our distance, like commoners in the presence of royals.

Sometime during the summer, Basabe and his colleagues were all flying to some sort of luxury destination (the Hamptons, or something) to party for the weekend, and I called my parents to ask if they could send me some money so I could go. All of the kids had gotten together and worked out the total cost of the weekend, as well as the individual cost of attendance for each person’s transportation, food, and so forth. This information had been transmitted throughout campus via word of mouth. In theory, the party weekend was “public,” but in practice it was reserved only for those with enough parental cash to enjoy the festivities.

My phone call to my parents was not bratty or demanding—I swear. It was a call born of genuine ignorance and naïveté. It truly did not occur to me that we would be unable to afford this weekend. All of the other kids were going, so naturally I assumed that I, too, would go. My parents had to explain to me that we could not afford it. I asked: “How come all the other kids can afford it?,” and I distinctly remember my mother having to spell it out for me, slowly and carefully: “The kids at that school are extremely wealthy, Valéria. We are not like them. We can’t afford this.” It was the first time I had been in an environment comprised largely of people above my own socioeconomic class, and the knowledge came as a shock. 

Another cataclysm involved my wealthier peers’ consistent abuse of drugs and alcohol while at Exeter. The weekend parties they held on and around campus were sustained by freely flowing alcohol, copious amounts of weed, and probably harder drugs to which I was mercifully not exposed. The eldest participants in the summer program were around 18—meaning that, none of us were old enough to legally drink. Yet these kids seemed so casually familiar with drugs and alcohol. They weren’t even “experimenting”; they appeared well-versed in the recreational usage of a wide variety of substances. Parties, social events, opulent displays of wealth, and substance abuse seemed to be their primary areas of focus while at Exeter.

These kids were not like me. They had not muscled their entrance into this school, intellectually or otherwise. They were not in awe of the beauty of the campus or our exposure to new forms of literature. Their parents had arranged for them to attend. Some resented it, while others dutifully complied with the arrangement. But they were not like me. The realization was both confusing and devastating.

Imagine, for a moment, that you are 14 years old. Imagine that Fabian Basabe and his ilk are responsible for inaugurating your education in class. Imagine coming to the understanding, over 5 weeks, that there is an entire stratum of society that both effortlessly possesses and seems totally indifferent to the one thing you so cherish—the very thing for which, even at your tender age, you have already sweated, begged, pleaded, and battled to obtain access. Imagine emerging from those 5 weeks knowing in your gut that you were to face many, many more years of struggle, just to get a taste of what Fabian Basabe had already been handed dozens of times for free. Imagine realizing that merit meant nothing, or almost nothing, and money everything, or almost everything.

***

Part IV.

Vin1998

The author in Fall 1998, commuting to or from NYC as described in Part V of this post.

Fast-forward several years. While Fabian Basabe was getting expelled from multiple boarding schools in Florida, I was still fighting for my education. I had applied for and been accepted into Phillips Exeter. Not the summer program. The regular program—the real deal. It meant so much to me that, nearly 20 years later, I still have the acceptance letter (as well as the envelope in which it arrived) preserved in almost pristine condition:

Exeter_Acceptance_Letter

Accepted….sort of.

I was awarded a “half scholarship,” meaning that my parents were expected to pay 50% of the cost of tuition for Phillips Exeter. Currently annual tuition for Phillips Exeter hovers just under $50,000 and students whose parents earn $75,000 or less receive a free education. At the time I was applying, tuition was probably about half what it is now, but admissions was far from need-blind. The percentage my family was expected to pay was too much: “If we pay for Exeter, we won’t be able to afford to pay for college,” explained my parents. I had been accepted on merit but barred from attending because I was not born rich.

But again, I am stubborn.

I made my parents a proposal: help me find a way to pay for Exeter, or let me go to college two years early. They refused. I argued. Bargained. Yelled. Fought. Enumerated lists of reasons why my proposal was both reasonable and worthy of consideration. They refused, then refused again, and again. But I did not budge. Over and over, I demanded my education. I would not budge.

And so it was that I entered college in Fall 1996, at the age of 16.

Supposedly I was at Framingham State College (now “Framingham State University”) as a Dual Enrollment student, but I never returned to high school, instead taking on a full-time course load and refusing to look back. [5]

This does not mean that my entrance into college was smooth. Because I was a minor, I was not allowed to live in the dorms. Because at the time my family was collapsing (parents divorcing, father violent and abusive, mother clinically depressed—I won’t bore you with personal details that would distract from the focus of this post), I ended up semi-homeless and homeless much of the time while attending FSC. I ran away repeatedly to Boston and slept on the street or in youth homeless shelters and drop-in centers. I slept on couches in unlocked campus buildings. Occasionally someone in Boston would pick me up off the street and take me home for a day or two, and sometimes older students at FSC—aware of my situation—would sneak me into their dorm rooms for a night or two. Most of the people who took me in were caring and did right by me. One memorable week, I stayed at the home of Cambridge Mayor Denise Simmons. [6]

Excluding the semester I intentionally failed one of my classes (long story, and I don’t recommend that anyone follow my example)—I did well, earning mostly As and a smattering of B+s. To this day I remember an incident in a huge Biology I lecture which was team-taught by three professors: Dr. Beckwitt (unforgettable), Dr. Spence (I think?), and a third professor whose name I cannot recall. Dr. Beckwitt was one of those professors who rewards hard work and displays of brilliance but seems to have limited patience for any kind of undergraduate bullshit. I’d picked up on this from the start of the semester and wanted to impress him. So I waited, waited for an opportunity. Finally, one day, Dr. Beckwitt was lecturing on sickle cell anemia and how it was caused at the genetic level by a substitution of amino acids. Suddenly he stopped, mid-sentence, and scratched his head. “Hmm….but I can’t recall exactly what the substitution involves.” He turned to his colleagues: “Dr. Spence, Dr. X, do you remember?”

None of them did. 

From the back of the lecture hall, I raised my hand. I looked exactly like the photo of myself posted above. I was a scrawny 16-year-old raver kid with oversized pants, fire-engine red hair, and seven facial piercings. I chain-smoked outside the campus buildings. No one took me seriously….until I opened my mouth. Dr. Beckwitt squinted at me, then (slightly exasperated), asked: “What?”

“Dr. Beckwitt,” I offered, “Valine is substituted for glutamic acid.” [7]

Dr. Beckwitt looked at me as though I’d descended from the back rows of the lecture hall and punched him square in the nose in front of over a hundred undergrads.

“Oh,” he said, “….how did you know that?”

Students in the rows in front of me turned around, squinting and craning their necks to see what was going on.

“It’s on page 387 of the textbook, Dr. Beckwitt. I memorized it. I memorized everything. If you want, I can get my book and show you the page. I know I am right.” [8]

“No, no—that won’t be necessary.”

He knew I was right, too. After that, I got regular tutoring gigs and was paid hourly by my fellow students in exchange for assistance with Biology, English, and Spanish. And yes, I memorized the entire Biology textbook. My exams from that semester reflect that I was quoting from memory in essay responses. It is a skill I can still deploy today if I so choose, though I use it less now that I am out of school. [9]

While at FSC, I met two people who were to change my life forever, for the better. One of them was a young man named Kyle Mercury, and the other was a slightly older fellow student named Deanna Angelo. Deanna was 19 and I was still 16 when we met. We’d seen each other around campus and were mutually intrigued. Both of us had fire engine red hair, tattoos, and piercings. Neither of us quite fit in. Deanna and Kyle, like me, came from working class backgrounds (although Kyle’s father was somewhat wealthy). One day Deanna asked me for a cigarette, and I gave her one. I assumed she’d take it and walk away, but instead she sat down. We began to talk. We talked for hours. We skipped all of our respective classes that day. She took me home with her. And since that day—to this day—we have been inseparable. 

Because I was a minor, I could not work full time. I did work part-time as a dishwasher in a kitchen. I helped Deanna out as much as I could in exchange for being allowed to live in her apartment. I remember that I did the dishes (she hates washing dishes). Deanna would cook us Kraft macaroni and cheese out of cardboard boxes. Deanna taught me how to do the laundry. Deanna cared for me, raised me during a time I had been abandoned. I write this post now, in my mid-thirties, and I realize that Deanna was a 19-year-old raising a 16-year-old. And I realize now the magnitude of that, the responsibility. But we were working class kids making do with what we had, and this is the kind of thing that working class kids and poor kids do. Together with Kyle (my first boyfriend), we eventually formed a sort of trifecta.

***

Part V.

NYU_Card

Accepted…temporarily.

I took the SATs twice, with no prep. My best score was a 1310 (out of a possible 1600). I applied to only one school, for entrance in Fall 1998: NYU (Tisch School of the Arts, Film & Television).

I was accepted. What I remember most about this milestone was my mother commenting that my Aunt Elvira’s friends’ daughter, who attended Milton Academy, “also applied and she didn’t even get in!” [10]  By now I understood, to some extent, that the system was rigged. The fact that a wealthy child from a wealthy family who attended a wealthy prep school failed to get into NYU (but I got in!), was a delicious revelation. Everyone knew that acceptance to a “good school” like NYU was based on income rather than merit.

And yet, somehow, I got in.

Again, there was financial aid. Again, it covered about 50% of NYU’s annual tuition, which at the time was over $20,000 a year. I remember my mother co-signing on loans (but not being happy about it), and I remember being aware that I was not allowed to live in the dorms (I think because it would have prohibitively increased the cost of tuition) and would have to work full-time in order to support myself and pay my own rent, food expenses, etc. Despite these challenges, I remained unfazed. I was perfectly willing to work hard, and I wanted to attend NYU. Surely everything would be OK.

Once the paperwork was in order though, the real obstacles quickly became apparent.

***

Part VI.

ButchAnniesPlaza_WORCESTER

Butch and Annie’s Plaza, Worcester, MA.

It was only a day—maybe two—before the start of classes. It was Fall 1998. I was 18 years old. I was 18 years old and, on paper, I was enrolled at NYU’s Tisch School of the Arts in New York, New York. Deanna and I lived in an apartment on Hooper Street in Worcester, Massachusetts—about two blocks from a small shopping area called “Butch and Annie’s Plaza. 

The details elude me, but it was a money issue. (It has always been a money issue.) I was a student at NYU. This, I knew. I had already fought ferociously for a number of years to get there. This, I knew. I did not have enough money to get to New York City, nor a way to get to New York City, and classes were about to start. This, too, I knew.

I got into NYU but I could not get myself from Point A (Worcester, MA) to Point B (Manhattan). I had turned 18 in April and it had just become legal for me to work full time, and, while I did work, I had no savings because Deanna and I were barely able to cobble together money for basic living expenses even with both of us working full time. I could not afford to get to New York because I had no money.

Undoubtedly I’d already spent days wracking my brain, agonizing, trying to figure out what to do and how to get where I needed to be and panicking over the fact that classes were about to start and I wasn’t there. And somehow I ended up sitting on the curb in front of the convenience store (now a Dunkin’ Donuts) at the edge of Butch and Annie’s Plaza, and somehow I ended up sobbing in public because I could not figure out how I was going to pull this one off. I put my head down on my knees and cried. It was not the first time, and it would not—not by a long shot—be the last time I would openly bawl in public because of a desperate situation. People came and went. The sun got lower in the sky. I have to get to New York. I have to get to New York. Fuck.

It was then that I heard him.

“Hey—why are you crying?”

I looked up and saw a man emerging from a large truck. He was wearing jeans and a sweatshirt stained with what looked like black grease. 

“Because I’m supposed to go to NYU and my classes start in like 2 days and I have no way to get to New York.”

“Well, I can help you get to New York.”

I hesitated. 

“Look,” he said, “either you trust me or you don’t.” He opened the passenger side door of his truck.

For a fleeting moment I hesitated. I usually listened to my gut when it came to trusting people (or not), but with this guy I wasn’t sure. What if he murdered me and dumped my body in a ditch somewhere?

He touched the passenger side door again. Something in me said: go.

“Wait,” I said. “I’m coming.”

“Come on. I’ll take you to the bus station.”

“But I don’t have any money.”

“Don’t worry about it.”

On the way to the bus station we stopped at a gas station. The man put gas in his truck. “You smoke?” he asked me. I nodded. “What brand?” 

“Camel Lights,” I replied.

The man went into the gas station to pay for his gas and emerged with a plastic bag. In it were a couple of packs of cigarettes—my brand—as well as some snacks and drinks.

“What’s this?” I asked him.

“Well—it’s about a four hour bus ride, right? I figured you’d need these for the trip.”

He asked what I planned to do, and I told him I wanted to be a filmmaker. He bought me a one-way bus ticket to NYC and handed it to me along with the bag of snacks and smokes. He gave me around $20 cash. I didn’t even know his name. He never told me his name. I asked how I could repay him and he told me: “When you make your first movie, just dedicate it to the anonymous steel worker from Worcester who got you where you needed to go, OK?”

“OK.”

***

Part VII.

I got where I needed to go, and once I did there were additional challenges. Kyle and Deanna had decided to move with me to New York, but we didn’t have an apartment. For the first few months of my education at NYU, Kyle and I would stay in a cheap, roach-infested motel room in Chinatown during the week (for my classes) and then drive back to Massachusetts on the weekends to save money. I had no stable place to live, let alone to study. We were spending up to 15 hours per week commuting between Massachusetts and Manhattan.

Since I couldn’t afford my textbooks, I shoplifted the majority of them from a Barnes & Noble.

I made it to my first day of classes, having totally missed orientation and all of the other activities in which incoming freshmen typically partake at the start of their college experience.

Eventually Kyle—with the signature of his father, who agreed to act as our guarantor—managed to secure us an apartment in Brooklyn. We were all broke. Deanna and I, down to literally our last dollar, got hired as waitresses at a restaurant called Wilkinson’s Seafood on 84th and York. (It no longer exists). On our first scheduled day of training we realized we didn’t have enough money to ride the subway, so we jumped the turnstiles and got ticketed by a cop. We made it to work, though.

I spent three miserable semesters at NYU. Even with a stable place to live, the entire enterprise was a disaster. I worked full time at Wilkinson’s and maintained a full time course load at NYU (as I was required in order to retain my scholarship). I’d attend classes during the day, then head immediately to the restaurant to set up tables in the afternoon and wait on customers until around midnight. At midnight (give or take), we’d “break down” the dining room, cash out, and head home. Adding in the commute from the Upper East Side to Brooklyn, I’d usually get home by 2AM. I’d then catch 2-3 hours of sleep (Kyle would have to force me to awaken by physically propping me upright in the bed because my body was so exhausted that even multiple alarm clocks could not rouse me); get up around 5AM; and do homework for approximately 3 hours before heading to NYU for class. I did this every day and on weekends usually put in overtime at the restaurant. I worked 7 days per week. I did this for nearly two years before it finally broke me.

My situation was probably not unique, although at the time I felt like I was the only one at Tisch in these circumstances. I’ll never forget watching my colleagues in the introductory film and photography classes soar ahead of me with their projects (they didn’t have to wait tables) while I floundered and managed mediocre work at best. I was acutely aware of falling behind.

This was in 1998-2000, but students like me—that is, bright students who do not come from wealthy backgrounds—were evidently still facing the same dilemma I faced at least as of 2004, when the infamous case of the so-called “Bobst Boy” hit the national press. It would not surprise me to learn that, a decade later, there continue to be “homeless” NYU students. I admire Steve Stanzak (the “Bobst Boy”) for publicizing his experience and using that publicity to leverage himself a dorm room. Had I had internet access back in 1998, I likely would have been similarly vocal and perhaps then the outcome of my “education” at Tisch would have been different. But my situation occurred before the dotcom explosion—before anyone had PCs (let alone laptops) or regular internet access or blogs or the ability to draw national attention to one’s plight. So there I was, in pre-9/11 America: funded just barely enough to attend, but not enough to stay. Bright and driven enough to be accepted….sort of. Temporarily. With insufficient financial aid. Able to secure some student loans, but not enough to prevent me from also having to work full time. Drowning and yet thrashing mightily—violently—to remain afloat, enrolled, and in good standing.

At Wilkinson’s Seafood, on the monied Upper East Side, I was a “black tie” waitress (and eventually hostess). What this means is that I did formal waitressing in an upscale environment for wealthy customers. During my time at Wilkinson’s, my “regulars” included baseball players Keith Hernandez and Rusty Staub; one of the writers of Law & Order (never knew the guy’s name, just what he did for a living); and Barbara Feldon, who played Agent 99 on the TV show “Get Smart” (she is a vegetarian, or at least was when she frequented our restaurant). One memorable evening I waited on Martha Stewart, who is exactly as intimidating as one would expect. 

Life below stairs as in the Victorian era still exists, I can assure you. At the restaurant, the waitstaff had an entire universe of our own that stood parallel to—yet completely hidden from—the “front of the house” dominated by our upper-class customers. Since most of our days and evenings were similar, and since I was mind-numbingly sleep-deprived, I don’t remember many of the specifics of those two years. I recall quite a few shifts spent covering the dining room along with a young man named Marcelo, with whom I would aggressively compete for customers’ leftovers. The unspoken rule among waitstaff is that whoever buses a table once the customers have departed gets to eat any leftovers that remain on their plates. This may sound disgusting, but what you need to understand is that waitstaff spend 8 or more hours at a time on our feet, running back and forth, upstairs and downstairs, etc. without stopping. Often we are given very little time to eat, or no time at all. Add in the fact that Marcelo and I were in our teens and still growing, and therefore frequently very hungry. At Wilkinson’s the entire staff would eat together at around 5PM, before the evening’s work began, but by 10PM (after lots of running about), Marcelo and I would be famished. Other than the meal at the start of the shift, we were not allowed to eat food from the kitchen. Thus, our customers’ leftovers were the only food available to us. It was eat leftovers off their plates or wait until we got home (well after midnight) to eat again—so we fought fiercely to bus the tables and gain access to leftovers. A dessert for which we tussled with special intensity was the chocolate mousse cake with raspberry sauce. I think I “won” about half the time.

It is only in retrospect that I grasp the extent to which socioeconomic class divided me from the people on whom I waited. One evening an older man was eyeing me studiously from head to toe while he and his party awaited their table. After examining my legs, arms, torso, and so on, he looked me in the eye and asked: “Would you please open your mouth?” I thought the request odd but didn’t dare refuse; my boss was standing within an earshot and, as waitstaff, we were trained to always comply immediately and unquestioningly with anything our customers might request. I opened my mouth. The man studied it, asking me to bite down so that he could more clearly see my teeth. Finally, he turned to my boss and gleefully proclaimed: “She has great breeding!” The man had examined me physically—as one would a pedigreed horse or dog—before deciding that he wanted me to serve him and his family dinner. To him, I was little more than chattel simply because I had not been born rich.

When I reached my breaking point, I made one final attempt to remain at NYU. I knew that I couldn’t go on working the number of hours I had been working and sleeping as little as I had been sleeping. I could feel that something was going to give. Yet I wanted, with every fiber of my being, to remain in school. My education, so hard won, was slipping through my fingers. I hoped that I could receive some guidance from administration—perhaps suggestions for additional financial aid options or scholarships, something that would enable me to cut down my work hours and allow me to focus more on my studies. I made an appointment with someone who was probably a Dean or similar at Tisch, and this is what I recall about the appointment: he would not meet with me in his office, but instead insisted he had to go and only had time to speak with me in the elevator. I thanked him for the limited time and walked with him to the elevator. Briefly, I explained my situation and asked if he could provide any guidance, suggestions, or even the name of someone else affiliated with my school with whom I could speak. 

As the elevator descended, he looked at me and flatly declared: “You can’t work and go to NYU film school.” [11]

The elevator stopped and made a characteristic “ding.” The doors opened. He turned and walked away.

An NYU administrator to whom I’d turned hoping for advice had just confirmed what I’d known deep inside all along: I was not welcome at NYU. I did not belong there. Sure, I’d gotten in—but it was clearly some sort of joke, or trick, or mistake. My classmates did not have to work; they belonged at NYU. Me? No—I did not belong.

Not working was not possible. Since the administrator had insisted I couldn’t “work and go to NYU film school,” the only course of action left to me seemed clear. I had to work. I had always had to work. There was no way for me to not work.

Two weeks later, I dropped out of NYU. 

***

Part VIII.

601838_10151643261943698_2072535967_n

Left to right: the author defending her Ph.D. dissertation in May 2013, with (L-R) advisors Victor K. Mendes and Anna M. Klobucka offering feedback during the defense.

A few months after dropping out of NYU, I took my life savings of $2,000 and a backpack and boarded a plane to Portugal. I was fluent in Spanish but did not speak any Portuguese. I did not know anyone in Portugal. I had never been there before. What I knew was that somehow, despite my best efforts, my dream had fallen apart. What I wanted was to leave the tatters of that dream behind and start over someplace new.

I ended up living abroad for 16 months, during which time I became fluent in Portuguese. I arrived back in New York on August 2001—a month before 9/11. I was in Weehawken, NJ on the day of the attacks and watched the towers fall from across the Hudson River. For two more years I continued working odd jobs, low-class jobs, whatever jobs I could find. I waited tables, washed dishes, cleaned houses. Deanna and I continued to eat Kraft macaroni and cheese out of cardboard boxes and struggle to make ends meet. I gave up on my dream of an education because it seemed unattainable. I resigned myself to the fact that I was never going to finish the Bachelor’s degree I’d started at 16. 

What changed for me was nothing that I did. I did not suddenly work harder, or get richer, or become more intelligent. What happened to me was a stroke of luck rarer than being hit by lightning. My patron. Long after I had given up, my patron found me and offered to fund my education.

The problem with this story is that what happened to me does not happen, and that is why I am writing this post.

There have been a number of articles lately in the New York Times about how colleges and universities have done little (or nothing) to improve poor students’ access to higher education, and/or how the culture of higher education continues to be stubbornly “upper class,” leading poorer students to struggle academically and socially in college and university environments. Around the country, many prestigious universities—including the one where I currently teach—are making a greater effort to engage in meaningful conversations about “socioeconomic diversity.” Yet the fact remains that not just poor students, but even middle class students—and I don’t mean that euphemistic use of “middle class” that abounds at elite universities, where families making $250,000+ a year are regarded as “middle class,” but rather actual middle class students, as in those from families whose income falls in line with the median household income of around $50,000 per year in 2013—are still being shut out of higher education.

My point is that I was lucky in so many ways from the get-go: I was born reasonably intelligent, received a decent public school education in Mendon, and was blessed/cursed with an irrationally stubborn streak to my personality. Despite the many hardships I have endured in my life, I was not malnourished in childhood (for example), nor did I grow up in a neighborhood plagued by gang violence or drugs. Yes, I come from a lower middle class background, but I still enjoyed enough advantages early in life that I was able to dream of an education in the first place and take steps (however unsophisticated or faltering at times) towards achieving that dream.

And yet I still would not have made it without my patron.

That part is worth repeating: I still would never have made it without my patron.

There are not enough wealthy patrons in this country—nor are the vast majority of wealthy would-be patrons generous enough to consider funding the education of an underprivileged college student—to compensate for the degree of inequality that persists deep within our educational system. Barring a few exceptions, colleges and universities in the U.S. are staffed, run, and funded by the wealthy—people who came from upper-class families, have enjoyed lives of comfort and privilege, and now seek to pass those benefits on to fellow members of their own socioeconomic class, i.e. – the new generation of wealthy and upper-middle-class college students. The American academy has always been, and unfortunately remains, far more of an aristocracy than a meritocracy. Since education is associated with better prospects of upward mobility, access to it is critical in order to reduce socioeconomic inequality, but with the costs of college education rising at an obscene pace, all but the wealthiest families are being priced out.

The question I am compelled to ask, then, is this: why should it take an actual miracle for any bright, motivated, hard-working young person from one of the wealthiest nations in the world to have access to a quality education?

Make no mistake: the luck that befell me was a miracle. A miracle—not a success story. A success story would be if every bright young person from a disadvantaged background had a patron like mine. Better yet: a success story would be for such patronage to be unnecessary.

***

[***FIRST DRAFT: Wednesday, September 24, 2014. 03:44H CDT***]

***

Notes

1 – Note Milton’s median incomes compared with Mendon’s (from the towns’ respective Wikipedia pages) via links above.

2 – Tuition for summer 2014 is around $8,000, but I recall it being less than half that amount when I attended in 1994. Of course, that was twenty years ago. At today’s price tag, my parents would not have been able to send me to the summer program at all.

3 – See? I told you I’m sub-par in math.

4 – I went so far as attempting to pull archives of my work from The Exonian, but their online database does not go back far enough, unfortunately.

5 – For some reason this cost my family nothing. I cannot remember why, although I imagine there is some sort of program in place. I know that we either did not pay tuition to FSC or paid so little as to make it a token (and affordable) amount.

6 – One of her daughters is approximately my age and brought me home. I am pretty sure there are photos of me somewhere holding signs campaigning to re-elect Mayor Simmons. I have hazy memories of doing so during the time I spent at her home.

7 - Nope, I didn’t have to look this up. It is a fact I will never forget specifically because of this classroom incident.

8 – This is not the actual page number. That, I did forget.

9 – No, I do not have a photographic memory. I devoted hours to memorizing individual textbook chapters for my science classes. It was difficult and time-consuming, but worth it when I performed well on my exams.

10 – “Aunt Elvira” is not my blood Aunt but is close enough with my mother that my brother and I were raised to address her as “Auntie.” She lives in Milton and was one of the sources of my knowledge, growing up, that Milton was “where the rich people lived.”

11 – This man is fortunate that I remember neither his name nor his position, because I am no longer a powerless, timid undergrad, and I certainly would have something to say to him now regarding that episode in the elevator.

Works Cited

Silva, Jennifer M. Coming Up Short: Working-class Adulthood in an Age of Uncertainty. Oxford: Oxford UP, 2013. Print.

 

Posted in Uncategorized | Tagged , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , | 12 Comments

Complicating Disability Studies’ Relationship to Medicine

One of Disability Studies’ major hang-ups is its default position with respect to the field of medicine and—by extension—with medical practitioners. The adversarial stance of DS towards medicine (and doctors) stems largely from the former’s repudiation of the medical model of disability, according to which—as defined by Disability Studies scholars—individual disabled people are identified as “problems” to be “fixed” or “cured.” [1]

The graphic below, borrowed from the website of the Democracy Disability and Society Group, nicely illustrates the medical model of disability as theorized by DS scholars and activists:

medical-model

Image credit Democracy Disability and Society Group (ddsg.org).

Before I dive into my discussion on DS’s positions vis-a-vis “the medical model,” I’d like to clarify that in my own work I make no distinction between “impairment” and “disability,” preferring instead to utilize “disability” to designate the complex matrix of physical/material and socio-cultural phenomena that together produce conditions of disablement for some people. [2]

The main issues that I have with Disability Studies’ framing of “the medical model” in its current incarnation is that it presumes the following:

  • Medicine and physicians are always paternalistic.
  • Recipients of “medical care” are always “passive” and “disempowered.”
  • There is no gray area between the extremes of “cure” and “do nothing” when it comes to medicine.
  • All “medical” care is bad.

It is worth noting that the definition of “medical model”—a term coined by psychiatrist R.D. Laing to describe the working model for training physicians and from which the related “medical model of disability” derives—is one articulated in the 1970s. It therefore bears little resemblance to working models employed by physicians in the 21st-century, especially newer generations of doctors who have moved away from paternalistic attitudes and tend to view them as outdated and ineffective. [3]

The Democracy Disability and Society Group graphic includes both “impairments” (aka “disabilities”) and “chronic illness,” but I’m puzzled as to why they occupy separate categories considering chronic illnesses are in fact disabilities. A disability (again, the graphic uses “impairment” to denote what I call “disability”) is quite simply a mode of functioning that differs from that of the majority of people. For instance: if the majority of people have 2 legs, then having only 1 leg is a “disability” because it involves a physical form (and consequently a mode of ambulation) that differs from that of the majority of the population. If most people do not perceive sights and sounds as overstimulating but someone with Autism does, then Autism is a disability because it involves sensory/cognitive processing modes that differ from those of the majority. It logically follows that if most people have immune systems characterized by a common baseline level of inflammation, people with immune systems characterized by higher-than-average inflammation levels (manifesting in a variety of conditions with names like MS, Rheumatoid Arthritis, Chron’s, etc.) are configured immunologically in a way that differs from the majority of the population and consequently must operate differently from their immunologically “standard” counterparts. In other words: yes, chronic illness (defined as “ongoing immunological inflammation that differs from that found in the majority of the population”) is a disability.

A couple of factors contribute to the “classical” separation within DS between “chronic illness” and “disability.” As shown in the graphic, disability is traditionally viewed as a “physical, mental, [or] sensory” difference, but overwhelmingly “mere” physical differences are prized, with the “ideal” disabled person being an “otherwise healthy” individual with a motor impairment (i.e. – missing limb, spinal cord injury, war trauma, etc.) necessitating either a wheelchair or prosthesis. Within the hierarchy of disability—yes, there is a hierarchy—Deaf and blind people are also prized, since they are “otherwise healthy.” [4] A quick Google image search of the keyword “disability,” while admittedly not scientifically rigorous, provides a terrific example of the hierarchy of disability at play.

My proposal is that this emphasis on “health” as the standard by which people are included or excluded as “disabled” is as outdated as the paternalistic style of medical practice. By emphasizing the image of disability as “mere” physical variation in “otherwise healthy” individuals, Disability Studies is very problematically helping to enshrine the ideal of “health” as well as colluding in the over-arching cultural rhetoric of “health as morality,” wherein immunological variation is code for “immorality” and even “inferiority.” By clinging to mainstream ideals of “health,” Disability Studies works to achieve greater equality for some disabled people by actively oppressing others. For a field allegedly committed to social justice and equality, upholding this kind of hierarchy of oppression is unacceptable.

Because chronic illnesses are many times imperceptible [5], they tend to be overlooked by the general public (including the DS community), and this lack of perception seems to be the second key determinant—besides the prevailing rhetoric of “health”—in their exclusion from disability and Disability Studies. Everyone knows when a paraplegic person enters the room: he’s using a wheelchair. The Deaf person, in signing, not only communicates but also performs his or her Deafness. The blind person with a cane or dark glasses is identifiable as blind. Being identifiable, even by laypeople, as disabled is important to the validation of “disability identity” precisely because of DS’s internalization of cultural ideals of “health.” Disability Studies’ idealization of “health” and its emphasis on perceptible forms of disability are inextricably intertwined.

In contrast with “classically” acknowledged forms of disability like Deafness, blindness, using a wheelchair or prosthesis, etc., chronic illnesses are often not perceptible to the general public. The crucial point here is that chronic illnesses are frequently only perceived (and perceptible) by *medical* professionals—and even then indirectly, via analysis of complex physical exams, blood work, and so forth. They are thus prone to being reflexively (if incorrectly) “medicalized” by default and rejected by DS scholars and activists as “something other than disability.”

It is both poignant and ironic that, while people with perceptible disabilities are more likely to suffer discrimination and exclusion by the non-disabled public by virtue of their disabilities being perceptible, people with imperceptible disabilities (such as chronic illnesses) are routinely excluded from Disability Studies as “other-than-disabled” or “non-disabled” for (in part) the opposite reason. [6]

Disability Studies’ rejection of “the medical model,” combined with immunologically disabled people’s configuration or placement within that model, contribute to conditions that foster the exclusion of chronically ill people from disability and from DS. DS “needs” to reject chronically ill people because it “needs” to reject “the medical model,” and chronically ill people are stubbornly enmeshed within that model. Chronically ill people are treated by the field as “the problem” in need of “cure” or “fixing”—and this “cure” or “fix” is accomplished through segregation, which takes the form of exclusion from the category of “disability.” Oh what a tangled web we weave when nearly an entire field uses the very same working model it claims to loathe as a virtual blueprint for casting off certain members of its own group! [7] 

Instead of rejecting chronic illness as “not disability” simply because it doesn’t fit into the established paradigm of “the medical model of disability” as formulated by Disability Studies scholars and activists, what if we flipped the lens? What if we asked what recognizing chronic illness as a disability could potentially do for our existing understanding of “the medical model of disability”?

One of the first shifts that would occur would pertain to our views on medicine, medical care, and physician-patient relationships. The experiences of people with chronic illnesses (aka “immunological disabilities”) in the realm of medicine often bear little resemblance to the invariably negative and fatalistic views of medicine propagated by leading DS scholars. For starters, since chronic illnesses are not “curable,” there tends to be minimal—if any—fixation on the notion of “cure” on the part of the physician. When and if an insistence on “cure” does occur, it is generally on the part of the chronically ill person, and my argument would be that it is because that particular person has been indoctrinated into the rhetoric of “cure” by organizations like the National MS Society, the Arthritis Foundation, etc. (and on a larger scale, by contemporary society’s worship of “health”). This is no different than an individual paraplegic person expressing his/her desire to not be paraplegic, or an individual blind person maintaining that they would prefer to be sighted. What is different is that chronically ill people receive far less support from the general public should they choose not to oppose the rhetoric of “cure,” coupled with far more (organizational and social) pressure to adhere to this harmful rhetoric. If charities and organizations such as the NMSS and the AF continue to foster the idea that chronic illness is an “evil” and that “cure” is the only solution, then many chronically ill people will continue to succumb to pressure to internalize these views, even if it proves disempowering and unproductive.

The relationships between chronically ill (aka “immunologically disabled”) people and their physicians are typically long-term ones that emphasize continuity of care, partnership, interdependence, and support. Far from being “passive recipients” of care, we are engaged participants in a dynamic that contributes to our own care and that of others. Far from having “cure” (or even “treatment”) imposed on us, we are empowered to provide input regarding how we would like to approach our disability (and how we would like others, including our doctors, to approach it). Notice that I deliberately use terms like “care” and “approach to” instead of “cure” or “fix.” The latter terms simply fail to describe my experience within the context of medicine, and so I avoid them.

An immunomodulatory drug—the type of drug most people with immunological disabilities use—is best viewed as a prosthesis. In The End of Normal: Identity in a Biocultural Era, Lennard Davis affirms: “A drug would be a prosthesis if it restored or imitated some primary state that appears to be natural and useful” (64). Davis makes this statement in the context of his argument that SSRIs are not “chemical prostheses” for depression, since happiness is not a “primary state” of being and since there is compelling evidence to suggest that SSRIs do not actually work (Davis 55-60). His assertion is relevant to my position in this blog post since, unlike SSRIs, immunomodulatory drugs do “restor[e] or imitat[e] some primary state” (levels of immunological inflammation and patterns of immunological behavior more consistent with those of people without autoimmune conditions) that “appea[r] to be natural and useful” (“natural” in the sense that these altered levels and patterns are consistent with those of people without autoimmune conditions, and “useful” in that they restore—to one an extent or another—“normal” immunological function in individuals with altered patterns of immune activity). Like a paraplegic deciding which model of wheelchair to use or an amputee picking the perfect prosthesis, we with chronic immunological conditions have input into which (if any) immunomodulator to use. If the chosen prosthesis (wheelchair, artificial limb, chemical compound) turns out to be ineffective or uncomfortable, we can choose a different one.

Interestingly, because specialists who care for patients with a particular condition (like Multiple Sclerosis or Chron’s) often maintain active research agendas that focus on the condition in which they specialize, their relationships with patients are best characterized as mutually interdependent. The physician needs the patient (or at least some patients) to consent to participating in clinical trials and providing data that will facilitate the physician’s own research, while the patient needs the physician to not only periodically assess his or her function, but also to prescribe (or provide access to) what are in effect chemical prosthetics that enable “normal” function.

The fact that these chemical prostheses are not accessible without recourse to a physician is arbitrary. By this I mean that it is not difficult to imagine an alternate capitalist universe in which 3D printers (with which wheelchair users can now print portable ramps) or even Braille are made for “limited use only” and controlled as tightly as immunomodulatory drugs are now. Wheelchair users got lucky in that they don’t require a new prescription every 30 days and a “co-pay” (imagine a monthly “user’s fee” for a wheelchair) to access the adaptive technology that is their wheelchair or 3D printer. Blind people got lucky in that they don’t require “prior authorization” to use Braille. There is nothing “special” about immunomodulatory drugs—meaning, nothing inherent in the drugs themselves or even the delivery system—that somehow makes them “medical” in contrast to so-called “non-medical” tech like 3D printers, Braille, and wheelchairs. It just worked out that groups of people figured out how to manufacture, control, and ultimately profit off of immunomodulatory drugs before they figured out how to do the same with Braille or 3D printers. Or maybe they figured out ways to make immunomodulatory drugs more profitable than Braille or 3D printers. It doesn’t matter. My point is that immunological prostheses are no more “inherently medical” than any other prostheses. They became medicalized because certain people figured out how to profit off of them by tying them into the established medical system. This is utterly random.

Given the randomness of the system in place; the evolving role of physicians (with shifts toward “patient-centered care” instead of “paternalistic medicine” and relationships of mutual interdependence between both parties rather than unilateral dependence running from patient to physician only); and medicine’s accepted position as an intermediary which, for some disabled people, controls access to certain types of chemical prostheses that have been arbitrarily classified as “medical,” it seems to me that it might be high time to question and, indeed, to complicate Disability Studies’ relationship to medicine. To move forward with such a paradigm shift, the field needs to stop medicalizing chronic illness. It needs to stop labeling people with chronic illnesses (immunological disabilities) as a “problem” in need of “curing” or “fixing” through exclusion from the category of “disability.” It needs to take another look at the so-called “medical model”—one it mimics in its treatment of the chronically ill while simultaneously decrying as “undesirable” for all other disabled people. To do this, the field will need to confront its existing hierarchy of disability and seek to trouble the notion that a disability must be perceptible to laypeople in order to “count.” But most importantly, Disability Studies will need to acknowledge that its “medical model of disability” no longer corresponds to the out-dated “medical model” of medicine on which it is based—and that the widening gap between the two threatens to quash the growth of the field.

[***FIRST DRAFT: WEDNESDAY, JUNE 11th, 2014. 23:01H EDT***]

Notes

1 – I specifically add the clumsy verbiage “as defined by Disability Studies scholars” to emphasize that medical professionals themselves would be unlikely to identify with this view of their own profession. As such, “the medical model of disability” needs to be understood within the context of its formulation by DS scholars and activists. The “model” is not neutral or objective; it is a specific framing of the field of medicine and of medical professionals by people with disabilities and/or their allies, many of whom aggressively oppose any kind of “medical” intervention.

For further reading and some helpful diagrams illustrating differences between “medical” and “social” models of disability, please consult the following pages:

http://ddsg.org.uk/taxi/medical-model.html

http://ddsg.org.uk/taxi/social-model.html

http://ukdisabilityhistorymonth.com/the-social-model/2011/9/13/understanding-the-social-model-of-disability-the-medical-mod.html

2 - For an expanded discussion of my views on the “impairment/disability binary,” see this thread and this document (especially pages 2-3 and notes on page 20).

3 – The NY Times piece is by a cardiologist who discusses grappling with tensions between paternalism and autonomy, and the Forbes article is by a physician criticizing what she refers to as “dinosaur physicians”—that is “old guard” M.D.s who still practice rigidly paternalistic medicine.

4 – Many Deaf people do not view themselves as disabled, since Deafness can also be conceptualized as a cultural and linguistic difference rather than a “disability” per se.

5 – “(Im)perceptible disabilities” is a phrase coined by Stephanie Kerschbaum as a preferable alternative to the ocularcentric “(in)visible disabilities.”

6 – “In part” because DS’s enshrinement of “health” should not be underestimated as a motivating factor in the exclusion of chronically ill people, either.

7 – When scholars within DS do write about medicine, they tend to focus on eugenics, end-of-life care, and assisted suicide, thereby perpetuating the stereotype that medicine equals “sickness and death only.” See recent work by Lennard Davis (The End of Normal: Identity in a Biocultural Era, 2013), especially Chapter 7 and Tom Shakespeare (Disability Rights and Wrongs, 2006), especially Part II.

Posted in Uncategorized | Tagged , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , | 23 Comments

Triggernometry Redux: The “Trigger Warning” as Speech Act

Hello-Im-a-victim

An addendum to my earlier post on “trigger warnings,” inspired by a very late night discussion on Facebook:

**

The “trigger warning” can be viewed as a speech act. Considered as such, the act it performs is indirectly declarative; it (pro)claims for oneself and/or others the identity of “victim.” Because in the United States, in particular, the identity of “victim” is culturally enshrined, the deployment of the “trigger warning” is in essence an assertion of “moral superiority.” (It functions much like “not having privilege,” as described in Gawker’s playful online series “The Privilege Tournament”).

The (paradoxically privileged) status of “victim” confers upon its owner(s) the (unquestioned and unquestionable, because “sacred”) right to exert control over narratives (including the speech of other people, especially “non-victims”)—–a right understood as unimpeachable owing to the (pro)claimed, privileged status of “victim” and the authority this status bestows.

This is what George Will meant when he stated that victimhood is a privileged status, and this is just about the only thing he got right in his op-ed. He didn’t mean (or say) it was a privilege to be raped. He said that the status of “victim” comes with certain privileges. And this is what he meant. His greatest taboo, of course, was in exposing the culture of victimhood as one of power and in pointing out that the position of “victim”—-at least in contemporary U.S. society—-is one of power.

In other words, Will’s “transgression” consists of naming the power that the label “victim” intends to occlude, and upon whose occlusion the maintenance of that power depends. In exposing both the underlying mechanisms of power at play and their occlusion, Will’s op-ed threatens to subvert the authority of “victimhood.” It is primarily for this reason that he is currently being skewered online, although no one skewering him is openly admitting that this is the reason—-for doing so would force his critics to even more clearly detail the power structures underpinning the culture of “victimhood.”

[***DRAFT: WEDNESDAY, JUNE 11th, 2014. 17:44 EDT***]

Posted in Uncategorized | Tagged , , , , , , , , , , , , , , , , , , , , , , | Leave a comment

Triggernometry

tumblr_static_tw-sign6

So “trigger warnings” are back in the news again.

I’ve been reading along for months while concertedly refraining from making any sort of public comment on the discussion, but now I feel kind of obligated since it keeps raging on and I’ve already participated with a certain degree of vigor on closed forums and in Facebook feeds.

This post will consist of two parts: the first in which I express my personal views on “trigger warnings,” and the second in which I offer a brief cultural analysis of the “trigger warning” in hopes of shifting the collective conversation in a new direction.

PART I.

One of the people who spearheaded the resurrection of “trigger warnings”–specifically their use on college campuses–is a sophomore named Bailey Loverin who attends UC Santa Barbara. Loverin has articulated her arguments in favor of implementing campus-wide policies apropos of “trigger warnings” on such national platforms as the NY Times and USA Today:

From music to movies, content and trigger warnings are everywhere. We accept them as a societal standard. 

With these introductory sentences, the author concedes that the impetus behind her support of “trigger warnings” on syllabi stems, at least in part, from having grown up in a society in which “warning labels” appear before films, on music albums, on food, and so on. Ms. Loverin is so used to the ubiquitous presence of warning labels that extending the presence of these labels even further seems not only “natural,” but positive.

“Warning labels” in the United States are a relatively recent trend which began in 1938 under the Federal Food, Drug and, Cosmetic Act. Although they started with food, they quickly spread to tobacco, alcohol, and then finally to music in the late eighties and early nineties. The current ratings system employed for movies also has its roots in the 1990s.

What Loverin does not acknowledge in her opening paragraphs is that the reason these content warnings began to proliferate was due to the uptick in frivolous lawsuits in the U.S. and the desire of companies to engage in what is essentially “defensive advertising”—strategically warning “consumers” about any and all possible risks associated with their products or services beforehand so that said “consumers” cannot sue companies for millions of dollars, claiming the companies “failed to warn” them of any particular risk factor.

A recent frivolous lawsuit provides a classic example of this phenomenon (and makes me wonder if we’ll soon see a new set of “warning labels” on sneakers): a Portland pimp, Sirgiorgio Clardy, sued Nike for 100 million dollars after being convicted and sentenced to 100 years in prison for beating to death a john who had refused to pay him for one of his prostitute’s services. Clardy’s argument?

[…] Nike, Chairman Phil Knight and other executives failed to warn consumers that the shoes could be used as a weapon to cause serious injury or death.

Clardy’s lawsuit against Nike is pending.

Regarding this aspect of Loverin’s apology for the “trigger warning,” I am inclined to agree with Tressie McMillan Cottom, who writes:

[…] the “student-customer” movement is the soft power arm of the neo-liberal corporatization of higher education. No one should ever be uncomfortable because students do not pay to feel things like confusion or anger. That sounds very rational until we consider how the student-customer model doesn’t silence power so much as it stifles any discourse about how power acts on people.

You can read McMillan Cottom’s full post on the subject here.

What bothers me about the “trigger warning” is this: it implies that it is my responsibility, as a speaker and writer, to preemptively modulate the emotional and psychological responses of anyone who might hear or read my words—rather than the responsibility of those individuals to learn how to modulate and/or regulate their own emotional responses to my words (and to the world in general).

More importantly, though, it seems to me that the mass deployment of the “trigger warning” threatens to perpetuate a cycle of victimization and helplessness: people are allowed to bypass material that might disturb them emotionally or psychologically, and thus potentially avoid ever learning how to modulate their own thoughts, reactions, and emotions when confronted unexpectedly with disturbing stimuli.  In this sense, “trigger warnings” are the helicopter parents of language: in seeking to protect, they inadvertently enable large numbers of people to remain walking wounds of unhealed trauma.

In fact much of the available literature on trauma and PTSD advocates against the kind of maladaptive coping mechanism to which the “trigger warning” caters. One particularly apt passage of the Handbook of PTSD: Science and Practice (2010), flatly states:

Negative reinforcement of fear through behavioral avoidance is the primary process that is postulated to sustain, and even promote, the maladaptive fear response. Typical behavior avoidance manifested by traumatized individuals includes avoidance of stimuli associated with the traumatized event, not disclosing or discussing the traumatic event with others, social isolation, and dissociation. (41)

Translated into plain English, this quotation says: “Avoiding stimuli associated with a trauma as a result of fear leads to the perpetuation of both the avoidant response and the fear.” Or, even simpler: “Avoiding triggers perpetuates trauma and the ugly feelings associated with it.”

So much for the declarations of Loverin and others that “trigger warnings” “avert trauma.” Not only do they not “avert trauma,” they may actually serve to perpetuate the trauma and associated feelings of panic, in addition to stalling the healing process, which can only be initiated and sustained by confronting the trauma.

Much like well-meaning but overbearing parents who think they are doing right by their children when they refuse to let them play outside or intrusively moderate their children’s fights, “trigger warnings” do more harm than good to the very population they aim to “protect.”

And while Loverin alleges that “[“Trigger warnings” are not] an excuse to avoid challenging subjects; instead, they offer students with post-traumatic stress disorder control over the situation so that they can interact with difficult material,” it is difficult for me to see how the function of a “trigger warning” is anything but an invitation to do precisely that—avoid the subject matter, leave the classroom, and engage in other maladaptive coping strategies.

Exploiting the trope of the “mad student” so familiar from recent media reports and capably analyzed by scholar Margaret Price in her monograph Mad at School: Rhetorics of Mental Disability and Academic Life (2011), Loverin then goes on, in her USA Today op-ed, to paint the following grim picture of the “traumatized student”:

If students are suddenly confronted by material that makes them ill, black out or react violently, they are effectively prevented from learning. If their reaction happens in the classroom, they’ve halted the learning environment. No professor is going to teach over the rape victim who stumbles out in hysterics or the veteran who drops under a chair shouting.

Furthermore, seeing these reactions will leave other students shaken and hesitant to engage. With a trigger warning, a student can prepare to deal with the content. (bold emphasis mine)

Here, again, it is possible to see how proponents of the “trigger warning” are advocating for strategies of trauma avoidance—both on the part of students with PTSD, on the part of faculty and staff, and on the part of students without PTSD who share classroom space with those with PTSD. “Trigger warnings,” according to Loverin, will cut down on classroom outbursts and avoid “disturbing” everyone involved. It is not at all difficult to see the specters of Eric Harris and Dylan Klebold or Kip Kinkel or Seung-Hui Cho lurking between the lines of Loverin’s text.

It is as though Loverin is suggesting that one kind of “trigger warning” will help prevent another, more gruesome “trigger warning”—that of the school shooting. While this type of neat and tidy logic may be very appealing to administrators, it is largely fallacious since the reasons for school shootings have very little to do with PTSD and “trigger warnings” and a lot to do with, basically, the availability of guns and our enshrinement of a culture of violence in the United States.

A claim I’ve heard repeated in various blog posts and op-eds by those in favor of the “trigger warning” stands out: namely, that post-traumatic or distressed reactions by students “hinder” or “prevent” learning. (Loverin takes it a step further, citing “halted learning environments” for both the student experiencing PTSD and others present in the classroom. Interestingly, her description flirts with the idea that witnessing another’s trauma is in and of itself a form of trauma—an argument parallel to the one which asserts that, for any victim of a past trauma, witnessing evidence of similar trauma in the present is always already traumatic.) When I read the passage above, though, I see something quite different: I see an opportunity to engage with the classroom (students and events) in real-time and to use that engagement to promote learning. I believe, in short, that pain can be a site of learning both for those who experience it and those who bear witness to it.

I am not in favor, obviously, of inflicting pain for the sake of inflicting it—that would be sadism. What I am suggesting is that it’s OK for classrooms to be messy, human places where messy, human reactions occur, and that I think it’s better for us to engage with them as they transpire than attempt to curtail them before they can take place. I do not buy the assertion that incidents such as those Loverin describes “prevent learning.”

One aspect of Loverin’s piece which I find compelling is her focus on the concept of “control.” She reiterates a couple of times that trauma victims need to feel “control”—indeed, mastery of trauma entails regaining this feeling. Where we disagree is about how that mastery should unfold and over what—or whom—that control should be exerted. My position is that mastery of trauma is best achieved by confronting trauma rather than seeking to avoid it and that learning to modulate one’s own emotions in a diverse array of settings and when faced with a wide range of subject matter is a good way to regain a sense of “control.” Seeking to exert control over course content or classroom discussions (or other people) for the sake of (unhealthily) avoiding one’s trauma is not.

Which brings me to another observation: whenever I have seen demands for “trigger warnings” deployed, they seem to be deployed by whomever wishes to regulate either a conversational topic or the manner in which it is being articulated. That is, I see “trigger warnings” being used to strategically to silence some voices. I’m reminded again of Tressie McMillan Cottom’s “student-customer” model, since the question of who is attempting to exert control over the discourse has a lot to do with social class (and probably race as well).

I once read somewhere: “Being rich means being able to choose what one does and does not experience in life.” We could modify this statement to read: “The richer you are, the more control you have over what you do and do not experience in life.” It is reasonable to assume that places like Oberlin CollegeUC Santa Barbara, and Rutgers—three institutions of higher learning embroiled in debates about “trigger warnings”—are by and large populated by students from comfortably upper-middle-class families (or above). [1]

These students—more so than poor students—see themselves as “consumers,” which makes sense since the more disposable income you and your family have, the more you engage in patterns of consumption and, more importantly, the more you experience consumer choice. To give a quick, concrete example of this phenomenon at work: if you’re poor and going food shopping you typically go to the cheapest grocery store around and look for the least expensive food item available (like Ramen). Your range of “choice” becomes limited to whatever is cheapest or—on a good day—to several equally as cheap items. Conversely, if you’re upper-middle-class or wealthy, you have the ability to exercise choice over which supermarket you will shop at and then, once there, over which products you will purchase and, within any given food category, which brands you will select. Your horizon of choice is noticeably greater than that of someone with a fraction of your income, so you experience “choice” at every level of your shopping process. You grow accustomed to “choice.”

With “trigger warnings,” students are applying “consumer choice” models to education. This is not necessarily problematic in and of itself and, as some have pointed out, may even be beneficial in empowering students to participate actively in shaping their own learning. The quandary arises when one begins to consider who exactly is exerting their “right” to “consumer choice” through the arm of “trigger warnings.”

In the real lives of people not privileged enough to selectively choose what they will and will not be exposed to, “trigger warnings” do not exist. And it seems to me that we are currently more interested in protecting some students from mention of trauma than we are in protecting others from actual trauma. In a climate where, just yesterday, Johns Hopkins University suspended an entire fraternity for, among other crimes, gang rape, we appear more invested in “protecting” students with PTSD from reminders of past trauma than we do in protecting all students from lived experiences of trauma. In the process, we may also be discouraging students who do experience trauma on campus or while enrolled in our institutions from speaking or writing about their experiences, for fear of “triggering” their peers.

We are creating an environment where speaking, naming, or showing trauma is becoming more taboo than actually traumatizing another human being through an act of violence—and this is a problem, particularly for students from less-privileged socio-economic backgrounds who may leave our classrooms and encounter repeated, ongoing violence at home and in their communities. These students often cannot “choose to avoid” or even “prepare themselves beforehand” for repeated encounters with trauma, for it is happening all around them—to them—on a daily basis. We are coming dangerously close to fostering a culture of silence around trauma that threatens to perhaps “protect”—temporarily, for avoidance is not an effective long-term strategy for dealing with trauma—more privileged students while both failing to protect and silencing less privileged ones. Only if you are privileged enough to experience an end to your lived trauma do you have the time—the luxury, the choice—of insisting that literary and cultural objects reminiscent of your original trauma bear “warning labels.” Only if your lived trauma is not relentless does it even occur to you that you might be able to avoid confronting it (despite the fact that all evidence shows that failure to confront trauma is detrimental to recovery).

Unless you are fortunate enough to exert the kind of control over the rest of your life that you would propose to exert over potentially “triggering” material, avoiding that material in the (more or less) safe space of a classroom will in no way prepare you for what you will encounter after you graduate. On the contrary, you will likely be forced to deal with unanticipated “triggers” on a regular basis—at your job, in your neighborhood, when you travel. The question of “trigger warnings” then evolves into one about whether you’d rather learn how to modulate a panic attack in class or in a boardroom, at the university or the next time you’re deployed for military duty. My take on this is that the classroom and university—where stakes are still relatively low and support is available—would be preferable training grounds for learning how to successfully process trauma.

PART II.

disability_symbols_161

I’d like to contemplate the possibility that demands for “trigger warnings” may not be what they seem, at face value, to be. Up to this point, I’ve dissected Bailey Loverin’s op-ed about these “warnings” and formulated some of my personal objections and challenges to the concept of “trigger warnings” as they intersect with issues of disability and class.

From a Disability Studies perspective, it is reasonable to ask not only whether “trigger warnings” do more harm than good (as I did above, in Part I), but also what it is that we do when we maintain, like David Perry does in “Should Shakespeare come with a warning label?,” that:

The classroom is not a therapist’s clinic […] Moreover, it’s a decision for a patient and a therapist or doctor to decide and advise a university, rather than for faculty or administrators to decide for themselves.

I’m not really sure that we can have it both ways. If “the classroom is not a therapist’s clinic” and the decision about when, how, and where a student should or should not be exposed to subject matter is “for […] a therapist and doctor to decide and advise a university,” then why are we even talking about implementing blanket policies on “trigger warnings” in university environments? (Perry himself is not arguing in favor of these blanket policies, but instead indicating that our existing systems of ADA accommodations policies can and should adequately address the needs of students with PTSD, and I am generally inclined to agree with him.)

I quote Perry at this juncture because I have read similar sentiments in tweets and Facebook posts by academics over the past several months—minus Perry’s astute qualification that our existing disability policies can and should sufficiently address the concerns of students like Loverin. For those academics who clamor “we are not therapists” but also support blanket “trigger warning” policies: your position appears internally contradictory.

Also from a Disability Studies perspective, it is worth pondering the advantages and/or drawbacks of such blanket policies. Does a failure to implement them effectively “medicalize” PTSD in a way that would be considered undesirable within the larger framework of Disability Studies? In other words, when we reject blanket policies on “trigger warnings” and instead direct students towards individualized solutions (via therapists and doctors, medication, and ADA accommodations), are we in essence “medicalizing” PTSD–and by extension disability in general? What might this question reveal to us about relationships between (mental) illness and disability as perceived by DS scholars? By the public?

What fascinates me about the idea of over-arching “trigger warning” policies is that, whereas ADA accommodations are tailored towards individual students—with all students enrolled in a given school presumed non-disabled until and unless they declare themselves disabled by requesting accommodations [2]—“trigger warning” policies operate via the inverse principle. They preemptively assume all students are in fact traumatized (or vulnerable to the effects of PTSD). Thus, from a purely theoretical point of view, blanket “trigger warning” policies are quite progressive since they assume disabilitynot able-bodied/mindedness—as the default state. In so doing, the policies fall more in line with “social model” approaches to disability; they identify the problem as residing in society instead of in the bodies/minds of disabled individuals, with these blanket policies acting as the ideological equivalent of an adaptive or assistive technology. If all this is true, then what we’re witnessing is a potentially revolutionary paradigm shift in the way we view mental/psychological disability.

The two types of trauma victims that blanket “trigger warning” policies are cited as “protecting” include soldiers and rape victims. I question why we would be engaged in a discussion now, as a society, about whether or not we wish to move forward with the paradigm shift I’ve just described. Temporarily putting aside my arguments about the “student-consumer,” etc. — why now? I wonder if the desire for “trigger warnings” communicates something about us on a macro level, as a culture. For if, as I have insisted, we as a culture tend to avoid facing trauma—we suppress it, silence it—and if “trigger warnings” are about exerting control (however maladaptive the strategy may be), then perhaps we as a culture are struggling to modulate and control our own large-scale trauma: our nation’s legacy of violence.

When I re-read Ms. Loverlin’s stereotypes of the “hysterical” rape victim and the “shouting” soldier along with that of the student-witnesses who become “shaken and hesitant to engage,” my mind pans reflexively through a Rolodex of events: 9/11; the wars in Iraq and Afghanistan; the financial crisis of 2008; years of gun violence in schools; the Marathon bombings; mass incarceration of U.S. citizens; natural disasters; rape on college campuses.

I remember that students of Ms. Loverin’s age have, for all intents and purposes, never known a world without war, natural disaster, gun violence, terrorism. And I wonder if the ongoing debate surrounding “trigger warnings” might actually be about something far greater, albeit unspoken—an expression of our students’ desire to try and mitigate collective cultural traumas. An attempt, if you will, to exert some control.

[***FIRST DRAFT: TUESDAY, MAY 20th, 2014. 23:45H EDT***]

**

Notes

1 – A complete breakdown of data (including reported family income) for UC – Santa Barbara students is accessible here, in .PDF formatIf anyone can find data on Oberlin, please do contact me; I did some fishing but was unable to find anything like “average family income” for students enrolled. Here (also in .PDF format) is some info. on demographics at Rutgers, with a breakdown by campus within the Rutgers system as well. Apparently (thanks, David!) one indirect measure of student/parent income is the percentage of students at a given institution who receive Pell Grants. Information for any institution about the percentage of its students who receive Pell Grants can be accessed here. In 2012, 31% of Rutgers students received Pell Grants. According to the figures posted in the U.S. News report, this would place Rutgers somewhere in the middle socioeconomically; far more students at Rutgers receive Pell Grants than at Oberlin, yet more students at UC Santa Barbara (whose overall student body is far from impoverished) receive Pell Grants than at Rutgers.

2 – That is, the very framework of “accommodations” presumes a “default” of able-bodiedness.

"The Falling Man," by Richard Drew.

“The Falling Man,” by Richard Drew.

 

Posted in Uncategorized | Tagged , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , | 30 Comments

“Can You Get Me Into College?” – Midnight in Southie

Southie1

Photo by Valéria M. Souza

It was midnight and we sat on the jungle gym of a South Boston playground designated as being “for ages 8-12″ and “requiring upper body strength and coordination.”

We both had some degree of “upper body strength and coordination,” but neither of us was 8-12.

The young man, who had abandoned his skateboard nearby to come talk to me, interrupted my vaguely clumsy acrobatics on the monkey bars to ask: “Yo, what are you doing? Like, why are you on here?”

I dropped to the ground.

“I saw you skateboarding,” I said.

“Yeah—-so?”

The retort was a bit defensive, challenging. Did he think I was a cop or something? “No, I mean—-I don’t care. I just wanted to ask you: do you skate here at night? Do people bother you? Like: tell you to leave? Or is this place chill? That’s all….”

Instantly he relaxed. His shoulders dropped as he shrugged, open-palmed. “Oh, no—it’s cool. Nobody ever bothers us. They’ll tell us to leave during the day, but at night nobody cares.”

“So, like, you think I could come here a few times a week and climb and nobody would bother me?”

“Yeah, for sure. No one’s going to give you a hard time.”

“Cool—thanks.”

“But why are you climbing?”

“I’m training. Practicing.”

“For what?”

I smiled. Silence.

“C’mon—you’re not gonna tell me?”

“I can’t,” I replied.

“Are you gonna climb a mountain?”

“Maybe. Maybe I am.”

“You’re not gonna climb a mountain, I can tell. Are you like sponsored by Red Bull or something?”

“Haha—no. I am most definitely not sponsored by Red Bull or anyone else.”

We faced each other on one of the metal platforms in the playground.

“Do you mind if I smoke a bowl?” he asked.

“I’d rather you not.”

“OK—-I won’t then. How old are you?”

“How old do you think I am?”

“Like 20-something.”

“I’m 34. What about you?”

“22. Listen—OK—can I ask you a question then?”

“Sure.”

“How do you feel about, like, dating younger people? Like would you date someone my age?”

“I would not,” I answered calmly. “To me that’s waaay too young. I’m a college professor. My students are 18-22. That would be like dating a student. That’s really weird, and I would never do it.”

Suddenly he stood up, his body a lightning bolt striking the air between us. Gone was the casual, off-hand questioning. Gone was the interest in smoking a bowl. “Wait. You’re a college professor?”

“Yeah—-here: give me your phone.” I Googled myself, then loaded the faculty page from the university where I worked. “Here, that’s me. Read.”

He read. He looked at my faculty picture, then at me. Again at my faculty picture, then back at me.

“I need to talk to you,” he insisted, handing the phone back to his friend with terse instructions to bookmark that page, yo—the one she’s on. “How do you like….get into college?”

I squinted, unsure of what he meant. A specific college? College in general? Which aspect of “getting in”? This was a far cry from some of the elite universities at which I’d taught—places where students were already richer, savvier, and better-traveled at 18 than I’d be at 80. Those kids attended Milton Academy and Phillips Exeter and had schedules of meticulously planned extracurricular activities and spoke fluent Mandarin. Or fluent French. Those kids had SAT prep and could afford to do unpaid internships because their parents were rich and they didn’t need to work for money. Those kids—so smart and cosmopolitan and sure of themselves—were so different from me. From us.

“What do you mean?”

“I mean like….the whole process. Look. No one in my family has ever gone to college. Nobody knows what to do. The counselors at my high school didn’t help us. I try to research and I know which schools I want to get into, but I don’t know the process.”

“Wow. OK—-well, you’re right. It is a process. There are a lot of steps involved. Hmmm. OK. We’ve got to fill out applications and financial aid stuff and…”

He interrupted, rattling off a list of four or five elite out-of-state schools he dreamed of attending and asking if we would have to complete a FAFSA. I blinked. This kid was obviously intelligent and had done his homework. He had a short list of schools. He could list the characteristics of each one that he found especially attractive. He knew the FAFSA existed. He was doing the best he could with what he had—and what he had was very little.

Southie2

Photo by Valéria M. Souza

“OK,” I probed, “what’s your GPA?”

“Like 2-point-something.”

I sighed. “OK—-that’s not high enough for the schools you’ve listed. So we’re going to have to do something a little bit strategic. Let me know what you think: first we get you into a lower-tier public school or community college here in Mass. I know you want to go out of state, but your GPA is not high enough yet. So you do a year at one of those lower-tier schools and you get straight As, and then we rig it so you can transfer out to one of your dream schools.”

“Straight As?”

“Straight As. You can be poor and brilliant or rich and mediocre, but you can’t be poor and mediocre. It just doesn’t work that way.”

He nodded in agreement. “I feel you. Straight As.”

“You’re going to have to work hard.”

There was a long pause. He fiddled with his marijuana and looked down. I felt my heart twisting. Not out of pity. Out of deep sadness because of all the people who had failed this kid. This bright, driven, earnest kid.

“Will you help me get into college?” he asked.

The request was so simple. A hand reaching across a divide, grasping. Hoping for someone to grab it and not let go. I remembered my own trajectory, long and far. I felt another twist in my chest for this boy who was just like I had been, once upon a time. I remembered filling out the FAFSA by myself at the kitchen counter in my Mom’s condo. I remembered trying to write a persuasive letter to the Financial Aid Office that included the phrase “onerous mortgage payments.” I remembered taking the SAT twice and with zero preparation beforehand. I remembered applying to only one school—NYU—because I wanted to go there and because nobody had introduced me to the concept of “the safety school.”

I placed my hands on two horizontal, parallel bars and pushed, lifting myself upwards ever so slightly, my feet maybe 3 inches off the ground. I still had a lot of work to do; my upper body strength was total shit. Need to build muscle, I thought, and lowered my body back down to the ground: “Yes. I will help you get into college.”

With those words, he was like a child in front of whom I’d just set a birthday cake. His eyes burned, two lit candles.

“You’ve done this before, haven’t you?”

“It’s my job.”

“You’ve gotten other people into college before.”

“There’s a name for this,” I said. “It’s called ‘being an advisor.'”

“You’re my advisor now?”

“I am your advisor.”

It was spontaneous. He threw his arms around me. He hugged me tight, pressing his fingertips into my vertebrae. I hugged back.

He didn’t want to let go. We had to exchange emails and cell numbers. He had to make sure he had the right information. He could not lose track of me.

“I promise you, I’m not going anywhere.”

Still, he had to make sure.

“I’ve wanted to go to college since I was in high school and I tried—I tried—but nobody could ever explain it to me. My family, they’re good people but they just don’t know anything about it. They never went to college. I tried asking people for help and nobody could ever help me. You’re the first person who has ever known how to help me get into college. I can’t lose you.”

“I know what that’s like. It’s hard. But I promise you, I’m not going to disappear. So let’s do this. Let’s get you into college.”

Grinning.

“Tell you what: you get me into college and I’ll train you.” The kid flexed, showing me biceps, triceps, rippling shoulder muscles. Granted, he was 22 and a boy—both advantages in terms of general fitness and strength—but he clearly trained. “I’ll train you.”

I extended my hand in the darkness to seal the deal. We shook.

“Deal.”

“Deal. You gotta problem with push-ups?”

“Nope.”

“Pull-ups?”

“Nope.”

“You gonna complain?”

“Nope. I am willing to work hard. You’ll see. I’ll work hard to build muscle and you work hard to get into college. And if we both put in the work, it might just go our way.”

“That’s right,” he said. “That’s right.”

Southie3

Photo by Valéria M. Souza

[***FIRST DRAFT: THURSDAY, MAY 15th, 2014. 19:09H EDT***]

 

Posted in Uncategorized | Tagged , , , , , , , , , , , , , , , , , , , , , , | 9 Comments