Being normal isn’t normal

A man from Tanzania once told me that he hated radios. I was sitting in my 12th grade English class, startled by the opening phrase of this foreign visitor. Before they had radios, everyone in his village would gather weekly to sing and dance. This was so normal that the weird people were those who refused to sing, not those who belted off-pitch notes. But then radios were introduced to the village by a western philanthropist. Initially, the excitement was unparalleled—hearing Frank Sinatra and Tony Bennett’s voices without having to travel to see them was surreal, bigger than life even. But that was exactly the problem. People upheld the studio-produced perfections on the radio as the new “normal,” and anything that fell short became unworthy or wrong. Within two months of getting radios, more than two thirds of the people had stopped singing in the weekly events because of how self-conscious they had grown.

We constantly compare our actions against what’s expected of us by society. When we fall short of those expectations, we feel outcast and try to curb our behaviors to feel like we belong. But what exactly is this norm, and who decides it? According to an article written in The New York Times last March, one in five high school age boys in the United States and 11 percent of school-age children over all have received a medical diagnosis of attention deficit hyperactivity disorder (ADHD). That’s a 40 percent increase in diagnoses in just ten years. The National Institute of Mental Health (NIMH) notes that “scientists are not sure what causes ADHD, although many studies suggest that genes play a large role.”

ADHD is defined as a “mental disorder” by the NIMH and even as a “mental illness” by the National Alliance on Mental Illness (NAMI). But what exactly is a mental disorder? People often say it’s an “imbalance of chemicals in the brain.” But what exactly would a proper balance of chemicals be? The Oxford English Dictionary (OED) defines “mental disorder” as “relating to the mind in an unhealthy or abnormal state.” The OED also defines “abnormal” as “deviating from the ordinary type.” But it is insane to assert that one in five school age boys aren’t ordinary in terms of mental health. That reflects a deeply rooted societal misunderstanding of normalcy, not a mental-health epidemic among our youth.

No two people have the same two genes, or by association, brain chemistries. Think about how dangerous it would be if we had the technology to perfectly change and restructure people’s mental states. The American Psychiatric Foundation listed homosexuality as a mental illness until 1974, so to uphold normalcy then, we probably would have restructured everyone’s brains to be heterosexual. Our perceptions of normalcy can be dangerous because they don’t accommodate what we don’t understand.

When we see a man paralyzed from the waste down, we don’t judge him not being able to move his legs. But when we see a woman talking to herself in the train station, we judge her “strange” behavior. We move ourselves away from her and deem her dangerous. Unlike a physical handicap, we can’t see what’s wrong with her, so we don’t register that her behavior is as out of her control as the legs of the paralyzed man.

While ADHD exists as a crippling mental illness, its over-diagnosing and over-treatment reflect our misunderstandings of normalcy and abnormality. Schizophrenia, on the other hand, is not over diagnosed at all; it’s underdiagnosed. Schizophrenia is a damaging brain disease that often causes the individual inflicted to see and hear hallucinations, often leading to extreme paranoia. It’s not that schizophrenia disables people from functioning “normally;” it disables people from being able to function at all. According to the Ohio State University Wexner Medical Center, 2.4 million Americans are affected by schizophrenia. That ends up being about 8 per-thousand people, meaning statistically about 28 people at Brandeis have schizophrenia.

Interestingly, schizophrenia.com claims that, according to data from the Department of Health and Human Services, approximately 200,000 people with schizophrenia or manic-depressive (bipolar) disorder are homeless, constituting about one third of America’s homeless population. At any given point, more people with untreated psychological illnesses are on the street than in hospital beds.

Mental illness is scary because it’s unknown to us. We know that it can exist, and we often overcompensate by putting everything—lack of focus, lack of energy, lack of social skills—under this umbrella. People with schizophrenia indisputably need medical attention, but does every eight year old boy who has trouble focusing in class? We need to be compassionate and understanding towards mental illnesses, but not overeager to diagnose every social variant as one.

We are humans. And humans, after all, are normally pretty weird.

Common misconceptions about computer science

Computers are a pretty new thing. There is a lot of discussion around when the first computer was first invented, because it all depends on how we define computers.

“Computer” was first recorded as being used in 1613 to describe any individual who did math or calculations. “Computer” later came to describe the Difference Engine, the first mechanical computer, developed by Charles Babbage in 1822.

Columbia University declares the IBM (International Business Machines) 610 as the first personal computer because it was the first programmable computer intended for use by one person. The IBM 610 was announced in 1957, and because it cost $55,000 ($457,118 after inflation), only 180 units were produced.

Regardless of which exact computer was the first, computers did not begin to change the life of average people until the late 1980s through the 1990s.

According the 2011 U.S. census, home computer use began in the early 1980s and has been growing steadily since, with only 8.2 percent of households reporting having a personal computer in 1984, 61.8 percent in 2003, and 75.6 percent in 2011. Internet access has progressed similarly, with only 18.0 percent of households reporting access in 1997, 54.7 percent in 2003, and 71.7 percent in 2011.

We now exist in a time where a majority of people living around us own and use computers. The importance of the study of computer science is at an all-time high and will only continue to rise as technologies continue to develop.

However, while understanding how to use various technologies like Microsoft Office, smart phones, Google applications, and so forth is extremely important in the current day and age, but is not what the heart of computer science is.

Computer science is a craft of solving problems. As a computer scientist, you train to figure out the most efficient ways to perform various tasks, analyze root causes of problems that arise, uproot those problems, and create something tangible.

An article posted in the Huffington Post last August, titled “Six Reasons Why Studying Computer Science Is Worth It,” lists reason two as “You will feel like God,” citing the divine sensation of creating something that will last forever. It’s truly remarkable to think that unlike most things in the world, there is no decay associated with electronic information.

The physical machinery merely serve as vessels to the actual information, allowing the creation of new machines to perpetuate the old information.

Solving problems as a computer scientist is not strictly contained to the world of computers. When I tell people I’m a computer science major, they assume I can read lines of 0s and 1s as if they were English pros. This is not at all what computer science is about.

My two favorite examples detailing real-world analogies of famous computer science concepts are Binary Search and Recursion. Both have scary, foreign names, but the underlying ideas are quite simple.

Binary Search is a very fast way to find something from a sorted list. Imagine you have a dictionary and you want to tell a computer how to find a particular word, say “rupture.”

Unfortunately, it turns out that you can’t simply tell it to “turn to the r section,” and telling it to search every entry from the beginning until it finds “rupture” could take a really long time if you had a big dictionary.

Instead, we think about how we answer when someone says “guess my number between 1 and 100.” We simply guess 50, and if they say larger, we guess 75, smaller and we guess 25. We keep cutting the possible ranges in half until we find the right number.

That’s exactly what binary search is. So with the dictionary, we just flip the dictionary halfway open, check to see if the entry is larger or smaller than ours, then cut the dictionary in half accordingly.

To find an entry in a dictionary with a 1,000,000 entries using the method of starting from the beginning and searching until we find the right one would take 500,000 tries on average, but using binary search only takes about 20.

Recursion is a way of solving problems.

Imagine you’re standing in a huge line to get into an amusement park and you want to figure out how many people are in line. You could try counting it all yourself, but that would be really pretty hard to do if the line were comprised of 100,000 people.

Instead, recursion says that to find out what position you’re in, you just ask the person in front of you, “What position are you in?” Presumably that person doesn’t know either, so he asks the person in front him, who asks the person in front of her, and so forth. This goes until it gets to the front of the line, where the person there declares that he is in position 1, then the person behind him knows that she is in position 2 and so forth, all the way back to where you’re standing.

Neither of those two problems were particularly math-intensive, which brings up a common misconception surrounding computer science: that it’s math intensive. Math is important when understanding some of the underlying structures at play, but much like music composition, the math exists, but is often not seen or used at the surface level.

Mathematicians can compose Fourier series to detail harmonic values and the overtone series, analyzing exactly what will sound sweet or dissonant. But composers compose with their ears. They listen for the sounds they like, and they produce them.

Computer scientists compose with their perceptions of annoyance in the world. We listen for the things we don’t like, and we fix them. So pick up a text editor and start composing. Fix your (hello) world.

Poverty: hidden in plain sight

I remember walking through Harvard Square a couple of months ago and giving $20 to a homeless guy. I felt so proud. I had earned that money working and was originally going to put it towards a tee shirt I saw at Urban Outfitters. Not only was I being moral and generous, I was being frugal. Feeling particularly content, I entered a nearby Starbucks, ordered a hot chocolate with whipped cream, pulled out my laptop, and began to peruse through my Facebook news feed. It was rest time and I had earned it. My high school’s environmental action club was fundraising through a bake sale to replenish rainforests in Madagascar. Sweet. The club that volunteers at the local homeless shelter was preparing a dinner. Awesome.

I messaged a good friend to boast my accomplishment. He responded, “Awesome! What’d you guys talk about?” Flustered and embarrassed, I didn’t know how to respond. I didn’t stop to speak to the man I “helped.”, I didn’t even look him in the face. In those six words, my friend (and I can’t thank him enough for this) transformed my self-proclaimed, world-saving deed into maybe one of the causes of the problem. Maybe instead of blindly donating, we need to actually embrace our societal guilt surrounding poverty, look it in the face, and remember its name.

A 2012 interview with a homeless Chicago man, Ronald Davis, went viral over the infamous media-sharing site, worldstarhiphop.com .  In the interview, Davis shares his experience panhandling, including being yelled at to “get a job.”  Through tears, he notes, “No matter what people think about me, I know I’m a human first. And just ‘cause I’m down on my luck, don’t give nobody no excuse to call me no bum. Because I’m not.”

We dehumanize homeless people because we don’t think they deserve the same respect as everyone else. I never mentioned the man I spoke to by name, only as “a homeless guy.”. I didn’t know his name, and didn’t care to try to find out what it was.

We think that if someone is homeless or poor, it is his fault. We think that if someone is rich or famous, it is her achievement. This claim is absolutely integral to everything we believe in about America—the American Dream of being the next Bill Gates, Michael Jordan, Steve Jobs,  Louis Armstrong, or one of the other thirty or so people we cite as evidence of opportunity paying off. Not only is that notion doing immense amounts of harm to how we view poverty, it’s also just incorrect.

A recent editorial in The New York Times written by Mark R. Rank, a professor of social welfare at Washington University, sets out to debunk common myths about poverty. Rank tackles the idea that if someone is homeless or poor, it’s because they aren’t working hard enough. Through his own and his colleagues’ and his own research, he concludes that the attitudes of those in poverty mirror those in mainstream America, and that “a vast majority of the poor have worked extensively and will do so again”. In fact, he goes on to conclude that at least 40 percent of Americans aged 25-60 will experience at least one year below the official poverty line, that at least 54 percent will spend a year in poverty or near poverty (below 150 percent of poverty line) and 50 percent of all American children will at some point reside in a house that uses food stamps for a period of time.

So if being “down on your luck” isn’t anomalous, and is actually quite “mainstream,” why do we stigmatize it? In “Hiding Homelessness,” an article published in The Spare Change News (a newspaper mostly run and distributed by homeless individuals), James Shearer, member of the paper’s board of directors, writes:

“The other thing I’ve noticed quite a bit lately is how, as Americans, we get caught up in causes, especially when it comes to tragedies such as the horrific tornadoes that recently struck Oklahoma, or the bombings in Boston. Whenever things like this happen, we gather ourselves up to help, we set up funds, sell T-shirts and throw benefit concerts, all in a heroic effort to raise money for the victims and to raise awareness. We donate to causes like cancer, diabetes, heart disease or sick children. These are all noble causes, but there are times when I wonder why we will not do the same for those living in homelessness and poverty. Where is their benefit concert? Is not having one another way of hiding these social ills?”

“We need to stop hiding homelessness and poverty—and we need to stop hiding them from ourselves.”

If human kindness, not blind philanthropy, is needed to break the cycle of poverty, if anyone could be poor and everyone is human, there is no question about what we need to do. We need to realize that the problem isn’t with them. It’s with us.

College entitlement

Why do we go to college? If I were to ask someone that nowadays, they would probably look at me as if I were unlearned and respond “why wouldn’t we go to college?” The benefits of college have become such deep seated truths for many of us that we’ve stopped questioning them. We’re told college is an investment. We borrow large sums of money now to make larger sums of money later. We’re told college is formative. We leave more intellectual and responsible; we are the saving generation. But are these both really true? And what are the implications if they are not?

The whole investment argument is founded in receiving a degree. I think most of us agree that what’s actually important is what you learn in school, not a piece of paper saying you learned. But the degree unfortunately holds a lot of weight. With today’s online resources, you could teach yourself any subject, just as well as any undergraduate might have learned in school, but still be at a competitive disadvantage to them in the job market because you don’t have a degree. Consequently, the whole desire for a degree has become more driven by our survival instincts than anything else. We are told we will not be successful in this world or have economic security (stable food, shelter, attractive partner, etc.) without a degree and so we convince ourselves that we need one.

Now, that argument isn’t without its merits. According to a recent Huffington Post article, people with a college degree have half the unemployment rate as people with only a high school degree—though the article also points out that half of recent grads are working jobs that don’t actually require a degree. Even with this distinction, the competitive advantage in getting jobs with a degree makes it economically worth it.

The degree is not the only thing we go to college for though; college is also supposed to prepare us for the real world both in intellect and responsibility. In my two months of living here so far, I am feeling prepared intellectually. All of my classes are engaging and challenging and I’ve even been given the resources to start my own club about creating Android Apps, which is what I want to do professionally in the future.

In terms of responsibility, however, I could not feel farther from prepared. I feel more disconnected from the world than ever. People cook my food, wash my dishes, clean my bathroom—the only responsibility I can try to claim is doing my own laundry. My family did also cook my food, wash my dishes, and clean my bathroom for most of my life, but I still had to communicate with them and acknowledge them on a human level. I have yet to see anyone talk to, let alone thank, the guy who cleans my bathroom. Granted, I’ve only lived here two months, but I’m skeptical whether this actually changes later on in college.

This one-sided service breeds a scary amount of selfishness and entitlement. I talk with my friends about how atrocious those gated communities in, say, Florida—bubbles of homogenous wealth and culture—are. But is a college campus all that different? We’re separated from the rest of society (by a string in Brandeis’s case); we’re fairly politically homogenous; we’re all around the same age; and we’ve convinced ourselves that we’re the most in-tune with society that we’ll ever be.

We’re choosing to exist in this bubble because we love it. We love being distant from the responsibilities of the real world. We love only having to focus on ourselves. We love being around other people who love those same things. The college degree alone might be worth the economic expenditure, but maybe the loans aren’t the most crippling debt we accrue. We leave college entitled and expecting life to be served to us on a silver platter. The scariest part is that with our “top-tier” degrees, it probably will be.

The Facebook epidemic

What was I doing in 2004? YouTube wouldn’t be created until 2005. Myspace had been created in 2003, but I wouldn’t make my own page until sometime around 2007. Facebook was created in 2004, but was only open to select college students until sometime in 2006—I didn’t end up creating my own account until 2009, anyways.

So what was I doing? It’s strange to imagine not opening my laptop every time I step into my room, not pulling out my smartphone every time I’m waiting in a line, and not feeling the short-lived excitement of realizing that a Facebook notification turns out to just be someone posting in the Brandeis class page. I was ten or eleven years old, so I was probably biking around with my good friends from the time. Jason, Jonathan, Devante, Asa—I could count them all on one hand.

As of writing this article I have 1052 friends on Facebook, but it only feels equally, if not less, reassuring than my four close childhood friends. I was initially surprised by this observation, because more is better, right? But the connections I’ve made and maintained over Facebook, and the persona I’ve created for myself feel artificial in comparison with real-life; I can’t even imagine how content I’d feel if I had 1052 friends in real life and four Facebook friends.

It’s always gratifying to see the red number on the corner of our screens—John liked my status, maybe my opinions are valid; Mark accepted my friend request, maybe I am popular; Max liked my profile picture, maybe I am attractive. But those feelings and gratifications are shallow and defined extrinsically. We get them through other people and consequently depend on those other people to feel that same way.

It’s true that I felt similarly gratified through my close childhood friends, but it wasn’t overburdening and over-stimulating. It was always nice to see my friends, and I appreciated the sense of belonging that came along with them, but I didn’t feel that presence literally every second of my life. I wasn’t reminded that those feelings were, or weren’t there, the way I am now as I check my phone in a line at the Hoot Market.

It’s very much like we’ve been conditioned. We see a red number and feel like we’re being noticed before we even check what that notification is about. We’re conditioned to feel accepted by these notifications and consequently at a loss without them. As I anxiously wait for the page to load—will I be accepted? If not, I feel restless, compelled to go and like someone else’s stuff, hoping they’ll reciprocate the “love” and fill the newfound void. But imagine how sad it would be if they’re liking my stuff for the same reason.

Aside from seemingly defining our self-worth, Facebook creates new personas for us. Individuals who I’ve known in person to be quiet and generally held back are often the most vocal on Facebook. It’s similar to the power that people find in anonymous online Internet forums, but they key difference is that Facebook associates a name with your words. Usually the will to make bold statements comes from not having your name associated with your words, but on Facebook, there’s actually a matter of pride with claiming those words.

Some people will chirp into huge debates with a safe (not new or interesting, but favorable) opinion to reap the benefits of social acceptance without the associated risks. Some people will disagree with just about anything to boast their intelligence and non-conformity. And some people actually provide thought-provoking, interesting, and unique opinions—though this last group has always seemed the minority in my experience. The large majority of users I’ve observed fall into the first two groups, and this is problematic because they don’t develop necessary social skills.

We live in a society where our natural human impulses are artificially stimulated and are consequently improperly developed. We feel conditioned belonging through our notifications. We feel contrived bravery through the constant presence of our peers. We belong online and are alone in person. I don’t know how to solve this problem, but I can safely say that Googling it won’t help.

My experience with religion at Brandeis university

I originally thought atheism meant “lack of religion,” but after hearing more sermons delivered from atheists than I ever did from any priest or pastor, I reconsidered. Most of the people I knew growing up were atheists, followed by Muslims, then Jews, and finally Christians. My dad was not an atheist; he was “spiritual but not religious” (a description that I now understand to mean he believes in a higher power, meditates, has strict morals, but under no circumstance will reveal what he actually believes). A couple of my friends from high school identified as “culturally but not religiously Jewish.” That ended up meaning their families were Jewish, probably celebrated Jewish holidays, but rarely went to services. All I knew about Christianity was what my Facebook friends posted about the Westboro Baptist Church being comprised of hateful bigots. It wasn’t uncommon for me to make presumptions about unfamiliar groups. I had little-to-no experience with religion growing up, so I had no idea what to expect from Brandeis. Yes, I’ll be politically correct and note that Brandeis is not officially a Jewish university, but the main selling points on my tour here were the half-kosher dining hall and that Brandeis is sometimes referred to as “The Jewish Harvard.”

Generalizing my preconceptions of the Westboro Baptist Church to all religious people, I was half-expecting to be thrown into a pit of hateful, close-minded people who ate babies, then protested their funerals. Needless to say, that didn’t happen. I was actually very quickly humbled and pleasantly surprised by the feeling of welcome-ness and community that Brandeis offered. I had thought that asking what “kosher” actually meant was taboo, but when I finally summoned the courage to ask the religious-looking man who supervises the kosher side of the dining hall, he simply smiled and explained the various dietary restrictions to me. As he explained, I felt compelled to apologize for my ignorance, out of respect for this belief system that I knew nothing about. He responded by chuckling, revealing a comforting smile, and then pointing me in the direction of the big entrée of the day. His tone revealed no indication of offense, in fact, he seemed incredibly accepting—a common trend among most religious people I’ve actually talked to at Brandeis.

Atheists had always convinced me that atheism was the intellectually superior path and that religious people were incredibly close-minded, but whenever I questioned that, they told me I was wrong. My conversations with them were often brief because they’d get quickly frustrated when I didn’t see the same “truth” as them. And they often entered our conversations from a post-enlightened–“I once thought that too”—perspective.
As a sharp contrast, during my first week at Brandeis, I had a really pleasant conversation with two orthodox Jews. We sat down for lunch and they invited me to ask them anything I wanted about Judaism or about their particular beliefs. We began talking, and even though I disagreed with several of their views (particularly with regards to homosexuality), I did not feel like they assumed a moral superiority or forced their views on me; they were both actually extremely receptive to me articulating my disagreements. I’m sure there are plenty of close-minded religious people and open-minded atheists, but the point is that it’s just people on all ends, and assuming anything based on religious preference is fallible.

I remember how shocked I was in high school to hear that my Economics teacher was a Catholic. I remember being even more shocked when I learned that he was a Republican. But the biggest shock of all came when I realized that he was absolutely brilliant. He turned out to be one of the most perceptive, interesting, and intelligent people I’ve ever talked to. If I hadn’t had to sit in class and listen to him speak every day, I would’ve just finalized my opinions right then and there—ah, another Catholic, Republican bigot, racist, sexist, homophobe, earth-destroyer.

But think about how ridiculous it is that I formed all of these judgments from simply hearing him speak a sentence or two. Think about how ridiculous it is that I nearly did the same for Brandeis. Think about how ridiculous it is when any of us presume to know anything about anyone. It’s easy to lock ourselves away with beliefs that feel safer, more familiar, and more secure, to surround ourselves with other like-minded folks, but it’s really hard to enter an unfamiliar place and to not just tolerate it, but to embrace it. I definitely haven’t fully embraced Brandeis yet, but I feel like I’m on my way. Shalom.

Morality of the Boston marathon bombings

The bomb squad trucks sped, sirens blaring, through our innocuous, earthy-crunchy streets, and most of us could think only one thought—kill him. Last April, bombs went off at the Boston Marathon, killing three people and injuring an estimated 264 others. The alleged bombers, Dzhokar and Tamerlan Tsarnev, went to the same high school as me. Shortly after the bombings, a police officer was killed fifty feet from where I had eaten a burrito the day before. The car chase that followed was through the streets I grew up on; I heard the explosions of the make-shift bombs thrown during the pursuit as I biked home that night. A picture surfaced on my Facebook newsfeed of a friend of mine from school surrounded by twenty FBI officers, their guns pointed at him—he was Dzhokar’s neighbor. About halfway through the chase, Tamerlan was killed, but Dzhokar got away, and the chase continued. Kill him. All of us were scared in liberal Cambridge, Massachusetts.

All of us who picketed the U.S.’s torture of Iraqis in Abu Ghraib, the infamous American war prison, just a couple of years ago were now glued to the TV news feed. We huddled in living rooms and neighbors’ houses for hours on end, yelping with excitement every time we thought the police might have killed Dzhokar. We were peace-loving when the war was overseas, but not when it was on our streets. When Dzhokar was finally detained (alive, but with gunshot wounds on his head, neck, and hands) everyone cheered; mission accomplished, we got him. We celebrated our police’s accuracy in killing Tamerlan early on, and some of us even celebrated Dzhokar being alive because we thought death was too quick and easy. Our non-violent values were tried and proved to be conditional. Our disgust with the national security budget quickly turned into appreciation when hundreds of police and FBI officers swarmed our streets. Our liberal values proved to be privileged, existent only because we never had to be afraid before.

Right, wrong, moral, immoral, secure, paranoid—the distinctions are entirely conditional. Because I grew up in a place where I was never afraid that a bomb would go off on a bus or train, I would have cried “dictator” and “imperialist” if I ever had to walk through a metal detector in the subway station. But what if buses and trains were being blown up? Would I still think that it was an infringement on my privacy or liberties? Are we capable of unconditional feelings?

In the infamous Stanford Prison Experiment from 1970, twenty-four male students were selected to participate in a mock-prison scenario. They were each randomly assigned the role of either prisoner or guard. The experiment spiraled out of control within days; prisoners were subjected to psychological torture, among other things, and the experiment had to be shut down preemptively. Everyday people turned into monstrous jail guards, no better than those at Abu Ghraib.  In Cambridge, we looked at Abu Ghraib and said that our soldiers were immoral. What do we say when we look at ourselves? The people selected for the Stanford prison experiment (though it was a small sample) were just college students, no different from most of us. Are we all monsters because we’re capable of the same things? How much of ourselves is inherently “bad” and how much is inherently “good”? How much is just human, susceptible to what’s going on around us? Would we all have been the admirable abolitionists in slave times? Would we have been the slave owners? Can we even predict who or what we are?

I believe in non-violence as much as I believe in any social justice. But if I wasn’t steadfast in my values now, could I have held on to any similar values in different situations? I’d like to think that I’d have been a freedom fighter in slave-time America, Nazi Germany, or Rwanda not even 20 years ago, but the hard truth is that I wouldn’t have been. Many argued that Dzhokar deserved the death penalty, or worse. It absolutely terrifies me that while I disagreed logically, I agreed emotionally. I did not object to people condemning Dzhokar’s life. I did not object to myself feeling the same, even though I tried to hide it. I’m no philosopher king, but I’m no monster either. I’m just like you.

Do it for yourself

What’s your major? If you’re a first-year like me, and maybe even if you aren’t, you’ve been asked this a lot over these first couple of weeks. Maybe you know for sure that you’re a neuroscience major. Maybe you’re kind-of sure that you’re an English major. Or maybe you have no idea; you feel overwhelmed and unsettled by even labeling yourself with a major. In any case, this seems to be a big topic of discussion (at least until we can all remember each other’s names). But it’s paramount that we remember why we’re in school in the first place. We are not here just to get a job; we’re here to learn and discover, not just about the periodic table, but about ourselves. We’re here to find, explore, and develop passions and curiosities. We’re here to change the world. While we all have to work to live, we are not alive to work.

 

For many of us, high school wasn’t about learning and self-discovery; it was about surviving. We convinced ourselves that if we didn’t get into the elite ivies (or Brandeis) we’d end up homeless on the streets. And so it follows that major choice became strictly an economic analysis—the safest path into the highest paying job. A little bit of research (googling “top majors 2014”) yields that engineering, computer science, finance, and of course law and health, come up at the top of that list. According to a recent Forbes article, the average starting salary of engineers is $63,000/yr, while the humanities and social sciences scrape by with $37,000/yr—sorry English majors. But performing such an analysis and calculating our chances based on income seems like a lousy way to go about living.

 

Sure, going to college gives us more job security and more materialistic satisfaction in the long run, but why are we looking for more security and material satisfaction? Are we insecure and discontent with our lives? I’m certainly not saying that anybody should forgo material altogether and starve on the streets, but chasing materialism will lead to an insatiable hunger of its own. Instead of trying to fill this void of insecurity and dissatisfaction with temporary material, why not fill it with something that lasts? Passions and curiosities never die. You gain and lose wealth because it’s out of your control, but what you love to do in this world, what you strive to understand, doesn’t go away until you find something more captivating.

 

I won’t assert that I know everyone’s plans, but I will say that we are outlining the paths we’ll follow for the rest of our lives, whether we like it or not. Just think about that for a second. If you still feel like you’re working towards an end—that this day, this week, this semester, or even this year is a necessary evil to achieve something bigger—I implore you to consider this: college is the something bigger. Take a class or join a club you’ve always wanted to try, or maybe even one you’ve never heard of. Don’t waste four years of your life (and a lot of money) studying something you decided on in the seventh grade, without even exploring your options first. Go outside of your comfort zone and take advantage of your time here. College has literally everything you need to explore each walk of life.

 

In college, you can develop your inner jock, artist, scholar, or whatever. Walking through Gosman last week, I saw a machine in the basketball courts that catches your missed shots and passes them right back to you, without you needing to move a step, and a machine in the swimming pool locker room that dries your bathing suit within seconds. This place has it all. But even if you’re not willing to dawn a speedo or pair of Jordans, how about trying out an intro class you’re not familiar with? Linguistics, computer science, sociology, philosophy, theater, Chinese music and its origins—whatever it may be, just go for it. If you end up not liking it that much, no big deal; it’s only one class out of around thirty two you’ll take in college. But if you discover something you really do love, and you find yourself eager for the next class, reading ahead in the textbook, and feeling mentally engaged and stimulated, then you’ve made it. “Work” becomes learning. “College” becomes home. Your “major” becomes your life.

 

You can take this time to do what you want to do. No, don’t act impulsively and nap all day, but do let your curiosities take hold of you. Free yourself from society’s expectations, curiosities, and passions, and become captive to your own. Did you know that Feminist Sexual Ethics in Judaism, Christianity, and Islam is a class offered this semester? How about Mobile Application Development? Whatever classes you end up taking, majors you end up deciding on, and clubs you end up joining, or whatever—do it for yourself.