My expertise after a week of blogging

Writing a blog post is way different than writing anything else.

It’s much less formal than a newspaper article, much less “academic” (by which I mean pretentious) than a traditional essay, much less cited than a research paper, and certainly much punchier and shorter than just about any other style of writing.

On a blog, If people have to take more than five seconds to read something, they won’t. Common causes of such a behavior are: vocabulary words that you learned through memorization, long sentences, paragraphs longer than a couple of lines, boring content, etc.

To fill a paragraph with pedantic material is like starting a sentence with a preposition; it marks the sentence as airy or waspy, disengaging any reader interest, exacerbating the reader’s affinity for drifting off, and providing a generally lackluster opening, which in my experience, determines most of the interest for a particular body of writing. Using a semi-colon is similar in its effects because it provides a logical separation of thought without as much melodrama as a period, though producing sentences that run on and on (normally referred to as run-on sentences) seems not to do too well either, unless the writer has time to craft it just so, perfectly aligning every noun, verb, and adjective into the right formation, ensuring a nice, all-but-crunchy 700-word paragraph that sounds the least similar to Mad Libs as possible.

You didn’t read that last paragraph. Or at least I wouldn’t have. It’s way too boring and I used a thesaurus to write it.

We’re taught to write like that in school to sound smart. We sound smart to elevate us about those who don’t. All we really end up doing is screwing over some 9th graders, who end up wondering why they have to memorize the meanings of ablution, abnegate, and abstemious.

Some people (many, actually) will only read the headline of a blog post and summarize all meaning from that, so headlines usually end up sounding like marketing ploys to gain readership–“You’ll Never Look at the World the Same Way Again!”, “Want to Know How I Cured Depression?”, etc.

Even the mere sight of a long blog post will turn people away before they even begin to read it. And on that note, I bid you adieu.

Advertisement

From a lifelong vegetarian

I’ve never eaten meat in my life. But you’d be surprised at how many people have responded to that by asking if I eat chicken. I don’t eat fish either, but it’s more understandable that people ask that since so many vegetarians do eat fish.

The next most frequent thing I get asked is if I’m sure that I haven’t eaten meat. “Have you ever eaten pasta that was cooked in a pot that cooked meat several weeks before? Do you inspect all the food you eat to make sure it doesn’t have any signs of meat at all? Have you kissed a girl after she’s eaten meat?”

I’m sure that I’ve indirectly eaten meat at some point in my life. In fact, it probably doesn’t have to be as removed as some of the suggested scenarios. I’ve certainly eaten veggie burgers that have been cooked alongside actual burgers. It wouldn’t be unheard of for some particle of meat to have ended up in my veggie burger. Accidents happen. Whatever.

The point has never been about some sort of physical purity. In fact, that’s what people ask next. Why do I do it? Well, I’ll tell you the reasons that do not make me do it.

It’s not for diet. It’s not for politics. It’s not for religion. It’s not for pretentious ethics.

In fact, I’d like to take a second to address pretentious ethics. Nothing turns me away from the idea of vegetarianism more than pretentious vegetarians. One will always ask how long you’ve been a vegetarian and why you do it. But before you’re done answering, she’s already cut you off to tell you why she’s better than you.

I’ve been told on many occasions that it’s easier for me to be a vegetarian since birth than it is for some college student on their 3-month “exploration” of vegetarianism because I never had to be tempted by meat at all. I respect that opinion because it’s probably true. I don’t feel any desire to try meat.

But then I consider that everyone who has told me that has only done so to leverage how much better than me, and everyone else, they think they are. For such a person, his agenda is extrinsically driven. He wants recognition, probably moral or political, and he wants it now.

Doing something moral for recognition is already logically fallible, so not much to say there. Doing something political for recognition makes sense, I guess. But I’ll still dislike you for it.

As to why I do it, it’s honestly because I was raised that way and it’s what I’m used to. I don’t think it’s a better way to live than anything else, though I don’t think it’s lesser either. I just think it is a way to live.

 

 

Humble Bundle, Panera Cares, and T-Mobile show promise for corporate ethics

“Pay what you can for 10 games from Electronic Arts (EA), a $240 value”. My jaw literally dropped when I read the e-mail a year ago. Yes, partly because I was excited about the video games, but more because of the implications of such a charitable offer. A business (one of those big, non-personal, corporate things) was doing something that actually directly benefited me.

Humble Bundle has been offering deals like this since 2010, but it hadn’t caught my attention until August 2013 when they released the Humble Origins Bundle with EA, their first major publisher.

The EA bundle retailed at $240, but consumer could pay as little as $1 to unlock 6 of the 10 games–to unlock the remaining 4, consumers simply had to pay higher than the average ($5.16 when I bought it). The proceeds are then split between game developers and charities of the consumer’s choice (popular choices have included the American Red CrossChild’s Play, and Action Against Hunger).

Spirits uplifted, I also remembered hearing about how Panera Cares had just opened a store in Boston in January 2013. The logic behind it is the same as the Humble Bundle–pay what you can for high-quality products (food, in Panera Cares’ case). Their website describes the crux of their mission as “We will offer a dignified dining experience in an uplifting environment – without judgment – whether or not a person can pay”.

While Panera Cares only brings in about 70-75% of its expenses as revenues, it’s aiming to ultimately be self-sustaining. Though some customers are not able to pay the full “suggested price”, the management anticipates (and has since observed) that many customers will pay more than the “suggested price”, creating some sort of balance.

Okay, so charitable video game companies and restaurants. What’s next?

Last spring, T-Mobile decided to abolish phone contracts altogether and call bullshit on the phone industry. As it stood, customers were paying more per month to pay off “free phones” they received with their contracts. Everyone knew this was happening, fine. But, what we didn’t know was that the bill payments were never going down, not even after the phones had been more than paid off.

T-Mobile publicly called out other phone companies for being dishonest, apologized for being dishonest itself, and promised to discontinue any excessive billing. It also abolished all fees for going over allocated minutes, text messages, or data. Just a few weeks ago on June 18, T-Mobile announced that customers could now stream unlimited amounts of music without depleting any allocated data.

T-Mobile, Panera, and Humble Bundle have each embodied some form of conscious capitalism. But, as pleasantly surprised as I was by the charitable approaches these companies have taken, I couldn’t help but remember the clichéd but accurate “there is no such thing as a free lunch”, and that all of these organizations are primarily motivated by making money.

I certainly hesitate about promoting the idea that we should make decisions primarily off of what makes the most money, but after looking at what these companies have done, is that even wrong? It actually sounds pretty awesome if the best way for these companies to make money is by being charitable.

Atrocities motivated by money are still far too relevant and frequent for us to ignore (re: FIFA, Texaco/Chevron, Haliburton, etc.), and we should still remain apprehensive about any apparent corporate “benevolence”. But, with Humble Bundle, Panera, and T-Mobile taking lead, others will certainly follow–they’ll have to if they want to stay in business.

And a world where all, or even most, corporations follow in conscious capitalism sounds like a pretty good world to me.

Nobody has 7 midterms

If you’re in college, and probably high school, you’ve heard the word “midterm” thrown around quite a bit. In fact, you’ve probably heard people claim they’re studying for 7 midterms, making your meager 3 midterms seem unworthy, and forcing you to shy away from any and all conversation about them.

But it turns out that your friend does not have 7 midterm exams. They have 7 something-or-others (sometimes actual exams, sometimes projects, and sometimes just homework assignments) that someone (often a professor) decided to call midterms. But you need not even pick up a dictionary to know that “midterm” means “in the middle of the term”, which would make the idea of having a “midterm” at any other point in the term seem suspect.

So when a classmate mentioned to me that he had 10 take-home “midterms”, equally distributed throughout the semester, I couldn’t help but feel like he was being disingenuous. Why was he calling homework assignments “midterms”?

It seemed both peculiar and disadvantageous. “Midterm” is certainly more ambiguous, seeing as it could mean literally any assignment you’ve ever received (other than a “final”, which is a whole different can of worms). Plus, it just sounds kind of gross and formal.

But it then struck me that this was its very advantage. Calling something a “midterm” makes it sound more formal; calling something a “midterm” legitimizes it. Suddenly, a “midterm” becomes an easy way to garner sympathy or to excuse yourself from something you don’t want to do.

When you feel like you need some validation for your hard work, mention you have a “midterm”, and we all reflexively dip our heads, scrunch our brows, and offer our condolences. When someone who you don’t really want to hang out with asks to get dinner, mention you have a “midterm”, and feel good about having a “valid” excuse to pass on the opportunity.

So why is this a bad thing? Legitimizing school work seems positive, surely. But if every piece of schoolwork is as legitimized as a midterm, suddenly none are. Suddenly the weight that the word once held is lost. Language is our most powerful currency, albeit intellectual, and to deflate the value of any word is to deflate the values of all words we speak. For instance, I’m now less inclined to believe that Mr. Homework-Is-Midterms-Guy really has “the biggest problem of his life” when I read his tweets about it.

There is a reason that different labels and categories exist for different types of school work (homework, quizzes, tests, midterm exams, final exams, etc.). These do not exist to belittle the work we do–I’ve definitely had quite a few homework assignments that were more difficult/stressful than midterms. Midterm exams should certainly still be called midterm exams. But homework assignments should still be called homework assignments for clarity, lingual precision, and also just plain-old honesty. Doing so does not belittle our hard work.

The only person who belittles the value of his work is the person who uses “midterm” to describe it.

 

Why I run

The other day I ran up and down all the steps of a stadium for the first time. The first few flights felt great; I felt energized and confident about this whole endeavor. But by the time I reached a fourth of the entire thing, I could feel it–in my legs, lungs, heart, and head. At the halfway mark, I was “making deals with the devil”, as my friend who was running with me put it. At somewhere around three fourths, in a delirious and dehydrated state (it was also around 90 degrees and so humid that the air felt like soup), I started to question why I was running in the first place.

What was I gaining out of this? I certainly was losing a lot right now–sweat, feeling in my legs, and the will to keep going. I thought back to the most common pieces of advice I hear, “do it for the ladies” or “it’s summertime, you need that beach body.” Surely, for someone college-aged like myself, this is also motivating. It’s vain, but so are we at times, I’ll concede to that.

But I also started to realize a fault in this school of thought: there is an achievable end. As soon as I deem myself attractive enough (which, assuming I’m not terribly self-deprecating, should happen at some point), I’d lose the urge to keep working out. I’d have reached my end. It would also become easy to coerce myself with excuses. “Well, I feel pretty good about the way I look today, and I didn’t eat anything that unhealthy. Maybe I deserve a break.”

The reason this is a big problem to me is that running, swimming, biking, or whatever, have never been about the physical outcome. They’ve been about the mental discipline. Now, I know the word discipline sounds like it’s something you’d hear spoken in a movie about samurai fighting to reclaim their heritage or whatever, but seriously it’s important.

One thing I’ve always loved about exercising on my own is knowing that I’m the only one accountable, on all ends. If I mess up, I’m only affecting myself, but also, and maybe more importantly, the only thing keeping me going is myself. When I’m tired, sweating, hurting, and feeling like I should just give up, the only reason I keep going is because I’m making myself, and there’s something rewarding in that.

As I reached this conclusion, I started to push myself a bit harder, and my body began to push back against me. It didn’t seem like it was going to be as easy as it was in the sports movies’ 3-minute chumps-to-champs reels.

I also thought back to a reason a good friend of mine once gave, “every time you work out, you become a better person.” Now, whether or not you agree with that as an academic, it’s a damn good motivation if you can make yourself believe it as a person.

With that in mind, I reached the seven-eighths point and couldn’t bring myself to do anything more than walk. But I kept walking, my body pushing me to go more slowly. And when I reached the final two flights, I sprinted (which probably looked more like flailing) and collapsed immediately after finishing the final step, knowing that I probably just ran one of the slowest stadium times in history, that I’d be terribly sore the next day, and that neither of those two things mattered at all.

Growing up is weird, and a lot easier than I was told

Ever since I was little, I worried about my future school and work lives. I had been told that high school was going to be challenging, that college would be a true testament of my mental strength, and that the workforce thereafter required a level of maturity that I couldn’t even imagine.

Two summers before high school started, I worked as a “Counselor in Training” at a summer camp where I got paid $20 a week for 30 hours of work–I think I was on record as a volunteer and the $20 was “under-the-table” because I was only 13 years old, and you had to be 14 to actually work or get paid. I worried that if a 7th-grade summer-job was that hard, then high school, college, and any jobs 5 years down the line would be insurmountable.

A couple of years into high school and I began to confirm my fears. I remember nearly breaking down in tears because I couldn’t understand what Machiavelli or Kant said in translated plain English, or what Newton and Leibniz said in their hybrid Roman, Greek, alphanumeric dialect that they reflexively insisted upon calling Math. On average, I was in school physically for 6.5 hours per day, commuting to school 30 minutes each way, and grueling through homework for another 2 hours once I got home. That’s 9.5 hours a day, ignoring sports, clubs, or a social life. 47.5 hours a week were legally and, more importantly, culturally assigned to me without my say, and it sucked.

Approaching college felt almost nightmarish. Was I really going to have to pull all-nighters, drink coffee, and grow out a scruffy beard because I forgot how to take care of myself from being too absorbed in my work?

Upon arriving to college, I was surprised that the first few days were dedicated to name games and tours of the very colorful and summer-camp-looking campus. OK, I figured they had to lull us into complacency before dropping the mountains of work on our backs.

But much to my surprise, the “real work” never really started. In fact, as the year progressed along, I realized that on a given day, I was only spending about 3 hours in class and only about 2-3 hours on homework, if even that. In fact, I had so much more free time than highscool, that me and my friends were able to all actually sit down at a table for meals and enjoy multiple courses, eating slowly and discussing cool and interesting things, rather than stuffing our faces for our brief high school lunches, where being late to class afterwards too many times meant we failed, literally.

Now I was only spending 30 hours or so per week on blocked out activity, but it’s all stuff I actually enjoy and wanted to do anyways. Choosing my own course of study meant I never got bored, never wanted to skip class (well, mostly), never watched the clock–I just listened, learned, and enjoyed. Furthermore, I now had much more time to play music, to swim and run, to be social, to watch movies, to read, to discuss philosophy and politics, and to write articles like this one.

What’s more, halfway through the year, I learned that the 10-day winter break I’m used to from high school is actually over a month long at college, and because it separates the first and second semesters, there’s no assigned work to do.

Second semester came along and I actually had enough time that I picked up a job too. I only worked about 10 hours a week, but even with that going along, I still had plenty of time to do everything else I was doing before. The dreaded “finals period” that I’d heard so much about before ended up being mostly comprised of watching movies (first semester, me and a bunch of friends watched every Lord of the Rings movie, extended editions of course).

Of course I’ve only finished my first year of college, so I don’t mean to generalize, but speaking for what I’ve experienced so far, being in college has been a lot easier than being in middle school or high school. While, I do know plenty of college-students who’ll opt into 60 hours or more per week of school work, I know plenty (like myself) who don’t. The best part of this whole shindig is that we finally have the say in what we do.

 

Acknowledge the dangers of privilege

I recently stumbled across an article from The Princeton Tory (republished in Time Magazine) called “Why I’ll Never Apologize for My White Male Privilege,” written by Princeton University freshman student Tal Fortgang. In it, Fortgang vehemently protests the idea that all of his success in life (including his admission to Princeton) can be credited to his race or sex, and offers instead that to call someone privileged “[assumes] they’ve benefitted from ‘power systems’ or other conspiratorial imaginary institutions [and] denies them credit for all they’ve done.”

Before going forward, it’s important to note that I am a white, heterosexual male. This, of course, means that I’m writing with a particular bias based on living with those dispositions.

That being said, I observed two assumptions Fortgang makes in his article. The first is that hard work yields success, which to Fortgang, seems to denote financial and political power. The second is that we live in a society that allows everyone to work hard, and thus be successful.

I agree, in part, with Fortgang’s first assumption. For many, if not most people, working hard is necessary to become successful. Fortgang’s grandparents, for instance, who escaped the Holocaust to start a “humble whicker basket” business in the United States, would not have been successful had they not worked hard.

However, I disagree with his second assumption. There are countless people who work hard, just as hard as any wealthy or influential person, and some even more, who are not successful. Fortgang concedes that “white males seem to pull most of the strings in the world,” but does not think this is significant. Mr. Fortgang, why do you suppose white males are in these powerful positions? Is it pure coincidence? Luck?

In Fortgang’s worldview, if we live in a society that “ultimately allowed [Fortgang’s grandparents] to flourish” because it “cares not about religion or race, but the content of your character,” what are we saying about the characters of everyone who is not white and not male? Are women earning 80 cents for every dollar that men earn because women routinely don’t work as hard or are not as competent? Are African-American men being excessively stopped by police because they’re routinely more dangerous?

Fortgang’s worldview, namely that we live in a meritocracy—an entirely just society—implies that those who are not successful are solely unsuccessful because of their own failings. The idea of a meritocracy is extremely convenient for those who are successful—who could condemn the hard work that a majority of wealthy and successful people exert? But this same idea promotes faulty assumptions and prejudice on the foundations of race, gender, sexuality or whatever else we can construct to divide ourselves.

Fortgang’s belief that our society is post-racial is also known as a racial color-blind attitude. That is to say that race should not and does not matter in our society. The first part of this statement seems admirable, but the second part, that we live in a society that is post-racial, is not only false, but also incredibly dangerous.

The popular television channel MTV worked with pollsters to generate a nationally representative sample of people ages 14 to 24 to measure how young people are “experiencing, affected by, and responding to issues associated with bias.” A majority of participants believe that we are post racial as a society, with 67 percent believing that Barack Obama being president proves that race is not a “barrier to achievements.” Seventy percent of participants believe that racial preferences (like race-based affirmative action) are unfair, regardless of historical inequities.

Another study run by Brendeshca M. Tynes, professor of educational psychology at the University of Illinois at Urbana-Champaign, found a strong correlation between racial color-blind attitudes and racial discrimination on social media. In other words, those who believe we are post-racial tend to have less opposition toward stereotypical images online (i.e. “gangsta parties” that feature white actors in blackface).

Curious by these results, I ran my own small study on a sample of Brandeis’ student body. I had 64 students view two videos featured on the social platform Vine that had each been critiqued as racist in one study and two newspaper articles. Students then wrote a three word response to describe their reactions to the video and filled out a version of the Color-Blind Racial Attitudes Scale. The CoBRAS was designed by psychologists and social scientists to measure racial color-blind attitudes, with assertions such as “Racism is a major problem in the U.S.” with which participants rate their agreement.

I analyzed the three word responses using a composite analysis between what is described in Tynes’s study and what is described Eduardo Bonilla-Silva’s, a Duke professor of sociology, study “The Linguistics of Color Blind Racism: How to Talk Nasty about Blacks without Sounding ‘Racist.’” I found a statistically significant correlation between students’ CoBRAS scores and how “racist” their language was.

If racially color-blind attitudes can provoke prejudice, why do we harbor them? Fortgang fears that if we acknowledge that society is not just, and that it does indeed divide on race, then “everything I’ve done with my life can be credited to the racist patriarchy holding my hand.”

But our efforts and success need not be completely diminished nor completely attributed to ourselves. We must acknowledge some edge, albeit not all-encompassing. Of course no such edge could be responsible for all, or even most of Fortgang’s achievements. But to deny that any bias exists is naïve and dangerous—it allows us to continue on comfortably without addressing and changing the prejudices we all still harbor.

Fortgang, you need not feel your achievements are undermined—I’m sure that if you or your family had not worked as hard as they did, you would not be where you are today, and that is something to be proud of. You need not apologize to anyone—no one is asking you to. But you simply do need to recognize that structural racism does exist in our society, and that its effects are far too important to be ignored.

We’ve known about the NSA all along

Last week, I heard a colleague from the Computer Science department mention that his job is to monitor the campus’ downloads and streaming (yes, this includes porn). Apparently, the Brandeis community makes some, and I quote, “exotic” choices in their Internet browsing material. A guy next to me freaked out upon hearing this. “Isn’t that a violation of our rights, or something?”

Rights to privacy have been hotly debated since Edward Snowden revealed last June that the National Security Agency was maintaining electronic records of American citizens, without us knowing. Libertarians and liberals alike have been outraged at this infringement upon their perceived right to privacy.

I have no issue with Brandeis’ monitoring of our Internet traffic (in fact, it’s not prohibited by the Fourth Amendment because it’s a private institution), or with the NSA monitoring our phone calls.

I also have no issue with Edward Snowden—in fact, I think his disclosure of information makes him one of the bravest, smartest and most Time Person-of-the-Year-Worthy people out there. I have no issue with keeping what the NSA is doing private or public, I only have an issue with the notion that any of what the NSA has been doing is news, or that it even matters.

We’ve all known from the start that anything we put online was fair game. Those of us who grew up around the Internet have been given fair warning by parents, teachers and anyone else with formal experience with the Internet that anything that’s put on the Internet is not private. Remember the Patriot Act? Sure, Facebook has privacy settings, but faith in such a contract is naïve at best. Whether we like it or not, anything we put on the Internet (social media in particular) is for other people or society to see. After all, why else would we put it there?

This monitoring also does not affect us at all on a daily basis (we didn’t even know it was happening until we were told it was). Many of us have gone everyday before last June, and even past it, illegally streaming Sherlock, Downton Abbey, and Lord of the Rings movies. The only way it would’ve affected us is if we committed an act of terror (or illicit pirating, in the case of Brandeis’ monitoring) and got unexpectedly caught in the act. It appears we’ve lost the right to break the law in private.

People will cry “Big Brother” and demand recognition of our dystopic, apocalyptic state. Now that the government knows our plans for this Friday night, they can more easily ruin our lives. The sad truth is that unless one is flagrantly posting about their planned terrorist activities, no one (not even the NSA) cares about what is on their Facebook accounts, in their messages to significant others or on their blogs.

In fact, there aren’t any NSA agents sifting through our messages and electronic paraphernalia at all. There are computer algorithms that will flag anything worth reading. No human eyes will ever see any of our stuff unless it gets flagged (and it hasn’t been, if you’re reading this newspaper). What the NSA does care about, however, is protecting us against violent, terrorist attacks.

Many liberals I talk to find protective measures like this unnecessary, stating that they encourage discrimination and infringe upon our intrinsic right to privacy.

But these people have often never been in any sort of actual danger. They’ve accrued the belief that any protective measures are superfluous because they’ve been brought up in extremely privileged, protected places because of agencies like the NSA.

How often do we have to worry about school buses with our children being blown up? How often do we have to worry that our homes will no longer be standing when we get back from school or work?

Regarding an intrinsic right to privacy, I believe we have one, but not when it comes to the Internet. If you want a private conversation, have it in person. If you want a photo or video to remain private, don’t post it on the Internet. Anything posted on the Internet publically is a cry for attention and we should not conflate our desire for attention with a “right” for privacy.

There are serious issues in the world, and there are places where people wish they had the resources for their government to monitor Internet traffic to stop bombs from going off in the streets.

While the dissent toward the NSA comes from a good place, it’s important to remember that we’ve known it’s been happening all along, it doesn’t actually affect us, and, in effect, it doesn’t actually matter.

Grade inflation at Brandeis University

Last December, I read an article about Harvard University’s grade inflation in their school newspaper, the Harvard Crimson. According to the article, Harvard University Dean of Undergraduate Education Jay M. Harris released the median grade as an A- and the most received grade as an A.

I laughed. How silly was it that such a prestigious institution could have such low standards?
Last week, I picked up a copy of the Justice and saw an article titled, “Median fall grades released to the public.” I grew excited at the chance to shove Brandeis’ higher academic standards in Harvard’s face. According to the article, however, Brandeis students received a median grade of an A- and maintain an average grade point average of a 3.4.

I was ashamed and humiliated. All of my pride about working hard in classes to achieve high grades began to dissipate; what did this mean for me as a student? Did this mean that I was taking easy classes and that my grades simply reflected that? Brandeis Senior Vice President for Communications Ellen de Graffenreid defended Brandeis’ grade inflation by stating, “The averages and the distributions have been remarkably stable over time, which would not indicate a pattern of grade inflation.”

Are we supposed to feel better that grades have always been inflated and that this isn’t simply a recent trend? To make matters worse, she later added, “The averages at Brandeis are consistent with those at other elite colleges and universities.”

Adapting the old “If you can’t beat them join them” ideology is not permissible. Rather than conform to the norm, we should be setting new trends and maintaining non-inflated grades.

Some might offer that everyone receiving high grades simply means that everyone is just doing really well in their classes. But this is incorrect. A high median grade simply indicates that grades have lost value as an intellectual currency. Best put by Syndrome, the evil villain of Disney’s The Incredibles, “When everyone’s super, no one will be.”

There are valid arguments as to why grades shouldn’t exist in the first place: they cause competition in an area where some believe competition isn’t necessary, they don’t always accurately represent what people actually know and they often yield bias to those who test well, among other reasons. But those arguments are irrelevant because, whether we agree with them or not, grades are currently the main metric by which students are evaluated. Grades are weighed heavily on applications for internships, graduate schools and even jobs.

It is concerning to think that we would be disadvantaging our students by issuing comparatively lower grades than other universities. However, if we come out publicly in strong opposition to grade inflation, and call out other elite institutions on their grade inflation, the issue will become more well-known among employers and recruiters, and the issue as a whole can begin to be solved. It’s the other schools, not us, who will be disadvantaged.Additionally, Brandeis could even distribute “adjusted for inflation” grades which compare current Brandeis grades against the school’s past inflated standards, or against a national average. This would map the C a student might receive against the A- they might have received before Brandeis became conscious of the issue. Rather than passively accepting the standard around grades as a necessary evil, we need to correct it through leading by example.

Grade inflation also provides a ceiling for learning—a point at which one becomes complacent with his or her current progress. This is ironic since one of the original purposes of academic assessments was to provide a floor for the basis of knowledge that someone should possess. For example, when lifting weights, you don’t set a finite goal of 100 pounds, then reach it and stay there. You constantly increment the weight once your current weight becomes too easy. The process never ends; it’s simply revised at each iteration.

If an A- or A is easy to reach, students are able to stop working once they’ve reached it. And why should this ever be a lesson educators encourage? Learning has no definite starts or stops. But providing an easy-to-reach maximum grade perpetuates the falsehood that it does.

As educators, as students, as lovers of learning, we need to hold ourselves to higher grading standards.

No justice in death

John J. Geoghan was a Catholic priest who molested nearly 150 boys over the course of thirty years. He was exiled from the Catholic Church in 1998 in the face of numerous allegations, and was found guilty of indecent assault and battery in 2002 when a college student testified that he had been molested by Geoghan in a swimming pool at the Waltham Boys and Girls Club in 1991. Because he was only found guilty of the one charge, Geoghan was sentenced to 9 to 10 years in prison. On February 23, 2004 Geoghan was murdered in his cell by Joseph L. Druce. “Justice” was served.

Joseph L. Druce was serving a life-without-parole sentence for a 1998 murder. He had been picked up as a hitchhiker by 51-year-old George Rollo. After realizing Rollo was gay, Druce attacked him, stuffed him in the trunk of Rollo’s car, drove him to a wooded area, and strangled him. Druce is a reputed member of the Aryan Nations neo-Nazi group.

Geoghan’s murder had been meticulously planned months in advance. At 11:48 a.m. on February 23, 2004, all 22 cells on the block were opened for prisoners to return food trays to a common area. There were supposed to be two correctional officers on guard by the tray return area, but one of them was pulled off to escort another inmate to the nurse’s station. Druce snuck into Geoghan’s cell and used a book, nail clipper, and toothbrush to jam the cell door so that it could not be opened electronically. He then gagged Geoghan, threw him to the ground, repeatedly stomped on him and jumped on him from the bed, then strangled him with a pair of socks. Guards weren’t able to get the door open for 7 to 8 minutes. Geoghan was pronounced dead at 1:17 p.m. with the cause of death as ligature strangulation, blunt chest trauma, bro-ken ribs, and a punctured lung.

While there’s no direct evidence to suggest that the prison facilitated the murder, many questions remain unanswered. Was it really coincidental that Geoghan was kept on the same block as Joseph Druce, a known homophobe who was serving a life-sentence for strangling a homosexual man? Was it really coincidental that only one correctional officer was on guard at the time? Was it really coincidental that the guards were unable to intervene for 7 to 8 minutes? I’m not suggesting a systemic conspiracy, but it’s no hidden fact that pedophiles and rapists are not treated well in prisons. Was Druce a lone vigilante or a soldier? And if the latter, who issued his orders?

Regardless of how it was done, the reaction to the murder was almost as atrocious as the act itself. One of Geoghan’s victims, Michael Linscott, said of the murder, “I thought about the victims that are still here that he would have to face had he lived…In my opinion, he got off the easy way.” Mitchell Garabedian, an attorney for more than 200 alleged victims of Geoghan, added, “Many victims are disappointed … They wish Father John Geoghan had time to be in prison to reflect.”

While this sentiment is understandable, and could be sympathized with even, it is not just. Geoghan’s mur-der was not the righteous end to his story, but not because it was easier than life in prison. Geoghan molested children. But if you, me, or the state issues an execution, directly or indirectly, and feel no remorse, we’re no better than Druce. Some see Geoghan’s murder as justice—he was a pedophile, so he deserved it. Some see injustice that Druce wasn’t murdered—he was a murdering homophobe, so he deserved it. But both of those views are oversimplified. There are no heroes or villains, no winners or losers, only tragedy. Killing Geoghan did not take away what he did to the countless boys. Killing Druce would not bring back George Rollo or even John J. Geoghan.

We as a people should not cheer on murder—we should vehemently protest it. We as a people should not sit silent to the atrocities of our brothers in jail—we should scream in their defense. Everyone has certain unalienable human rights. We as a people need to recognize that.