Love Freedom and Equality – Lessons from “Beauty and the Beast”

The most important scene in Beauty and the Beast is not when our eponymous protagonists reunite, but when Beast decides to let Belle go. The scene offers valuable lessons on the intimate links between love, freedom, and equality.  Continue reading

Advertisements

When It’s OK to Kill: The Ethics of Violence in ‘Game of Thrones’

From_WallpaperUPThe streets, rooms, and castles of Westeros are strewn with dead bodies –throats slit and bodies impaled, stabbed, or decapitated.  Yet Game of Thrones is as popular as ever. Does this indicate a penchant for violence among fans and viewers?

One sure hopes not. Viewers may relish the entertainment value of the schemings and killings, among others, but would not condone them in real life. There, all of the major GOT characters would be arrested, tried, and prosecuted for murder. In theory at least.

But alas, Westeros is not a democracy, which is one reason why and how Game of Thrones resonates in our world today. While art and literature do not simply “mirror” history, the series does portray an undemocratic society ruled by few elite, one-percent families. There is no voting, no consultations, no participatory governance, no deliberative democracy.  Only war, power grabbing, and the eponymous game of the series.

More importantly, Westeros is a land where the traditional, old-fashioned moral order — where good and evil are easy to tell apart — has broken down, if not disappeared altogether.  It is never really clear in Game of Thrones who the bida is. This isn’t the case in, say, Lord of the Rings, or the Harry Potter, where everyone knows that Sauron and Lord Voldemort are the villains, and that Aragorn and Harry are the heroes.

Indeed, the plot of Game of Thrones deliberately frustrates the tendency, as in all narratives, to look for, choose, and side with the bida. More specifically, it brutally disabuses us from the long-standing assumption that the good guys will triumph in the end. Season 1 sets up one character as a protagonist, only for him to be decapitated later on. Something similar happens in the Red Wedding. To be fair, however, two of the nastiest of characters in the series get their comeuppance as well, both of whom die a violent, if not well-deserved deaths. Many others in between — “innocent” and “guilty” — are struck down. As far as killing goes, Game of Thrones does not take sides. Anyone can die.

This is an undeniably cruel world. But this indiscriminate, bloodthirsty penchant for killing is paralleled, balanced, moderated, or contradicted (depending on your point of view) by an even-handedness that is less evident in the TV series: the books devote several chapters to each major character, and allows readers to get to know another, if not deeper look, into their respective personalities and motivations. It is as if the narrator gives everyone a chance to speak on the microphone, to let each of them have their say, and to let us, the viewers and readers, sympathize and gain different vantage points. Such generosity, if not liberal democratic tendency, runs counter to the Machiavellian cruelty and Hobbesian violence of the series.

Through this fairness, Game of Thrones seeks to preclude us from making clear-cut moral judgments, that X is good, while Y is bad, and gives us instead a gray-colored moral landscape. This is particularly true of Jamie Lannister. In the first episode of Season 1/in the first book, he starts off as a typical villain, but as we get to know him through his chapters, especially in his interactions with Lady Brienne, one sees his different, perhaps nicer and kinder side. That he isn’t so bad after all. And Cersei, for all her ruthlessness, is painted differently and somewhat more sympathetically (she is a victim of patriarchy; and often complains a la feminist mode how women are limited), the more deeply one delves into her mind. Even the good guys betray a streak of ruthlessness every now and then.

Even if there are no traditional protagonists in Game of Thrones ala Aragorn, Frodo Baggins, or Harry Potter, viewers still choose from several contenders for the role of bida, the one who will control King’s Landing. And how they make such decisions partly entails a moral decision on their part, one that justifies or ignores the killing(s) perpetrated by their chosen “hero.” This is just entertainment, but the kinds of moral reasoning employed here is troubling, for they resemble the types of justifications of real-life killings.   Even if s/he killed someone, he is nice guy deep down; her heart is in the right place; he killed in self-defense; she just wanted revenge; it’s the fog of war; we understand where he is coming from; it’s a kill or be-killed society; she had no choice; he had to do it as a show of force; everyone in Westeros has blood on their hands, anyway.

There is then little, if at all, solid moral ground from which we choose our bida. For whatever our reasons are, they essentially boil down to a justification of killing and murder; and reflect choices that we would not otherwise or hesitate to make (one hopes at least) in real life: few would not justify the murders committed in Westeros, even those by, say, Arya or Jon Snow, no matter good and how kind-hearted we find them. Whatever her merits and despite her past history, Arya is a murderer.

In many ways, this kind of decision-making in fiction reflects a world of increasing cynicism and growing moral and political complexity, not least that of business, where the goal is to “win,” get ahead, and “defeat” competitors, often with advice from The Art of War, The Book of Five Rings; or the 48 Laws of Power; where questions of ethics are displaced by neoliberal notions of risk and returns, gains and profits; where everyone is urged to be less idealistic and be more practical; where compromise and getting our hands dirty are routine; where due process is done away with; where killing is justified, legitimized, sanctioned, and tacitly approved in various ways.

Game of Thrones reflects and responds to a world beyond good and evil, as it were, presenting and problematizing a social order dominated by power, competition, and violence. To what extent, if at all, can ethics work in this world beyond good and evil? Do such categories still work given the compromises that many of us have to do and live with? Are we forced to abandon our ethical ideals? Can ethics help us survive such a cruel word?

Such questions pervade our popular culture, from Iron Man 1, which simultaneously questions and upholds the US Military-Industrial Complex, to The Walking Dead, which asks just how much and what kind of morality applies in a post-apocalyptic, zombie-infested world.

Indeed, the exercise of power has been one of the perennial themes of sci-fi and fantasy, from Lord of the Rings and Wheel of Time to Star Wars and Harry Potter. But ethics, as we have seen, is not far behind. In true Aristotelian vein, morality is intimately tied to politics, concerned as it is with selflessness and humility, and with selfishness and ambition that drive Tom Riddle and Anakin Skywalker to the dark side. Many of these films or novels function as, among other things, moral fables. And while it may seem naive, if not academically unfashionable, to speak of fables in this morally complex world, what makes ethics seem so superfluous and outdated is also what makes it so urgent and difficult to come by.

Enjoy the rest of Season 7, but do not forget that the world beyond is dark and full of terrors.

Reel Justice: Filipino Action Movies in a Time of Killing

decBelow are the opening paragraphs of an essay I wrote for Kyoto Review of Southeast Asia (Young Academic Voice).

This year saw the premiere of Itumba ang mga Adik (Kill the Addicts) in the Philippines. Shot in streets across the country, from narrow alleys to cramped rooms, the controversial film stars vigilantes, (suspected) drug users and dealers, crime syndicates, and innocent civilians. It has been a bloody tale of crime and punishment.  Continue reading

Experiences For Sale

Some time ago, I saw a photo in my Facebook timeline stating that people who spend on material things are less happy than those who do so on experiences. Whether it’s true and backed up by research, it’s a sentiment many people will tip their hats to. The statement strikes against consumerism, but if it’s an admirable stance against excessive buying on the one hand, it’s not on the other.

Guided so, people would arrange a road trip with friends; go on a holiday with family; or travel to new places.  Happy thoughts indeed. I am all for these things, but I still see an underlying danger in the “spend-more-on-experience” school of thought: the commodification of experience itself. It may help us turn our backs on (too much) consumerism, but the logic of spending is still intact: experience is something you buy and accumulate, just as you hoard shoes, toys, gadgets, books, toys, or what have you. This, then, represents the ultimate triumph of commodification; capitalism has commodified not only things but also our non-material experiences.

But do people actually think this way, that by spending time with family, friends, and traveling all over, they have commodified experience and behave somewhat the capitalist, who wants to accumulate more capital? Do people count their trips and experiences, just as businessmen go over their earnings and profits, and bother about the bottom line?  Can we even and always equate the commodification of experience with that of material things?

Though the equivalence may not be absolute, and that there are certainly differences, I think we commodify our experiences, not (just) in the sense of counting and accumulating them, but (also) in how they enhance our personal cultural capital, our brand, as it were.

For some, posting on Facebook is innocent self-expression. But this remark ignores the relational context of the self; it does not operate on a vacuum; like many things, the cult of self-expression has social determinants.  At any rate, self-expression in this case dovetails with the idea of the self on display, as William Davies writes in his book, The Happiness Industry.

Self-expression here is thus not just about the self per se, but more about its relationship to others. There used to be a time when people self-expressed by keeping a journal, which was as a rule not meant for public viewing. Today, however, people express themselves,  but why does their entire Facebook friend list have to know where they are and what they are doing. It is true that we gain much experience, perspective, and insight when traveling, but why do other people have to know the fact via our social media pages? Isn’t it enough that we learned and saw a lot without having to shout it out? Or, something closer to home, why do I have to write this blog and make this public?

Perhaps this explains why some people, including I, go so gaga over taking photos of the places they go to. Indeed, it’s been lamented that people today no longer experience places the way they used to because their experience thereof is already mediated through the camera. Instead of seeing things and places with their own eyes and minds (if and however that is possible), they are more concerned with getting the perfect photo for that Facebook post, if not a selfie. They consume places with every click of the phone, and one shot or two no longer seems enough; they hoard photos the way businessmen hoards capital.

The value of a place lies not in itself but in photographs thereof, especially their level of their social media worthiness, their “postability,” their measure of “spectacle”  a la Guy Debord. The superseding of the place or experience is captured at its most extreme in a line from one of my favorite TV shows, “if it’s not on Instagram, it didn’t happen.” The image stands for the experience or the event itself, implicating new notions of truth, among others, and its relationship to social media.

Contrary to the “social-mediatization” of experience is the notion of what has been called “immersive tourism,” which I take to mean roughly spending a lot of time in any place and getting to know it deeply. This, of course, is impractical. Even if we really wanted to, many of us couldn’t really spend, say, two weeks in a foreign country. The most, I think, we can do is to read the history of a place, and understand the meaning and significance of what see, instead of its value as a Facebook photo.

This is why I sometimes think I do not need to travel in order to learn about a place. Indeed, this is why I feel less of an itch to travel. I feel that I can learn more about the place by reading about it than just visiting for a day or two and taking photos thereof. We can always do both, of course, but how many of us actually read the history and culture of a tourist site before visiting it?

Of course, it’s not bad to accumulate as many experiences as we can. For what else would we be accumulating? What’s changed is how we experience our experiences, which are created, mediated, and expressed through technology, photographs, and social media.

There must be a progressive dimension to this kind of mediation, but as a true dialectician who sees both sides of the issue, it may also have a darker aspect: its erosion of the value of things and places in themselves at the expense of the self who consumes them and puts them on display. It helps us forget that things, places, and even other people have an existence independent of us: that they are not fodder for our self-on-display.

By all means, let’s take photos, but let’s not forget this fact.

 

 

 

 

‘Pakiusap,’ ‘palusot’ and ‘pasaway’

The first few paragraphs of my essay that was published in Inquirer.

Filipinos have an infuriating habit of (1) breaking the rules (“bawal pumarada o umihi rito,” “pumila po tayo,” or “submit requirements by 5 p.m.”) and (2) justifying their actions before or after the fact. And they do so in at least three ways: pakiusap, palusot and pasaway. READ FULL ARTICLE. Continue reading