Archive for the ‘ Manga ’ Category

Pacific Rim (TheByteScene Review)

Date: September 9th, 2013

TheByteDaily

Pacific Rim

3 Giant-Robots-Fighting-Giant-Monsters out of 4

It’s a movie about giant robots fighting giant monsters from a different universe. No, really, Guillermo del Toro, the famed director behind Pan’s Labyrinth, Hellboy, The Devil’s Backbone, and a wide array of other films has returned to create a movie about giant robots fighting giant monsters from a different universe. And it’s awesome. No, seriously.

The movie’s premise is simple, succinct, and straightforward: Giant monsters from another dimension named Kaiju attack Earth from a breach in the Pacific Ocean and the world’s governments work together to create giant robots named Jaegers to fight the unearthly threat. Taking place days after the Jaeger programmed is decommissioned, four remaining Jaegers set up a final resistance against the Kaiju menace in an all-or-nothing gambit for the fate of the world. Again, the movie’s premise is straightforward, and little time is spent on meaningless exposition; despite, or perhaps due to, the film’s ambitious nature, the plot is streamlined and all character interactions are limited by purpose.

What is the point of the conversation, what purpose does it serve to have these characters meet, how is the plot affected by this piece of dialogue? Once a scene answers these questions, the movie quite literally returns to the action, drawing in the audience with visuals, CGI, graphics, robots, monsters, and set pieces that are operatically epic. The film’s pace carries the audience from set piece to set piece choosing to spend time on creating a world where the Jaegers and Kaiju reign supreme.

Above all else, Pacific Rim is an exercise in visual mastery.

Created by artists whose love for the Mecha and Kaiju genres, and tokusatsu is abundant and evident, the movie radiates in subtle homages, references, and pastiches to the works of masters such as Ishiro Honda, Hideaki Anno, Go Nagai, Akira Kurosawa, Yutaka Izubuchi, and Yoshiyuki Tomino.

To those unversed in the staples that these creators and their works pioneered, the movie is loud, beautiful, epic, and awesome. A score by Ramin Djawadi creates a powerful atmosphere that the movie relishes in exploring, and though blockbuster action is present, watching Jaegers pummel, and get pummeled by, Kaiju is akin to watching master warriors dance around a large apocalyptic canvas. The fight choreography is akin to watching violent ballet; Jaegers and Kaiju match one another’s moves like dancers who have spent years learning each other’s intricacies and idiosyncrasies, and discovering new ways to adapt and conform to them.

Yes, the film’s plot is thin. Yes, the characters are fleshed out just enough to explain their motivations. Yes, the action is loud, bombastic, frequent, and worthy of the “Summer Blockbuster” distinction. Beyond these criticisms, Pacific Rim is beautiful, expertly choreographed, beautifully directed, and spectacularly scored.

As always, this has been your Admin, the Blogger; comment, subscribe, and criticize, and DO remember! Always look on the BYTE side of life!

– SC(EK)

Understanding Film Critics; Yet Another Frivolous Article on the Importance of Argument

Date: September 7th, 2013

TheByteDaily

Understanding Film Critics; Yet Another Frivolous Article on the Importance of Argument

I fear becoming a movie critic.

To anyone who claims that fear isn’t real, and that it’s nothing more than a metaphysical extension of danger, I say this: Start writing movie reviews. There’s something about criticizing movies – compared to criticizing music, books, or even theatre – that somehow manages to enrage entire populations of the educated world. Truthfully, there’s something about criticizing movies that makes me fear writing too many movie reviews without producing a completely unrelated article. Even if I consider writing an opinion piece about movies instead of an actual movie review, I still manage to send nervous chills down my spine. Interestingly enough, for anyone who pays attention to my frequent absenteeism, it can be noted that most of the articles I write before a major hiatus are about movies.

There’s no need to be needlessly ambiguous with my critical fears; my worries have nothing to do with the enormous public backlash my reviews conjure within society. Quite the contrary, I absolutely love it when I start a needlessly detailed and incredibly laid out argument about the current trappings of the silver screen. Putting it delicately, I’m afraid of writing too much about movies because of the small percentage of the population that doesn’t understand the need for criticism, critique, reviews, or opinion pieces on the current state of art, media, life, the universe, and everything.

Putting it indelicately, there are people who don’t understand why it’s important to criticize everything, and these people scare me. Granted, previous articles have approached the ridiculous notion of “Questioning Everything” with the necessary quizzical-questioning glare that Socrates himself would surely embolden. Still, I find it’s far easier to tackle a single sample rather than the entire population, so I’ll continue discussing my fears nonetheless. Back to my point, I’m scared of becoming a movie critic because of the people who unironically argue the need to criticize.

I hope my point is made clear even at a perfunctory glance: I’m scared of becoming a movie critic because of the people who argue that we don’t need to argue about movies. I’m scared of writing too much – too often – about movies because of the people who attempt to dismiss, dismantle, and destroy the idea of criticism while simultaneously exercising, employing, and engaging their very rational desires to criticize and argue. Despite what a poorly educated pacifist might say, the universe runs on the intrinsic idea that there are positives and negatives, and these positives and negatives always interact even on microcosmic scales.

I admit that my explanation of quantum theory distracts from my argument regarding criticism, so I’ll be brief in my digression.

Why is it so important for me, or anyone, to criticize anything? Because that’s the whole point; without criticism, without argument, and without debate, things have a tendency to fall towards calculated tyranny and an eventual acceptance of blatant complacency.

Certainly, for all of my criticism of Transformers: Dark of the Moon not a single penny was withheld by the people who helped produce its billion dollar profit margin.

That is a very strong argument against film criticism.

For all the time invested into pointing out cinematic flaws, people still watch “Bad” movies, and for all the work done pointing out cinematic ingenuity, people will still avoid “Good” movies like I imagine medieval Europeans avoided the bubonic plague.

This is a strong argument regarding all forms of criticism.

Eventually, regardless of the work one might put into criticizing anything, one’s effort will be an exercise in futility. Presidents won’t be impeached – they’ll be re elected; governments won’t fall – they’ll simply become more intent on inconspicuously brutalizing their people; and Michael Bay won’t be driven out of Hollywood by a crowd carrying pitchforks and torches – he’ll go on to direct another Transformers sequel and Pain and Gain (which, admittedly, wasn’t absolutely terrible).

I argue that results are only one part of the overall structure, and that meaning and knowledge are equally important derivatives.

Why is it so important for me, or anyone, to criticize anything? It’s so we can learn something instead of sheltering ourselves in a cocoon of safety and self indulgence. Why is it so important for me to call a movie bad? It’s so I can start an argument and try to understand why I might be wrong, and why someone else might be right.

I’m still scared of becoming a movie critic, but thanks to this article I’ve learned that I’m only scared of being called one. I might never pursue cinema in any way after this article, but at least now I know that my fears are less physical and more entirely immaterial.

In any form, criticism is an extension of self-examination, and I argue that there are few things less frivolous in this universe than attempting to understand this universe.

This has been your Admin, the Avid Blogger; comment, subscribe, and criticize, and DO remember! Always look on the BYTE side of life!

-SC (EK)

The Man with the Iron Fists (TheByteScene Review)

Date: April 20th, 2013

 

TheByteDaily

 

The Man with the Iron Fists

 

3 Golden-Lions out of 4

 

I’m sure there’s a school-of-thought that believes that period pieces should be shot as homages to the past, highlighting how far society has advanced, and how much the overall human collective has achieved in the present, all while using the past as a pedestal for the future. As for Quentin Tarantino and company – the group of filmmakers who have studied under and worked with the cinematic trigger finger – it seems that the way to create an homage is by reducing an entire genre to the sum of its parts and mercilessly showcasing their love for it in a brutal display of cinematic sensationalism.

 

RZA purportedly spent 30 days taking notes and watching Tarantino work during the shooting of the latter’s Kill Bill films, and it’s evident that the Wu-Tang Clansmen has matured into master from pupil.

 

The Man with the Iron Fists is in no way an homage to the martial arts genre as much as it is  an ode to the micro-epics that served as the backbone for the Western definition of kung-fu. The film is bursting with ancient Eastern philosophy, wise mystics, remarkably choreographed fight-scenes, cheesy, baudy characters, and almost every cliche the genre is known for, barring the poorly dubbed voices. If it weren’t for the paper-thin story that doesn’t actually tackle the main plot until almost halfway through the film’s runtime, this would be the greatest ode to kung-fu action cinema ever, and would actually deserve to be considered one of the greatest kung-fu films of all time.

 

It’s clear from the film’s opening credits that those involved in the production’s creation show a deep respect, fondness, and affinity for the martial arts genre, and the kung-fu action cinema subgenre specifically. RZA’s directorial debut is outstanding, and while the writing is profoundly weak on near-spiritual levels, the film is a masterpiece in almost every other way. The editing is tight, the cinematography is crisp and gorgeous, the music is superb, and the fight-scenes are so beautifully choreographed that the extras might as well be credited as backup dancers.

 

RZA’s vision is that of Jungle Village, a shanty war-torn town ravaged by power-hungry clans. The execution of leader of the ruling Lion Clan, Gold Lion, by the conniving, yet oddly camp Silver Lion acts as the spark that sets off Jungle VIllage’s proverbial powder keg, forcing Gold Lion’s son Zen-Yi to leave his fiance, return to the village, and reclaim the lost honor of his family and his clan. Given the plot in context of the genre, it all makes perfect sense. Add some of the Emperor’s gold, Russel Crowe as a British consul, RZA as a talented Blacksmith, Lucy Liu as the head of the Pink Blossom brothel, and David Bautista as a mercenary named Brass Body into the mix, and the stage is set for an explosion of francium-based proportions.

 

Despite the wide-range of acting (and musical) talent on display, the film suffers from extremely slow moments of exposition that neither provide, nor take away, to the film in any significant way. Not to mention, the film’s arguable main character divulges a relatively weak story – never boring mind, but often weak. Hoping to escape the darkness of Jungle Village, the blacksmith is in love with a prostitute in the Pink Blossom. One would be excused for expecting a twist, a knife-in-the-back, or a betrayal, but sadly the romance never amounts to anything more than screentime for the two lovers.

 

The film’s soundtrack serves as a strong highlight, and features an eclectic mix between traditional Eastern influence, Hip-Hop, and Ennio Morricone thrown in for good measure. The movie is directed by RZA after all. What’s interesting is how well the tracks are edited together and incorporated into the film’s main score; it was rare for the film to mindlessly throw in a track from the soundtrack and risk ruining RZA’s and Howard Dossin’s own score.

 

Writing yet again, is weak. Disappointingly so, especially since this is an otherwise strong film that is even more important because it serves as an example of an action-flick that is worth watching specifically for the action. The Man with the Iron Fists is a rare film whose action is art, and whose director understands the genre and chooses to embrace every aspect of it.

 

Watching The Man with the Iron Fists, I’ve come to believe that the only way to shoot an homage is by mercilessly brutalizing the genre into submission, showing off everything that made an audience fall in love with it, and everything that made critics lampoon and deride it into arbitrary defection. RZA has made more than a homage to the kung-fu action cinema subgenre, and has, instead, created a singularity designed to appeal to fans specifically, and everyone else who stayed past the hilariously cheesy opening credits. Under almost every circumstance, the film is a masterpiece.

 

Almost.

 

As always, this has been your Admin, the slightly-Avid Blogger; comment, subscribe, and criticize, and DO remember! Always look on the BYTE side of life!

 

-EK

 

The End of the World As We Know It; A Discussion of Nihilistic Complacency, and Universal Catharsis

There’s a scene in Seeking a Friend for the End of the World where Steve Carell’s character – Dodge Peterson – reunites with his father for the first time in 25 years. It’s revealed that the father (played skillfully by Martin Sheen) abandoned his family when Dodge was young, and the vacuum created by the end of the world acts as the appropriate motivation to force Dodge to seek out his absent parent in an attempt to bring their relationship to a cathartic resolution. At first I found it difficult to suspend my disbelief; after all, the movie’s plot involves a 70-mile-wide asteroid named Matilda approaching the planet, leaving Earth’s inhabitants with nothing but three weeks until the inevitable collision, and our inevitable end.

 

Despite the film’s insistence on maintaining a semblance of realism and reality – including several scenes showing anarchy spreading throughout the world in the forms of riots, and rolling blackouts and a lack of water being the result of the government’s nihilistic apathy – the scene where Dodge attempts to resolve his longstanding issues with his father (and actually succeeding) was the sole moment that I found unrealistic, and I was unable to accept the film’s premise. The resolution was too perfect, and the reconciliation completely disregarded any possibility of the father’s negative qualities. In a more realistic situation, it would have taken months and years for Dodge to fully trust his father once again; given the characters’ circumstances, however, I’m ashamed to admit that the scene’s potency only recently struck me.

 

Today is December 21st, 2012 – the supposed End of the World as predicted by the Mayans and Nostradamus after them. Obviously, the world didn’t end today, because the Mayan equivalent of December 21st, 2012 is nothing more than a reset of the entire calendar itself to insure continuity for at least another 5 quintillion years. At the same time, no astronomical anomaly capable of ending the planet has been charted, meaning that the human species – and every other species that we share the planet with – can continue existing until Earth is consumed by our sun 5 billion years from now, when the yellow star expands and sequences into a red giant, eventually consuming the solar system’s inner planets.

 

Interestingly, though the consumption aspect of the planet’s demise is debated, there appears to be little doubt that the heat radiating from the sun will leave the Earth as nothing more than a dead planet with a surface of molten rock in a matter of a few billions years after the sun sequences.

 

Despite this theorized end, today is December 21st, 2012, and contrary to many irrational prophecies, the world has not yet ended. Certainly, the human species will continue attempting to eradicate itself through war, poverty, famine, illness, and the slow ecological destruction of the planet, but for all intents and purposes, we haven’t quite yet succeeded.

 

What I’ve only now come to realize, through the help of the widespread exposure given to the recent trends in predicting apocalypses, is that the end of the world is cathartically, complacently romantic. The apocalypse, as most prophecies tend to describe it, isn’t the end of the universe nor is it even the end of our solar system; instead, it’s the total destruction of the planet Earth, and the absolute annihilation of the human race – so far, the only discovered intelligent lifeforms in any part of the universe.

 

To the universe, it’s merely the end of all life on Earth, and is the end of humanity, but to humans it serves as an undeniable sign of the pointlessness of our existence.

 

Imagine for a moment that a 70-mile-wide asteroid was actually graphed to crash into the planet in less than three weeks time. Yes, people would run around scratching items of their bucket-list, and some might even live to their heart’s content doing everything they wished they could do but never had the courage or the time to; families would reconcile, lovers would reunite, parents would spend more time with their children, unhappy cubicle slaves would quit, people would go skydiving, some would go rocky mountain climbing, and a few brave souls might even attempt to ride bulls named Fu Manchu for 2.7 or more seconds. No human on the planet would have an excuse to not live like they were dying, for the simple fact that in three weeks time, every single human being on the planet (and the planet itself) would no longer exist from a purely ontological perspective.

 

This isn’t to say that handfuls of survivors would be capable of perhaps resuscitating the planet; once the world ends, everything is gone including 5 billion years worth of ecological, geological, social, cultural, intellectual, educational, mathematical, scientific, literary, philosophical, religious, and psychological evolution and devolution. It’s daunting to know that generations worth of change would be eliminated in an instant (and that’s assuming a quick end like an asteroid), and it’s even more daunting to be faced with the knowledge that everything the human race has done will ultimately be reduced to ruin simply because the sun will undergo an inevitable astronomical change 5 billion years from now.

 

Therein lies the dual nature of human existence.

 

Our existence is ultimately meaningless because any given astronomical anomaly can destroy our planet, and that’s assuming we don’t get there first on our own. However, human existence is simply meaningless from a universal perspective; shifting the point-of-view to that of an actual human produces difficulty in validating one’s nihilism. The cliched way of thinking is that an individual is held accountable for their actions because their motivations have the ability of resonating with the entire planet. A single shift in worldview can mean the fruition or completion of any given person, organism, nation, or idea and though the universe couldn’t care any less, the rest of the planet certainly does.

 

In essence, my ultimate point is dual in nature: The End of the World is cathartic insofar as one remains complacent through their life, choosing to remain inactive and refusing to attempt to exercise any amount of control over the events that occur around them. Certainly, the belief that the entire human race’s existence is meaningless and inevitably pointless is a universal truth, but as far as humans are concerned, we stand to lose everything once our planet ends.

 

Sadly, as many “End of the World” plots tend to highlight, including a man’s quest to find the love of life before time runs out for the rest of the planet, we only seem to recognize the importance of our existence once our existence is brought into question.

 

As always, this has been your Admin, the Avid Blogger; today isn’t the end of the world, and DO remember! Always look on the BYTE side of life!

 

-EK

 

Frost/Nixon (TheByteScene Review)

4 Pseudo-Biographical-Documentaries out of 4

It’s not a film so much as a documentary leading up to and including the events of the interviews between David Frost and Richard Nixon. It’s not meant to produce any major resonance in the hearts of the audience – we already know who’s in the right and who’s in the wrong. Many of us already know the story behind Richard Nixon’s resignation, and even if the details are incomplete we recognize the scandal behind Watergate. For the minority that remains unaware of Nixon’s involvement in the scandal, the film begins with a series of clips swiftly detailing the event.

Indeed, though much of the film’s conflict centres around Frost cornering Nixon into confessing his mistakes and his involvement in the events of Watergate, the film makes no effort to hide the fact that its primary focus is on the interviews and not Nixon’s policy. Frost/Nixon takes an editorial approach to the 1977 interviews between British television host turned journalist David Frost and former US President Richard Nixon. It’s revealed that each of the involved have their reasons for choosing to accept the conditions for the interviews. Nixon wants a chance to give his side of the story and he wants to remind the American people that he made more positive contributions than negative. Frost just wants fame; near the beginning of the film a character remarks that Frost has sad eyes and it’s true.

Throughout much of the film Michael Sheen plays David Frost with a talk show host’s sense of apathy – we see him smile and grin in almost every scene excluding the interviews. After each failed attempt to force Nixon into confessing, he reassures the members of his research team – Zelnick, Reston, and Birt, played by Oliver Platt, Sam Rockwell, and Matthew Macfadyen – that the next day of taping will mark an improvement. In many ways Sheen’s portrayal of Frost is perfect, and I struggled to find any literary flaws with the character – we see little of Frost’s pessimism. After failing to secure bids from America’s top networks, he decides to finance the interviews using his own capital, and the money of friends and other interested parties. After failing to make any progress with Nixon, he continues to smile and grin with his sad eyes. He reduces himself to effectively groveling for financing, yet he smiles through it all on the sheer optimism that Nixon will be forced to confess through some sort of editorial miracle.

Contrarily, Langella’s Nixon is an unrelenting force that serves as the perfect foil for Sheen’s Frost. We see the former president conferring with his team regarding his political aspirations following his resignation, hospitalization, and subsequent discharge and they’re all adamant that his recovery – both physical and otherwise – is imminent. Langella’s Nixon is portrayed as a hardened fighter who revels in each decision he’s made showing no regret in any of his choices. Only two moments show Nixon in a moment of weakness, and even while recovering from phlebitis he seems agile-minded and sharp. The audience is treated to a man who genuinely believes that he will succeed in acquitting himself of any charges in the court of public opinion following the interviews.

History proved otherwise, though this fact is lost on the film’s script; the audience knows who wins and who loses – we know that Nixon would go on to suffer a swift defeat at the hands of Frost and we know that his return to politics would never occur, yet these truths don’t change the fact that it feels like we’re watching history happen.

Because of the film’s documentary-style we are.

It’s interesting to note that neither Frost nor Nixon are recorded for a one-on-one interview with the camera. It’s even more interesting to note that we never feel like we’re watching the events unfold as they happen, but that we’re watching a recreation of the events that possibly did unfold. It seems a direct contradiction to my earlier statement, but the hypocrisy is only grammatical and not literal. Praise should be aimed at Salvatore Totino for his cinematography, and Daniel P. Hanley and Mike Hill for their editing. Scenes are almost perfectly shot, and transitions are made in such a fluid manner that much of the film’s success rides on its stye and editing.

Ultimately, the film is a mockumentary of sorts, though in style alone. Had the voice-overs and camera interviews been delivered by the actual figures and personalities, one would be excused for believing that Ron Howard had directed an actual documentary on the Frost/Nixon interviews of 1977, with a few artistic liberties taken for cinematic purposes.

As always, this has been your Admin, the Avid Blogger; comment, subscribe, and criticize, and DO remember! Always look on the BYTE side of life!

-EK

The Startling Fear of Intelligence Makes No Cultural or Evolutionary Sense; A Discussion of Gross Vocabularies, Brilliant Grammar, and Adequate Punctuations

TheByteDaily’s tagline of “Gross Vocabulary, Brilliant Grammar, Adequate Punctuation; Hilarity Ensues” is one of the few promises that I genuinely try to keep. The promise of operating with a relatively large vocabulary in order to produce articles that are grammatically accurate, if not exuding a certain perfunctory absence of appropriate punctuation, is quite possibly the only one that I haven’t turned my back on. Words are, without a doubt, the second most important aspects of verbal communication, with tone and presence occupying the number one spot with a considerable amount of tension between the two. Generally speaking, tone and presence produce the same result, with the only variation being that tone requires one to use their words instead of their actions as presence does.

It’s due to this adamant belief in an immense vocabulary that I often draw criticism for appearing haughty or pretentious, despite the fact that using “Big words” is nothing more than an artistic frivolity that I permit myself to toy with. It’s really easy to understand that the size of a word doesn’t change the meaning of a work, and that the smallest words can have the greatest meanings if they’re applied properly. Likewise, using grand or majestic words, idioms, and phrases guarantees a greater, and broader, sense of definition, allowing a writer (or anyone who would consider using words) to say a lot with little – without needing to resort to needless explanations that lend themselves to repetition.

My promise to maintain a large vocabulary has effectively managed to attract and repel would be readers on every position of the spectrum; there are some critics who find me needlessly frivolous, others who don’t stay long enough to maintain a position, others who agree with my use of language, and some who stay despite their inability to understand why or to find a reason for their lack of departure. Despite the fact that the words I use are quite standard, and are by no means “Large” or “Overly intellectual,” I still manage to find critics that disagree with my word choices for being needlessly complicated and far too convoluted to be enjoyed. I mentioned earlier that tone and presence go hand-in-hand when it comes to communication, but I admit this is mostly only true for live conversation where both parties are capable of taking in their companions’ presence and appearance.

For conversations that exist without the ability to see the alternate orator, the only other measure for one’s meaning is their tone – the same applies for written conversations carried out without an actual voice to interpret. Therefore tone and vocabulary are the only two paradigms I have available at my disposal to convince an audience of the validity of my opinions and words. Which leads me to question the belief that weakening one’s words will further validate the statement being made, because a wider audience can be included in the conversation.

It’s a trope and a cliche’ to have a headstrong character cry out to the resident scientist, doctor, intellectual, professor, or academic “Speak English to me,” as an attempt to force the intellectual to “Dumb down” themselves so the so-called “Regular audiences” can understand what they’re trying to say. It’s a tactic often used for comedic effect because the program’s writers know that any regular audience is more than capable of understanding the “Convoluted and complicated” terms that the intellectual is using. Joey Tribbiani from the hit NBC sitcom Friends would often ask other characters to give him time to process relatively straightforward information that the others have already moved past. The program’s writers produced a character that many would consider “Dumb but loveable,” a veritable ditz or “Idiot Hero” for comedic effect. At times, Ross Geller, the program’s Ph. D and paleontologist, would drone on to comedic lengths to draw an exasperated reaction from not only Joey, but any of the show’s cast of six friends. The purpose of using Geller as an intellectual that uses “Big words” and Joey as the adorably dumb, but occasionally wise and fascinatingly mystical, funny man who only uses “Small words” is a comedic ploy that has its origin greatly rooted in the Nerd or Geek stereotypes that still manage to remain perpetuated.

I must mention that explanations for the audience often also exist within impossible realms with possibilities that don’t exist within the “Real word.” In such circumstances, dumbing something down isn’t done for comedic effect as much as it’s done to explain the unbelievable. I digress, however.

Despite the fact that intelligence is a powerful paradigm that leads to a better understanding of one’s self and those around them, it’s still somehow feared – especially in cultures that have cultivated intellectualism, art, science, technology, and some of the greatest minds known to humanity. Obviously using a single NBC sitcom to validate an argument is not enough proof – The Big Bang Theory plays on CBS, after all – neither is any sitcom of any kind for that matter. Comedy is one of humanity’s greatest treasures, and the beauty of a joke is that anyone can find themselves at the end of one, regardless of their intelligence, social status, physical appearance, monetary worth, net income, place of worship, place of work, home life, sexual orientation, or any other definition that requires social derivation.

The fact of the matter is that, in an age where intelligence plays one of its most important roles in history, the social norm appears to be that being smart – regardless of how much I disdain the simplistic virtue of the term – is not to one’s advantage, and is something that should be ridiculed and put down instead of encouraged and cultivated. It would be foolish to assume that I refer strictly to stereotypical definitions of the term “Smart,” since I refer to any form of intelligence when I use the term, whether this includes “Book” or “Street” smarts, spiritual intelligence, physical and metaphysical intelligence, philosophical intelligence, psychological intelligence, or even a working knowledge of the manufacturing process of a pencil.

As a society that extends far beyond community, city, and country, we’ve brought it upon ourselves to claim that certain kinds of knowledge is more important than others, that the pursuit of raw intellectualism is a needless and pointless feat, that there is no need to know things when the answer can easily be “Googled,” that retaining any form of trivia is both banal and trivial, and that, somehow, big words are equivalent to a big brain. The fear is one of the most rational ones that exist within the psychiatric spectrum, since not being able to think is both an evolutionary and culturally useless trait. Not being “Smart,” in any way, shape, or form, is dramatically dangerous to one’s self, and the propagation of a species, and it makes sense that one should fear being “Dumb” or not knowing things. However, it is far from rational or logical to claim that the solution to this fear is to render everyone equally dumb; It is irrational and illogical to believe that overcoming one’s fears is only possible by eliminating the singularity that initiated the fear.

One doesn’t overcome a fear of heights by destroying tall things, one doesn’t overcome an irrational fear of enclosed spaces by removing closets and cabinets, and I can’t possibly fathom how one would go about eliminating a fear of the outdoors based on these principles.

Fear is an evolutionary advantage that warns one about the prospect of imminent demise, and it makes sense to be scared of things that can kill you, but the path to overcoming fear does not exist by eliminating that which scares and confuses us. This strange fear of intelligence makes no cultural or evolutionary sense, apart from masking a sensational insecurity in the minds of those who are afraid, and I genuinely believe that this is the case. The fear of intelligence stems from the insecure feeling that there are those who are more intelligent than ourselves but instead of condemning these individuals, we should be striving to reach their heights – regardless of the difficulty associated with this task. I stand by my belief that taking the easy way out of things is fundamentally negligent towards the momentum of a task, but at least taking the easy way out of being smart is better than avoiding intelligence entirely.

Ultimately, my point remains as such – mocking intelligence, high IQ’s, and intellectual pursuits is the task for lazy screenwriters incapable of producing interesting ideas for their audiences to draw entertainment from. The simpleminded belief that intelligence should be scorned, or that knowing things is unnecessary, or even believing that there’s no need in striving for greater intellectual feats, is exactly that: it’s simpleminded.

Intelligence, the ability to know, analyze, quantify, and understand events, is the true human advantage, and fearing it is both negligent to the species, and to the wonders of the universe. Until such a time that intelligence is no longer feared, I’ll be using every “Big word” I can find to push the movement forward, regardless of the critics, their claims, and their supporters (few as they may be).

As always, this has been your Admin, the Avid Blogger; comment, subscribe, and criticize, and DO remember! Always look on the BYTE side of life!

-EK

It Was a Fascinating Trip; My Last Day In New York City, and a Discussion of Repetition, Interest, Stagnancy, and Movement (TheByteWeek Issue 16)

I originally planned for this final article about New York City to be published on July 31st, since my last day took place on the 30th, but circumstances arose and a momentary case of writer’s block managed to aid in delaying the publication. To be succinct, my last day had me visiting the Jackson Heights neighbourhood in Queens, attending a short film festival at the Queens Museum of Art, returning to the MoMI for a screening of Alfred Hitchcock’s well-known thriller Vertigo, and ending my day with dinner at a nearby Mexican restaurant. I was hoping to spend some time in Central Park – the entire day actually – but the film festival seemed far too compelling to pass up, and though the offerings were less than enticing, I don’t regret attending it.

The event featured a few short films from the Kashish-Mumbai International Queer Film Festival, a four day event held in the city of Mumbai every year. The selections opened with a short documentary detailing the prominence of homosexuality in India, featuring several prolific and key players in the movement to bring awareness and legality to their cause. Though the introduction was reminiscent of many PBS specials, it was undeniable that the presence the documentary held was well felt by the crowd, especially since the feature was shot candidly, with attention only being paid to the members of the movements, their reasons, their motivations, and the facts, never antagonizing one group or polarizing another. I will admit that the documentary was the highlight of the selections, with only one other short film truly catching my eye. The remaining few were an amalgamation of cliched writing, poor directing, disappointing editing, and even worse acting.

To say that the features were amateur would be an understatement, though the core principles of storytelling were still intact – the only problems the directors seemed to have was utilizing these principles in a meaningful and interesting way.

I’ve commented on the difference between a cliche’ and a trope, and I stand by my statement that a story doesn’t need to be told in a new or different way as long as the end result is interesting to experience. It’s easy to misunderstand and misconstrue this statement as contemptuous and an insult to artists the world over, but my point is not that there isn’t such a thing as a new story, or that artists and creators don’t need to challenge themselves – and those that experience their work – by attempting to create something fresh. Instead, my point is merely that art needs to be interesting, fascinating, intriguing, and enticing to both the creator and their audience in order for it to encompass any derivation of the word “Good.” The directors, and their writing staff managed to grasp the main tenements of storytelling, but simply failed to do anything interesting with them – not new, but interesting.

Repetition is even more important to understand in order to recognize that there’s nothing wrong, and there is certainly no shame in repeating themes, ideas, or opinions in order to tackle them in interesting ways. Alfred Hitchcock, a director, writer, producer, and filmmaker renowned for making thrillers that instilled fear in the minds of their respective audiences often used similar themes throughout his entire repertoire, choosing to repeat ideas in order to influence his audiences in interesting ways. Certainly, he created new work, and used different plots to interest his viewers, but the core principles of his work were always the same; his goal was to inspire fear, and though his films aren’t scary by 2012’s horror-show standards, audiences in his time, and mine, were, and are, aware that the plot devices he uses – in addition to the directorial choices he made – are universally known to send chills down one’s spine.

Excluding the fact that Hitchcock’s main purpose was to instill fear, his films also often focused on illuminating the supposed weakness of the female form, and almost all of his films feature a female character receiving justice for her deception, or decision to sin. Whether his reasoning was personal or otherwise is beyond my scope, and what must be derived is that he chose to repeat this theme in almost all of his films, for whatever reason. It doesn’t dilute his work knowing that he repeated himself in any way, since the original goal remained intact; to this day, Hitchcock’s films are regarded as fantastic thrillers that deserve to be analyzed, critiqued, and viewed.

Ultimately, Hitchcock’s films, including Vertigo, are interesting.

This human desire to be kept interested and intrigued extends far beyond the realm of film-making or art, and Hitchcock or the directors at Kashish are in no way the only ones forced to combat interest. It’s undeniable that humans actively seek to be fascinated in every aspect of our daily lives with concepts like boredom or ennui being regarded as matters that require dissection and analysis to eradicate and eliminate. We don’t want to be bored because we don’t like to be bored for the simple reason that standing still or not moving is a psychological and physical impossibility. I’ve often brought up the topic that the only path to move is forward, with difficulty needing to be overcome, conversation needing to be maintained, opinion needing to be provided, and ideology needing to be challenged and discussed.

Not doing so is strange and an abdication and resignation of the most basic human desire to know, despite the fact that adamantly maintaining one’s stance on an issue, and issues in general, requires just as much movement as stillness. It’s humourously paradoxical that stillness is nothing more than the equal balance of movement, especially since the two paradigms are entirely contradictory. Clearing one’s mind for meditation or concentration doesn’t eliminate movement, it merely changes the focus to another more pertinent topic that occupies a more important and more interesting spot in our collective cerebrums. It changes the focus and scope of movement by providing the illusion of stagnancy and stillness.

Whether through a film, or a vacation in a once visited city, or even taking the bus instead of flying, ultimately, any matter of movement is a matter of stagnancy and stillness, any discussion of novelty is a discussion of interest and the appropriate application supposed novelty. I don’t digress by saying that my trip to New York has revolved around these concepts; everyday I spent walking and exploring the city was a day spent exploring a new environment to discover something interesting – if experiencing a novelty wasn’t the core idea to begin with. My vacation is now absolutely and irrevocably over, but the point is that, for a few fleeting days, I overcame the psychological and physical stagnancy that I’m seemingly predetermined to struggle with until the inevitable end. My vacation is undeniably over, but the memories I’ve gained and the experiences I’ve accumulated will always remain interesting.

The journey is over for now, and though it was a repetition of a previously made trip, I must mention the fact that the repeated quest for interest never really ends. Until, of course, The End.

As always, this has been your Admin, the Avid Blogger; comment, subscribe, and criticize, and DO remember! Always look on the BYTE side of life!

-EK

Letting Each Moment Leave Us Breathless; My Second and Third Days In New York City (TheByteWeek Issue 13)

Coming back from the Museum of the Moving Image, I’ve learned three things in these past few days; first and most importantly is to not leave a backpack containing a laptop, charger, two notebooks, and three pens in an art gallery during an auction for pieces created by South Asian artists. The story behind that series of events is actually quite straightforward, though the important news is that the plot culminates with my personal belongings returning to me the next day. The second lesson is that weather is highly unpredictable, and no amount of planning or foreshadowing (quite literally, in terms of precipitation) can stand as a reasonable warning towards an incoming rainstorm. Finally, I learned that, while life is made up of proverbial moments that leave us breathless, we must actively make sure that we enjoy each and every single one of these moments instead of thinking about the moment in passing as a faded memory. Furthermore, we must insure that we don’t spend the rest of our lives trying to replicate a moment for whatever reason; I’ll return to the final lesson later, though, considering my ability to retell events in an interesting way needs significant work.

The 19th of July began with the knowledge that, at some point during the day, I’d be attending the Indo-American Arts Council Benefit Auction at the Aicon Gallery on Great Jones Street, at approximately 7:00 PM. Knowing that arriving to the event on time would require skillful planning on my part, I decided to spend the day walking in the Union Square area and not risking it. Around lunchtime, I was informed of the Strand Book Store on 12th St. and Broadway Ave.

As anyone can imagine, I made it to the bookstore and spent an hour taking it all in; reportedly, the store houses over 8 miles of books (about 12.9 kilometres), and quite frankly I’d believe it. I didn’t particularly spend very much time tracking down an individual book, but instead chose to browse the shelves, taking note of the publications I found all while making sure to take note of anything that seems interesting. Over the course of the hour, I managed to find books from the mid 1850’s, both available for sale and otherwise, all the way to current day (though finding “Recent” books in a bookstore doesn’t seem like a very monumental feat). I realized quite quickly that, if I didn’t continue on soon, I’d spend the rest of my life in the shop.

Continuing on Broadway Ave. I found myself entering forbidden planet, a comic book shop that coincidentally seems to be the graphical equivalent of Strand; I didn’t ask anyone for confirmation for fear of shattering my already fragile dream however. Not very much time was spent in forbidden planet, and I continued along Broadway until I arrived at 6th Ave. (changed in 1945 to “The Avenue of the Americas” by then mayor Fiorello La Guardia to “…bring grandeur to a shabby street…”); in all honesty, I wouldn’t have paid very much attention to the street had it not been for the name, and realizing that my curiosity had gotten the better of me I continued along 6th Ave. to discover what secrets it held. Sadly, the name appeared to be merely for show, though if I’ve been misdirected by own observation I’d be more than happy to return to rectify any existing error.

I did learn about The New School though – it’s a liberal New York university founded in 1919 as an institution designed to encourage understanding and knowledge – and I was utterly fascinated by the philosophy that students are to choose and run their own courses (with certain input from academic advisors and professors). To prove this point, the student at the front desk at the school’s welcome centre pointed out that one of his courses included a subject named “Games 101,” a course I later noticed highlighted in the school’s Wikipedia entry (any similarity is, without a doubt, entirely unintentional I’m sure).

Without being disingenuous or needlessly sarcastic, I must say that I’ve been fascinated with The New School since the moment I learned about its existence a few days ago. It seems quite interesting that the school focuses on its students instead of its monetary and financial gain, and the fact that the education is the focus and not the prestige of its graduates is refreshing. Following my brief time spent talking to students and security guards at The New School’s welcome centre, I continued walking around the area until it was time for the Aicon Gallery event.

Before I continue, I’d like to mention that I was apparently only a few blocks from High Line Park, and the Meatpacking District and would have ventured there had I known of my proximity. Ironically, this all ties into the ultimate point on moments, though I do digress. My second day in New York City ended with a delicious dinner at Schiller’s Liquor Bar. Granted, I also ended my second day forgetting my bag filled with almost all of my important technological equipment (really only the laptop), but I digress once again.

So far, I’d spent my first day trying to not be a tourist, all while accomplishing the sole feat of being a tourist, and I’d spent my second day having no plan whatsoever. I decided on my third day to fix these mistakes by spending time doing what I should have from the start – setting out with a planned destination in mind, instead of wandering around a city aimlessly. Ironically, the day I decided on that, I completely missed the fact that I was standing in front of the Flatiron Building. There is an explanation for this; having forgotten the bag, I’d arranged to retrieve it the following day, and exiting the 23rd St. subway station, I wasn’t paying attention to anything but the street names so I could get to the right building where a mutual acquaintance works. As consolation, I got my bag back quickly, but didn’t notice an important New York City landmark (for both tourists, and otherwise). As far as not wandering around aimlessly, and going out with a plan, I was focused on going to High Line Park, and sneaking in the Meatpacking District if I had time – first however, I was intent on finding lunch.

I noticed the line for the Shake Shack simply because it was so long – at first I thought it was two or three separate lines, but I realized my mistake soon. I suddenly found myself caught between two schools of thought; I wanted to try the burgers to see if they were really as good as people were making them out to be, but I didn’t want to waste too much time. Considering that it actually took me 45 minutes to get through the line and order, I wasn’t sure if I had made the right decision at the time, but the burger proved me otherwise.

The burger was cooked medium, meaning that the patty was juicy but it was somehow prepared in such a way that the juices didn’t dribble out of the burger and onto everything else. Quite the contrary, the meat was medium soft, but somehow managed to retain the overall fluid density of a well-done preparation – that is to say, the juices were there, but they weren’t going anywhere. The buns were prepared in such a manner that they were incredibly soft, and very warm, but not overcooked to the point that bits were breaking off into crumbs. At the same time, the buns were toasted to a golden brown and tasted, quite honestly, heavenly. Finally, the shack sauce was unlike any burger sauce I’ve ever had it was not too sweet, not too salty, not too thick, and not too creamy.

I was speaking to a friend earlier – discussing the overall structure and taste of the burger – and concluded that everything else we’d had was a subpar replica, a quasi-burger, that was incomparable to the creation manufactured by Shake Shack. I will mention that I was incredibly hungry, so obviously that influenced my train of thought.

Incidentally, this is also when the New York City skyline decided to cloud the sun and release the second round of the downpour that began on the 18th. I got very wet, and a bit cold, and decided to call a personal day; I spent the remainder of the afternoon lounging around here and there, taking short but leisurely naps for no reason at all. In summation, I spent the remainder of my Friday quite casually, until 7:00 PM when I went with two others to the Museum of the Moving Image for a viewing of Cabaret, the 1972 film adaptation of the 1966 Broadway production based on the 1951 play based on the 1939 novel. The film was absolutely magnificent; Liza Minelli provided a wonderful performance, Michael York was fantastic, and Joel Grey was terrifyingly sublime in his performance as Emcee of the Kit Kat Klub. My third day in New York City was remarkably lackadaisical, and my third night was filled with Cabaret.

Until now, I’ve touched upon my first two lessons; I’ve lost and regained my technological identity, and I’ve learned to never bet on or against an uncontrollable natural source – which only leaves me with my third and final educational sentiment.

These articles, and in many ways this blog, are designed as both a helpful tool, and a memorial journal of my ideas, experiences, opinions, and thoughts, and though it exists on a public forum, it is undoubtedly for my own pleasure entirely. It’s always been a fervent rule of mine that I’d stop writing the moment I couldn’t, whether this occurred due to natural reasons such as a lack of time or rational reasons such as a lack of ability, though until such a moment the blog will be a compilation of my individual memories.

In more than a few ways, I’ve dabbled with the notion of moments and events, and have come to the current conclusion that much of the time we spend experiencing moments is also spent trying to recreate memories from past moments. This isn’t to say that a married couple attempts to reproduce the moment they first met, or first declared their love for one another, as much as they attempt to reproduce the emotions of these moments. It’s an absolutely logical human reaction to want to be able to experience moments once again, though it’s less reasonable to compare and contrast events to events; the couple would quickly realize that, if the relationship had positively progressed, that the emotions are still very much present or that, if the relationship had negatively progressed, that they might not be present at all, though it’s undeniable that they’d discover a new moment to remember regardless.

Yesterday’s Aurora, Colorado shootings aren’t the entire reason for this sudden proclamation, since the idea has been rattling in my mind for quite some time, though they certainly are the catalysts for my sudden declaration of memory. Lives are lost, families are torn apart, memories are crafted and destroyed in an instant, and while the pundits and so-called “Intelligent and Educated” individuals drone on about the cause-and-effect of tragic events, human lives are forced to recreate and relive moments all while comparing, contrasting, and critiquing their past decisions and memories to see if they’ve gathered any meaning.

I lost my bag two days ago, and got it back yesterday, and now this memory is ingrained in my subconscious for whenever I’m capable of recalling it. In much the same way as the moment I realized that I had completely missed the Flatiron Building, this moment will be a reminder of this 11 day trip I’ve found myself on. These are individual moments that deserve attention, and instead of spending my time trying to relieve my past vacations in the city, I should be spending time making new memories (considering the weather’s unabashed randoms, I do have an excuse, however). Much like my past memories and experiences, the present ones deserve as much attention, pomp, and circumstance.

If there’s anything to take from this article, and if there’s anything to truly be gained from my vacation in New York City, it’s this: life is made up of moments that take our breathe away, but only if we stop and really let them leave us breathless. Otherwise, we’re just spending our entire lives trying to relive and recreate the past instead of enjoying the present.

As always, this has been your Admin, the Avid Blogger; comment, subscribe, and criticize, and DO remember! Always look on the BYTE side of life!

-EK

Pick A Side, Maintain Your Convictions, Put Up a Good Fight, and, Above All Else, Admit Defeat; The Importance of Not Taking Things Too Seriously, With Help from the Anonymity of the Internet

At the heart of every non-confrontational form of communication lies the ever present concept of anonymity. Certain emotions may seem obvious to convey, regardless of the medium, but the fact remains that, even when conversations happen with both parties directly speaking to one another, and every aspect of their language is made clear, details can still be misconstrued. It’s how comedians manage to make so many jokes about men and women, their differences, and the explosive danger of not listening to one’s significant other, and it’s how politicians manage to make even the most basic greetings devolve into an all out media war.

Despite this truth, however, it goes without saying that anonymity allows many otherwise “Quiet” individuals to speak out, and speak up, about topics that they might not be able to discuss if their identity were made public. At the same time, this unknown quality also allows many to take advantage of the security of their true identity by acting out, and against, those around them; the colloquial “No one knows if you’re a dog on the internet” metaphor can be extended further, with many, otherwise silent individuals accepting a new persona with their “Bark” and “Bite” gaining equal intensity.

I use the term persona, because anonymity allows us the advantage of being whomever we choose to be, whether these identities support Obama, Harper, Chavez, Jehovah, Zenu, Racial Insensitivity or Racial Equality, and the internet, as the most prevalent form of non-confrontational communication continues to grow due to the power of anonymity, for better or worse. Of course, I use the term non-confrontational in much the same way that a pacifist who engages in verbal arguments defines their position on physical violence; certainly, the internet fosters, and seems to feed on, confrontation, but the truth is that, excluding certain exceptional cases, we rarely see our online opponents in the so-called “Real World.”

In many ways, this is a positive aspect, since much of what is said online is done as a form of absurd extremism to cultivate attention to an issue that the commentator feels is important and worthy of such attention. Though, in other ways, this also means that a significant amount of what is said online bears little value or meaning to those who said it, and to those who it was said to. Even worse, this also means that, to a select few people, topics that seem otherwise important, become nothing more than meaningless drivel spouted as a means to an end.

However, I digress since anonymity has little to do with the overall thesis of this article, instead my issue lies with conviction and maintaining these convictions. In essence, my overall point is relatively straightforward: pick a side, find arguments that support your beliefs, produce these arguments in a manner that doesn’t involve insulting your opponent, and accept that, if your opponent produces better arguments that seem more logical and more acceptable than yours, you might have been wrong anyway. It’s not a matter of arguing against a fundamentalist about the creation of the universe, or arguing with an animal activist about the aspects of the economy that would suffer if animals were given equal rights as humans, or even discussing the possibility of Hamlet hallucinating his entire ordeal – it’s a matter of picking a side, believing in that decision, and standing by that decision until such a time that a singularity occurs that changes one’s mind.

I do recognize my literary redundancy (after all, I’ve already written two articles on giving up – one that’s all for it, and one that’s against it), but it’s a matter of enough importance that it warrants further incidence. This nihilistic belief that it’s pointless to argue for fear of validating the argument’s topic is elementary and against the very nature of opinion – specifically, that any one individual is allowed, and often encouraged, to have an idea or thought that is contrary to another.

Furthermore, the opinion that both sides can coexist is far more insulting than mediating. The arbitrator that believes both sides of an argument can happily coexist insults both sides equally, especially when both parties have dedicated their minds to an opinion that they feel comfortable with (assuming that they aren’t arguing for the sake of arguing, and that they genuinely believe in the side they have positioned themselves on).

Once again, I recognize that history repeatedly shows that two conflicting sides will almost definitely prove to be incapable of maintaining a non-violent approach to conflicting opinions, which is why mediation and arbitration is more than necessary when dealing with large-scale conflict that escalates far beyond the control of the original parties. Moving forward, I also recognize that it’s even more difficult for a party to accept that they’re wrong in a matter, and my opinions on fanaticism remain stagnant (it’s a troublesome byproduct of maintaining a single point-of-view on an opinionated duality), which is why the most important aspect of picking a side is accepting that possibility that one might be wrong in their decision.

Granted, when discussing topics that can neither be proven or disproved, it’s best to maintain a proverbial “Open mind.” Ultimately, however, this is a trivial point since, given the lack of violent escalation, and given that there isn’t an absolute right or wrong, an argument can continue indefinitely, so long as both parties firmly maintain their convictions. That’s really all it comes down to. Well, that and not believing in something strongly enough to become violent.

After all, not everything needs to be taken so seriously.

As always, this has been your Admin, the Avid Blogger; comment, subscribe, and criticize, and DO remember! Always look on the BYTE side of life!

-EK

Pretension; A Discussion of Films, Their Criticisms, and The Belief That a Movie Is Never Just a Movie

It happens in an instant between the moment the film companies get their titles splashed across the screen and the opening credits roll. We’re enthralled, and intrigued; we’re waiting impatiently for the plot to begin and the characters to reveal themselves. In an instant, we’ve been taken away from our lives and placed in the centre of a new one, observing new people, new stories, and new events. A good film has us immediately hooked, wanting to know more about what we’ve seen onscreen – a bad movie has us questioning whether we made the right decision in choosing it – an even better movie asks us to wait and to be patient, because it knows it’s going to get better. A great movie knows that the beginning is just the beginning, and that it has time to amaze us; it knows that it needs time to amaze us and that, like a good illusion, the small details that we miss while we’re waiting are the most important.

Of course, every director, like every illusionist, knows that we’re not really paying attention since we want to be amazed.

We continue to watch as the plot develops, regardless of the introduction and the opening credits, and as the characters begin to reveal themselves. We continue to watch as the visuals fascinate us, and as the effects fool us into believing such an existence is possible. We continue to watch as the sounds’ subtlety silently dulls us into a sense of ease and acceptance. We continue to watch as the cinematography and editing, two of the most overlooked aspects of film, transition us further into the film, while we pay all of our attention to the characters and plot onscreen. A good movie has, by now, eased its way into our psyche and is producing scene after scene of relative perfection. A bad movie has, almost definitely, made us regret our decision. A great movie has, no doubt, made itself evident and has made us aware of its mastery.

A critic may use words and phrases such as “well-cast,” “brilliantly shot,” “terribly edited,” or “wonderfully directed,” and the so-called “Regular” movie-goer may use words like “boring,” “bad acting,” “good plot,” and “interesting,” but the fact remains that the two opinions stem from similar psyches. A bad movie is still a bad movie, and a good plot is still a good plot. The occupation of the critic does not change the fact that criticism is dispensed, and the popularity of the judge does not change the critique they provide. An individual might not evaluate a film based on the same criteria as another, but that doesn’t make their opinion any less valid, and it doesn’t reduce the validity of a film in any way.

Certainly, bad movies and good movies exist within the same spectrum, but our criticisms cannot end there.

The plot finally eases its way into the climax, and the characters have reached the pinnacle of their development; from this point forward, the plot will begin to resolve itself and each conflict will come to a close. The music might swell, the editing might become smooth and fluid, and the plot might resolve itself in the most mundane and contrived of fashions, but it’s undeniable that the film has left a mark on those who have given up their time to experience it. The film ends, the credits roll, and the audience is left to their own devices, but the end, as always, is merely the beginning of a new series of events.

The conclusion of the film brings out our inner critics, and the eventual realization that the film we so heavily and strongly debate is, after all, nothing but a film. As one can possibly imagine, this is simply untrue; though what’s onscreen might be a movie, it not specifically just a movie, in the same way that a piece of art isn’t just a piece of art, and a song simply isn’t just a song. Art, in any form, is not just art, and those that seem to stand by the belief that the contrary is true to maintain a sort detachment are, undeniably, wrong in their assessment.

Interestingly enough, the belief that a film is nothing more than what it appears to be is a sentiment that’s only really expressed when discussing bad movies – entire arguments and debates on The Iron Lady, The Happening, Battlefield Earth, and Showgirls have ended because an individual raised the opinion that “…these are just movies, and we don’t need to take them so seriously. They don’t mean anything.” I understand the sentiment behind the statement – the person who raises these claims is exhausted from debating the merits of a film and simply wants those around them to recognize that there’s little need in continuing the discussion any further. It makes sense for a person to want to move on to a different topic, especially when they’re not enjoying themselves – or when they’re on the wrong side of an argument and have managed to find themselves with their back against a very proverbial wall.

In such cases, the statement is not intended to be taken seriously, and is really an attempt at segueing the conversation forward. Honestly, I understand the sentiment, and given my penchant for debates and arguments, I recognize that there are times when enough is genuinely enough and I need to move on. My concern is not with the individuals who find themselves exhausted from debate, but from those who decide that debate is needless and unnecessary. My concern is with those who claim that a film is nothing more than the sum of its parts without trying to recognize why it might be as such. Once again, this phrase never presents itself when everyone agrees that a movie is good, but only when someone thinks a movie is bad, or only when someone produces an opinion that exists as the antithesis to one’s own.

Frankly, it’s incredibly pretentious, and more than a little condescending.

Granted, pretension isn’t so much creating a void of opinion, so much as providing a topic with more heft than it truly requires; ironically, by not treating a subject with enough seriousness, the same effect is produced. In every sense of the term, it’s equally pretentious to create a vacuum, as it is to fill it with meaningless drivel. Ultimately, it comes down to a matter of opinion and perspective; importance is a subjective concept, and it’s obvious that not everyone will treat the same topic with the same amount of interest as others. Despite this fact, however, art is never just art, and a movie, though definitely a movie, is never just a movie. Believing otherwise is an insult to those who dedicated their time creating and producing the film, and an insult to those who dedicated their time watching and experiencing it.

Ultimately, even the worst film ever deserves to be discussed, debated, argued over, and, most importantly, challenged. Otherwise, the human desire to gain knowledge, and the concept of intelligence, is rendered null and void. Films are art, and as a form of art, they deserve to be recognized as the amalgamation and transcendence of generations worth of intellectual, psychological, and emotional evolution – regardless of whether they get a high rating on Rotten Tomatoes, how much money was spent making it, and how much money it made back.

As always, this has been your Admin, the Avid Blogger; comment, subscribe, and criticize, and DO remember! Always look on the BYTE side of life!

-EK