Artistic Integrity vs. Marketing Ploys

image

I recently read a article on Cinematical.com titled Lucas Didn’t Kill Han Solo Because of the ‘Star Wars’ Toys by Erik Davis. The short piece basically summarizes how producer Gary Kurtz split from George Lucas after the Star Wars imagineer changed the originally planned ending for the happier version we’re all so familiar with today: 

The original ending was supposed to include Luke walking off alone “like Clint Eastwood in the spaghetti westerns”, leaving a somewhat frazzled and grieving Leia to pick up the pieces and take on her new duties as queen. Kurtz disagreed with all of Lucas’ changes – including his insistence on putting in a second Death Star (because it’d be too similar to the original film, he thought), and, fed up, Kurtz and Lucas parted ways.

Davis’s blog ends with the question, “Do you think George Lucas 'sold out’ by changing Jedi, or was he just making smart business decisions?”

I mulled over this a few days, and after some consideration decided the idea of marketability versus artistic integrity would be an interesting topic to approach to several examples of popular media and creative entrepreneurs with this question: at what point does one lose artistic credibility if they choose to participate in marketing and commercializing their artistic product? 

To begin with, I think there are a few gradients of the artistic integrity versus marketing scale, and can be generalized to these four types: 

  1. Those who outright refuse to market their creation. 
  2. Those who take an existing canon and reinvent/create a new adaptation. 
  3. Those who create a new canon and choose to market it for greater exposure or profit. 
  4. Those who sacrifice artistic vision and taste for purely marketing choices. 

image

I’ll start with a famous cartoonist who I admire greatly, Bill Watterson. Watterson created the legendary series “Calvin and Hobbes,” which I grew up reading (and still do on occasion), and famously left the American cartooning industry after tiring of the constant pressure from publishers to merchandise his work. He felt that selling mugs, stickers and T-shirts with spiky-haired Calvin and orange-black-striped Hobbes slapped onto them cheapened the characters and their personalities. Even after his retirement on November 9, 1995, Watterson refuses to sign autographs or license his characters – a resolve I completely respect. 

In his absolute refusal to license the self-centered, smart alec Calvin and the sensible, proud Hobbes, Watterson essentially spat in the face of the modern capitalist system: he refused to market his creation, believing that his stance was the only way to maintain his integrity and ideals. His ideals were a direct and polar opposite response to Jim Davis’s approach to “Garfield,” a cartoon that at its popularity leaked out into so many marketing alleyways – television shows, T-shirts, stickers, bookmarks, movies, etc. – that now, during its decline, Davis’s cartoon is less of a cartoon and more of a marketing logo. 

image

Watteron’s resolve to maintain artistic integrity never ceases to amaze me – the paneling and wit of his cartoons have been timeless ever since I started reading them over ten years ago (click on the picture for the full comic)

I believe Watterson’s front against marketing is what makes “Calvin and Hobbes” so much more appealing than most products. Yes, we all would like a stuffed animal of Hobbes, or a mug of Calvin and his shenanigans, but if it’s against the creators wishes isn’t it completely counterintuitive and disrespectful to go against so? Additionally, the lack of marketing and consumerist product added a sort of purity to the canon, that Watterson wasn’t selling out at all despite pressure from publishers and continued to cartoon out hilarity, wit and philosophy for the sake of integrity and quality. If anything, Watterson is the prime and rare example of a artist who fits the description of category one; the unfortunate reality, though, is that Watterson’s idealism would likely kill the potential of any creative project from taking off and finding an audience in today’s economic conditions: marketing is almost inevitable, and I’ve yet to see any widespread artistic property that doesn’t have any sort of commercialism attached to it. 

image

Then you have those who take an existing canon and adapt it into a newer story that resonates more soundly with the current generation. With numerous remakes flooding the Hollywood blockbuster market – Hulk, James Bond, Star Trek, Iron Man – Christopher Nolan’s take on the Batman universe is arguably the prime zeitgeist of them all. Dark, brooding, and full of implications that echo of pre- and post-September 11th sentiments and increasing suspicions of political and corporate authorities, Nolan’s revamp of Batman in 2005 and 2008 essentially made him the Godfather of superhero lore in movies (he’s even been commissioned to oversee the upcoming Superman project by Warner Bros). 

Adapting an existing canon puts you in an interesting position because regardless of your creative vision, the final product is still prone to marketing which is beyond your control. What matters here is how you reimagine the adaptation, and what elements you want to keep or discard while attempting to appeal to a certain demographic that includes fans of the canon and (potentially) those who would be interested regardless of the canon’s universe. In Christopher Nolan’s case, he successfully appealed to a wide demographic that includes Batman fans and those simply want to enjoy a popular and good film (The Dark Knight is an excellent example of films that are commercially and/or critically successfully, which I previously discussed here; interestingly enough, Nolan’s success with his second Batman installment established another demographic – Nolan fans). 

image

image

Nolan’s adaptations are significantly darker and less cheerful than Tim Burton’s famous films, Batman in 1989 and Batman Returns in 1992. Burton’s films were highly popular at the time, critically and commercially successful during their theater runs. Some Burton fans were thrown off by Nolan’s take on the Batman lore; some even felt that Nolan celebrated moral depravity, lamenting that the current generation had become too pessimistic and disillusioned. Personally, I prefer Nolan’s aesthetic over Burton’s, but this is simply a matter of tastes: Heath Ledger even said that he would’ve outright refused to taken the part of the Joker if had Nolan envisioned the character along the same vein of Jack Nicholson’s famous interpretation (between the two, I believe Ledger’s performance was tenfold more haunting, disturbing, and memorable – a definite movie icon for decades to come, I’m sure). 

Then there are those who completely throw out all artistic integrity in order to squeeze out every drop of marketing potential they can. If you haven’t already guessed, I’m referring specifically to Joel Schumacher, director of the horrendous 1997 film Batman & Robin.

image

Remember the Batsuit nipples? How about those awesome Mr. Freeze lines? Or the tight rubber suits with molded muscles? And let’s not forget those awesome toys you could get with your McDonald’s happy meals, or how Six Flags debuted roller coasters themed to Schumacher’s film! (Let’s be fair – Alicia Silverstone in tight Batgirl leather was probably the best aspect of the entire movie). 

If anything, Schumacher effectively killed the Batman franchise until Nolan’s breath-of-fresh-air revival in 2005 eight years later. Schumacher himself has admitted that the film was made with the intention kid-friendly marketability, stating that he was under heavy pressure from Warner Bros to do so. Regardless, Schumacher still did it, and at least has the dignity to take full responsibility for his directorial decisions. 

Cases like Batman & Robin really present major questions as to how far one relinquishes taste in lieu of product marketing. Schumacher’s film is an extreme example of the marketing versus integrity argument, but nevertheless an important one to consider: would the film have been better if he didn’t try to create something that had so many gimmicks or accessories that are otherwise superfluous, useless, and tasteless? I believe yes – marginally so, but very possibly yes: Schumacher’s film could have been much better if he didn’t throw away cinematic vision and taste in pursuit of creating a toyetic product. More important, though, is how this decision affected his reputation: a majority of his post-Batman films have been received poorly in the critical circles (though I did enjoy the thrill ride of Phone Booth in 2003), and some have even subtly spoofed Schumacher’s infamous Batman film. Schumacher’s Batman & Robin is a worst-case scenario when someone completely gives up tastes in favor of marketability – in which case the film will likely fail critically, and very possibly tarnish the reputation of the person responsible. 

image

So now we come all the way back to George Lucas, the pinnacle of the marketing versus integrity debate. He essentially helped create the Hollywood blockbuster; adjusting for inflation, Star Wars Episode IV: A New Hope is still number one in the United States since 1977 – the same year of its release. 

Star Wars was an incredible hit. From there, Lucas created two more films, though these sequels were nowhere near as jawdroppingly awing compared to the premiere of the film that started it all. While most critics and fans were equally (if not more) astounded and exhilarated with the second installment, The Empire Strikes Back, they were less so with the final arc of the original trilogy, Return of the Jedi – unsurprisingly, the most common criticism involved the Ewoks (Gene Siskel even expressed his dislike for the closing scenes that included the fluffy bouncy Ewoks, and numerous comedians have joked about the un-defeatable Death Star 2.0 being taken down by teddy bears). 

To backtrack a bit, George Lucas was heavily influenced by American mythologist, writer and lecturer Joseph Campbell, who’s most famous work is probably The Hero with a Thousand Faces. Lucas was the first Hollywood filmmaker to credit Campbell’s influence, deriving the famous Star Wars characters and plot structures from classic archetypes and narratives (i.e. Han Solo is the anti-hero; Luke Skywalker’s entry into the tavern is the hero’s first step into something less innocent before progressing upon his journey; Darth Vader’s famous “Luke, I am your father” scene is the classic forces of evil appealing to forces of good). Arguably, Lucas’s first three films were incredibly successful commercially and critically because he adhered so closely to classic motifs of storytelling that have gone back from thousands of years. 

So what happened with the third movie? It’s well known that oftentimes, the third part of a trilogy rarely outshines or delivers on the same level as the first (or even second) films. In Lucas’s case with the first three Star Wars films, The Return of the Jedi’s somewhat disappointing (but not unsuccessful) premiere highlighted a major question of artistic integrity versus business strategy – in changing the story and ending of episode VI, did he sell out? Was it a smart business move? Or did Lucas simply want a happier ending that simply didn’t satisfy the palate of his viewers? 

image

I believe there are three possible scenarios that led to his decision: 

  1. Lucas felt a happier ending was more appropriate simply from a narrative standpoint, regardless of how the Star Wars franchise and market was doing worldwide. 
  2. Upon seeing how popular Han Solo was, Lucas changed the story from its original, darker version to the one we’re all familiar with now, hoping that his decision would please (rather than upset) fans. 
  3. Based on how high toy sales were, Lucas changed the story to appease what he believed his fans wanted – that all the principle characters be alive. 

The first scenario is simply a creative motivation, in which case I disagree with Lucas’s decision but still respect his resignation to artistic integrity. The second and third scenarios are similar, but slightly different in one distinct way: the second scenario is a measure of popular opinion that may or may not include toy sales (i.e. polls of favorite characters in the Star Wars universe) while the third scenario is a direct measure of popular opinion based directly off of toy sales. 

If the second case is true, then I’m less inclined to respect Lucas’s decision, but not to the extent of calling him a sell-out. However, if the third scenario is what actually happened, then yes – I believe George Lucas sold out. If making “smart business decisions” leads you to completely change your story, you have invariably skewed in favor of marketing over artistic integrity: what matters to critical and commercial reception is the final product, not the process, and if Lucas felt that changing the narrative of Star Wars was appropriate based off toy sales, then he invariably sold out regardless of his intention. 

We’ll probably never quite know what went through George Lucas’s mind when he decided to rewrite Star Wars Episode VI: Return of the Jedi. And with the release of more recent prequels starring Hayden Christensen and Natalie Portman, disappointed fans have casted their frustrations at the director in a recent documentary titled The People vs. George Lucas, and Skywalker enterprises might be responding with their own documentary to defend the namesake. Whatever the reasons or reactions or fandoms, Lucas has demonstrated one of the trickier aspects of creative endeavors on the large scale – of balancing artistic integrity in lie of pragmatic marketing economics. 

image

My love for you is like a lovely river of loving, love! (click on the picture for the full comic by Scott Rasoomair)

Recommended Reading, Articles and Links: 

Superheroes for Sale – David Bordwell on recent adaptations of superhero canons (specifically regarding Nolan’s approach to the Batman universe)

Stop, Nolan, Stop! – James Berardinelli’s commentary on the curse of third film disappointments in trilogies

Great Movies: Star Wars Episode IV, A New Hope – by Roger Ebert

The People vs. George Lucas – Gerardo Valero’s commentary on the whole business of disgruntled Star Wars fans and implications for George Lucas

Gritty Superhero Reboot – Spoof by CollegeHumor on recent Hollywood reboots of franchises

The Power of Myth – A PBS documentary comprised of six one-hour long conversations between Joseph Campbell and Bill Moyers. Five of the six episodes were filmed on Skywalker ranch, and Campbell comments on numerous Star Wars clips

Scott Pilgrim – A Tribute to Arcades, 32-Bit, and A.D.D.

If there was ever a movie that truly captured the essence of the digital generation, Scott Pilgrim vs. The World is the pinnacle of it all. Movies that have targeted this generation include Superbad, Juno, Pineapple Express, Knocked Up, The Hangover, The 40-year-old Virgin, Anchorman: The Legend of Ron Burgundy, Garden State, Forgetting Sarah Marshall, (500) Days of Summer, Shrek – these comedy films collectively broke away from ‘80s archetypes of macho-nacho Arnolds and sexy-smexy Sharons. At the core, these films aimed to create more honest, more vulnerable characters on screen, presented either naturalistically, stylistically, or slang-slinging snarkily – regardless, they are all hilarious in their own right. Most of these films’ soundtracks are compilations of songs, each a flavorful (and invariably) pop culture tribute that listeners will catch here and there, further adding to the slice-of-life, down-to-earth susceptibility (or diabolically manic attitude) that these likewise movies try to depict. However, Scott Pilgrim goes where no movie of late has successfully gone before: to take every aspect of arcades, video games, internet memes, hackers and trolls alike, and throw it up on the big screen for everyone to see it in its ultimate glory. 

From here, I think it’s necessary to backtrack a bit – before Scott Pilgrim, before Facebook, before Wikipedia, before Google, before AOL, before MIDIs, before Windows '98 – the beginning of what we know as the digital generation. 

My dad is a hard drive engineer, and has been since I can remember when I started remembering. His occupation invariably resulted in me and my brothers having very early exposure to the desktop computer, which he had brought home from work. There wasn’t any word processing, media center, internet, anything what we would call absolute standard today: it was DOS, and I remember asking and learning how to log into the main screen (“dad-day, what do I type in a-gain?”) My brothers and I would fight over the computer so we could play some of those awesome games like SkiFree, Minesweeper, that submarine game I can’t remember, some Mickey Mouse game I can’t remember either – we fought and clawed at one another just to play a simple 32-bit games that were rather difficult (that damn Yeti would always murder us in SkiFree, and only recently have I learned how to overcome this obstacle. Blast!) My brothers and I remember our dad using those giant floppy disks that were “uber cool,” and that the concept of a portable computer, the laptop, was “way futuristic dude.”

We also remember the Nintendo NES and the subsequent Super Nintendo NES, and how we watched in awe as our friends played Mario on the non-flat, antenna-crowned television (“watch the turtle thing!”); how we were early observers of video sharing when my dad’s friends recorded multiple movies onto a tape (“keep rewinding – Beauty and the Beast is the second one on this video tape”); how much we begged our mom to let us buy and share one Gameboy mini (despite our desolate puppy eyes, we were rather unsuccessful on this front with our mother; however, my younger sibling managed to get away with it somehow and secluded himself with black and white Mario – I suspect my dad or his friend had a hand in this scheme); how addicting and costly it was to play your way through the entirety of a arcade game (one time, my brothers and I spent at least an hour playing this Simpsons game, and I suspect we dished out at least twenty dollars worth of change to stay alive and get our names on the hall of fame); when anime suddenly became mainstream pop culture when Pokemon hit the scene, with the trading cards and anime and movies and all (my old AP History teacher in high school compared the Pokemon trading cards to the stock market crash that caused the Great Depression, and boy was he right – I must have driven my mom crazy when I told her my card collection was over after months of trading and buying like the proper model consumerist); when the N64 duked it out the Playstation before the Xbox came into play; when getting on the internet involved jacking your phone line and annoying everyone in your family who needed to make a important phone call (“DUDE, get off the internet! Don’t make me phone jack you!”); when Nokia was still its own telecommunications company, and that everyone had a Nokia phone with customizable face plates (“mine has flowers because I am HIP like that”); and how it turned out that typing in queries with an added dot-com didn’t exactly get you to where you wanted (“let’s look up the white house – I’ll type in whitehouse.com and… AW GOD NO!”)

My brothers, my friends and I remember that transition period too, of cable television taking off; when pedestrian web designing became available (I remember being in awe of my friend when she managed to upload clips of Pokemon onto a website: “DUDE HOW DID YOU DO THAT?!”); when floppy disks were phased out by CDs; when MIDIs were not the only thing possible anymore when it came to streaming music on the internet; when flash and javascript started ousting html (and ultimately deterred me from pursuing web designing as an occupation); when mp3 players started showing up, years before Apple’s iPod launched; when DVDs started selling in Costco, modestly placed next to the aisles and stacks of video tapes; when our teachers started telling us to use Google instead of Metacrawler and the like (“it’s educational!”); and when the digital age became defined by the digitalization at hand, from games to encyclopedias to music to film to communication, everything. I can’t pinpoint an exact year when it happened, but at some point it did, and with a tour de force. Here is when the digital generation took off. 

What do I mean by the digital generation? I believe it to be inclusive of the generation that grew up with the early and current development of the world wide web and video games – essentially the generation that saw the transition from VCRs to DVDs, CDs to mp3s, newspapers to times online, and so on. Individuals of this generation don’t necessarily have to be involved with video games or the internet; rather, what I mean is that the digitalization of technology – of games and information – allowed this generation to research and look up infinite amounts of information at their very fingertips, and that this availability has, in a sense, caused an acceleration of intellect and A.D.D.-like symptoms – there’s just too much information to learn. 

This acceleration of intellect and A.D.D. compounds into an interesting mix: there’s almost a manic desire to prove oneself on the net, where your physical identity dilutes down into the avatar you choose, the style and language with which you write, and the subjects and discussions that you habitually gravitate towards. There’s snark, there’s trolling, there’s administration, there’s moderating, there’s polemic-ing, there’s wit, there’s extremism, there’s thoughtfulness, there’s intellect, there’s meme-ing – essentially anything is possible on the net, and you can define a anonymous identity simply by choosing which characteristics of the net to display, ignore, or engage in. 

This is where Scott Pilgrim comes in. If there’s one thing this movie does right, it’s paying tribute to the genesis of the digital generation: all the way back from SkiFree to Super Nintendo NES to Arcades to the manic identities of the internet, Scott Pilgrim is a celebration of all these qualities which define this particular generation – everything that makes the internet and video games awesome and stupid at the same time. 

You’re pretentious, this club sucks, I have beef. Let’s fight.


Talk to the cleaning lady on Monday. Because you’ll be dust by Monday. Because you’ll be pulverized in two seconds. The cleaning lady? She cleans up… dust. She dusts.

Scott Pilgrim is jumpy, bouncy, shiny, punchy, quirky – all in a glorious bundle of gaming honor and back-and-forth internet-style quips that critics bemoan as the downfall of intelligent and meaningful discussion. It’s like 4chan come to life, barfing up Pedobear and Philosoraptor into the vein of these characters as they duke it out in arcade-style arenas and consequences (in fact, some of the aesthetic reminded me of the glory days of Street Fighter and more recently Tatsunoko vs. Capcom). Vegans have psychic powers, music creates ferocious beasts, defeated opponents give you tokens, girls pull giant hammers out of their teeny-tiny purses, swords pop out of your chest – none of this makes sense, and trying to decipher their symbolic meanings is as pointless as a snuggie. You simply have to let everything explode around you in its fireworks display of green hair and ninjas, absorbing and digesting it all into a chyme of caramel popcorn and deep-fried twinkles. And hell, what a chyme it is. 

The film makes no attempt to argue whether or not video games are art (this is a wise choice, given how much flack Ebert received for claiming they are not). Instead, the film is a grand celebration of the frenetic energy that compels players to participate in the fun, the vigor and enthusiasm that results in rapid-fire remarks that can be uncannily hilarious or sarcastic or both. Scott Pilgrim celebrates everything that makes the internet great and terrible, intelligent and dumb, moralizing and demoralizing, and everything in between. Moreover, director Edgar Wright makes an effort to depict these gamer and snark characteristics in the positive – that while these characters are engaging in what judgmental critics deem as immature, insubstantial and trivial, they are still very much human and transitioning from the limbo of adolescence into something less immature, less insubstantial, and less trivial – or at least trying to. 

I don’t know how to explain this film except to say it truly is a great tribute to the digital generation, and that trying to understand the film logically is utterly useless. It’s a fervent display of colors and emotions, magnificently game-like and A.D.D. in its aesthetic. It’s an incredibly inventive film on multiple levels, and has established Edgar Wright as a favorite director of mine (his credits include Shaun of the Dead and Hot Fuzz). Michael Cera plays Michael Cera again, but at this point his less of an actor and more of a presence on screen that will either draw us to or drive us away from the film itself. Regardless, I enjoyed it thoroughly and was able to ignore Michael Cera being Michael Cera (reportedly he was portraying Scott Pilgrim; now I have not read the original comic books, but there’s a sneaking suspicion that this Scott Pilgrim is incredibly similar to George Michael from Arrested Development, Evan from Superbad, and Paulie Bleeker from Juno – but I may be projecting in this case). 

So imagine my dismay when I heard that Scott Pilgrim was deprived of the weekend box office success after getting sandwiched by two very gendered films: The Expendables and Eat, Pray, Love. It’s like a repercussion attack from the '80s and '90s– the macho-nacho Stallones rambling up the screen with He-man guns and explosions of the '80s, and the self-righteousness of faux feminists that the '90s otherwise defended as “female empowerment” when really, the movie-character of Elizabeth Gilbert could easily be one of the most unlikable characters to date (let’s be frank: do you really think you can reach a lifetime of enlightenment by simply taking meditation 101 for only three months, and then mosey off to some other paradise and have a bloody blastastic time?) The respective characteristics of these two decades that the digital generation actively rejects and departs from are, ironically, the same characteristics that crushed Scott Pilgrim from a successful opening weekend run. 

Of course I’m disappointed. There’s a slight bitterness as to how things turned out: I wanted – no, expected Scott Pilgrim to succeed, yet the '80s and '90s are still a haunting force that drives demographics to decide upon which movies to see or not. In this case, it seems that the digital generation isn’t strong enough to empower its cinematic representative, even in the prime age of digitalization. And with '80s remakes like The A-Team, I’m beginning to wonder if current political, economic and social constructions are starting to sway in favor of a decade that resulted in my university tuition being raised and financial aid being slashed, a decade which envisioned a trickle-down theory that has invariably failed on so many levels. 

I wonder these things with a slightly bitter heart, mostly because this digital generation – my generation – is still getting the axe from older generations. Scott Pilgrim celebrates the disco and bellbottoms of the digital generation, the meditation bongs of the internet and the folk songs of video games – not equivalently, but certainly analogous. So in a illogical, emotional way, the lackluster box office of Scott Pilgrim feels like the digital generation still isn’t getting the respect or recognition from moviegoers who’d rather see Sylvester Stallone blow up more stuff or Julia Roberts eat and consummate her way to happiness. 

I’ll fling all the RAGEEEE!!! and FUUUUUU!!! I want, but box office numbers are going to be the way they are. I can appreciate what Wright does with Scott Pilgrim, lavish in the memes and hax0rs of the digital generation, and find someone to borrow off the six graphic novels so I can start reading them in my spare time if case A.D.D. bounces me away from my current reading list of Anna Karenin and The Catcher in the Rye. However, when I see renown critics like A.O. Scott syntactically jumping up and down with glee in their beaming reviews of films that represent the digital generation, I can’t help but smile and feel a little bit validated. Just a bit, but just enough to feel just peachy whenever I feel like picking up the Wii to play some good ol’ Super Smash Brothers Brawl. 

Recommended Readings

• Two articles by Erich Kuersten on his criticism and defense of Scott Pilgrim (and Michael Cera)

Scott Pilgrim vs. The World Review by Emanuel Levy

Scott Pilgrim vs. The World Review by A.O. Scott

The Strengths and Weaknesses of Animating Action - Yoshimichi Kameda

image

Click on the image above to see a bigger/higher resolution

I watched Fullmetal Alchemist: Brotherhood from April 5th, 2009 to July 4th, 2010. It was a weekly ritual: every Sunday a new episode would be out, and since I’d read the original story beforehand I would practice my Japanese listening skills by watching episodes without subtitles the first time around (afterwards, I’d wait for an official subtitle version by Funimation to see what I’d missed). If there were a few things I looked for in each episode, they were directing, pacing, music selection, and animation quality. And if there was one animator that stood out the most, it had to be Yoshimichi Kameda. 

Kameda’s style is instantly distinct from the rest of the production’s animators for a few reasons: 

• Multiple angles/perspectives

• Generous use of close ups

• Frequently dutched/diagonal horizons

• Pencil-like line art in the finished animation

• Quick cuts that created a lot of dynamic within the scene

• Follow/tracking shots of movement (often paired with close ups)

Unsurprisingly, Kameda’s most noticeable work involves action sequences. In a video compilation, you can see why his style can be simultaneously enthralling and dizzying. Here are some screenshots of his work as seen in the linked video: 

image

Greed/Ling vs Wrath/Bradley

image

Alex Louis Armstrong

image

Original Greed

image

Envy disguised as Maes Hughes

image

Roy Mustang

image

Lust

image

Roy Mustang

image

Riza Hawkeye

image

Hobo man Scar

image

Jerso (sp?)

image

Envy vs Ed

image

Ling vs Envy

image

Ed, Envy, and Ling (left to right)

image

Roy Mustang blowing up Envy

image

Roy Mustang still blowing up Envy, with Riza Hawkeye in the foreground (right side of the screenshot)

The above screenshots are excellent examples of Kameda’s strength and weaknesses in animation, especially with regards to conveying action. I’ve talked before about framing a fight/action scene and how you could storyboard movement to create differing effects. With animation, you can push boundaries even more since your characters aren’t bogged down by the physics that govern live action actors. Because of this, animators like Kameda can experiment even more with close ups, tracking shots, dutch angles, skewed expressions, shaky cam, artistic renditions of the environment - anything you can come up with. 

However, there are moments when auteurs of animation may go too far with these kinds of experiments. Technical skill is one thing, but making sure what’s happening on screen is cohesive is another trick: if no one knows what’s going on for long periods of time, an animator’s attempt to be original may fall short of anything artistically effective. 

With Kameda, his greatest asset and problem is the use of tracking close ups (at multiple angles) but forgetting (or choosing not to) establish the horizon, respectively. This style creates a lot of visual movement, and is best when there’s at least some sort of establishing shot – even for a second – to put everything into context. One of Kameda’s best examples is the scene where Envy reveals his true form to Ed and Ling, as seen here (Kameda’s work ends at 1:30, and afterwards a different animator/director takes over): 

image

image

image

image

image

image

image

image

image

image

image

image

image

image

image

image

image

I like this particular sequence of Kameda’s because while there are a lot of close ups and movement happening on screen, there’s always something establishing the characters into context and location. In this case, Envy’s massive size puts Ed and Ling into context by default: when the camera tilts up from our heroes to Envy stretching himself after the transformation, we instantly get a feeling for how small both Ed and Ling are as compared to the massive homunculus. Also, in almost every scene there is a horizon established, if even for a second before the camera cuts away to different angle and occurring action. The horizon is defined by the sea of blood that all three characters stand upon, and since there are no walls or major objects in the background (and the “sky” is pitch black) Kameda can get away with multiple tracking/close ups because the viewer will see the horizon (sea) of blood somehow (the infinite sea of blood (“ground”) is like a physicist’s wet dream of infinite planes and infinite dimensions - an animator can pull almost anything and it’s unlikely the viewer will get disoriented during the entire action sequence). Kameda and the director do a nice of job of creating lots of camera movement with close ups, tracking shots and cutting while maintaining a sense of cohesion of the entire fight sequence. 

Conversely, Kameda’s excess use of close ups and tracking shots loses its edge when he forgets (or chooses not to) establish a distinct horizon within each cut, as seen here when Greed/Ling fights Wrath/Bradley in the Fuhrer’s office (the fight begins around 17 seconds, where I assume Kameda takes over): 

image

image

image

image

image

image

image

image

image

image

image

image

image

image

image

image

image

image

image

image

The main problem in this fight scene is that Kameda doesn’t establish the horizon very well in most of the cuts. The horizon is defined by the floor that Greed and Wrath walk/run upon, not by the walls or ceiling. When we don’t see the floor between multiple cuts, the grounded physicality of the two sparring characters seems less apparent (they could be floating on air and we wouldn’t even notice). This becomes problematic over longer periods of time (“time” as in seconds and number of cuts) because the viewer starts losing a sense of relativity with regards to the ground. Compared to the Envy vs. Ed and Ling scene, where the environment was immense and primarily defined by the sea of blood that all three characters stood upon, in this scene with Greed and Wrath the environment is much more detailed and constrained comparatively (the Fuhrer’s office is a defined room and space), so it’s even more important to establish a sense of horizon more frequently so the viewer doesn’t become disoriented. 

The few times that we actually do see the floor help reestablish what’s going on, but the part where it matters the most - when Wrath pins Greed on the floor - is the most lackluster of the entire sequence: there’s no unique angle, no dutch, no skewing – just a plain old linear horizon and framing that places both characters dead center – the hell hole of the Thirds Rule. 

If there’s one thing you should avoid almost entirely, it’s placing the characters dead center screen at a un-angled horizon during any climax of an action sequence, and especially in animation. With live action, you could possibly get away with this since the actors are always breathing and twitching and their clothes are always ruffling, regardless if they stand dead still; with animation (especially on lower/tight budgets) the artists can’t spend precious extra minutes drawing these characteristics, so the characters on screen will look especially flat and boring if you don’t frame or angle them in an interesting way. 

As a last reiteration, it is always the ground that defines the horizon – not the sky, not the ceiling, not the walls, not even a cat. To maintain any sense of cohesiveness during an action scene, establishing shots are a absolute must (without them, kiss your chances of cohesiveness good-bye) and are the most effective when you incorporate the horizon upon which the characters stand. Cuts, multiple angles and perspectives, tracking shots, and dutch angles are friendly tools you can use, but overusing them without establishing a horizon is cohesiveness suicide. 

Kameda does good work throughout the series, but he’s not without his faults and demonstrates some prime examples of the strengths and weaknesses when animating action and characters. His work is a great reference for any aspiring animator and for anyone looking for different ways to frame action sequences. To see the episodes of Fullmetal Alchemist Brotherhood, please visit the official Funimation website (or search Hulu or YouTube for full episodes as well). 

… 

Additional Links/Readings

Yoshimichi Kameda profile on Anime News Network – shows a list of credits that Kameda has worked on. 

Talented Up-and-Coming Animators: Yoshimichi Kameda – a more comprehensive overview of Kameda’s work and the notice he’s received in his career so far. 

Greens, Fruit, and Candy - Hollywood versus Cinema

image

Back in 2008 I wrote a review on The Dark Knight, claiming that it was a “balanced, perfect chord that Nolan and his cast and crew [struck], a chord that few have touched or even come close to” and that the film “will be legendary by its own respect to the comic and movie medium, and moreover, by its respect for the general audience.” It was my first movie review, and I wrote this final statement without the same knowledge of film I possess today. Watching a unconventional superhero story unfold, being awestruck by Heath Ledger’s haunting performance, becoming enthralled with Nolan’s film noir-esque vision of Gotham – I wanted to defend this movie on a critical level immediately. Commercial success was inevitable, but I didn’t want the movie to become shanked* from the Oscars because of its undeniable popularity; I wanted to defend the movie on a intellectual level, a critical level so detractors and “film snobs” wouldn’t deny Nolan’s Batman lore of the credit I believed it deserved. 

It’s about two years later and I’ve stopped writing reviews regularly in favor of writing on this blog (also, these days I don’t have time watch movies expediently to write a relevant review). I know much more about film today, from its production to its history, and have even increased my regular online reading from Roger Ebert to the likes of Todd McCarthy, A.O. Scott, Emanuel Levy, Michael Phillips, and more recently Jim Emerson, David Bordwell and Dennis Cozzalio (amongst other writers who acquaintances and friends have introduced me to; I haven’t listed them because I usually read a minimum of three to ten articles – depending on the word count or subject – before citing them as regular reads). I’ve even stumbled across books and academic articles on Film Studies across the net, such as Bordwell’s generous free download of his book on Ozu and an entry linking academic papers on Nolan on the blog “Film Studies for Free.” The internet is a vast world out there, and persistent searching (coupled with a undeniably stubborn attitude that is possibly paired with procrastination) leads you to amazing finds. Most recently it brought me to some interesting commentary by film professor, scholar and critic, Emanuel Levy: 

How do you evaluate the artistic and popular dimensions of a particular movie year? For example, was last year, 2009, a good, mediocre, or bad movie year? When can you say with some degree of assurance and coherence that 1959 was a better year than 1958? And what will be the evidence to substantiate our claim that 1939 was the best year in Hollywood’s history? As a film professor, scholar, and critic, I have been struggling with this question for decades.

Levy’s comment and subsequent examples of successful films got me thinking on a tangential thought: how do you critique a commercially successful film? Critics and cinephiles alike talk all the time about independent and foreign films and how they oftentimes receive less attention than they deserve; but on the polar opposite, how are you supposed to talk about films that might possibly get more attention than anyone could foresee? 

I find the these two questions more difficult to answer since it’s easier to highlight the excellent qualities of an underdog film to a wider consciousness than to castigate the qualities of a well-funded, widely-distributed film that’s in the immediate public awareness to any effect. For instance, The Hurt Locker was the lowest grossing film to win the Oscar for “Best Picture” so far, and became well-known because of word-by-mouth reviews by acclaimed film critics around. Then you have the opposite fold, where films like Transformers 2 commercially succeed no matter how much you hack off the shiny lacquer of Megan Fox and Baysplosions hoping that people will realize the movie is a heaping pile of dung. 

image

Precious minutes of my life were wasted on this. These are moments where I wish I had a TARDIS.** 

Sometimes in my bitterness, movies like Transformers 2 make me wonder if people just prefer to throw away priceless seconds of their lives to see junk food excuses of cinema; but then I hold myself steady, take a breather, calm down and think “whoa there, buddy – people are smarter than that” and my optimism:pessimism ratio shifts back to the normal 55% and 45%, respectively. Call me naive, but I like to believe that people want to see good movies – why else would they going to theaters in the first place? 

Films are an experience, and personal ones too. They tap into our innate consciousness and subconsciousness, and oftentimes the films that we deem “personal favorites” are incredibly revealing of who we are as individuals. For instance, my list of favorite films currently includes Andrew Stanton’s Wall•E in 2008 and Charlie Chaplin’s City Lights in 1931: I loved the elegant and effective simplicity of the physical performances without (or with minimal) sound, notwithstanding the stories themselves which I found uplifting, charming and uncannily sweet. Obviously this sentiment wouldn’t carry on over to someone who’s primarily a fan of, say, Michael Baysplosions, but that goes to show how different and diverse our respective tastes can be. 

Now we’d all like to believe our favorite films are, in fact, great films. However, I prefer to amend this sentiment: favorite films and great films can overlap, but they are not necessarily the same. I say this because films are simultaneously about tastes and judgement. Now personally, I’d love to believe The Dark Knight is a classic, flawless movie that deserves a “great films” slot; however, I’d be in the purgatory of denial if I didn’t acknowledge legitimate criticism about the film’s flaws, and that while the film might be “awesome” that does not necessarily mean it is “great” (as stated by Stephanie Zacharek with regards to Nolan’s Inception). However, when I hear statements like these by Jeff Wells – that a commercially and critically successful film doesn’t need to be nominated for Best Picture to be great, and thus voters should vouch for films that are less noticed – I want to hit my head on the desk and write a letter to the Academy asking “what’s the bloody point of calling it ‘Best Picture of the Year’ if you’re just going to ignore commercial successes anyways?” My annoyance begins boiling again, but then I remember that the Oscars are always politically driven, and as A.O. Scott stated eloquently regarding the 82nd Academy Awards: 

The “Hurt Locker”-“Avatar” showdown is being characterized as a David-versus-Goliath battle, but melodrama and rooting interests aside, it is really a contest, within the artificial arena of the Oscar campaign, between the mega-blockbuster and the long tail. That last phrase, the title of a 2006 book by Chris Anderson, already has a bit of an anachronistic sound, but Mr. Anderson’s idea, shorn of some of its revolutionary overstatement, is still compelling. As digital culture makes more and more stuff available and spills it faster and faster into an already swollen marketplace, some works will establish themselves slowly, by word or mouth, social networking and serendipitous rediscovery.

That hypothesis is likely to be tested more strenuously than before in the movie world. The money to produce and publicize the kind of middle-size movie that has dominated the Oscar slates in recent years is drying up. Cheap acquisitions can be turned into hits — last year’s best picture winner, “Slumdog Millionaire,” being the most recent long-shot example — but there are likely to be fewer luxury goods for the prestige market.

Only one of the current crop of best picture candidates, “Up in the Air,” fits that description: it has a polished look, an established star, a literary pedigree and a medium-size budget. And it looks — all of a sudden, after a strong start in Toronto and in spite of perfectly good box office numbers — like an outlier, a throwback.

Which is to say nothing about its quality. The Oscars are never about that anyway. They are about how the American film industry thinks about itself, its future, its desires and ideals. Right now it is thinking big and small, trying to figure out how to split the difference, and hoping we will keep watching. Wherever and however we do watch.

Can a film be “awesome” and “great” simultaneously? More specifically can a film be commercially and critically successful at the same time? My naive self again would say yes, but would qualify the statement with an additional “–but very rarely does it happen.” I say “rarely” because when faced with inevitable commercial success, wide-release and blockbuster films are more prone to backlash for the very qualities that made it so successful and wide-appealing (note: when I speak about “commercial success,” I’m indicating films that pull in the box office numbers, which by extension is a numerical indicator of the film’s popularity amongst moviegoers, but not necessarily its critical reception). Take Juno, for instance: it was a hit at the 2007 Toronto International Film Festival, yet when it came close to Oscar season there was enormous backlash from people who felt that Juno’s pop-slanging shenanigans were unnatural and unrepresentative of how teenagers actually talk and act, and that the film sent a “immoral” message to teenagers about sex and teen pregnancy. 

I think Juno is a fine movie with slick, witty writing. But do I think it deserves a slot in “great films of all time” lists? In its own respect, I believe so, yes. To be perfectly honest, Juno is not exactly my cup of tea: I like Jasmine tea, but in the end I prefer the taste of Green tea simply because of my personal preferences – and in this case, Juno is Jasmine tea. No, this doesn’t detract away from my appreciation of the film; actually, it compels me to be even more holistic when looking at movies, and to make a conscious effort to differentiate (but not separate) between movies that I believe are great and movies that I personally adore to no end. So if I were to compile a list of movies I believe were “great,” I’d make an effort not only to appreciate films that aren’t necessarily my favorites and what they do well, but to also defend and argue for films that are my favorites if they are also included. This gives me room to relish movies that are cheesecakey goodness and include them in my list of personal favorites (i.e. Kung Fu Hustle) and extoll movies that I find artistically and technically outstanding which also happen to be in my list of personal favorites (i.e. My Neighbor Totoro). I like to imagine the differences, similarities and variance between “great” and “favorite” movies like this: great movies are your greens for cinematic fiber, favorite movies are your candy for cinematic sweets, and movies that great and personal favorites are fruit. 

image

I don’t bloody care what botanically correct scientists say: the tomato is still a vegetable in my books. And I will chuck it at whomever I wish to do so - Fresh or Rotten. 

We all want our favorite films to be fruit. But realistically you’ll have to admit that your personal favorites will not all be fruit – some will be candy no matter how much you believe otherwise (i.e. Caramel apple). At the same time, claiming that you only enjoy your films of greens ignores a lot of what films of candy and fruit offer. After all, films are also about entertainment: I could sit through countless art films and analyze the brilliance of the auteurs, but if I don’t feel compelled to re-watch it like a hyperactive child it’s unlikely the movie is going to be a personal favorite of mine. A recommended film, possibly, but probably not fruit, and definitely not candy. 

So how many films are actually fruit? Here, I decided to take a cue from Levy’s Four Criteria of Evaluation – 

  1. Artistic: Critics choices
  2. Commercial: Public choices, films that were popular with moviegoers—for whatever reason
  3. Innovative: Films that pushed the boundaries (technical, thematic, stylistic) and had impact on the evolution of film as a singular medium with new potential and possibilities
  4. Oscar movies: The five films singled by members of the Academy of Motion Picture Arts and Sciences (AMPAS) for Oscar nominations and awards.

– and then took a look at the list worldwide box office records of movies, and compared them respectively to their ratings on RottenTomatoes, Metacritic, and Imdb (note: click on the chart and graphs to see the full versions of these statistics, which were taken on 8/9/10; numbers are taken from here, though the numbers have changed with the addition of Toy Story 3 to the “Top 20” list as of 8/10/10): 

image

Now with my handy dandy Excel skills, I also made some graphs so we can visually see what’s going on here (titles are Y versus X values of graph; note the highest grossing film worldwide is the first value on the X-axis – essentially it goes from #1 to #20, left to right, respectively): 

image

image

image

image

All of the following findings are under the assumption that box office numbers are the best estimate we can get to seeing how “popular” a movie is – that is, how much people are actually compelled to dig into their wallets and see it for whatever reason, regardless of the critical reception before and after the film’s release. So if we were approximating a 70% as a generally favorable consensus after averaging the critical percentages of Rotten Tomatoes, Metacritic and Imdb for the current top grossers, we find this: only 14 out of the 20 listed films are generally favored by critics and viewers alike, which means that about 70% of box office grossers will be meet some amount of critical success – this group essentially consists of fruit and candy for movies, and that they could perhaps display a minimum of three (or two) traits out of the four criteria for evaluation that Levy presents. 

However, if we were to estimate what percentage of these movies would meet universal acclaim by approximating a minimum 85%, we would find the that only about 5 of the 20 titles could potentially be considered as “great films,” and that only approximately 25% of top grossers could actually be considered fruit and possibly possess all four qualities of Levy’s criteria. 

There are a number of things that could be wrong with these numbers. For one, the worldwide box office numbers in this data aren’t adjusted for inflation, hence the lists consists mostly of films that are relatively recent in film history. Another thing is that these numbers are based off worldwide gross, so it accounts for films that were fortunate enough to be released internationally in addition to domestically – this invariably favors films that get lots of funding from studios, and more often these studios are big time Hollywood players. Lastly, the review aggregates are from distinctly English-speaking, so the “universality” of critical acclaim possibly only covers the general Western hemisphere while not necessarily having the same appeal in the Eastern hemisphere (i.e. the reception of Danny Boyle's Slumdog Millionaire (2008) in America was much warmer than that of India, where the film was located in). Still, I think it’s important to see how box office numbers and critical consensus relate, and based on what I’ve found it all leads back to my original assertion: that it is rare for a commercially successful film to also meet universal critical acclaim. 

However, if we stand back from the standards of critical acclaim and consider more holistically to the standards of favorable acclaim, then my hope and naivete isn’t unfounded: a majority of the top office grossers aren’t bad films. They may not be great, but they’re not bad either; in fact, we could even conclude that they’re the perfect amount of candy and cheesecakey goodness that most moviegoers want when they go to the movies for whatever reasons – entertainment, escapism, evaluative, everything. 

image

Michael Bay presents Explosions! By Michael Bay. 

In the end, does it matter who got the higher score or raked in the most money? Frankly no; I’ll probably defend my favorite films until my deathbed and hate Transformers 2 for the rest of my life. That doesn’t mean I won’t find flaws or excellence in great, favorite and personal vendetta films either – in fact there’s a certain joy in finding things that could’ve been done better (or worse) in the films that you love or hate, almost as if you’re finding a Easter Egg that the filmmaker forgot they even put in. For instance, who knew you could so excellently incorporate sound effects “pew pew pew!” and classy comedy like humping dogs into a $200 million budget film created by full-grown, mature and enlightened adults like Michael Bay? It’s like they sympathized with my childhood where I counteracted my brothers’ reign of sibling terror with my pointed index finger after my mum told me to suck it up and fend for myself against their shenanigans (“no, I will not buy you a Nerf gun to assassinate your brothers with - go use a stick or something. And stop climbing the stair railing like a monkey - no I don’t care if you’re Quasimodo, you’re going to break off the railing!***”). 

We may never know what truly is “the best film of all time.” Lists will consistently bring up the same movies like Citizen Kane and Casablanca, but even then there’s a certain amount of subjectivism to any “greatest movies” list. Of course they’re always worth pursuing if the recommendation is compelling enough, but the decision is always personal and up to you alone. So while the probability of finding fruit might be low, it’s worth it after getting a palette full of greens and candy for films. 

image

*I was one hell of an angry cat when I found it The Dark Knight and Wall•E didn’t receive Best Picture nominations that year. Hell hath no fury like a person still without a cat.  

**Props to anyone who get this reference. Even bigger props if you’ve got your own sonic screwdriver. 

***I like to believe this was her loving way of telling me not to fall down to my peril and death. Regardless, I still climbed those stair railings because if anyone was going to be an awesome Quasimodo, it would be ME. 

Referenced Articles and Links (ordered with regards to this article)

The Dark Knight: 2008 - my first movie review

From Books to Blogs to Books - David Bordwell

Christopher Nolan Studies - posted and compiled by Catherine Grant

• 1960: Here is Looking at You Movie Year - Emanuel Levy

The 'Best Picture’ Academy Awards: Facts and Trivia - Filmsite.org

Is Inception This Year’s Masterpiece? Dream On - Stephanie Zacharek

Will 'Wall-E’ be be nominated for Best Picture at the Oscars? - Tom O'Neil of Gold Derby (this is a compilation of critics’ opinions on Wall-E’s prospects of a “Best Picture” nomination; search for Jeff Wells to find his quote)

The Politics of an Oscar Campaign - Peter Bowes

• Huge Film, Small Film: Big Stakes - A.O. Scott

Jumping the snark: The Juno backlash (backlash) - Jim Emerson

All Time Worldwide Top 20 - The-Numbers.com

Recommended Articles and Links (no particular order)

• Masterpieces: How to Define Great Films? - Emanuel Levy

The Top Film Criticism Sites: An Annotated Blog Roll - compiled by Paul Brunick of Film Society of Lincoln Center

The Fall of the Revengers - Roger Ebert 

I’m a proud Braniac - Roger Ebert

Fade In Magazine - A nice magazine that looks into the nitty gritty workings of Hollywood business and the film industry

Superheroes for Sale - David Bordwell

Hey, Wall-E: Shoot for the Top (Great animation deserves shot at Best Picture) - Joe Morgenstern

Trivia: Can The Dark Knight Win the Best Picture Oscar as a Write-In Candidate? - David Chen

Just for Fun (because comedy is the best relief for bitter film memories)

Rifftrax: Transformers 2 – BATTICAL!

Michael Bay presents: Explosions! – courtesy Robot Chicken

Michael Bay Finally Made an Art Film – Charlie Jane Anders of io9

Cat Safety Propaganda - How I reacted when I learned Wall•E and The Dark Knight did not get “Best Picture” nominations (see, cute little girl is the oppressive hand of the Academy and its innate biases against animation and comic book lore, and the cat is a cat the hell hath no fury like when it is angered… I’m not sure where I fit in here except that the cat’s reaction to cute little girl is more or less how I acted when I heard the Oscar news those years ago)

Horus: Prince of the Sun - A Look into Studio Ghibli's Origins

image

After an incredibly optimistic writing session with Revolutionary Road, I decided I was in dire need of a mood lifter less I risk falling into a deep, brooding state that even a fluffy cat wouldn’t cure me of (unless it is the cat I am still without, but that is a different matter). So I perused the list of foreign films I’ve been wanting to see (thanks to various recommendations) and lo and behold – Horus: Prince of the Sun, as recommended by Allan Estrella, was exactly what I needed. 

Horus is a 1968 anime movie and is the feature film debut of Isao Takahata, director of the classic and haunting Grave of the Fireflies in 1988. The film is about a boy, named Horus, who is entrusted with the Sword of the Sun after pulling it out from the ancient stone giant, Mogue. Before his father dies, Horus learns that he and his father were the last survivors of a sea village devastated by a wicked sorcerer named Grunwald, and thus sets off to avenge his village and stop Grunwald once and for all. 

Watching the film was an interesting experience: there are a lot of Studio Ghibli thematics throughout – the enigmatic forces of nature, the strong female characters, the complexity of motivations and emotions – yet there are a lot of distinctly Disney thematics as well – the evil sorcerer, the bubbly side characters, clashing forces of good and evil, and so on. In a sense, Horus really establishes the distinct divide between the legacies of Disney and Ghibli regarding thematics, animation, aesthetics, and writing. The film is widely unknown outside of Japan because it only ran for 10 days in theaters (for business reasons I’ve yet to really understand) and at that point in time, most of popular culture and public awareness was overshadowed by student protests, civil rights movements and anti-war sentiments pervasive during worldwide political and social unrest. Here, I’ll be highlighting some distinct elements reminiscent of classic Western storytelling and classic Eastern storytelling that sets Horus apart from any prior and subsequent production by Disney and Ghibli, and why it’s quite a gem in the history of anime and animation. 

Animation wise, Horus is topnotch for its time. There are only a few scenes where there is no animation but simply a panning/tilting of the camera with an audio track (a clear sign of budget issues) but besides that, Takahata directs some of the most awe-inspiring scenes that even some of recent animated features don’t come close to. For one, multiple framing types are used throughout the film: 

image

High-angle shot

image

Low-angle shot

image

Establishing/master shot

There are also multiple fields of depth and focal points: 

image

Grunwald is also holding the axe that Horus threw at him. The rope that holds the axe is blurred since Grunwald is the main focus. 

image

Here, there are multiple depths of field, with Horus being the closest and Grumwald being the farthest – all indicated by their size relative to the screen. 

These composition traits were severely missing from the Disney (chronicle) colleagues of Horus, The Aristocats in 1970 and Robin Hood in 1973, both which relied heavily on minimal dimensions (the majority of the film was mostly in a linear horizon, with the characters simply moving left and right with respect to the screen) and repetitious animation (there is a set amount of movements each character performs, resulting in a rather limited characterization and performance of the animated heroes and villains). With regards to animation, Horus outdoes The Aristocats and Robin Hood by a long-shot, and is even auteuristic in certain ways: 

image

A technicolor-like effect was used in the Enchanted Forest sequence. 

image

Mogue, the Rock Lord. 

image

Grunwald’s Mammoth of Ice, fighting flames created by the villagers. 

image

The sequence where the artists animated the reflection of sun on ice was visually astounding, notwithstanding Mogue’s epic entry into Grunwald’s lair. 

image

Use of a soft focus on a particular person/object, further emphasizing the focus by blacking out everything surrounding the person/object. 

image

Overlap of animation cels. 

image

Snow-Ice Wolves flying down the mountainside; these reminded me of Haku in Miyazaki’s Spirited Away

image

The Ice Mammoth and Mogue battle sequence reminded me a lot of the Forest Spirit from Miyazaki’s Princess Mononoke.

image

I don’t think anyone can ever outdo the animation of Monstro from Disney’s Pinocchio, but the Pike sequence in Horus does an excellent job on a lot of levels; I liked this screenshot the most because of the field depth inferred from the unfocused branch/tree/rocks in the foreground with Horus and the Pike in the background and in focus.

image

Gorgeous yet frightening sequence where Hilda unleashes mice upon the village in an act of terror.

image

Hilda’s Owl reminded me of Archimedes from Disney’s The Sword in the Stone - except not funny, less floofy, and white. But back to my main point: evil characters are drawn menacingly, like this here owl (who is the less hilarious version of Archimedes)…

image

… and good characters are drawn with a charm! (also, they are floofy)

image

Comparatively, Disney animators used strikingly similar animation in both The Aristocats (right) and Robin Hood (left)

Technical aspects aside, Horus presents some interesting Western and Eastern thematics in its narrative as well. Set in Iron Age Scandinavia, the story is a classic Western fable rich with mystical powers of good and evil that tamper with humans. Foremost, Horus is a very pure and very pious protagonist: no evil thoughts cross his mind, and he’s the perfect archetype for the Western hero; in fact, the first scene revolves around him fighting a pack of vicious wolves, and he is only saved by the rock lord Mogue. Mogue’s first appearance is the classic set-up for such an adventure, the random encounter with a powerful entity who sets forth a goal for the protagonist to strive towards, and warns that there is an evil entity which Horus must be wary of. Additionally, the death of Horus’s father lends further momentum to the story and protagonist’s motivation: the same evil entity Mogue spoke of is who Horus must take his revenge upon. Good and evil are established very early in the film, and while we know Horus will persevere we also know he will encounter numerous barriers that may prevent him from attaining his ultimate goal. Horus includes song and dance like Disney films as well, though I felt that these were less like musical numbers and more like natural characteristics of a small village that has distinct customs and practices; also, singing is a distinct characteristic of Hilda, the main female protagonist, and this characteristic plays a role in how the plot progresses throughout the film, and is less of a classic depiction of femininity. There are also some side characters that are Disney-esque in their animation, but not quite to the extent of candy-covered nudnik that becomes so obnoxiously giddy and uplifting as to induce mental diabetes (I’m looking at you, Cinderella – and don’t think I won’t go and sic my cat that-I-am-still-without on your singing mice if they start messing with my pumpkins). 

Trials of character, essential to Western lore, are also present: there’s a scene where Horus confronts a giant Pike terrorizing the fishing village that saved him after his front encounter with Grunwald, and it’s a scene that echoes of classic fairy and folktales of the Brothers Grimm where a hero/heroine must destroy a elemental force in order to restore natural order (i.e. The Twelve Brothers, The Seven Crows, The Glass Coffin, The Nix in the Pond, The Ball of Crystal). Characters of good are drawn in friendly manner while characters of evil are drawn in poor disposition – the good look good, the bad look bad (for instance, Grunwald’s henchmen wolves are drawn menacingly while Horus’s bear, Coro, is drawn amicably); in a sense, the extremities of morale are personified almost literally, just as Jiminy Cricket was animated as Pinocchio’s conscious in Disney’s 1940 film. Then there’s old Grunwald himself, who simply wants to eliminate all humans in his sight because he’s a pleasant fellow like that. 

image

The villagers collectively decide how to deal with Grunwald’s antagonism. 

However, there are distinctly Eastern elements to the story as well. Various elements of nature are personified into distinct personalities: Mogue, the rock lord, is booming and almost Ent-like from J.R.R. Tolkien’s universe; Grumwald, the sorcerer, specializes in ice magic and sends out spells of snow-ice wolves; the collective, not the individual, is necessary to accomplish any feat; the environment is distinctly beautiful, dangerous, and omnipresent, trumping over all human attempts of control (in fact, Hayao Miyazaki was responsible for “Scene Design” during production; his painters hand can easily be seen in his famous works, such as Spirited Away in 2002); and most important of all, not all characters are solely evil or good without motivation. 

This last characteristic is particularly important and poignant in most of Studio Ghibli’s film to date (I say most because I haven’t seen all of them yet – I’ll get there soon though!) This is a sentiment I agree with very much so: I’m of the opinion that there’s no absolute good or evil without motive, and even then the term “absolute” is difficult for me to fully endorse at face value; instead, I usually try and analyze the narrative or psychological significance of moral extremes. Even then I feel that absolutes are much more common to Western narratives than Eastern narratives: Eastern stories commonly deal with undertones of actions rather than the actions themselves, and thus the stories often lend themselves to more nuanced (“grey”) characters regarding personalities of good and evil. In Horus, there’s a corrupt deputy named Drago who manipulates everyone so he can gain power and dispel of Horus; purportedly a spy for Grunwald, Drago is obviously not a “good guy,” but his motivations for power and prestige are very much human. Even more interesting than Drago is the character Hilda, who marks a very important thematic in Ghibli’s most famous productions – the strong, independent, and nuanced female character. 

image

It took Disney 52 years to progress towards strong female characters, beginning in 1989 with The Little Mermaid after starting with the classic damsel-in-distress princess archetype with Snow White in 1937. Takahata included a strong female from the very start with his directorial debut in 1968 with Horus, a philosophy and tradition that has additionally spearheaded by his contemporary, Hayao Miyazaki, with many subsequent Ghibli films such as Nausicaa and Princess Mononoke. In Horus, this character is none other than the solemn and tormented Hilda: though she is initially under Grumwald’s control, it’s obvious that she’s neither pleased or happy with her choice for immortality; in fact, a good portion of the film focuses on Hilda alone (at one point I wondered if Horus had gone M.I.A. just for the heck of it), and generously fleshes out the internal conflict she feels when she’s ordered to wreak havoc upon and kill the very villagers she’s grown attached to. There might be a bit of the damsel-in-distress characteristics – the siren-like singing, her daisy-like physical appearance – but beyond looks Hilda is an mentally and emotionally strong individual, especially considering with the personal conflicts she deals with for almost the entire span of the movie. Comparing Takahata’s Hilda to her earlier Disney counterpart, Aurora/Briar Rose (Sleeping Beauty, 1959), is like looking at two different eras of social progress – the former the more progressive advocate of gender equality and the latter bent on chivalry and perpetual D.I.D.s who like being swept off their heeled-toed feet. 

image

Hayao Miyazaki on the left, Isao Takahata on the right

I like to believe Horus: Prince of the Sun marks the beginning of Takahata’s (and Miyazaki’s) conscious effort to move away from traditional Disney fare, storytelling, and animation aesthetics; yet ironically Horus has numerous elements in vein with Disney productions, which makes it an interesting hybrid: a product highly influenced by Western endeavors while actively trying to establish distinctly Eastern foundations – all with regards to animating stories and the characters within. Horus may easily be one of the most exceptional and overlooked gems in the world of animated films, and it’s a film I can’t recommend enough to those interested in animation, anime, Studio Ghibli and Disney productions. 

Additional Reading/Links for Those Interested

Notes on Horus and its production history

A nice video comparison of Disney’s linear animation (also shows how some animation was recycled between Mr. Toad and The Jungle Book)

Opening credits of Disney’s Robin Hood: here you can see a prime example of linear animation in which the characters primarily move to the left or right, but not away or towards the screen to establish a sense of depth

Peter Schneider and Don Hahn interview on Waking Sleeping Beauty, the documentary about Disney’s rise, fall, and comeback during the years 1984 to 1994

Online resource for short stories in Grimm’s Fairy Tales, for those interested in the short stories I listed in this article

image