The first two installments of Roman Polanski's Apartment Trilogy, "Repulsion" and "Rosemary's Baby" rank among my favorite films. Watching Catherine Deneuve and Mia Farrow travel Polanski's long, winding paths of cinema paranoia toward similar, but not quite the same destinations, still gives me a thrill every time (The third film, "The Apartment," alas, leaves me cold). And yet I've refrained from bringing up my admiration for Polanski's work, especially in recent conversations about the director's legal troubles. And when his newest film, "The Ghost Writer," hit the art house theaters a few months ago, I hesitated a good long while before deciding I could wait for the DVD rental.
As I type this, Roman Polanski is still under house arrest in Switzerland, trying to block his extradition back to the United States for heinous crimes committed in 1977 against Samantha Geimer, then a thirteen-year-old girl. I've heard too many mealy-mouthed attempts to excuse Polanski for what he did, which essentially boils down to statutory rape, or just plain rape, depending on which version of the facts you prefer. All of the subsequent legal shenanegans about the crooked judge and the botched plea agreement don't change that. Or the fact that three decades has elapsed and the grown up Geimer has forgiver her attacker. It galls me to no end that Polanski escaped any major negative consequences for his actions, and I can't say that I'm very unhappy at the thought of him being sent back to California to finally be sentenced for the crimes he was convicted of.
And yet I still enjoy Roman Polanski films. I thought he was deserving of the Best Director Oscar he won for "The Pianist" and I still want to see "The Ghost Writer." Is this hypocritical of me? How do I reconcile the films with the director that made them? Art is not created in a vacuum, especially not for top directors at Polanski's level. Does my condemnation of the man mean that I also have to condemn everything he produced in the thirty-three years he's fled the law, no matter their artistic value? Or even his work from before that? I've heard some arguments that the Apartment Trilogy, "Knife in the Water," "Chinatown" the other early films can be exempted from scorn because they were created before 1977, but this is just splitting hairs. Polanski's reputation is still enhanced by their elevation, and it's hardly possible to bifurcate him.
To muddle things even further, there was the petition circulated by several French and International filmmakers calling for Polanski's release back in September of last year, when he was first seized by the Swiss authorities, signed by Martin Scorsese, David Lynch, Stephen Soderbergh, Michael Mann, Terry Gilliam, Wim Wenders, Mike Nichols, Wong Kar Wai, Sam Mendes, Costa Gravas, and dozens of other cinema luminaries. If I stop watching Polanski's films because of his failings of character, do I stop watching the films of his supporters too, for taking up a position I adamantly disagree with? Where should the line be drawn?
In the end, I think the work has to be considered separate of the artist. If the fanboys love "Star Wars" and bemoan George Lucas, then it follows that I should be able to love "Repulsion" and think unpleasant thoughts about Roman Polanski. The essence of art, after all, is communication. A horrible person may still have valuable ideas to convey, or even if the message is abhorrent, the method or means of conveyance may be worthy of attention. Leni Riefenstahl is still admired for the artistry of her Nazi propaganda, and no one disputes the technical or historical importance of "A Birth of a Nation," which is essentially a KKK recruitment film. It's strange to think of Roman Polanski in these terms, as none of his work has any overtly objectionable messages. You could easily read nefarious themes or intentions from his older films, which often featured women in peril or women betrayed. Maybe "Chinatown" was a secret cry for help.
But in spite of all the rationalizations, I still feel a guilty about supporting his films, especially the newest ones. I simply can't bring myself to disassociate Polanski-the-director from Polanski-the-man entirely – there's too much of him in his films, from the evocations of the Holocaust in "The Pianist," which blighted Polanski's childhood, to his alienated hero in "The Apartment," who he decided to play himself to mortifying effect. And frankly, I haven't forgiven him even if his victim has. I get no sense of remorse from Polanski, no indication that he's sorry for his actions. If he'd stayed in Los Angeles to face the music, or even gone back under his own steam after that judge had died, I'd be more apt to leave the past in the past.
But as it stands, I can only stand to give his films qualified praise. Oh yes, I loved "Rosemary's Baby" – even though it was directed by Roman Polanski.
Friday, April 30, 2010
Thursday, April 29, 2010
A Modern Take on the Melancholy Dane
The only reason I watched the new TV adaptation of the recent Royal Shakespeare Company's staging of "Hamlet" was for David Tennant in the title role. I'm glad I did.
I don't dislike Shakespeare, and I've seen "Hamlet" performed live before, as well as about half a dozen filmed adaptations. The trouble is that it's always been out of a sense of obligation. I read the play in class during high school, playacted a few scenes for assignments, and watched film clips for reference - none of them very engaging for an adolescent viewer. The Mel Gibson, Niccol Williamson, and Laurence Olivier versions were too remote in the past, and the Ethan Hawke version was still to come. The five-hour 1996 Kenneth Branagh production had just been released, I think, though I don't remember if I watched it in theaters or on video later. I do remember that Branagh's "Hamlet" came off as overly grandiose and stuffy, though I did appreciate the celebrity cameos by Robin Williams, Charlton Heston, Gerard Depardieu, Jack Lemmon, and Billy Crystal as the gravedigger. But for a play where the hero shouts accusations of incest and murder, and the last act features about half-a-dozen onscreen deaths, it was a remarkably tame experience.
So roughly fifteen years later, after reading a lot more Shakespeare in college and becoming mildly infatuated with Julie Taymor's spectacularly bloody "Titus Andronicuis" adaptation along the way, I was ready to give the melancholy Dane another go. The new RSC version version that aired on many PBS stations last night was a modernized update in all respects except for the original language, and trimmed down to a brisk three hour length. And I loved it. Not only was it gratifying to see the play again, but I realized that I now understood nearly all the language, the puns, the wordplay, the humor, and the naughty bits that had been too obtuse for me to grasp as a teenager. The dialogue was rattled off in conversational fashion instead of recited, mostly, which made it far easier to grasp. I could remember sitting in English class, listening to my fellow inmates droning through the then-impenetrable lines with all the vigor of boiled noodles. And what an amazing contrast to hear the same lines again as natural speech, shrieked or sighed or simply thick with sarcasm. You could understand what was going on from the characters' behavior even if you couldn't decipher the words.
David Tennant might have come acoss as too manic for some, but he brought an energy and a vitality to the role I thought was sorely missing from most of the other versions I'd seen. His Hamlet broods only briefly, before throwing impulsive temper tantrums and aggressively pursuing his reputation for madness. Maybe it's because I'm older now, but Hamlet as a character never seemed younger, a grieving son called home from school to the unpleasant shock of his mother's remarriage. In one act he pads around the set barefoot, in a printed orange T-shirt and skinny jeans, and then totes around a camcorder in another to record his uncle's response to provocation - no doubt to be subsequently uploaded on Youtube. Most of his later scenes result in physical altercations of some sort, including the confrontation with Gertrude that ends with him sobbing into her lap. English teachers everywhere are likely thanking their lucky stars to finally have a decent Hamlet their students may actually be able to identify with. No, Ethan Hawke doesn't count.
The rest of the cast suffers a bit in comparison to Tennant, but certainly holding his own was Sir Patrick Stewart as an excellent, enigmatic Claudius. He also doubled as the Ghost, which added some interesting dimensions to the filial and marital relationships. Mariah Gale as Ophelia and Penny Downie as Gertrude both had strong moments, and Oliver Ford Davies' played his Polonius as more broadly comic than I remember the character being. Considering the intensity of the rest of the cast, though, it was a welcome addition. Most of the minor character suffered from the cuts to the play, which excised nearly all the political intrigue involving Fortinbras, the pirates, and even the famous declaration that Rosencranz and Guildenstern are dead.
As for the presentation of the film, it was about on par with most British television productions. The set design was clearly transplanted almost directly from the stage play, with various props like the shattered mirrors and omnipresent cameras providing odd distractions. However, I have to give kudos to them for pulling off an oppressive modern setting that wasn't overly utilitarian, and all of the modern costumes worked with a minimum of clashing. Though the director was good about keeping the pace brisk, there were some odd cuts and strange shots that I could have done without. Whenever the POV would switch to that of a security camera or the Ghost, I felt like I had been plunked down into an episode of "Doctor Who."
Is this new filmed version of "Hamlet" one for the ages? Maybe, and maybe not, but for this age, it's not a bad one to have.
I don't dislike Shakespeare, and I've seen "Hamlet" performed live before, as well as about half a dozen filmed adaptations. The trouble is that it's always been out of a sense of obligation. I read the play in class during high school, playacted a few scenes for assignments, and watched film clips for reference - none of them very engaging for an adolescent viewer. The Mel Gibson, Niccol Williamson, and Laurence Olivier versions were too remote in the past, and the Ethan Hawke version was still to come. The five-hour 1996 Kenneth Branagh production had just been released, I think, though I don't remember if I watched it in theaters or on video later. I do remember that Branagh's "Hamlet" came off as overly grandiose and stuffy, though I did appreciate the celebrity cameos by Robin Williams, Charlton Heston, Gerard Depardieu, Jack Lemmon, and Billy Crystal as the gravedigger. But for a play where the hero shouts accusations of incest and murder, and the last act features about half-a-dozen onscreen deaths, it was a remarkably tame experience.
So roughly fifteen years later, after reading a lot more Shakespeare in college and becoming mildly infatuated with Julie Taymor's spectacularly bloody "Titus Andronicuis" adaptation along the way, I was ready to give the melancholy Dane another go. The new RSC version version that aired on many PBS stations last night was a modernized update in all respects except for the original language, and trimmed down to a brisk three hour length. And I loved it. Not only was it gratifying to see the play again, but I realized that I now understood nearly all the language, the puns, the wordplay, the humor, and the naughty bits that had been too obtuse for me to grasp as a teenager. The dialogue was rattled off in conversational fashion instead of recited, mostly, which made it far easier to grasp. I could remember sitting in English class, listening to my fellow inmates droning through the then-impenetrable lines with all the vigor of boiled noodles. And what an amazing contrast to hear the same lines again as natural speech, shrieked or sighed or simply thick with sarcasm. You could understand what was going on from the characters' behavior even if you couldn't decipher the words.
David Tennant might have come acoss as too manic for some, but he brought an energy and a vitality to the role I thought was sorely missing from most of the other versions I'd seen. His Hamlet broods only briefly, before throwing impulsive temper tantrums and aggressively pursuing his reputation for madness. Maybe it's because I'm older now, but Hamlet as a character never seemed younger, a grieving son called home from school to the unpleasant shock of his mother's remarriage. In one act he pads around the set barefoot, in a printed orange T-shirt and skinny jeans, and then totes around a camcorder in another to record his uncle's response to provocation - no doubt to be subsequently uploaded on Youtube. Most of his later scenes result in physical altercations of some sort, including the confrontation with Gertrude that ends with him sobbing into her lap. English teachers everywhere are likely thanking their lucky stars to finally have a decent Hamlet their students may actually be able to identify with. No, Ethan Hawke doesn't count.
The rest of the cast suffers a bit in comparison to Tennant, but certainly holding his own was Sir Patrick Stewart as an excellent, enigmatic Claudius. He also doubled as the Ghost, which added some interesting dimensions to the filial and marital relationships. Mariah Gale as Ophelia and Penny Downie as Gertrude both had strong moments, and Oliver Ford Davies' played his Polonius as more broadly comic than I remember the character being. Considering the intensity of the rest of the cast, though, it was a welcome addition. Most of the minor character suffered from the cuts to the play, which excised nearly all the political intrigue involving Fortinbras, the pirates, and even the famous declaration that Rosencranz and Guildenstern are dead.
As for the presentation of the film, it was about on par with most British television productions. The set design was clearly transplanted almost directly from the stage play, with various props like the shattered mirrors and omnipresent cameras providing odd distractions. However, I have to give kudos to them for pulling off an oppressive modern setting that wasn't overly utilitarian, and all of the modern costumes worked with a minimum of clashing. Though the director was good about keeping the pace brisk, there were some odd cuts and strange shots that I could have done without. Whenever the POV would switch to that of a security camera or the Ghost, I felt like I had been plunked down into an episode of "Doctor Who."
Is this new filmed version of "Hamlet" one for the ages? Maybe, and maybe not, but for this age, it's not a bad one to have.
Wednesday, April 28, 2010
"Iron Man 2"? Probably Rusty, But Nobody Cares
The first rumblings of discontent began on Monday, when early "Iron Man 2" reviews from the UK started trickling onto the web. First there was the disappointed account from the blog Heyguys, followed by a blistering notice from The Mirror. A few fans got a little worried, but the alarm klaxons didn't go off full blast until roughly around midnight, when reports from the world premiere screening rolled in along with reviews from the two big industry trades, Variety and The Hollywood Reporter. Brian Lowry at Variety delivered a mixed review, but The Report's Kirk Honeycutt savaged the film, leading off with, "Everything fun and terrific about 'Iron Man,' a mere two years ago, has vanished with its sequel." Ouch. Subsequent reviews from various sources have been all all over the place, but there have been few raves aside from, predictably, Harry Knowles over at Aint it Cool News.
Once upon a time, these negative early reviews might have given studio executives heart palpitations, but no longer. "Iron Man 2" is primed to be this summer blockbuster season's most massive event film, and is easily the most highly anticipated title of the year. The media hype and the goodwill from the excellent first film are so strong, it's hard to imagine that it won't end up breaking box office records upon its release next week. Conversely, traditional print critics have seen their influence wane in recent years, to the point where even the most universally panned films like "Transformers 2" or "GI Joe" see no impact on their box office earnings or future franschise prospects. The announcement was made today that the poorly received "Clash of the Titans" reboot is getting a sequel, mostly due to strong performance overseas. No matter what the critics say, people are going to see "Iron Man 2."
As a self-avowed pretentious movie geek, I've long made my peace with the fact that the studios make movies for general audiences, not movie critics, and people genuinely enjoy pictures that I consider miserable drivel. However, it doesn't feel like the audience has as much say as they used to when it comes to a film's financial success either. Where so much of the audience interest is drummed up by studio hype, it often feels like the quality of a film's marketing campaign is a better determination of success than the quality of the film itself - or whether anybody actually liked watching it. Of course big, hyped-up films can still be affected by negative audience response. Highly anticipated projects like "Clash of the Titans" or last year's "Wolverine" will often have great first weekends and then plummet like so many turkeys. "Spider-man 3" has one of the highest domestic grosses of all time, but nobody seems to have enjoyed it much and Sony backed away from follow-up films. Yet all of these films made millions in profits.
The troubling early reviews might be a bellwether for negative audience reaction to "Iron Man 2," but this won't pose much of a hurdle for Paramount. The movie business has structured itself to capitalize on front-loaded films that do big business in their first week or two of release, and subsequently drop out of sight to be replaced by the next big title. Since they're relying almost exclusively on hype and other factors independent of the film's actual quality, distributors put films into as many theaters as it possibly can in the first few weeks, and marketers expend the bulk of their budgets months in advance. Because no one listens to reviewers anymore, it takes at least a few days for word to get around as to whether a film is actually worth seeing, and by that time chances are viewers have already seen it. Steady performers like "How to Train Your Dragon," which won the past weekend box-office after four weeks, are becoming more and more of a rarity, and long gone are the days when popular films could linger in theaters for a year or more. The current system of tight exhibition windows, with DVD released dates worked out far in advance, make shorter theatrical releases a necessity. Occasionally you'll see limited encore engagements, like FOX recently did with "Avatar," but these days it takes a bona fide cultural phenomenon to win over exhibitors for that kind of deal.
I worry that as the studios' efforts to hedge their bets against negative word of mouth, they also undermine positive word of mouth in the process. "Alice in Wonderland," for instance, garnered some controversy when Disney announced that they were shortening the period between the theatrical run of "Alice" and its DVD release to about three months. Theater owners protested that it would cut into their earnings and there were threats of a boycott and other dramatics that ultimately came to nothing. "Alice in Wonderland" turned out to be a huge hit, everyone made lots of money, and industry observers concluded that and earlier DVD release date didn't deter audiences at all. And yet, the film is not going to make as much box office bank as it could have, because it will be leaving theater screens earlier. Disney only made the initial decision to bump up the DVD release because it assumed that "Alice" had fairly niche appeal and was going to have the same sort of front-loaded theatrical run as its other recent genre films. Even "Avatar" had its initial run cut short when theater owners were contractually obligated to turn over many of its screens to other films.
Of course all films aren't subject to these constraints. Smaller films, art-house films, and indies still benefit enormously from critical response and word of mouth reactions. The hits that come out of this world, usually the prestige films that crowd awards season, are the ones that build on genuine support from its viewers. Most of these titles are modestly budgeted, and most are never expected to turn much of a profit from their theatrical runs, so distributors are more willing to take risks. But when it comes to the monster studio action films, full of expensive CGI and A-list stars, the executives are not willing to take any risks that people won't like them, and rely on aggressive pre-selling to make sure the audience shows up. So they like films that are easier to sell, like adaptations and sequels, and reboots of familiar properties. They like predictability, and the dirty little secret of most film fans is that we like it too.
I confess that I've been caught up in the "Iron Man 2" hype too, and I'm planning to see it as soon as I can. I might have held out on "Spider-man 3," but this is different. The first "Iron Man" was one of my favorites of 2008, ranking well above "The Dark Knight." I've skimmed the negative reviews I've linked in this post, but haven't read too closely for fear of spoilers. And though they have put a dampener on my expectations for the film, I'm significantly invested in the characters and story already that I want to go and see for myself how bad it is. But more than that, since the presumption is that everyone else is excited for the film and will be seeing it in the theaters, I'm anticipating the communal experience of discussing it with others. Whether I end up defending or damning "Iron Man 2," I like participating.
And if I'm going to be disappointed, at least I'll have plenty of company.
Once upon a time, these negative early reviews might have given studio executives heart palpitations, but no longer. "Iron Man 2" is primed to be this summer blockbuster season's most massive event film, and is easily the most highly anticipated title of the year. The media hype and the goodwill from the excellent first film are so strong, it's hard to imagine that it won't end up breaking box office records upon its release next week. Conversely, traditional print critics have seen their influence wane in recent years, to the point where even the most universally panned films like "Transformers 2" or "GI Joe" see no impact on their box office earnings or future franschise prospects. The announcement was made today that the poorly received "Clash of the Titans" reboot is getting a sequel, mostly due to strong performance overseas. No matter what the critics say, people are going to see "Iron Man 2."
As a self-avowed pretentious movie geek, I've long made my peace with the fact that the studios make movies for general audiences, not movie critics, and people genuinely enjoy pictures that I consider miserable drivel. However, it doesn't feel like the audience has as much say as they used to when it comes to a film's financial success either. Where so much of the audience interest is drummed up by studio hype, it often feels like the quality of a film's marketing campaign is a better determination of success than the quality of the film itself - or whether anybody actually liked watching it. Of course big, hyped-up films can still be affected by negative audience response. Highly anticipated projects like "Clash of the Titans" or last year's "Wolverine" will often have great first weekends and then plummet like so many turkeys. "Spider-man 3" has one of the highest domestic grosses of all time, but nobody seems to have enjoyed it much and Sony backed away from follow-up films. Yet all of these films made millions in profits.
The troubling early reviews might be a bellwether for negative audience reaction to "Iron Man 2," but this won't pose much of a hurdle for Paramount. The movie business has structured itself to capitalize on front-loaded films that do big business in their first week or two of release, and subsequently drop out of sight to be replaced by the next big title. Since they're relying almost exclusively on hype and other factors independent of the film's actual quality, distributors put films into as many theaters as it possibly can in the first few weeks, and marketers expend the bulk of their budgets months in advance. Because no one listens to reviewers anymore, it takes at least a few days for word to get around as to whether a film is actually worth seeing, and by that time chances are viewers have already seen it. Steady performers like "How to Train Your Dragon," which won the past weekend box-office after four weeks, are becoming more and more of a rarity, and long gone are the days when popular films could linger in theaters for a year or more. The current system of tight exhibition windows, with DVD released dates worked out far in advance, make shorter theatrical releases a necessity. Occasionally you'll see limited encore engagements, like FOX recently did with "Avatar," but these days it takes a bona fide cultural phenomenon to win over exhibitors for that kind of deal.
I worry that as the studios' efforts to hedge their bets against negative word of mouth, they also undermine positive word of mouth in the process. "Alice in Wonderland," for instance, garnered some controversy when Disney announced that they were shortening the period between the theatrical run of "Alice" and its DVD release to about three months. Theater owners protested that it would cut into their earnings and there were threats of a boycott and other dramatics that ultimately came to nothing. "Alice in Wonderland" turned out to be a huge hit, everyone made lots of money, and industry observers concluded that and earlier DVD release date didn't deter audiences at all. And yet, the film is not going to make as much box office bank as it could have, because it will be leaving theater screens earlier. Disney only made the initial decision to bump up the DVD release because it assumed that "Alice" had fairly niche appeal and was going to have the same sort of front-loaded theatrical run as its other recent genre films. Even "Avatar" had its initial run cut short when theater owners were contractually obligated to turn over many of its screens to other films.
Of course all films aren't subject to these constraints. Smaller films, art-house films, and indies still benefit enormously from critical response and word of mouth reactions. The hits that come out of this world, usually the prestige films that crowd awards season, are the ones that build on genuine support from its viewers. Most of these titles are modestly budgeted, and most are never expected to turn much of a profit from their theatrical runs, so distributors are more willing to take risks. But when it comes to the monster studio action films, full of expensive CGI and A-list stars, the executives are not willing to take any risks that people won't like them, and rely on aggressive pre-selling to make sure the audience shows up. So they like films that are easier to sell, like adaptations and sequels, and reboots of familiar properties. They like predictability, and the dirty little secret of most film fans is that we like it too.
I confess that I've been caught up in the "Iron Man 2" hype too, and I'm planning to see it as soon as I can. I might have held out on "Spider-man 3," but this is different. The first "Iron Man" was one of my favorites of 2008, ranking well above "The Dark Knight." I've skimmed the negative reviews I've linked in this post, but haven't read too closely for fear of spoilers. And though they have put a dampener on my expectations for the film, I'm significantly invested in the characters and story already that I want to go and see for myself how bad it is. But more than that, since the presumption is that everyone else is excited for the film and will be seeing it in the theaters, I'm anticipating the communal experience of discussing it with others. Whether I end up defending or damning "Iron Man 2," I like participating.
And if I'm going to be disappointed, at least I'll have plenty of company.
Tuesday, April 27, 2010
What Do Women Want From Comic Book Movies?
"Kick-Ass" underperformed at the box office last weekend and took a plunge in numbers this week, and now "The Losers" is a certified bomb. There's little in common between the two films except that they're based on smaller comic-book properties, had few stars, and were aimed square at roughly the same demographic of young, male, action fans. Of course, two films of even superficial similarity makes a trend on the Internet, and the apparent connection is boosted by the fact that a lot of the comic-book and fanboy film sites were salivating for these two projects for months in advance, and are now experiencing bouts of severe disappointment. As various and sundry have been dissecting the flops, one observation caught my eye: the Bleeding Cool comics blog points to both films' dismal performance with women and suggests that studios need to start looking at more female-friendly franchises. But what is it that women want out of a comic-book movie?
Now I figure that I'm a woman, a little older than the target audience for most of these films, but I have a yen for the superhero genre and I've read comics on and off myself over the last decade. Maybe I can offer a few insights into how future comic book films might avoid becoming the next "Kick-Ass" or "Losers." Clearly women watch films based on comic properties, even the ones that are nearest and dearest to traditional comic geeks' hearts. Women went to "The Dark Knight" and "Iron Man" in droves, and "Wolverine" was on the Fandango list of female moviegoers' most anticipated 2009 films right along with "New Moon." I don't see that there are any drastic changes that need to be made to current comic-book action films in order to draw female crowds, but there are some things that the creative types could be doing better, and there are a couple of pitfalls that should be avoided at all costs.
First, and most importantly, male and female viewers aren't all that far off in what they like about comic-book films. Cool visuals? Exciting action scenes? Yes. Appealing, larger-than-life characters? Absolutely. But there needs to be a balance when it comes to these elements. Male action film fans are happy with nonstop action and big effects sequences. Female viewers need more than that. For us, stronger stories, humor, and characters with stronger personalities are more of a draw, and if we sense the lack of them, we lose interest. Films that are sold by simply glorifying violence without any reference to a more concrete narrative, or showcasing any interesting actors or performances, tend to read as female-unfriendly. "Kick-Ass" might have picked up more interest from the girl geeks if it had bothered to let us know there were human beings under the masks. Instead, all we got in the ads were people in silly costumes beating on each other, and some unconvincing audience reactions trying to convince us that it wasn't really as silly as it looked.
Another common mistake is the assumption that having women feature prominent in a comic-book film means female audiences will show up. Some comic-book heroines are great, but more often than not, the emotionally remote, strongly sexualized portrayals of women in comics are geared toward men and can be off-putting to female viewers. Rather, we need to see better female characters, more diverse, more involved, more active and well-rounded female characters who girls and women would actually want to identify with. Some of the comic titles with the highest female readership include "Sandman," "Fables," "Love and Rockets," and "Y: The Last Man." They have lots of strong female characters, but few of the busty, idealized superheroine types that grace the pages of "Elektra," "Wonder Woman," and "Supergirl." The biggest mistake the studios have made when trying to push comic properties to female audiences, is that they don't understand who these titles are aimed at. "Aeon Flux" and "Catwoman" were mostly watched by guys with the hots for Charlize Theron and Halle Berry, and shunned by the women who felt they were being pandered to.
Some filmmakers seem to think a romantic subplot or a female love interest in an action film is a draw in and of itself. Usually not true. I don't think anyone out there is seeing "Iron Man 2" for Gwyneth Paltrow or her charming banter with Robert Downey Jr. If female viewers respond positively to signs of romance in a comic-book film, it's often because of how it functions in the plot - letting us see the softer, more vulnerable side of the badass heroes when they're not out punching bad guys. Sure, women love the bad boy, but most of them don't want the bad boy 24/7. Of course this can also be accomplished by a little male bonding, by domestic scenes with their parents or kids, or simply humorous moments to deflate the hero's image a bit. Yet romance feels perfunctory in comic-book films these days, often spoken of as a concession to the female viewers. Since many action writers and directors are fairly useless when it comes to portraying romantic relationships, I'd prefer to see less of them. Instead, let the leading ladies find other roles in the story to fill - James Cameron and Quentin Tarantino certain know what to do with them.
Finally, there's the content issue. It's important to keep in mind that sensitive viewers aren't just limited to children - there are people of all ages out there who don't want to see or hear certain things onscreen. I don't have a problem with R-rated content in comic-book films myself, so long as it doesn't detract from the rest of the movie. For instance, I really wanted to like last year's "Watchmen" adaptation. I've read the graphic novel multiple times, roughly once a year since I picked up a copy in college, and I adore it. And Zack Snyder's lavish production had me roughly up until the third or fourth endless slow-motion gore scene completely killed it for me. From the constant complaints about lily-livered studio suits I hear from many comic book purists, I get that the visceral thrills that come with graphic violence, sexuality, and profanity are a plus for young male viewers. But for female audiences, they can provoke extremely negative reactions. Several of my girlfriends, all in their 20s and not a "Sex and the City" fan among them, went to see "Sin City" on the basis of the interesting visuals and strong cast. And they almost uniformly panned the film for its over-the-top scenes of cannibalism, castration, dismemberment, and other such delightful activities.
The important thing to remember is this - comic-book films don't have to be niche films, but that requires thinking outside the box and finding ways to appeal to audiences that aren't traditionally comic-book fans. If comics are truly a medium rather than a genre, as many an indignant fanboy will insist, maybe it's time to give the comic-book films a chance to grow up a bit and prove it.
Now I figure that I'm a woman, a little older than the target audience for most of these films, but I have a yen for the superhero genre and I've read comics on and off myself over the last decade. Maybe I can offer a few insights into how future comic book films might avoid becoming the next "Kick-Ass" or "Losers." Clearly women watch films based on comic properties, even the ones that are nearest and dearest to traditional comic geeks' hearts. Women went to "The Dark Knight" and "Iron Man" in droves, and "Wolverine" was on the Fandango list of female moviegoers' most anticipated 2009 films right along with "New Moon." I don't see that there are any drastic changes that need to be made to current comic-book action films in order to draw female crowds, but there are some things that the creative types could be doing better, and there are a couple of pitfalls that should be avoided at all costs.
First, and most importantly, male and female viewers aren't all that far off in what they like about comic-book films. Cool visuals? Exciting action scenes? Yes. Appealing, larger-than-life characters? Absolutely. But there needs to be a balance when it comes to these elements. Male action film fans are happy with nonstop action and big effects sequences. Female viewers need more than that. For us, stronger stories, humor, and characters with stronger personalities are more of a draw, and if we sense the lack of them, we lose interest. Films that are sold by simply glorifying violence without any reference to a more concrete narrative, or showcasing any interesting actors or performances, tend to read as female-unfriendly. "Kick-Ass" might have picked up more interest from the girl geeks if it had bothered to let us know there were human beings under the masks. Instead, all we got in the ads were people in silly costumes beating on each other, and some unconvincing audience reactions trying to convince us that it wasn't really as silly as it looked.
Another common mistake is the assumption that having women feature prominent in a comic-book film means female audiences will show up. Some comic-book heroines are great, but more often than not, the emotionally remote, strongly sexualized portrayals of women in comics are geared toward men and can be off-putting to female viewers. Rather, we need to see better female characters, more diverse, more involved, more active and well-rounded female characters who girls and women would actually want to identify with. Some of the comic titles with the highest female readership include "Sandman," "Fables," "Love and Rockets," and "Y: The Last Man." They have lots of strong female characters, but few of the busty, idealized superheroine types that grace the pages of "Elektra," "Wonder Woman," and "Supergirl." The biggest mistake the studios have made when trying to push comic properties to female audiences, is that they don't understand who these titles are aimed at. "Aeon Flux" and "Catwoman" were mostly watched by guys with the hots for Charlize Theron and Halle Berry, and shunned by the women who felt they were being pandered to.
Some filmmakers seem to think a romantic subplot or a female love interest in an action film is a draw in and of itself. Usually not true. I don't think anyone out there is seeing "Iron Man 2" for Gwyneth Paltrow or her charming banter with Robert Downey Jr. If female viewers respond positively to signs of romance in a comic-book film, it's often because of how it functions in the plot - letting us see the softer, more vulnerable side of the badass heroes when they're not out punching bad guys. Sure, women love the bad boy, but most of them don't want the bad boy 24/7. Of course this can also be accomplished by a little male bonding, by domestic scenes with their parents or kids, or simply humorous moments to deflate the hero's image a bit. Yet romance feels perfunctory in comic-book films these days, often spoken of as a concession to the female viewers. Since many action writers and directors are fairly useless when it comes to portraying romantic relationships, I'd prefer to see less of them. Instead, let the leading ladies find other roles in the story to fill - James Cameron and Quentin Tarantino certain know what to do with them.
Finally, there's the content issue. It's important to keep in mind that sensitive viewers aren't just limited to children - there are people of all ages out there who don't want to see or hear certain things onscreen. I don't have a problem with R-rated content in comic-book films myself, so long as it doesn't detract from the rest of the movie. For instance, I really wanted to like last year's "Watchmen" adaptation. I've read the graphic novel multiple times, roughly once a year since I picked up a copy in college, and I adore it. And Zack Snyder's lavish production had me roughly up until the third or fourth endless slow-motion gore scene completely killed it for me. From the constant complaints about lily-livered studio suits I hear from many comic book purists, I get that the visceral thrills that come with graphic violence, sexuality, and profanity are a plus for young male viewers. But for female audiences, they can provoke extremely negative reactions. Several of my girlfriends, all in their 20s and not a "Sex and the City" fan among them, went to see "Sin City" on the basis of the interesting visuals and strong cast. And they almost uniformly panned the film for its over-the-top scenes of cannibalism, castration, dismemberment, and other such delightful activities.
The important thing to remember is this - comic-book films don't have to be niche films, but that requires thinking outside the box and finding ways to appeal to audiences that aren't traditionally comic-book fans. If comics are truly a medium rather than a genre, as many an indignant fanboy will insist, maybe it's time to give the comic-book films a chance to grow up a bit and prove it.
Monday, April 26, 2010
A Post About Michael Sheen
To go along with my post on Amy Adams for last week, I thought I'd look at another actor with a similar career arc - Michael Sheen, a Welsh actor best known for portraying Prime Minister Tony Blair in "The Deal" and "The Queen," and the David Frost to Frank Langella's Richard Nixon in "Frost/Nixon." What struck me about his recent film roles is the clear divide between his work in British and independent films, and his work in mainstream American films. Watching only one or the other would give viewers completely different impressions of Michael Sheen.
On the one hand, there's the Michael Sheen who's been doing films like "The Damned United," where he played famed football manager Brian Clough, and showed up in the ensemble of "My Last Five Girlfriends," a romantic comedy. He'll be returning to the role of Tony Blair later this year for "The Special Relationship," the third of Peter Morgan's films on the former Prime Minister, and also make an appearance as a terrorist villain in "Unthinkable," an indie suspense thriller. These are all fairly typical parts that we might expect to see for any up-and-coming leading man or character actor.
On the other hand, you have the Michael Sheen who starred in "Underworld: Rise of the Lycans" as a werewolf freedom fighter, and popped up in "New Moon" as a decadent vampire elder. Both were broad, larger-than-life performances that were a lot of fun, but you'd never guess that this was the same actor who racked up awards for his nuanced portrayals of real-life politicians and media figures. Sheen is also set to play a flamboyant nightclub owner in the "Tron Legacy" at the end of the year, though no one has yet explained what a nightclub is doing in the computer innerspace of "Tron."
Despite the prevalence of big, loud action-adventure films on Hollywood studio slates, they're not the only kind of film being made these days. So, it stands to reason that Michael Sheen should be showing up in the crime thrillers and mid-range dramas like Gerard Butler, another relative newcomer, has been. My guess is that these are the same kinds of films that Sheen is already doing in the UK and in smaller productions, so there's no danger of him being pigeonholed for the wacky blockbuster roles. And there's no indication that he has any interest in crossing over to become an American star, and giving up the visibility he already enjoys in the UK film industry. So while he and Stephen Fry and Alan Rickman will occasionally make appearances in high-profile films like "Alice in Wonderland," for the most part they seem content to make more modest pictures outside of Hollywood.
Lots of other international film stars have the same sort of bifurcated filmographies. Jackie Chan, for example, is primarily known for action comedies like "The Spy Next Door" in the US while doing more interesting, serious films at home in Hong Kong like "The Shinjuku Incident." Gerard Depardieu made a bid for American stardom after the success of "Green Card" in the early 90s, but his subsequent American films were mostly badly-conceived comedies like the awful "My Father the Hero" and "Bogus." He still pops up occasionally for minor supporting roles in films like "The Last Holiday," but enjoys a far more prolific career in France.
What's nice about many major studios productions is that they're increasingly global enterprises, and draw not just from American talent, but from other countries' film industries and their talent from around the world. These days an overseas actor who wants global visibility doesn't have to abandon their local film scene and head for Hollywood in order to maintain their standing with Hollywood filmmakers and casting directors. Many do, of course, in order to cash in on more lucrative US-based productions, but it's nice to know that for others, it's no longer a necessity.
And while it's good to have Michael Sheen, the over-the-top genre actor who's always good for a chortle, it's good to have Michael Sheen the dependable British dramatic actor too.
On the one hand, there's the Michael Sheen who's been doing films like "The Damned United," where he played famed football manager Brian Clough, and showed up in the ensemble of "My Last Five Girlfriends," a romantic comedy. He'll be returning to the role of Tony Blair later this year for "The Special Relationship," the third of Peter Morgan's films on the former Prime Minister, and also make an appearance as a terrorist villain in "Unthinkable," an indie suspense thriller. These are all fairly typical parts that we might expect to see for any up-and-coming leading man or character actor.
On the other hand, you have the Michael Sheen who starred in "Underworld: Rise of the Lycans" as a werewolf freedom fighter, and popped up in "New Moon" as a decadent vampire elder. Both were broad, larger-than-life performances that were a lot of fun, but you'd never guess that this was the same actor who racked up awards for his nuanced portrayals of real-life politicians and media figures. Sheen is also set to play a flamboyant nightclub owner in the "Tron Legacy" at the end of the year, though no one has yet explained what a nightclub is doing in the computer innerspace of "Tron."
Despite the prevalence of big, loud action-adventure films on Hollywood studio slates, they're not the only kind of film being made these days. So, it stands to reason that Michael Sheen should be showing up in the crime thrillers and mid-range dramas like Gerard Butler, another relative newcomer, has been. My guess is that these are the same kinds of films that Sheen is already doing in the UK and in smaller productions, so there's no danger of him being pigeonholed for the wacky blockbuster roles. And there's no indication that he has any interest in crossing over to become an American star, and giving up the visibility he already enjoys in the UK film industry. So while he and Stephen Fry and Alan Rickman will occasionally make appearances in high-profile films like "Alice in Wonderland," for the most part they seem content to make more modest pictures outside of Hollywood.
Lots of other international film stars have the same sort of bifurcated filmographies. Jackie Chan, for example, is primarily known for action comedies like "The Spy Next Door" in the US while doing more interesting, serious films at home in Hong Kong like "The Shinjuku Incident." Gerard Depardieu made a bid for American stardom after the success of "Green Card" in the early 90s, but his subsequent American films were mostly badly-conceived comedies like the awful "My Father the Hero" and "Bogus." He still pops up occasionally for minor supporting roles in films like "The Last Holiday," but enjoys a far more prolific career in France.
What's nice about many major studios productions is that they're increasingly global enterprises, and draw not just from American talent, but from other countries' film industries and their talent from around the world. These days an overseas actor who wants global visibility doesn't have to abandon their local film scene and head for Hollywood in order to maintain their standing with Hollywood filmmakers and casting directors. Many do, of course, in order to cash in on more lucrative US-based productions, but it's nice to know that for others, it's no longer a necessity.
And while it's good to have Michael Sheen, the over-the-top genre actor who's always good for a chortle, it's good to have Michael Sheen the dependable British dramatic actor too.
Sunday, April 25, 2010
The Dark Vision of "The White Ribbon"
Michael Haneke is a filmmaker I know by reputation, but I never saw any of this films before "The White Ribbon." He's known for playing with the narrative form, using meta-narrative elements in both versions of "Funny Games" and premising "Cache" on the search for a menacing voyeur that may actually be film's audience. "The White Ribbon," though tackling less sensational material, presents its own unorthodox and fascinating enigmas.
Painters talk about the use of negative space, areas of the landscape defined by an absence of content. In essence, this is how the narrative of "White Ribbon" operates. It is a film that exists in reaction to events that are never addressed directly. The story seems to follow the formula of a mystery at first. Set in a small German village just before World War I, we learn in the opening scenes that someone set up a trip wire that fells the town doctor from his horse. The son of a rich man is severely beaten by persons unknown. In each case, groups of children were present just before or just after the misfortune. We never see the acts of aggression and violence directly, except for a few small, telling scenes that appear innocuous out of context. Rather, it's the reactions of the puzzled, frightened adults that feature most prominently.
Shot in simple black and white cinematography, with most of its scenes centered around long conversations and domestic situations, "The White Ribbon" feels like a much older film, perhaps something Ingmar Bergman might have made in one of his darker periods. The knowing, almost willful self-awareness of the narrative, however, is decidedly modern. The result is a film of wonderful menace, a picture of a pastoral, peaceful-looking little piece of the German countryside where unspoken oppressiveness and fear permeate everything. The characters seem agreeable and kindly at first, though the product of an older, more rigidly traditional era. The adults are mystified at the events that occur, yet also strangely accepting of their occurrence. The children seem almost monstrous when hints emerge that they may be responsible, until we begin to see their interactions with their parents, and the way the adults interact with each other.
The rich baron is landlord and employer to most of the village, and has near-tyrannical power over the laborers who work for him. The pastor's is strict and unyielding in his faith, and has perhaps unreasonable expectations. And the town doctor and the midwife seem perfectly respectable in public, but have disturbing secrets to hide. All of these characters have children, and all of them suffer misfortune through their children – some are victimized, and perhaps some are responsible for the victimization. Our central protagonist is the young schoolteacher who suspects something is terribly amiss with the children, but as an outsider he is continually thwarted and undermined by the village's authority figures.
No explanation is offered for the crimes that take place, though it's clear by the end of the picture who is responsible. Several key events are left unresolved, such as the sudden disappearance of the midwife, who may have learned the truth about what happened. There is no need to suss out the details, however, because they aren't the important part of the story. Haneke explains through the film's narrator that the events of the film are meant provide some insight into subsequent events, alluding to World War I. This suggests that "The White Ribbon" is an allegorical examination of the origins of the attitudes and cultural shifts that lead to the war. I think it works on a far broader level, a look at how moral and social values are transmitted from generation to generation, sometimes inadvertently.
The film's central image is the white ribbon tied around the arm of a child who is deemed to have misbehaved, supposedly a reminder of inner purity and the potential for salvation. Instead, it's a mark of something far more insidious being instilled by the well-meaning parent, an unspoken threat that is ultimately delivered back upon the originator. In that sense, it's a far more terrifying sight than any act of violence that Haneke could have shown us. And in the end, it seems that the film is not a mystery story at all – it's a horror picture.
Painters talk about the use of negative space, areas of the landscape defined by an absence of content. In essence, this is how the narrative of "White Ribbon" operates. It is a film that exists in reaction to events that are never addressed directly. The story seems to follow the formula of a mystery at first. Set in a small German village just before World War I, we learn in the opening scenes that someone set up a trip wire that fells the town doctor from his horse. The son of a rich man is severely beaten by persons unknown. In each case, groups of children were present just before or just after the misfortune. We never see the acts of aggression and violence directly, except for a few small, telling scenes that appear innocuous out of context. Rather, it's the reactions of the puzzled, frightened adults that feature most prominently.
Shot in simple black and white cinematography, with most of its scenes centered around long conversations and domestic situations, "The White Ribbon" feels like a much older film, perhaps something Ingmar Bergman might have made in one of his darker periods. The knowing, almost willful self-awareness of the narrative, however, is decidedly modern. The result is a film of wonderful menace, a picture of a pastoral, peaceful-looking little piece of the German countryside where unspoken oppressiveness and fear permeate everything. The characters seem agreeable and kindly at first, though the product of an older, more rigidly traditional era. The adults are mystified at the events that occur, yet also strangely accepting of their occurrence. The children seem almost monstrous when hints emerge that they may be responsible, until we begin to see their interactions with their parents, and the way the adults interact with each other.
The rich baron is landlord and employer to most of the village, and has near-tyrannical power over the laborers who work for him. The pastor's is strict and unyielding in his faith, and has perhaps unreasonable expectations. And the town doctor and the midwife seem perfectly respectable in public, but have disturbing secrets to hide. All of these characters have children, and all of them suffer misfortune through their children – some are victimized, and perhaps some are responsible for the victimization. Our central protagonist is the young schoolteacher who suspects something is terribly amiss with the children, but as an outsider he is continually thwarted and undermined by the village's authority figures.
No explanation is offered for the crimes that take place, though it's clear by the end of the picture who is responsible. Several key events are left unresolved, such as the sudden disappearance of the midwife, who may have learned the truth about what happened. There is no need to suss out the details, however, because they aren't the important part of the story. Haneke explains through the film's narrator that the events of the film are meant provide some insight into subsequent events, alluding to World War I. This suggests that "The White Ribbon" is an allegorical examination of the origins of the attitudes and cultural shifts that lead to the war. I think it works on a far broader level, a look at how moral and social values are transmitted from generation to generation, sometimes inadvertently.
The film's central image is the white ribbon tied around the arm of a child who is deemed to have misbehaved, supposedly a reminder of inner purity and the potential for salvation. Instead, it's a mark of something far more insidious being instilled by the well-meaning parent, an unspoken threat that is ultimately delivered back upon the originator. In that sense, it's a far more terrifying sight than any act of violence that Haneke could have shown us. And in the end, it seems that the film is not a mystery story at all – it's a horror picture.
Saturday, April 24, 2010
he Best Films of 2009 List - A Work in Progress
End of the year lists and awards always present so many little aggravations. I like watching or reading them, and I like discussing them, but I never feel like I can properly make my own because I haven't watched most of the films. The lists are especially hard to consider because I feel like I have to put in so many caveats. I used to end up with lists of "Best Films of the Year That I Actually Saw This Year" or "Best Films of the Year That I've Seen So Far." It grates because I'm a completist. I like getting a comprehensive look at the array of films that are actually out there before I commit to anything. And I like having lists of the "Best Films of the Year" actually be lists of the "Best Films of the Year."
I'm a film nut with no ties to the industry, no access to critics' screenings, and my budget for movie tickets is pretty modest. Fortunately I live in a major metropolitan area with a lot of independent and art-house theaters so I have access to a wider variety of films than most. But as the scheduling for smaller films is always unpredictable, and the cost of movie tickets quickly adds up, I end up seeing a lot of stuff on DVD. I always end up finding out about some foreign gem or other six months after the theatrical run has passed. I don't think home viewing can ever match the theater experience, but I'm not a stickler either.
The problem with waiting for DVDs, though, is that the period between a film's creation and domestic release and a DVD release can be very, very long. Exhibition windows may be shrinking for the mainstream studio films like "Alice in Wonderland," but smaller films often have labyrinthine distribution deals and can take much longer to work through the system. The Danish war film "Flame and Citron," for instance, played film festivals in 2008 and had a Stateside release in July of 2009, but didn't come out on DVD until only a few weeks ago. Then again, some smaller films all but skip the theatrical exhibition window and go straight to DVD, like the troubled "The Maiden Heist," a Christopher Walken heist picture than never secured theatrical distribution.
But then there are the categorization problems. The Jet Li martial arts epic "Hero" had a very small-scale theatrical run at the tail-end of 2002 to qualify for awards, and snagged a Best Foreign Film Academy Award nomination for the trouble, but was practically inaccessible by the general public until it saw a wider release in 2004. So is "Hero" a 2002 or 2004 film? Does it make sense to wait on completed films that have been delayed or shelved by their distributors? I've seen lists where critics will include older films that have undergone restorations or that received no official release when they were originally completed. Is it fair to compare the 2006 reissue of "Army of Shadows," a Jean Pierre Melville film that was made in the 60s, with contemporary features?
So far I've seen about sixty films from 2009, mostly studio pictures and a handful of the bigger foreign films. I could make a top ten list based on those titles, but there's the final, nagging question – how many films do I need to see from a given year before I've seen enough to make a good assessment? There's always that possibility that the one film I didn't see will be my favorite of the year. It's happened before, with "The Devil's Backbone" in 2001.
What I do now is create preliminary lists of films to see for any given year, and don't make my "Best of the Year" list until I've checked them all off. On the current list are "My Son, My Son, What Have Ye Done?" and "The Secret in Their Eyes," which are still in the middle of their theatrical runs, and "Agora," which won't be in theaters until next month. So, I expect that my "Best of 2009" list should be ready around spring of 2011.
I'm a film nut with no ties to the industry, no access to critics' screenings, and my budget for movie tickets is pretty modest. Fortunately I live in a major metropolitan area with a lot of independent and art-house theaters so I have access to a wider variety of films than most. But as the scheduling for smaller films is always unpredictable, and the cost of movie tickets quickly adds up, I end up seeing a lot of stuff on DVD. I always end up finding out about some foreign gem or other six months after the theatrical run has passed. I don't think home viewing can ever match the theater experience, but I'm not a stickler either.
The problem with waiting for DVDs, though, is that the period between a film's creation and domestic release and a DVD release can be very, very long. Exhibition windows may be shrinking for the mainstream studio films like "Alice in Wonderland," but smaller films often have labyrinthine distribution deals and can take much longer to work through the system. The Danish war film "Flame and Citron," for instance, played film festivals in 2008 and had a Stateside release in July of 2009, but didn't come out on DVD until only a few weeks ago. Then again, some smaller films all but skip the theatrical exhibition window and go straight to DVD, like the troubled "The Maiden Heist," a Christopher Walken heist picture than never secured theatrical distribution.
But then there are the categorization problems. The Jet Li martial arts epic "Hero" had a very small-scale theatrical run at the tail-end of 2002 to qualify for awards, and snagged a Best Foreign Film Academy Award nomination for the trouble, but was practically inaccessible by the general public until it saw a wider release in 2004. So is "Hero" a 2002 or 2004 film? Does it make sense to wait on completed films that have been delayed or shelved by their distributors? I've seen lists where critics will include older films that have undergone restorations or that received no official release when they were originally completed. Is it fair to compare the 2006 reissue of "Army of Shadows," a Jean Pierre Melville film that was made in the 60s, with contemporary features?
So far I've seen about sixty films from 2009, mostly studio pictures and a handful of the bigger foreign films. I could make a top ten list based on those titles, but there's the final, nagging question – how many films do I need to see from a given year before I've seen enough to make a good assessment? There's always that possibility that the one film I didn't see will be my favorite of the year. It's happened before, with "The Devil's Backbone" in 2001.
What I do now is create preliminary lists of films to see for any given year, and don't make my "Best of the Year" list until I've checked them all off. On the current list are "My Son, My Son, What Have Ye Done?" and "The Secret in Their Eyes," which are still in the middle of their theatrical runs, and "Agora," which won't be in theaters until next month. So, I expect that my "Best of 2009" list should be ready around spring of 2011.
Friday, April 23, 2010
Comedy Central Caved, But "South Park" Still Won
I was a late convert to "South Park." By the time I had cable and had worked up enough curiosity to watch a few episodes, the initial furor over the cartoon's content had long since died. "South Park" and its parcel of pint-sized reprobates didn't win me over right away. I'm not a fan of the toilet humor Matt Stone and Trey Parker gleefully indulge in, and I did find some of the material pretty offensive - but it wasn't as offensive as I was led to believe. More importantly, the show was smart, well-written, funny, and fearless. When it got crass and mean, there was always a point to it. The satire was vicious, but it was also insightful, timely, and provocative in the best way. The creators had their leanings, but they went after everybody, left and right, young and old, famous and infamous.
Which brings us to the controversy that's making headlines right now. "South Park" is in trouble again, after ominous warnings were delivered by an Islamic group about the planned appearance of a cartoon caricature of the Islamic prophet Muhammad in Wednesday's episode. Comedy Central was so concerned, they blocked the image of Muhammad with black boxes, even when the character was concealed in a bear suit, then bleeped out all mentions of his name and cut more referential dialogue for good measure. I haven't seen in the episode in question, or most of the current season since I drop-kicked my cable subscription, but I remember the last time the "South Park" guys tried to put the image of Muhammad on the air after the Danish political cartoon fracas. Comedy Central didn't let us see him that time either, though it refrained from the heavy bleepery.
The reactions to the decision have been predictably mixed. Parker and Stone revealed that they delivered an uncensored episode to the network and the decision to censor was on the part of Comedy Central's executives. Many sympathize with the creators, pointing out that this could be setting a worrying precendent, allowing the curtailment of speech by religious fringe groups. There have also been charges of hypocrisy, as Muhammad already briefly appeared on the show years ago in a group of major religious figures that included Buddha and Moses. Others point to the murder of director Theo van Gogh after a similar controversy, and suggest that the network had no choice but to censor in the interest of protecting themselves and the creators. In January, over four years after the publication of the Danish Muhammad cartoons, one of the cartoonists was attacked by a radical Islamist in his own home.
Yet it's difficult to think of this kind of controversy happening with any other show on any other network. The United States has stronger safeguards on freedom of speech than most, but tends to be more restrictive in actual practice, especially with speech in the media. Comedy Central is one of the few major television networks that occasionally allows uncensored speech other button-pushing content in its programming. "South Park" is one of Comedy Central's flagship shows and has pushed the boundaries of acceptable content since day one. They've been called racist, sexist, blasphemous, un-American, and not funny. Provocation is the show's modus operandi, but it's only been able to push as far as it has thanks to a network willing to take risks and sustain outrage from everybody from the Scientologists to Steven Spielberg.
The fact that Comedy Central backed down and acknowledged that its boundaries exist isn't as important as the fact that the "South Park" guys are still pushing them. And this is why "South Park" consistently makes headlines, even after nearly fifteen years. It has never reached the kind of popularity or widespread acceptance of "The Simpsons," but "South Park" has never lost its edge and has remained culturally relevant for at least twice as long. It would have been easy for Parker and Stone to rest on their laurels when celebrating their big 200th episode last week, but instead they reignited an old controversy and the show is back in the middle of a raucous free speech debate.
Now that's the way to honor "South Park."
Which brings us to the controversy that's making headlines right now. "South Park" is in trouble again, after ominous warnings were delivered by an Islamic group about the planned appearance of a cartoon caricature of the Islamic prophet Muhammad in Wednesday's episode. Comedy Central was so concerned, they blocked the image of Muhammad with black boxes, even when the character was concealed in a bear suit, then bleeped out all mentions of his name and cut more referential dialogue for good measure. I haven't seen in the episode in question, or most of the current season since I drop-kicked my cable subscription, but I remember the last time the "South Park" guys tried to put the image of Muhammad on the air after the Danish political cartoon fracas. Comedy Central didn't let us see him that time either, though it refrained from the heavy bleepery.
The reactions to the decision have been predictably mixed. Parker and Stone revealed that they delivered an uncensored episode to the network and the decision to censor was on the part of Comedy Central's executives. Many sympathize with the creators, pointing out that this could be setting a worrying precendent, allowing the curtailment of speech by religious fringe groups. There have also been charges of hypocrisy, as Muhammad already briefly appeared on the show years ago in a group of major religious figures that included Buddha and Moses. Others point to the murder of director Theo van Gogh after a similar controversy, and suggest that the network had no choice but to censor in the interest of protecting themselves and the creators. In January, over four years after the publication of the Danish Muhammad cartoons, one of the cartoonists was attacked by a radical Islamist in his own home.
Yet it's difficult to think of this kind of controversy happening with any other show on any other network. The United States has stronger safeguards on freedom of speech than most, but tends to be more restrictive in actual practice, especially with speech in the media. Comedy Central is one of the few major television networks that occasionally allows uncensored speech other button-pushing content in its programming. "South Park" is one of Comedy Central's flagship shows and has pushed the boundaries of acceptable content since day one. They've been called racist, sexist, blasphemous, un-American, and not funny. Provocation is the show's modus operandi, but it's only been able to push as far as it has thanks to a network willing to take risks and sustain outrage from everybody from the Scientologists to Steven Spielberg.
The fact that Comedy Central backed down and acknowledged that its boundaries exist isn't as important as the fact that the "South Park" guys are still pushing them. And this is why "South Park" consistently makes headlines, even after nearly fifteen years. It has never reached the kind of popularity or widespread acceptance of "The Simpsons," but "South Park" has never lost its edge and has remained culturally relevant for at least twice as long. It would have been easy for Parker and Stone to rest on their laurels when celebrating their big 200th episode last week, but instead they reignited an old controversy and the show is back in the middle of a raucous free speech debate.
Now that's the way to honor "South Park."
Thursday, April 22, 2010
I Want an "American Idiot" Movie
Theater fans been trumpeting the arrival of the new "American Idiot" musical on Broadway, after a successful run at the Berkeley Rep. Based on the 2004 Green Day concept album that catapulted the band back into the spotlight after some lean years in the 90s, the stage version has been getting raves from just about everybody and is already being talked up as the show to beat at next year's Tony's.
Popular music tends to elude me, as I am perpetually behind the times and I have trouble matching up bands and singers to the songs coming out of the radio. But I remember "American Idiot" roaring onto the scene at the height of the second Iraq war, encapsulating the moody zeitgeist of Gen Y in the millennial era of Dubya and WMDs. I knew Green Day vaguely from my teens as the hot punk band of the moment, roughly when I was a sophomore in high school. "Good Riddance" was inescapable at graduations and end-of-the-year functions for a while, and several of my friends were madly in love with the band members.
It actually took me a while to connect the scruffy rockers of my youth with the scruffy-and-slightly-older rockers behind "American Idiot." They didn't sound especially different from their "Dookie" days, but there was suddenly a message coming in loud and clear, a frustrated, disillusioned refrain being shouted back against the war, the establishment, and the media, backed up by fevered guitar riffs. And when the album wasn't hollering it seemed to be drawing out our need to mourn, with the melancholy "Boulevard of Broken Dreams" topping the charts for months. It was the perfect soundtrack to the burgeoning discontent of the times, akin to Coldplay's "Viva La Vida" playing Wall Street day traders off into the recession.
Listening to the album and watching the videos, the thought of a movie version popped into my head more than once. Some of it was due to the similarities between "American Idiot" and other concept albums like The Who's "Tommy" and Pink Floyd's "The Wall," which were converted into cinematic visions of freaky and curious delight. Mostly, though, I think it was because there were so many tantalizing hints of something bigger in the works. The "Wake Me Up When September Ends" video featured Jamie Bell and Evan Rachel Wood in essentially a mini-movie, and there were constant rumors about a mystery project that would expand on the album.
And what can I say? I'm a movie nut. I don't know much about popular music, but I love films that center around them. Rock operas and rock musicals especially fascinate me, everything from "A Hard Day's Night" to "Hedwig and the Angry Inch." The verve and vigor of even the most flawed projects like "Velvet Goldmine" or "Rent" never fail to get me excited. I wish we had more of them, and I wish our best musicians would bring their stuff to the screen more often – not just with biopics and concert films, but full-fledged creations conceived for film.
"American Idiot" has so much potential to cross over to other media, not just because of its themes and messages or its strong evocation of a particularly fractious point in time, but because it has a different point of view. Rock musicals we've seen before on Broadway, but a punk rock musical? A Green Day rock musical? That's a new generation and a new kind of beast rearing its head, and I'm not surprised that the critics are eating it up.
But while I think that an "American Idiot" musical is a great thing to have and I hope it has a long, happy run, I want a movie version. I want it on the celluloid, distilled, refined, and flickering in the dark. I want those green-filtered images from the music videos expanded, blown up and given more and more dimensions until they're spilling over the edge of the frame. They could shoot it in 3D. I hear that's in these days. But I want to see the moment captured on film, before it's too far behind us. I want an "American Idiot" movie.
And I want a Gorillaz movie too, but that's a post for another day.
Popular music tends to elude me, as I am perpetually behind the times and I have trouble matching up bands and singers to the songs coming out of the radio. But I remember "American Idiot" roaring onto the scene at the height of the second Iraq war, encapsulating the moody zeitgeist of Gen Y in the millennial era of Dubya and WMDs. I knew Green Day vaguely from my teens as the hot punk band of the moment, roughly when I was a sophomore in high school. "Good Riddance" was inescapable at graduations and end-of-the-year functions for a while, and several of my friends were madly in love with the band members.
It actually took me a while to connect the scruffy rockers of my youth with the scruffy-and-slightly-older rockers behind "American Idiot." They didn't sound especially different from their "Dookie" days, but there was suddenly a message coming in loud and clear, a frustrated, disillusioned refrain being shouted back against the war, the establishment, and the media, backed up by fevered guitar riffs. And when the album wasn't hollering it seemed to be drawing out our need to mourn, with the melancholy "Boulevard of Broken Dreams" topping the charts for months. It was the perfect soundtrack to the burgeoning discontent of the times, akin to Coldplay's "Viva La Vida" playing Wall Street day traders off into the recession.
Listening to the album and watching the videos, the thought of a movie version popped into my head more than once. Some of it was due to the similarities between "American Idiot" and other concept albums like The Who's "Tommy" and Pink Floyd's "The Wall," which were converted into cinematic visions of freaky and curious delight. Mostly, though, I think it was because there were so many tantalizing hints of something bigger in the works. The "Wake Me Up When September Ends" video featured Jamie Bell and Evan Rachel Wood in essentially a mini-movie, and there were constant rumors about a mystery project that would expand on the album.
And what can I say? I'm a movie nut. I don't know much about popular music, but I love films that center around them. Rock operas and rock musicals especially fascinate me, everything from "A Hard Day's Night" to "Hedwig and the Angry Inch." The verve and vigor of even the most flawed projects like "Velvet Goldmine" or "Rent" never fail to get me excited. I wish we had more of them, and I wish our best musicians would bring their stuff to the screen more often – not just with biopics and concert films, but full-fledged creations conceived for film.
"American Idiot" has so much potential to cross over to other media, not just because of its themes and messages or its strong evocation of a particularly fractious point in time, but because it has a different point of view. Rock musicals we've seen before on Broadway, but a punk rock musical? A Green Day rock musical? That's a new generation and a new kind of beast rearing its head, and I'm not surprised that the critics are eating it up.
But while I think that an "American Idiot" musical is a great thing to have and I hope it has a long, happy run, I want a movie version. I want it on the celluloid, distilled, refined, and flickering in the dark. I want those green-filtered images from the music videos expanded, blown up and given more and more dimensions until they're spilling over the edge of the frame. They could shoot it in 3D. I hear that's in these days. But I want to see the moment captured on film, before it's too far behind us. I want an "American Idiot" movie.
And I want a Gorillaz movie too, but that's a post for another day.
Wednesday, April 21, 2010
For the Love of the Franchise
I feel the need to clarify my position on franchise films, after the the post on my misgivings toward "The Avengers" the other day. I love good franchise films, but I think they're hard to pull off and the handling of some of the more popular ones frustrates me to no end. By franchise, I mean films that are about the same major characters, your "Spiderman," "Planet of the Apes," and "Star Wars," films. The actors are not necessarily the same, and the individual entries don't always progress in a linear fashion from picture to picture, but the audience knows that they're getting the same universe with certain common rules and tropes. There are so many variations and incarnations of the oldest franchises like "Sherlock Holmes" and "Tarzan," we don't even think of them as franchises anymore.
Arguably the most successful modern one is "James Bond," a great perennial that's survived multiple decades and six actors in the title role. It was announced at the beginning of the week that the latest film, the twenty-third, was being put on indefinite hold while the financial woes of MGM get sorted out. A frustrating development, perhaps, but where this could spell the doom of other properties like Peter Jackson's "Hobbit" films, "James Bond" has survived far worse - the six-year gap between "Licence to Kill" and "GoldenEye," the fickle whims of George Lazenby, dueling official and unofficial "Bond" films in 1983, and several subpar outings like the bizarre "Die Another Day." Even if MGM's death throes go on for years, it won't kill the franchise. "Bond" is practically bullet-proof.
Not every franchise follows the "James Bond" template, though many of them try. "Harry Potter," by contrast, is a rare, wonderful example of a film series with a single story that has progressed in sequence over the course of six films to date with two more on the way. Maintaining consistency by retaining nearly all of its core actors, and rarely straying far from its source material, the "Potter" films make up a remarkably cohesive whole. Similar to the "Lord of the Rings" trilogy, the franchise is finite by design, though Warner Brothers did manage to squeeze out an extension by splitting the last film into two installments.
But what would have happened if the first "Harry Potter" film stumbled out of the gate? Or the fourth? In Hollywood a director is only as good as his or her last picture, and the same holds true for franchise films. It seems like the instant a big corporate-branded picture runs into trouble these days, the studio executives run for the reboot or spinoff buttons. In some cases franchises do run their course, like the "Star Trek" films in the late 90s that saw interest and quality drop off over successive films. After a much needed break, it rebooted last summer into one of the biggest hits of the year.
Increasingly though, there are cases like "Spiderman." Everyone seems to agree that the latest film was a misstep, though it made millions at the box office and certainly has enough goodwill in reserve to carry on for another picture or two. But FOX, deciding the damage was done, scrapped the entire existing series, including three potential future films. Their plan is to start over with a reboot in 2012, a scant five years after "Spiderman 3" and barely a decade after the first "Spiderman." There have been reboots of flops before, like the "Hulk" and "Punisher" films, but to remake a massively successful film series while it's still fresh in the public consciousness is tantamount to self-cannibalism.
The reason why franchises are so prevalent right now is because they're stable brands for film companies, easily marketable and predictable performers. It's easier to sell a familiar superhero like "Batman" than it is to sell a crime thriller starring an A-lister now, so the drive to sustain and make use of high-profile franchises is a high priority. Unfortunately the alchemy is never as easy as it looks. Tales of the multiple attempts to relaunch "Superman" as a franchise have been far more entertaining than the films themselves. The itch to continuously capitalize on the proven successes like "Spiderman," however, has lead to troubling decisions like rushing into the planned reboot. With so much on the line financially, it's no wonder why execs get nervous at the first sign of failure and try to cut their losses and move on in a different direction. Of course reboots aren't the only option these days. "X-Men" spun-off "Wolverine," "Terminator" and "Star Wars" explored prequels, and there's the massive Marvel Films crossover experiment in the works.
In most cases, I think the best antidote to a bad franchise film is time. Sony has just announced a new sequel to the 1996 film "Men in Black," which spawned the awful "Men in Black II" in 2002. After eight years, the embarrassing particulars of the second film have mostly faded, though the original film remains popular enough that viewers are still familiar with the property. A new film after all this time is a gamble, but it's worked before for "Indiana Jones" and "James Bond." Even dear old "Rocky" had a great last hurrah. I wish the execs would keep that in mind, since I think the Sam Raimi "Spiderman" films and an awful lot of other stymied franchises out there still have a lot of mileage left in them.
Arguably the most successful modern one is "James Bond," a great perennial that's survived multiple decades and six actors in the title role. It was announced at the beginning of the week that the latest film, the twenty-third, was being put on indefinite hold while the financial woes of MGM get sorted out. A frustrating development, perhaps, but where this could spell the doom of other properties like Peter Jackson's "Hobbit" films, "James Bond" has survived far worse - the six-year gap between "Licence to Kill" and "GoldenEye," the fickle whims of George Lazenby, dueling official and unofficial "Bond" films in 1983, and several subpar outings like the bizarre "Die Another Day." Even if MGM's death throes go on for years, it won't kill the franchise. "Bond" is practically bullet-proof.
Not every franchise follows the "James Bond" template, though many of them try. "Harry Potter," by contrast, is a rare, wonderful example of a film series with a single story that has progressed in sequence over the course of six films to date with two more on the way. Maintaining consistency by retaining nearly all of its core actors, and rarely straying far from its source material, the "Potter" films make up a remarkably cohesive whole. Similar to the "Lord of the Rings" trilogy, the franchise is finite by design, though Warner Brothers did manage to squeeze out an extension by splitting the last film into two installments.
But what would have happened if the first "Harry Potter" film stumbled out of the gate? Or the fourth? In Hollywood a director is only as good as his or her last picture, and the same holds true for franchise films. It seems like the instant a big corporate-branded picture runs into trouble these days, the studio executives run for the reboot or spinoff buttons. In some cases franchises do run their course, like the "Star Trek" films in the late 90s that saw interest and quality drop off over successive films. After a much needed break, it rebooted last summer into one of the biggest hits of the year.
Increasingly though, there are cases like "Spiderman." Everyone seems to agree that the latest film was a misstep, though it made millions at the box office and certainly has enough goodwill in reserve to carry on for another picture or two. But FOX, deciding the damage was done, scrapped the entire existing series, including three potential future films. Their plan is to start over with a reboot in 2012, a scant five years after "Spiderman 3" and barely a decade after the first "Spiderman." There have been reboots of flops before, like the "Hulk" and "Punisher" films, but to remake a massively successful film series while it's still fresh in the public consciousness is tantamount to self-cannibalism.
The reason why franchises are so prevalent right now is because they're stable brands for film companies, easily marketable and predictable performers. It's easier to sell a familiar superhero like "Batman" than it is to sell a crime thriller starring an A-lister now, so the drive to sustain and make use of high-profile franchises is a high priority. Unfortunately the alchemy is never as easy as it looks. Tales of the multiple attempts to relaunch "Superman" as a franchise have been far more entertaining than the films themselves. The itch to continuously capitalize on the proven successes like "Spiderman," however, has lead to troubling decisions like rushing into the planned reboot. With so much on the line financially, it's no wonder why execs get nervous at the first sign of failure and try to cut their losses and move on in a different direction. Of course reboots aren't the only option these days. "X-Men" spun-off "Wolverine," "Terminator" and "Star Wars" explored prequels, and there's the massive Marvel Films crossover experiment in the works.
In most cases, I think the best antidote to a bad franchise film is time. Sony has just announced a new sequel to the 1996 film "Men in Black," which spawned the awful "Men in Black II" in 2002. After eight years, the embarrassing particulars of the second film have mostly faded, though the original film remains popular enough that viewers are still familiar with the property. A new film after all this time is a gamble, but it's worked before for "Indiana Jones" and "James Bond." Even dear old "Rocky" had a great last hurrah. I wish the execs would keep that in mind, since I think the Sam Raimi "Spiderman" films and an awful lot of other stymied franchises out there still have a lot of mileage left in them.
Tuesday, April 20, 2010
A Post About Amy Adams
It's rough being a movie star today, with so many features leaning on CGI effects where they once banked on the name recognition of A-listers like Tom Cruise and John Travolta. Star vehicles are fading quickly, being replaced with franchise juggernauts. Only a handful of actors and actresses really attract enough attention to sell a film by their presence alone anymore, and the lucrative production deals that they once commanded are a thing of the past.
The movie star era has been in decline for a while now, with no new names coming up to take the places of Meryl Streep and Will Smith, though there is never a shortage of contenders. Take Amy Adams, for instance, who won critical accolades and her first Academy Award nomination with "Junebug" in 2005 and made a mainstream splash with the much-adored "Enchanted" in 2007. You couldn't ask for a more promising leading lady. And to nobody's surprise, she has gotten a decent amount of work as a result of her recent successes, but it's depressing to see how little that means these days.
Looking at Adams' pictures that were produced after the release of "Enchanted," we've got "Doubt," "Night at the Museum: Battle of the Smithsonian," "Julie and Julia," and "Leap Year." All mainstream projects, all relatively successful, and all a total waste of her talents. Admittedly I haven't see "Leap Year," but it fits the mold of all the other female-friendly, lightweight, romantic comedies that tend to get released in late winter and spring. Every young cinema debutante has had a brush with them at one point or another. Alas, Kate Hudson and Jennifer Lopez never escaped their treacly pastel clutches.
"Doubt" was a well-regarded prestige picture based on the Tony Award winning play, where Adams had a small supporting part and garnered a few good notices and a surprise Academy Award nomination. Compared to the meatier roles of Meryl Streep, Philip Seymour Hoffman, and Viola Davis, however, she was easily overshadowed. The "Night at the Museum" sequel, which opened in the summer of last year was much more high profile, but Adams' role as an Amelia Earhart waxwork brought to life was brief and didn't leave much of an impression. Playing the love interest in a CGI-heavy effects picture, she didn't have much to do besides run around after Ben Stiller and shout one-liners at appropriate moments.
What should have been Amy Adams' big role of the year was Julie Powell in "Julie and Julia," the double-memoir of a culinary blogger and the legendary chef Julia Child. A late summer release for blockbuster-weary grown-ups, directed by Nora Ephron, it featured yet another fine performance by Meryl Streep in the Julia Child segments and an absolutely miserable one by Amy Adams in the Julie half of the picture. The problem wasn't the actress, but the character, an overwrought, self-obsessed young woman who spends much of the film's running time whining incessantly about how awful her picture-perfect life is.
The sad thing is, these roles are really about the best that mainstream Hollywood has to offer to young actresses these days. Neurotic romantic heroines, second bananas to Meryl Streep, and action movie love interests are par for the course in major studio films. A step down from there, and you're looking at thankless wife, mother, or girlfriend roles. It's only in the indies, art house pictures, and horror films that you see strong parts for women, the kinds of parts that are necessary to make lasting movie stars.
But then I look at Sandra Bullock, who is about as big and popular as it gets these days, and all of those dull, awful, romantic comedies she's made, and how few really great roles she's had like Leigh Anne Tuohy in "The Blind Side," and I have to shudder inwardly. Is it even worth it to try for mainstream success anymore if this is where it leads? The paychecks must be nice, but it's no wonder why so many of our best film actresses like Glenn Close, Holly Hunter, Toni Collette, Laura Dern, and Sally Field have decamped for television screens. Or simply disappeared like Debra Winger.
Next up for Amy Adams, it's back to the indies with David O. Russell's "The Fighter," where she's playing Christian Bale's love interest, and there has been talk of an "Enchanted 2" project at Disney. I wish her the best of luck.
The movie star era has been in decline for a while now, with no new names coming up to take the places of Meryl Streep and Will Smith, though there is never a shortage of contenders. Take Amy Adams, for instance, who won critical accolades and her first Academy Award nomination with "Junebug" in 2005 and made a mainstream splash with the much-adored "Enchanted" in 2007. You couldn't ask for a more promising leading lady. And to nobody's surprise, she has gotten a decent amount of work as a result of her recent successes, but it's depressing to see how little that means these days.
Looking at Adams' pictures that were produced after the release of "Enchanted," we've got "Doubt," "Night at the Museum: Battle of the Smithsonian," "Julie and Julia," and "Leap Year." All mainstream projects, all relatively successful, and all a total waste of her talents. Admittedly I haven't see "Leap Year," but it fits the mold of all the other female-friendly, lightweight, romantic comedies that tend to get released in late winter and spring. Every young cinema debutante has had a brush with them at one point or another. Alas, Kate Hudson and Jennifer Lopez never escaped their treacly pastel clutches.
"Doubt" was a well-regarded prestige picture based on the Tony Award winning play, where Adams had a small supporting part and garnered a few good notices and a surprise Academy Award nomination. Compared to the meatier roles of Meryl Streep, Philip Seymour Hoffman, and Viola Davis, however, she was easily overshadowed. The "Night at the Museum" sequel, which opened in the summer of last year was much more high profile, but Adams' role as an Amelia Earhart waxwork brought to life was brief and didn't leave much of an impression. Playing the love interest in a CGI-heavy effects picture, she didn't have much to do besides run around after Ben Stiller and shout one-liners at appropriate moments.
What should have been Amy Adams' big role of the year was Julie Powell in "Julie and Julia," the double-memoir of a culinary blogger and the legendary chef Julia Child. A late summer release for blockbuster-weary grown-ups, directed by Nora Ephron, it featured yet another fine performance by Meryl Streep in the Julia Child segments and an absolutely miserable one by Amy Adams in the Julie half of the picture. The problem wasn't the actress, but the character, an overwrought, self-obsessed young woman who spends much of the film's running time whining incessantly about how awful her picture-perfect life is.
The sad thing is, these roles are really about the best that mainstream Hollywood has to offer to young actresses these days. Neurotic romantic heroines, second bananas to Meryl Streep, and action movie love interests are par for the course in major studio films. A step down from there, and you're looking at thankless wife, mother, or girlfriend roles. It's only in the indies, art house pictures, and horror films that you see strong parts for women, the kinds of parts that are necessary to make lasting movie stars.
But then I look at Sandra Bullock, who is about as big and popular as it gets these days, and all of those dull, awful, romantic comedies she's made, and how few really great roles she's had like Leigh Anne Tuohy in "The Blind Side," and I have to shudder inwardly. Is it even worth it to try for mainstream success anymore if this is where it leads? The paychecks must be nice, but it's no wonder why so many of our best film actresses like Glenn Close, Holly Hunter, Toni Collette, Laura Dern, and Sally Field have decamped for television screens. Or simply disappeared like Debra Winger.
Next up for Amy Adams, it's back to the indies with David O. Russell's "The Fighter," where she's playing Christian Bale's love interest, and there has been talk of an "Enchanted 2" project at Disney. I wish her the best of luck.
Monday, April 19, 2010
The Ignominious and the "Inglorious"
A gentleman of advancing years, upon seeing the trailer for "Inglorious Basterds" one evening, leaned over in his theater seat and told me in no uncertain terms that the film looked like a piece of garbage. At the time I did not disagree with him, as this gentleman is a dear friend of mine, and has a particularly colorful history, including the intimate knowledge of modern warfare in all its various forms throughout the years. I understood his objections. The trailer didn't do much for me either, leaving a vague feeling of embarrassment and unease at the thought of Quentin Tarantino mining WWII for splatterfest material. Those fears were mostly allayed with the positive reviews that accompanied the film's release and the smattering of kudos during awards season.
However, I couldn't help thinking of my friend's reaction to the trailer as I was watching "Inglorious Basterds." Thankfully the gory excesses of the title characters alluded to in the trailer were brief, but they were still intrusive. "Basterds" was sold as a pulpy, over-the-top action picture akin to Tarantino's last features, "Death Proof" and "Kill Bill." The anticipation of orgiastic violence hung over every frame, a distraction from what turned out to be a fairly restrained outing for the director, even with a fiery finale where much of the Third Reich is blown to smithereens. Intimate, dialogue-heavy scenes far outnumbered the broader action sequences, and anachronistic flourishes were relatively few.
Still, there's no getting around the major liberties Tarantino took with historical fact, and by the end of the movie, "Inglorious Basterds" resembles nothing so much as WWII alternate-universe fanfiction. The characters exist in a recreation of the cinematic depictions of WWII in 40s period films, rife with Tarantino's trademark homages, rather than the actual reality of the time. I had a hard time with the tone, which often shifted from farcical to dead serious and back again, sometimes in a single scene. The film opens with a suspenseful, utterly serious encounter with the villain Hans Landa, quickly followed by the Basterds' introduction from the trailer, in all its cheerfully pulpy excess.
And what to make of scenes where relatively realistic characters like Diane Kruger's Bridget von Hammersmark share the screen with wild caricatures like Brad Pitt's Lt. Aldo "The Apache" Raine and Eli Roth's bat-wielding Sgt. Donny "The Bear Jew" Donowitz? Tarantino wants to have it both ways, the exaggerated stylization and the genuine sentiment, and ends up with a picture that quick-steps around heavier themes and feels devoid of depth and consequence. This works for lighter fare like "Zombieland" and "Tropic Thunder," but "Inglorious Basterds," which evokes the weight of so much history, often comes off as dreadfully flippant with its material – just as my dear gentleman friend had feared.
In spite of this, I have to say the movie was well-made and parts of it were very entertaining. The predominantly European cast was excellent, especially Christoph Waltz as Hans Landa and both of the female leads. It took me a while to figure out what Brad Pitt was doing, but once it clicked that he and the other Basterds were Tarantino's amped-up answer to the Dirty Dozen, and meant to act as a counterweight to the film's hysterical Hitler, I got the joke. I just wish it didn't make me wince so badly.
A few days ago I watched another WWII film from last year, the Danish "Flame and Citron," which was about two Danish resistance assassins operating during the Nazi occupation of Denmark. Like "Inglorious Basterds," it had plenty of intrigue, violence, and visceral thrills, but I found "Flame and Citron" much more satisfying. It couldn't match Tarantino for style or wit or dazzle, but it did present its material directly and candidly. And in the end I thought it had much more to say.
However, I couldn't help thinking of my friend's reaction to the trailer as I was watching "Inglorious Basterds." Thankfully the gory excesses of the title characters alluded to in the trailer were brief, but they were still intrusive. "Basterds" was sold as a pulpy, over-the-top action picture akin to Tarantino's last features, "Death Proof" and "Kill Bill." The anticipation of orgiastic violence hung over every frame, a distraction from what turned out to be a fairly restrained outing for the director, even with a fiery finale where much of the Third Reich is blown to smithereens. Intimate, dialogue-heavy scenes far outnumbered the broader action sequences, and anachronistic flourishes were relatively few.
Still, there's no getting around the major liberties Tarantino took with historical fact, and by the end of the movie, "Inglorious Basterds" resembles nothing so much as WWII alternate-universe fanfiction. The characters exist in a recreation of the cinematic depictions of WWII in 40s period films, rife with Tarantino's trademark homages, rather than the actual reality of the time. I had a hard time with the tone, which often shifted from farcical to dead serious and back again, sometimes in a single scene. The film opens with a suspenseful, utterly serious encounter with the villain Hans Landa, quickly followed by the Basterds' introduction from the trailer, in all its cheerfully pulpy excess.
And what to make of scenes where relatively realistic characters like Diane Kruger's Bridget von Hammersmark share the screen with wild caricatures like Brad Pitt's Lt. Aldo "The Apache" Raine and Eli Roth's bat-wielding Sgt. Donny "The Bear Jew" Donowitz? Tarantino wants to have it both ways, the exaggerated stylization and the genuine sentiment, and ends up with a picture that quick-steps around heavier themes and feels devoid of depth and consequence. This works for lighter fare like "Zombieland" and "Tropic Thunder," but "Inglorious Basterds," which evokes the weight of so much history, often comes off as dreadfully flippant with its material – just as my dear gentleman friend had feared.
In spite of this, I have to say the movie was well-made and parts of it were very entertaining. The predominantly European cast was excellent, especially Christoph Waltz as Hans Landa and both of the female leads. It took me a while to figure out what Brad Pitt was doing, but once it clicked that he and the other Basterds were Tarantino's amped-up answer to the Dirty Dozen, and meant to act as a counterweight to the film's hysterical Hitler, I got the joke. I just wish it didn't make me wince so badly.
A few days ago I watched another WWII film from last year, the Danish "Flame and Citron," which was about two Danish resistance assassins operating during the Nazi occupation of Denmark. Like "Inglorious Basterds," it had plenty of intrigue, violence, and visceral thrills, but I found "Flame and Citron" much more satisfying. It couldn't match Tarantino for style or wit or dazzle, but it did present its material directly and candidly. And in the end I thought it had much more to say.
Sunday, April 18, 2010
"Kick-Ass"? Not at the Box Office
I do enjoy schadenfreude in the morning.
Early estimates of this weekend's box office numbers have come in, and the two films that are neck and neck for the number one spot are "Kick-Ass," which opened Friday to mostly positive reviews but underperformed, and "How to Train Your Dragon," the Dreamworks animated film that is in its fourth weekend of release. Both films pulled in roughly $20 million apiece. "Dragon" previously held third place spots the last two weekends behind "Clash of the Titans," "Date Night," and the latest Tyler Perry movie, "Why Did I Get Married Too?" The industry folks are in an uproar, because nobody expected this to happen. No film in recent memory has come back after so many weeks to challenge what was supposed to be a major release.
What's striking is that "Dragon" hasn't experienced some sudden resurgence in popularity. Despite a soft opening, it's retained modest but steady numbers from week to week while the other, subsequently released films have done big numbers upfront and faded away quickly. It's doing almost the same business that "Monsters V. Aliens" did last year, actually trailing about $5 million from what that film had generated at this point in its release, though "Dragon" has much smaller audience drop-offs from week to week and may end up surpassing it. Also in its favor is the lack of competition for its audience. Aside from "Furry Vengeance," the dreadful looking Brendan Fraser comedy coming next week, there's nothing on the schedule for the family audience until "Shrek 4," the next Dreamworks animated film, debuts on May 21st.
On the other side of the equation, the totals for "Kick-Ass" didn't match early estimates that pegged it as taking in $25-30 million, mostly based on the hyped up early response of the film's intended audience of young adults. Most analysts are now looking at the film's "R" rating and the recent controversy over the film's content as the culprits. While most critics praised the film, almost all did so with reservations and strong warnings to parents about the graphic violence and profanity. A few, like Roger Ebert and AO Scott went further and addressed the controversy directly and explained their objections in greater detail.
My pet theory is that the film's marketing did the most damage by studiously avoiding any mention of the film's copious profanity and ultraviolence. Without them, "Kick-Ass" looks like something aimed at a much, much younger audience. All the ads focus on the characters in brightly colored superhero costumes, reminiscent of something from the Disney Channel. A major point that most commentators glossed over in the content controversy was that many people assumed that the film was intended for children because if its comic premise and visual style. Nobody blinked an eye at foul-mouthed tots in films like "Role Models" because those are clearly intended for adult consumption. "Kick-Ass," by the creators' own admission was meant to confuse the line, and confuse it did.
There are still a few hopefuls suggesting that "Dragon" had inflated numbers because the kids were buying tickets in order to sneak into "Kick-Ass." Even if this were true, it still highlights the high systemic hurdles facing any film trying to woo teenagers with adult content. Some predict "Kick-Ass" will benefit from good word-of-mouth and overcome its opening weekend numbers, or that it will find an audience on DVD. It's clear, however, that "Kick-Ass" will not be the major hit that so many of the fanboy sites were giddily anticipating. Frankly, the dismissive attitude of many of the film's champions when confronted with questions about the film's provocative elements should have raised alarm bells early on. Fanboys can be terribly myopic and tend to forget they only comprise a fraction of any film's potential audience.
Lionsgate and Matthew Vaughn, who self-financed "Kick-Ass," should make a tidy profit in any case, but it doesn't look good for future comic-based properties with similar conceits. Instead, animation fans and animators are having a very nice day.
Early estimates of this weekend's box office numbers have come in, and the two films that are neck and neck for the number one spot are "Kick-Ass," which opened Friday to mostly positive reviews but underperformed, and "How to Train Your Dragon," the Dreamworks animated film that is in its fourth weekend of release. Both films pulled in roughly $20 million apiece. "Dragon" previously held third place spots the last two weekends behind "Clash of the Titans," "Date Night," and the latest Tyler Perry movie, "Why Did I Get Married Too?" The industry folks are in an uproar, because nobody expected this to happen. No film in recent memory has come back after so many weeks to challenge what was supposed to be a major release.
What's striking is that "Dragon" hasn't experienced some sudden resurgence in popularity. Despite a soft opening, it's retained modest but steady numbers from week to week while the other, subsequently released films have done big numbers upfront and faded away quickly. It's doing almost the same business that "Monsters V. Aliens" did last year, actually trailing about $5 million from what that film had generated at this point in its release, though "Dragon" has much smaller audience drop-offs from week to week and may end up surpassing it. Also in its favor is the lack of competition for its audience. Aside from "Furry Vengeance," the dreadful looking Brendan Fraser comedy coming next week, there's nothing on the schedule for the family audience until "Shrek 4," the next Dreamworks animated film, debuts on May 21st.
On the other side of the equation, the totals for "Kick-Ass" didn't match early estimates that pegged it as taking in $25-30 million, mostly based on the hyped up early response of the film's intended audience of young adults. Most analysts are now looking at the film's "R" rating and the recent controversy over the film's content as the culprits. While most critics praised the film, almost all did so with reservations and strong warnings to parents about the graphic violence and profanity. A few, like Roger Ebert and AO Scott went further and addressed the controversy directly and explained their objections in greater detail.
My pet theory is that the film's marketing did the most damage by studiously avoiding any mention of the film's copious profanity and ultraviolence. Without them, "Kick-Ass" looks like something aimed at a much, much younger audience. All the ads focus on the characters in brightly colored superhero costumes, reminiscent of something from the Disney Channel. A major point that most commentators glossed over in the content controversy was that many people assumed that the film was intended for children because if its comic premise and visual style. Nobody blinked an eye at foul-mouthed tots in films like "Role Models" because those are clearly intended for adult consumption. "Kick-Ass," by the creators' own admission was meant to confuse the line, and confuse it did.
There are still a few hopefuls suggesting that "Dragon" had inflated numbers because the kids were buying tickets in order to sneak into "Kick-Ass." Even if this were true, it still highlights the high systemic hurdles facing any film trying to woo teenagers with adult content. Some predict "Kick-Ass" will benefit from good word-of-mouth and overcome its opening weekend numbers, or that it will find an audience on DVD. It's clear, however, that "Kick-Ass" will not be the major hit that so many of the fanboy sites were giddily anticipating. Frankly, the dismissive attitude of many of the film's champions when confronted with questions about the film's provocative elements should have raised alarm bells early on. Fanboys can be terribly myopic and tend to forget they only comprise a fraction of any film's potential audience.
Lionsgate and Matthew Vaughn, who self-financed "Kick-Ass," should make a tidy profit in any case, but it doesn't look good for future comic-based properties with similar conceits. Instead, animation fans and animators are having a very nice day.
Saturday, April 17, 2010
Episode I - A New "Who"
My last couple of entries have been pretty negative. Let's switch gears.
"Doctor Who" is back after the fireworks of David Tennant's departure from the title role three months ago. The new season brings us a new Doctor, a new companion, a new Tardis, a new title sequence, and even a new logo for the show. That last one is a major improvement right off the bat. I won't be missing the radioactive surfboard. Hello DW/Tardis icon!
It's difficult to judge the new actors and the new creative folks behind the cameras after only one installment, but if the rest of the season is like the premiere, "Doctor Who" is in safe hands. I expect that Matt Smith as the Doctor and Karen Gillian as his new companion, Amy Pond, are going to take a few episodes to settle into their characters. But going on very preliminary initial impressions, Matt Smith should be fine. He's got the energy without being as manic as Tennant, the ability to shout ridiculous lines of dialogue with any hint of irony, and he's a little odd looking, which endears him to me immediately. I'm not sure if he can go to the dark and serious places that "Who" sometimes explores, but I look forward to watching him try. Karen Gillian is a lovely, bright presence, though the little girl who played the younger Amy upstaged her. I was actually a little disappointed that the seven-year-old couldn't be this series' companion.
Of the writing, I'm more apprehensive. There's no question that Stephen Moffat, the new head writer, has turned in scripts for some of the best hours of the new "Doctor Who" so far, winning three Hugos in succession for them. I loved the earlier episodes he did, but was severely disappointed by the "Silence in the Library" two-parter from the last season. It had a very uneven tone, had no clue what to do with one of the major characters, and recycled several earlier ideas. Moffat has a tendency to use the same themes and little gimmicks in his stories, such as the effects of time dilation, creepy repeated phrases, monsters who follow rules, and wistful romances. Some of these work better than others. For instance, I never get tired of his exploration of time paradoxes and time dilation, which is one of the most fascinating elements of "Doctor Who" that the other writers sometimes forget about. Conversely, though all of Moffat's episodes have had their little moments of humor and lightness, his tone is darker than Russell Davies, the previous head writer, and tends toward the somber and thoughtful. I worry that we'll lose some of the fun and kitsch of "Doctor Who" if Moffat steers the series too far into dramatic waters.
The premiere episode was fine, with a lot of good one-liners and a lot of breakneck action in the midst of a very sweet time-travel story. A few semi-spoilery thoughts: Though budget cuts for the series were reported, it doesn't show in the production quality - at least not yet. The introductions to the doctor and to Amy (twice) and the new Tardis were all handled very well - though I did have to roll my eyes at the end of the episode when it was revealed that the new companion is yet another runaway bride. I'd have thought Donna Noble would have filled the quota for a very long time. The peripheral characters didn't make much of an impression, and Amy's familial and romantic relationships haven't been well defined yet. The bulk of the running time was wisely spent acquainting us with the new leads, and cementing their relationship. It seems to have worked, because the buzz around the series has been very positive so far and I'm certainly sold on them.
Looking ahead, we've got a few familiar faces coming back from the earlier Moffat episodes, including River Song from "Silence in the Library" and the supremely creepy Weeping Angels from "Blink." So far Moffat's avoided using any of the usual "Doctor Who" villains like the Daleks or the Cybermen, but previews have confirmed they'll be showing up later in the series too. It's good to have the Doctor back. I know it's only been a few months, but it still felt like he was away for far too long.
"Doctor Who" is back after the fireworks of David Tennant's departure from the title role three months ago. The new season brings us a new Doctor, a new companion, a new Tardis, a new title sequence, and even a new logo for the show. That last one is a major improvement right off the bat. I won't be missing the radioactive surfboard. Hello DW/Tardis icon!
It's difficult to judge the new actors and the new creative folks behind the cameras after only one installment, but if the rest of the season is like the premiere, "Doctor Who" is in safe hands. I expect that Matt Smith as the Doctor and Karen Gillian as his new companion, Amy Pond, are going to take a few episodes to settle into their characters. But going on very preliminary initial impressions, Matt Smith should be fine. He's got the energy without being as manic as Tennant, the ability to shout ridiculous lines of dialogue with any hint of irony, and he's a little odd looking, which endears him to me immediately. I'm not sure if he can go to the dark and serious places that "Who" sometimes explores, but I look forward to watching him try. Karen Gillian is a lovely, bright presence, though the little girl who played the younger Amy upstaged her. I was actually a little disappointed that the seven-year-old couldn't be this series' companion.
Of the writing, I'm more apprehensive. There's no question that Stephen Moffat, the new head writer, has turned in scripts for some of the best hours of the new "Doctor Who" so far, winning three Hugos in succession for them. I loved the earlier episodes he did, but was severely disappointed by the "Silence in the Library" two-parter from the last season. It had a very uneven tone, had no clue what to do with one of the major characters, and recycled several earlier ideas. Moffat has a tendency to use the same themes and little gimmicks in his stories, such as the effects of time dilation, creepy repeated phrases, monsters who follow rules, and wistful romances. Some of these work better than others. For instance, I never get tired of his exploration of time paradoxes and time dilation, which is one of the most fascinating elements of "Doctor Who" that the other writers sometimes forget about. Conversely, though all of Moffat's episodes have had their little moments of humor and lightness, his tone is darker than Russell Davies, the previous head writer, and tends toward the somber and thoughtful. I worry that we'll lose some of the fun and kitsch of "Doctor Who" if Moffat steers the series too far into dramatic waters.
The premiere episode was fine, with a lot of good one-liners and a lot of breakneck action in the midst of a very sweet time-travel story. A few semi-spoilery thoughts: Though budget cuts for the series were reported, it doesn't show in the production quality - at least not yet. The introductions to the doctor and to Amy (twice) and the new Tardis were all handled very well - though I did have to roll my eyes at the end of the episode when it was revealed that the new companion is yet another runaway bride. I'd have thought Donna Noble would have filled the quota for a very long time. The peripheral characters didn't make much of an impression, and Amy's familial and romantic relationships haven't been well defined yet. The bulk of the running time was wisely spent acquainting us with the new leads, and cementing their relationship. It seems to have worked, because the buzz around the series has been very positive so far and I'm certainly sold on them.
Looking ahead, we've got a few familiar faces coming back from the earlier Moffat episodes, including River Song from "Silence in the Library" and the supremely creepy Weeping Angels from "Blink." So far Moffat's avoided using any of the usual "Doctor Who" villains like the Daleks or the Cybermen, but previews have confirmed they'll be showing up later in the series too. It's good to have the Doctor back. I know it's only been a few months, but it still felt like he was away for far too long.
Friday, April 16, 2010
"The Avengers" - I Have a Bad Feeling About This
The headline popped up on Google News on Sunday evening: "Joss Whedon to direct 'The Avengers'?: Existence of God no longer in doubt." The Entertainment Weekly proclamation was par for the course for most reactions to the news that Whedon was reported to be at the top of the shortlist of possible "Avengers" directors. A few scant hours later, more buzz when it was announced he might also rewrite the Zak Penn "Avengers" script, and the new "Captain America" for good measure. Euphoria abounds in Geekland. My reaction was somewhat different. I have nothing but goodwill towards Joss Whedon, genre television guru, comic book scribe, and the guy who birthed "Dr. Horrible's Sing-Along Blog." But I am deeply, deeply skeptical about "The Avengers."
For comics naifs, The Avengers was the Marvel superhero team that was comprised of several of their most popular headliners: Captain America, Iron Man, The Hulk, Thor, and second stringers Ant-Man and The Wasp. Marvel Studios is planning a film version that would be a massive crossover of four of their major film properties: "Iron Man," "The Incredible Hulk," and the yet to be released "Captain America" and "Thor." They've already planted little hints in the movies released so far, quick cameos of characters from the other franchises to establish a common universe. But as Erik Davis over at Cinematical has already pointed out, a full scale "Avengers" film is a risky, logistically daunting operation that has the potential to massively backfire on Marvel Studios.
The success of "The Avengers" is going to depend on several factors, most crucially the performance of two films that are far from completion. What happens if "Captain America" or "Thor" don't do well? At first glance, both productions look solid, but then so did Ang Lee's notorious "Hulk" with Eric Bana and Jennifer Connelly before its release. "Thor" is being helmed by Kenneth Branagh, who isn't exactly known for big effects films, and made a spectacular bungle of his last directing gig on "Sleuth." "Captain America" will be directed by Joe Johnston, a dependable veteran of genre pictures like "The Rocketeer," but who is also coming off of this year's most unfortunate underperformer, "The Wolfman."
And then there are the existing films to consider. The rebooted "Incredible Hulk" that came out a few months after "Iron Man" did mediocre business, but the numbers were good enough to ensure the big green galoot a spot on the "Avengers" roster, as confirmed by Marvel's president Kevin Feige back in January. A sequel has been nixed, however, so the question will be whether audiences will still remember the Louis Leterrier's and Edward Norton's version of the Hulk in 2012. "Iron Man" has been a runaway success, with a hotly anticipated sequel due in less than a month. On the other hand, there are already mild rumblings of discontent from a few corners that "Iron Man 2" couldn't really tackle any major story arcs that would require multiple films to resolve, as other franchises have done, since they had to keep a clean slate for "The Avengers." In this sense, the project could end up holding back the most successful Marvel film properties.
I haven't even gotten to the challenges of the potential "Avengers" film itself, which is going to have to juggle multiple main characters, who may all come from the Marvel universe, but inhabit very different corners of it. Thor has mythological underpinnings and fantasy concepts, Captain America is historical fiction with deep roots in World War II, and Iron Man is pure tech-driven sci-fi. The sensibilities of these films will be necessarily different, so there is the very real danger of the characters and their separate movie-verses becoming diluted, diminished, or simply clashing once combined. A Superman/Batman crossover film has been discussed for years, but no one has ever come close to putting those two together because of these kinds of concerns. Now Marvel wants four different properties sharing the same screen?
Films about superhero teams have usually been a mixed lot. The "X-Men" and "Fantastic Four" series had their ups and downs, but even in the best of them some characters always suffered at the expense of others. There is no guarantee, even if all the upcoming single-hero Marvel films turn out well, and all the tone and story pitfalls are avoided, that four big superheroes – five with Samuel L. Jackson's Nick Fury – are going to play well together. I expect that Robert Downey Jr. is going to end up dominating the picture, as Chris Evans and Chris Hemsworth, our future Captain America and Thor respectively, are relative newcomers. But then there's Edward Norton, who had the rumor mills going overtime with his own rewrites of the "Incredible Hulk" script – Zak Penn's "Incredible Hulk script, ironically. Oh, this could get ugly very fast.
I'd love it if "Avengers" turned out well, but it really feels like Marvel is pushing their luck with this one.
For comics naifs, The Avengers was the Marvel superhero team that was comprised of several of their most popular headliners: Captain America, Iron Man, The Hulk, Thor, and second stringers Ant-Man and The Wasp. Marvel Studios is planning a film version that would be a massive crossover of four of their major film properties: "Iron Man," "The Incredible Hulk," and the yet to be released "Captain America" and "Thor." They've already planted little hints in the movies released so far, quick cameos of characters from the other franchises to establish a common universe. But as Erik Davis over at Cinematical has already pointed out, a full scale "Avengers" film is a risky, logistically daunting operation that has the potential to massively backfire on Marvel Studios.
The success of "The Avengers" is going to depend on several factors, most crucially the performance of two films that are far from completion. What happens if "Captain America" or "Thor" don't do well? At first glance, both productions look solid, but then so did Ang Lee's notorious "Hulk" with Eric Bana and Jennifer Connelly before its release. "Thor" is being helmed by Kenneth Branagh, who isn't exactly known for big effects films, and made a spectacular bungle of his last directing gig on "Sleuth." "Captain America" will be directed by Joe Johnston, a dependable veteran of genre pictures like "The Rocketeer," but who is also coming off of this year's most unfortunate underperformer, "The Wolfman."
And then there are the existing films to consider. The rebooted "Incredible Hulk" that came out a few months after "Iron Man" did mediocre business, but the numbers were good enough to ensure the big green galoot a spot on the "Avengers" roster, as confirmed by Marvel's president Kevin Feige back in January. A sequel has been nixed, however, so the question will be whether audiences will still remember the Louis Leterrier's and Edward Norton's version of the Hulk in 2012. "Iron Man" has been a runaway success, with a hotly anticipated sequel due in less than a month. On the other hand, there are already mild rumblings of discontent from a few corners that "Iron Man 2" couldn't really tackle any major story arcs that would require multiple films to resolve, as other franchises have done, since they had to keep a clean slate for "The Avengers." In this sense, the project could end up holding back the most successful Marvel film properties.
I haven't even gotten to the challenges of the potential "Avengers" film itself, which is going to have to juggle multiple main characters, who may all come from the Marvel universe, but inhabit very different corners of it. Thor has mythological underpinnings and fantasy concepts, Captain America is historical fiction with deep roots in World War II, and Iron Man is pure tech-driven sci-fi. The sensibilities of these films will be necessarily different, so there is the very real danger of the characters and their separate movie-verses becoming diluted, diminished, or simply clashing once combined. A Superman/Batman crossover film has been discussed for years, but no one has ever come close to putting those two together because of these kinds of concerns. Now Marvel wants four different properties sharing the same screen?
Films about superhero teams have usually been a mixed lot. The "X-Men" and "Fantastic Four" series had their ups and downs, but even in the best of them some characters always suffered at the expense of others. There is no guarantee, even if all the upcoming single-hero Marvel films turn out well, and all the tone and story pitfalls are avoided, that four big superheroes – five with Samuel L. Jackson's Nick Fury – are going to play well together. I expect that Robert Downey Jr. is going to end up dominating the picture, as Chris Evans and Chris Hemsworth, our future Captain America and Thor respectively, are relative newcomers. But then there's Edward Norton, who had the rumor mills going overtime with his own rewrites of the "Incredible Hulk" script – Zak Penn's "Incredible Hulk script, ironically. Oh, this could get ugly very fast.
I'd love it if "Avengers" turned out well, but it really feels like Marvel is pushing their luck with this one.
Thursday, April 15, 2010
Saying Goodbye to "Ugly Betty"
My fling with "Ugly Betty" was fairly quick. I was rooming with a "Betty" fan during the second season, and it was the only show she watched on a regular basis. I got sucked in by the brightly colored decor and girl-power spunk of the premise. America Ferrera, who played Betty Suarez, was a welcome break from the norm in her braces and disastrously uncoordinated outfits. Also, I was struck by the resemblance between Eric Mabius, who plays Betty's boss Daniel Meade, and my own supervisor at work at the time.
Aside from a very brief "General Hospital" phase in high school, I was never much for soaps. And that's what "Ugly Betty" was - a primetime soap. More specifically, it was an adaptation of the Colombian telenovela, "Betty La Fea," one of several international versions that have been hits around the globe. The American version encapsulated several rising trends of the time - "The Devil Wears Prada" fashionistas, Latino and openly gay characters emerging in the popular culture, and a nice little backlash against increasingly emaciated female body images.
During the first season, it was hard to escape "Betty-mania." The critics loved it, and the show collected heaps of awards at the Golden Globes and the Emmys. It attracted a respectable number of viewers, enough to help ABC put up some real competition on Thursday nights against "Survivor." America Ferrera and Vanessa Williams, who played the deliciously evil magazine editor Wilhelmina Slater, were everywhere. But then the love disappeared just as quick. The show's subsequent seasons steadily lost viewers and went through several different timeslots until its was finally canceled this year.
For my part, I appreciated the show more than I enjoyed it. After a season of watching Betty juggle potential love interests, foil Wilhelmina's plots against the Meade family, and bail loved ones out trouble, my interest petered out. My roommate and I split ways, and I quickly lost track of the show. I had the same peeves about "Betty" that I had with most soap operas - lack of resolution, repetitiveness, and very slow plot progression. I watched the finale last night and was caught up in about two minutes. But credit where credit is due - it was by far the most entertaining soap I'd watched in recent memory.
I got a real kick out of seeing successful career women at the forefront of many storylines, battling each other over work as well as men. I loved that everyone had some ambition, from Betty's sister Hilda trying to become a hair stylist to Wilhelmina's catty minion Marc turning up in the same editors-in-training program as Betty. And best of all, the writers weren't scared of going for laughs. The humor was always spot-on, often playfully winking at the show's ridiculous plot twists and oddball logic. And it helped to enliven the obvious messages of tolerance and diversity that were always at the heart of "Betty."
It's only been four short years, but "Ugly Betty" already feels like it was made for a different era, when print publications weren't being downsized left and right, and the recession hadn't upended every twenty-something's job expectations and career prospects. Nor is it the only multi-culti comedy on the air anymore, with ABC's "Modern Family" and FOX's "Glee" featuring similarly varied casts of characters. Though the creators of "Betty" apparently had a five-year plan for the series, it feels right that it's ending now, closing out the cultural moment that spawned it.
I watched the finale last night and was glad to see Betty got a proper sendoff, finally achieving closure for several of its storylines and letting most of the characters find their happy endings. Rumors are already floating about a feature film follow up. As an alternative, I'd love to see a new series about Betty's adventures in London. For now, though, I'm happy to know "Ugly Betty" made good on its premise. The braces came off and the wardrobe got an upgrade, but the biggest change to Betty Suarez after four years was her attitude and confidence. All grown up, with her own editing gig, and calling her own shots, Betty is an ugly duckling no more.
And in the show's final shot the title card followed suit, with the "Ugly" fading away at long last to leave "Betty," ready to take on the world.
Aside from a very brief "General Hospital" phase in high school, I was never much for soaps. And that's what "Ugly Betty" was - a primetime soap. More specifically, it was an adaptation of the Colombian telenovela, "Betty La Fea," one of several international versions that have been hits around the globe. The American version encapsulated several rising trends of the time - "The Devil Wears Prada" fashionistas, Latino and openly gay characters emerging in the popular culture, and a nice little backlash against increasingly emaciated female body images.
During the first season, it was hard to escape "Betty-mania." The critics loved it, and the show collected heaps of awards at the Golden Globes and the Emmys. It attracted a respectable number of viewers, enough to help ABC put up some real competition on Thursday nights against "Survivor." America Ferrera and Vanessa Williams, who played the deliciously evil magazine editor Wilhelmina Slater, were everywhere. But then the love disappeared just as quick. The show's subsequent seasons steadily lost viewers and went through several different timeslots until its was finally canceled this year.
For my part, I appreciated the show more than I enjoyed it. After a season of watching Betty juggle potential love interests, foil Wilhelmina's plots against the Meade family, and bail loved ones out trouble, my interest petered out. My roommate and I split ways, and I quickly lost track of the show. I had the same peeves about "Betty" that I had with most soap operas - lack of resolution, repetitiveness, and very slow plot progression. I watched the finale last night and was caught up in about two minutes. But credit where credit is due - it was by far the most entertaining soap I'd watched in recent memory.
I got a real kick out of seeing successful career women at the forefront of many storylines, battling each other over work as well as men. I loved that everyone had some ambition, from Betty's sister Hilda trying to become a hair stylist to Wilhelmina's catty minion Marc turning up in the same editors-in-training program as Betty. And best of all, the writers weren't scared of going for laughs. The humor was always spot-on, often playfully winking at the show's ridiculous plot twists and oddball logic. And it helped to enliven the obvious messages of tolerance and diversity that were always at the heart of "Betty."
It's only been four short years, but "Ugly Betty" already feels like it was made for a different era, when print publications weren't being downsized left and right, and the recession hadn't upended every twenty-something's job expectations and career prospects. Nor is it the only multi-culti comedy on the air anymore, with ABC's "Modern Family" and FOX's "Glee" featuring similarly varied casts of characters. Though the creators of "Betty" apparently had a five-year plan for the series, it feels right that it's ending now, closing out the cultural moment that spawned it.
I watched the finale last night and was glad to see Betty got a proper sendoff, finally achieving closure for several of its storylines and letting most of the characters find their happy endings. Rumors are already floating about a feature film follow up. As an alternative, I'd love to see a new series about Betty's adventures in London. For now, though, I'm happy to know "Ugly Betty" made good on its premise. The braces came off and the wardrobe got an upgrade, but the biggest change to Betty Suarez after four years was her attitude and confidence. All grown up, with her own editing gig, and calling her own shots, Betty is an ugly duckling no more.
And in the show's final shot the title card followed suit, with the "Ugly" fading away at long last to leave "Betty," ready to take on the world.
Wednesday, April 14, 2010
My "Golden Palace" Epiphany
I once had an epiphany while watching a bad sitcom - I'm not sure how relevant this is to media, or to anything really, but it's one of those little things that's stuck out in my mind for about twenty years now, so I thought I should write it down.
This needs a little context. I was a pretty sheltered kid, kind of a homebody, and very close to my family. Nonetheless, we didn't talk about certain subjects, sex being the one most relevant to this little trip down memory lane. I'm one of those poor, tragic souls who had to look up sex in the dictionary, and didn't figure out the actual mechanics of being in flagrante delicto until college. To prove the point, my usual Saturday night ritual in junior high was watching "The Golden Girls" with my mother. It was the raciest program that I was regularly exposed to. I'm dead serious. Upon reflection, I'm pretty sure it's where I learned what an affair was, plus about two dozen charming euphemisms for intercourse. And it was on the spinoff, "Golden Palace" that I finally figured out what being gay meant.
"Golden Palace" was a terrible show, a bald act of desperation to keep the audience of "Golden Girls" after Bea Arthur quit. The remaining ladies, along with Cheech Marin and Don Cheadle, were placed in charge of a ramshackle hotel and subjected to painfully contrived sitcom shenanigans over the course of the show's first and only season. It was a low point in the careers of everyone involved. Why a simpler change in scenery couldn't have sufficed for a follow-up show I will never understand. Have Blanche, Rose, and Sophia move to New York! To San Antonio! Just because a premise worked for Bob Newhart doesn't mean it works for anybody!
Anyway, you get the point. I watched "Golden Palace" because it was on in the same timeslot as "Golden Girls," and the characters were familiar. I couldn't tell you what happened on any single episode. However, I do remember the subplot of one episode, where Rose, played by Betty White, was running the front desk. After noticing several couples checking in as Mr. and Mrs. Smith, she finally caught on that people were meeting at the hotel to conduct affairs, and spent the rest of the episode tying herself in disapproving knots with each new arrival.
All of this built up to the big joke at the end of the episode. Rose, finding yet another guest signing in as "Mr. Smith," asked wearily if there was a "Mrs. Smith" with him. No, there wasn't but there was a "Mr. Jones" sharing the room. And as Rose brightened up in obvious relief, she completely missed the sight of Mr. Smith and Mr. Jones merrily walking off to their room, hand in hand. You couldn't have a better visual representation of love in bloom.
Keep in mind that I didn't have more than a vague understanding of what sex actually was at this point, but I understood all the business of serious and casual relationships that went on around them. Until that second, I didn't realize that there was such a thing as a homosexual relationship in the romantic sense. I didn't know they could do that. I knew gay meant different, and gay was about men liking men. I just never equated it with the same feelings that there were between heterosexual men and women. This was the early 90s, so anything more than a vague reference to homosexuality was still pretty rare on television. If it did come up, the AIDS scare was almost sure to follow in the next breath.
The joke on "Golden Palace" was so simple, it was almost sweet, and there weren't any tawdry or negative connotation attached to the pair at all – it was just met with a burst from the laugh track and then went to credits. My mother was watching with me and didn't say anything to explain. I think she acknowledged that they were gay, but her reaction was neutral. I didn't learn until years later that she had spent a lot of time in San Francisco in her twenties in the hippie days and was used to people with alternative lifestyles, though she's fairly conservative otherwise. It wasn't the sort of thing we talked about.
Media can be a powerful thing, even in its most seemingly innocuous moments. "Golden Palace" might be remembered by most as an embarrassing mistake. But I learned something important from it, even though I don't think that anybody meant for me to. But as someone with a pretty blank slate as far as attitudes toward sexuality then, having no real religious upbringing and parents who did not acknowledge such things in the presence of their teenage daughter, it stuck. To this day I can remember that joke crystal clear, and that moment where suddenly I understood years of innuendoes and offhand remarks. And it was strangely thrilling – like discovering a new continent populated by people I didn't know existed.
I turned out to be straight, but I'm glad that from fairly early on, I knew it would have been okay if I wasn't. And I have a bad sitcom to thank for that.
This needs a little context. I was a pretty sheltered kid, kind of a homebody, and very close to my family. Nonetheless, we didn't talk about certain subjects, sex being the one most relevant to this little trip down memory lane. I'm one of those poor, tragic souls who had to look up sex in the dictionary, and didn't figure out the actual mechanics of being in flagrante delicto until college. To prove the point, my usual Saturday night ritual in junior high was watching "The Golden Girls" with my mother. It was the raciest program that I was regularly exposed to. I'm dead serious. Upon reflection, I'm pretty sure it's where I learned what an affair was, plus about two dozen charming euphemisms for intercourse. And it was on the spinoff, "Golden Palace" that I finally figured out what being gay meant.
"Golden Palace" was a terrible show, a bald act of desperation to keep the audience of "Golden Girls" after Bea Arthur quit. The remaining ladies, along with Cheech Marin and Don Cheadle, were placed in charge of a ramshackle hotel and subjected to painfully contrived sitcom shenanigans over the course of the show's first and only season. It was a low point in the careers of everyone involved. Why a simpler change in scenery couldn't have sufficed for a follow-up show I will never understand. Have Blanche, Rose, and Sophia move to New York! To San Antonio! Just because a premise worked for Bob Newhart doesn't mean it works for anybody!
Anyway, you get the point. I watched "Golden Palace" because it was on in the same timeslot as "Golden Girls," and the characters were familiar. I couldn't tell you what happened on any single episode. However, I do remember the subplot of one episode, where Rose, played by Betty White, was running the front desk. After noticing several couples checking in as Mr. and Mrs. Smith, she finally caught on that people were meeting at the hotel to conduct affairs, and spent the rest of the episode tying herself in disapproving knots with each new arrival.
All of this built up to the big joke at the end of the episode. Rose, finding yet another guest signing in as "Mr. Smith," asked wearily if there was a "Mrs. Smith" with him. No, there wasn't but there was a "Mr. Jones" sharing the room. And as Rose brightened up in obvious relief, she completely missed the sight of Mr. Smith and Mr. Jones merrily walking off to their room, hand in hand. You couldn't have a better visual representation of love in bloom.
Keep in mind that I didn't have more than a vague understanding of what sex actually was at this point, but I understood all the business of serious and casual relationships that went on around them. Until that second, I didn't realize that there was such a thing as a homosexual relationship in the romantic sense. I didn't know they could do that. I knew gay meant different, and gay was about men liking men. I just never equated it with the same feelings that there were between heterosexual men and women. This was the early 90s, so anything more than a vague reference to homosexuality was still pretty rare on television. If it did come up, the AIDS scare was almost sure to follow in the next breath.
The joke on "Golden Palace" was so simple, it was almost sweet, and there weren't any tawdry or negative connotation attached to the pair at all – it was just met with a burst from the laugh track and then went to credits. My mother was watching with me and didn't say anything to explain. I think she acknowledged that they were gay, but her reaction was neutral. I didn't learn until years later that she had spent a lot of time in San Francisco in her twenties in the hippie days and was used to people with alternative lifestyles, though she's fairly conservative otherwise. It wasn't the sort of thing we talked about.
Media can be a powerful thing, even in its most seemingly innocuous moments. "Golden Palace" might be remembered by most as an embarrassing mistake. But I learned something important from it, even though I don't think that anybody meant for me to. But as someone with a pretty blank slate as far as attitudes toward sexuality then, having no real religious upbringing and parents who did not acknowledge such things in the presence of their teenage daughter, it stuck. To this day I can remember that joke crystal clear, and that moment where suddenly I understood years of innuendoes and offhand remarks. And it was strangely thrilling – like discovering a new continent populated by people I didn't know existed.
I turned out to be straight, but I'm glad that from fairly early on, I knew it would have been okay if I wasn't. And I have a bad sitcom to thank for that.
Tuesday, April 13, 2010
"Preacher" - Gone to Hollywood
I'd say I qualify as a comic-book fan, meaning someone who reads comic books independent of any film or television influences, if just barely. In college I picked up a couple of volumes of "Sandman" and my roommate tracked down all the issues of "Johnny the Homicidal Maniac," and I found myself sporadically leafing through graphic novels like "Creature Tech" and "From Hell" ever since. I never collected and I never invested, but I've read a fair amount of the big titles and know who most of the big names in the industry are. And the knowledge has proved very helpful for being a film buff, as Hollywood's been snapping up more and more comics properties for development over the last decade.
I mentioned in yesterday's post that darker material from the comic book realm has been in ascendancy, as embodied by the runaway success of "The Dark Knight," and the willingness of a major studio to put some serious money into Zack Snyder's adaptation of the very adult "Watchmen." Though "Watchmen" didn't do well, it hasn't stopped filmmakers from continuing to push the boundaries with edgier comic properties like the upcoming "Kick-Ass." There's still a reluctance, though, to produce truly adult films based on comics for fear of alienating the younger demographic that most action films are geared towards. Of course, nobody seems to want to makes films for grown-ups these days in any genre, but it's interesting to look at some of the higher profile comic-based projects that are struggling with this barrier.
One title that's been percolating for a while now is "Preacher," based on the popular Garth Ennis comic that was published by DC's Vertigo imprint for mature readers in the 90s. I've read the entire run, all 66 issues, five specials, and the four-part miniseries collected into nine trade paperbacks. Centered around the adventures of a young Texan preacher who acquires supernatural powers akin to the literal Word of God, and declares war against the Almighty, it neatly straddles the line between Western and horror genres with a little passion play thrown in for good measure. In the right hands, I'm sure the material could spawn a good action film trilogy, or better yet a miniseries.
And over the last ten years, it's nearly been both, with the adaptation rights bouncing from the Weinsteins at Miramax to HBO to Columbia Pictures. James Marsden was attached to star. Then he wasn't. Sam Mendes was attached to direct. Then he wasn't. The latest word from producer Neal Moritz at the Collider yesterday was that they have a script (not one of the three drafts written by Garth Ennis), and are looking for a director. From what I can tell, the biggest holdup with "Preacher" has been the comic's reputation. It is one of the most influential titles to come out of the 90s, near the top of many a comic fanboy's list of most desired film adaptations. Kevin Smith and Stephen King are fans. Nobody wants to get this one wrong.
But on the other hand, faithfulness to the original comic would result in a film or TV series with a pretty narrow audience. The "Preacher" comic is a fun, pulpy read, but touches on all sorts of hot-button issues, revels in graphic content, and has an extraordinarily sick sense of humor. One of the major recurring characters is a young man who calls himself Arseface, a Nirvana fan who was so distressed at the suicide of Kurt Cobain, he tried to follow suit and blew half his face off with a shotgun. And survived. Arseface goes through the comic with features so disfigured people retch at his presence, and his speech is so garbled it requires additional translation in subscripts. To add insult to injury, these horrors are constantly played for laughs.
I don't have any issues with films like "Preacher" and "Kick-Ass" being made. But when we're talking about content that's practically in the realm of John Waters and Harmony Korine, is it any wonder the studios have had second thoughts? If audiences were cringing at the plight of Precious Jones, how are they going to react to Arseface, who's just the tip of the iceberg when it comes to the carnival grotesques of the "Preacher" comic? In this case, the more determined the filmmakers are to be faithful, the less viable making the film becomes, but cutting anything would only alienate the existing die-hard fans. At least one version was scuttled because the studio wasn't willing to risk putting up a fairly modest figure for the film's budget. The fact that "Preacher" has actually come so close to being made is more surprising than anything else.
If we ever do get a "Preacher" movie, it won't be another big budget, frame-for-frame geek love letter like "Watchmen." That ship has sailed. I expect we'll either get something drastically watered down and mainstream friendly or else a low-budget cult film with the content intact. The current film-making system isn't capable of dealing with anything else.
I mentioned in yesterday's post that darker material from the comic book realm has been in ascendancy, as embodied by the runaway success of "The Dark Knight," and the willingness of a major studio to put some serious money into Zack Snyder's adaptation of the very adult "Watchmen." Though "Watchmen" didn't do well, it hasn't stopped filmmakers from continuing to push the boundaries with edgier comic properties like the upcoming "Kick-Ass." There's still a reluctance, though, to produce truly adult films based on comics for fear of alienating the younger demographic that most action films are geared towards. Of course, nobody seems to want to makes films for grown-ups these days in any genre, but it's interesting to look at some of the higher profile comic-based projects that are struggling with this barrier.
One title that's been percolating for a while now is "Preacher," based on the popular Garth Ennis comic that was published by DC's Vertigo imprint for mature readers in the 90s. I've read the entire run, all 66 issues, five specials, and the four-part miniseries collected into nine trade paperbacks. Centered around the adventures of a young Texan preacher who acquires supernatural powers akin to the literal Word of God, and declares war against the Almighty, it neatly straddles the line between Western and horror genres with a little passion play thrown in for good measure. In the right hands, I'm sure the material could spawn a good action film trilogy, or better yet a miniseries.
And over the last ten years, it's nearly been both, with the adaptation rights bouncing from the Weinsteins at Miramax to HBO to Columbia Pictures. James Marsden was attached to star. Then he wasn't. Sam Mendes was attached to direct. Then he wasn't. The latest word from producer Neal Moritz at the Collider yesterday was that they have a script (not one of the three drafts written by Garth Ennis), and are looking for a director. From what I can tell, the biggest holdup with "Preacher" has been the comic's reputation. It is one of the most influential titles to come out of the 90s, near the top of many a comic fanboy's list of most desired film adaptations. Kevin Smith and Stephen King are fans. Nobody wants to get this one wrong.
But on the other hand, faithfulness to the original comic would result in a film or TV series with a pretty narrow audience. The "Preacher" comic is a fun, pulpy read, but touches on all sorts of hot-button issues, revels in graphic content, and has an extraordinarily sick sense of humor. One of the major recurring characters is a young man who calls himself Arseface, a Nirvana fan who was so distressed at the suicide of Kurt Cobain, he tried to follow suit and blew half his face off with a shotgun. And survived. Arseface goes through the comic with features so disfigured people retch at his presence, and his speech is so garbled it requires additional translation in subscripts. To add insult to injury, these horrors are constantly played for laughs.
I don't have any issues with films like "Preacher" and "Kick-Ass" being made. But when we're talking about content that's practically in the realm of John Waters and Harmony Korine, is it any wonder the studios have had second thoughts? If audiences were cringing at the plight of Precious Jones, how are they going to react to Arseface, who's just the tip of the iceberg when it comes to the carnival grotesques of the "Preacher" comic? In this case, the more determined the filmmakers are to be faithful, the less viable making the film becomes, but cutting anything would only alienate the existing die-hard fans. At least one version was scuttled because the studio wasn't willing to risk putting up a fairly modest figure for the film's budget. The fact that "Preacher" has actually come so close to being made is more surprising than anything else.
If we ever do get a "Preacher" movie, it won't be another big budget, frame-for-frame geek love letter like "Watchmen." That ship has sailed. I expect we'll either get something drastically watered down and mainstream friendly or else a low-budget cult film with the content intact. The current film-making system isn't capable of dealing with anything else.
Monday, April 12, 2010
The "Kick Ass" Conundrum
Friday will see the release of "Kick-Ass," which its Wikipedia article describes as "the story of teenage Dave Lizewski who sets out to become a real life superhero only to get caught up in a bigger fight. He meets Big Daddy, a former cop who, in his quest to bring down an evil drug lord Frank D'Amico, has trained his eleven-year-old daughter to be the ruthless vigilante Hit-Girl." Based off the comic book of the same name, and generating some serious controversy for its profanity-heavy red band trailers featuring little Chloe Moretz cursing like a sailor, "Kick-Ass" has had the geekier fan sites like AICN buzzing for a while now. But from where I'm sitting, this thing isn't going to do much business.
To put it bluntly, I suspect that this is going to be "Watchmen" all over again. We've got another title that's hotly anticipated by a core group of comic-loving geeks, who have been singing the praises of the film's graphic content, completely heedless of the fact that even the title of "Kick-Ass" is giving a good chunk of theater owners heart palpitations and leaving mainstream audiences cold. The film is being listed as "Kick A" in some venues, with theater workers being coached to refer to the film as such. The last time this happened was Kevin Smith's "Zack and Miri Make a Porno," which had its posters censored and its DVDs sold in Wal-Mart as "Zack and Miri." That one didn't do so well either.
Fanboys may scoff at the prudish leanings of the corporate purveyors of our entertainment, but the awkward truth of the matter is that though everyone in Hollywood wants to make films that cater to the young adult male these days, they're only willing to go so far. The attitude that explicit sex and violence are always good, and the more extreme the better, is a common assumption among the young male audience. And it's the sort of attitude that gets films like "Kick-Ass" made. What these same fans don't seem to realize is that strong content tends to alienate everybody else, especially families and female viewers.
Everyone points to "The Dark Knight" as a darker superhero film that cleaned up at the box office, but "The Dark Knight" was also a PG-13 Batman film that played to all four quadrants of the general audience and couldn't have broken all those box-office records without all the non-geek viewers. Now compare the performance of "Dark Knight" with "Watchmen," a highly anticipated, heavily marketed, and very expensive R-rated superhero epic that garnered enthusiastic response from its target audience – and only its target audience. Niche audience films have been strong performers before, notably the "Twilight" films aimed at teenage girls, but the difference is that the "Twilight" films know they're niche and are made and marketed to scale. "Watchmen" was built up to be a blockbuster and fell on its face.
To the credit of director Matthew Vaughn and the film's other producers, "Kick-Ass" was made with a modest indie budget, so it has a good chance of making its money back. However, I am worried about how the film is being sold to appeal to broader audiences. The promotion has been mum on the film's adult content, which is probably going to cause headaches down the line. With a cast of superheroes wearing brightly colored costumes, and an eleven-year-old playing one of the leads, this has all the hallmarks of a general-audiences action comedy. Too much misleading advertising will undoubtedly lead to backlash from angry parents once they hear what's coming out of Hit-Girl's mouth – the red band trailer issue already has the MPAA looking at potential new rules for online content.
To put it bluntly, "Kick-Ass" is not going to be the breakout hit that the geeks are breathlessly anticipating. Geeks may be Hollywood darlings these days, but their values aren't the same as those of the mainstream audience. I don't envy Lionsgate, which has to be light on the cuts make the geeks happy while keeping angry parents at bay. But I wouldn't push too hard in either direction, because this one has the potential to blow up in everyone's faces.
To put it bluntly, I suspect that this is going to be "Watchmen" all over again. We've got another title that's hotly anticipated by a core group of comic-loving geeks, who have been singing the praises of the film's graphic content, completely heedless of the fact that even the title of "Kick-Ass" is giving a good chunk of theater owners heart palpitations and leaving mainstream audiences cold. The film is being listed as "Kick A" in some venues, with theater workers being coached to refer to the film as such. The last time this happened was Kevin Smith's "Zack and Miri Make a Porno," which had its posters censored and its DVDs sold in Wal-Mart as "Zack and Miri." That one didn't do so well either.
Fanboys may scoff at the prudish leanings of the corporate purveyors of our entertainment, but the awkward truth of the matter is that though everyone in Hollywood wants to make films that cater to the young adult male these days, they're only willing to go so far. The attitude that explicit sex and violence are always good, and the more extreme the better, is a common assumption among the young male audience. And it's the sort of attitude that gets films like "Kick-Ass" made. What these same fans don't seem to realize is that strong content tends to alienate everybody else, especially families and female viewers.
Everyone points to "The Dark Knight" as a darker superhero film that cleaned up at the box office, but "The Dark Knight" was also a PG-13 Batman film that played to all four quadrants of the general audience and couldn't have broken all those box-office records without all the non-geek viewers. Now compare the performance of "Dark Knight" with "Watchmen," a highly anticipated, heavily marketed, and very expensive R-rated superhero epic that garnered enthusiastic response from its target audience – and only its target audience. Niche audience films have been strong performers before, notably the "Twilight" films aimed at teenage girls, but the difference is that the "Twilight" films know they're niche and are made and marketed to scale. "Watchmen" was built up to be a blockbuster and fell on its face.
To the credit of director Matthew Vaughn and the film's other producers, "Kick-Ass" was made with a modest indie budget, so it has a good chance of making its money back. However, I am worried about how the film is being sold to appeal to broader audiences. The promotion has been mum on the film's adult content, which is probably going to cause headaches down the line. With a cast of superheroes wearing brightly colored costumes, and an eleven-year-old playing one of the leads, this has all the hallmarks of a general-audiences action comedy. Too much misleading advertising will undoubtedly lead to backlash from angry parents once they hear what's coming out of Hit-Girl's mouth – the red band trailer issue already has the MPAA looking at potential new rules for online content.
To put it bluntly, "Kick-Ass" is not going to be the breakout hit that the geeks are breathlessly anticipating. Geeks may be Hollywood darlings these days, but their values aren't the same as those of the mainstream audience. I don't envy Lionsgate, which has to be light on the cuts make the geeks happy while keeping angry parents at bay. But I wouldn't push too hard in either direction, because this one has the potential to blow up in everyone's faces.
Sunday, April 11, 2010
On the "Runway"
Another season of "Project Runway" is drawing to a close. It's my favorite reality show despite my disinterest in fashion generally. I love shows about people making things, as I've mentioned before. I love seeing creative, skilled professionals at work. I have a special fondness for movies about artists and composers, and I find any depictions of the creative process fascinating. So it's no wonder I like watching the "Project Runway" designers turn out dozens of garments, with the cameras following every step in the process from initial sketching to the final catwalk twirl. On the other hand, there are a lot of other elements at play that help to make it work.
"Runway" is now in its seventh season, the second to be broadcast on Lifetime. I've followed all of them since the beginning. There have been good and bad years, and it's been fascinating to see how delicate the show's alchemy for success really is. Last season was set in Los Angeles instead of New York, featured guest judges for most of the early rounds of competition instead of regulars Michael Kors and Nina Garcia, and most of the contestants were already established professionals. It was a lousy season, full of pedestrian personalities and pedestrian clothing. The entire atmosphere was different, from tamer challenges to milder critiques in the judging sessions to a bevy of finalists who looked like they belonged on the modeling side of the show.
The current season, by contrast, is back to its old tricks and much more fun to watch. The past week saw one designer and one model drop out of the competition by their own choice, the return of another designer to fill in the empty slot, a celebrity judge whose comments led to the declaration of two winners instead of one, and finally this week's eliminated designer going home after constructing three different dresses that never quite came together. There's enough going on in one good episode of "Project Runway" to fill a whole season of a lesser program, and the best part is that the tension is focused around the creative process rather than interpersonal drama.
"Runway" doesn't engage in the same amount of manipulative editing as you see on other shows – there is some to accentuate the wilder personalities and build up tension in the elimination segments – and the drama that does result tends to be born out of the extremes of the competition where the designers are essentially working around the clock for days on end. From clips that have surfaced in retrospective programs and from unedited segments online, it's surprising to realize how little most of the footage has been touched. It certainly comes through in the show itself, where you get a very genuine vibe from what's going on onscreen.
And I suspect that this is one of the major draws of the show – it's pulling back the curtain on an industry that is obsessed with appearances, to let us see a little of the rough-edged reality behind those million-dollar dresses worn by anorexic twenty-year olds. The lack of perfection - the quirkiness and volatility of the participants, the chaos and pandemonium before every runway show, and the occasional catastrophe – bring the world of high fashion back to the realm of mortals and turn the spotlight on the designers. It's still plenty pretentious and elitist, one of the few places on television where this is a plus, but now it's also more accessible and we get to see more of the players involved.
The show's hosts are iconic precisely because they're so pleasantly off kilter and go against expectations. Host and model Heidi Klum seems to be perpetually pregnant, yet still always looks fabulous, speaks in a thick Germanic accent, and bids each departing competitor "Auf wiedersehen." She's also one of the tougher critics on the show's judging panel. Tim Gunn, the indispensable mentor figure, shepherds the designers through the competition and provides early critique. He's equal parts wry, British sophisticate and den mother, and has proven so popular he briefly had his own show in 2007, "Tim Gunn's Guide to Style." He's an unlikely icon of American fashion, yet also a long overdue one.
The New York Times ran a piece last year noting that young women who want to work in fashion these days no longer want to be models exclusively, but have shifted their aims toward being designers and fashion editors too. The trend was directly attributed to the popularity of shows like "Project Runway" and "Ugly Betty" and their portrayals of the fashion industry. And no surprise. The contestants on "Project Runway," harried and stressed as they are, always look like they're having way more fun than the poor girls on "America's Next Top Model." Sometimes you have to peek behind the glitz and glamor to find the people who are really worth talking about, and as the age of the supermodel declines, I look forward to the rise of the "Project Runway" generation.
"Runway" is now in its seventh season, the second to be broadcast on Lifetime. I've followed all of them since the beginning. There have been good and bad years, and it's been fascinating to see how delicate the show's alchemy for success really is. Last season was set in Los Angeles instead of New York, featured guest judges for most of the early rounds of competition instead of regulars Michael Kors and Nina Garcia, and most of the contestants were already established professionals. It was a lousy season, full of pedestrian personalities and pedestrian clothing. The entire atmosphere was different, from tamer challenges to milder critiques in the judging sessions to a bevy of finalists who looked like they belonged on the modeling side of the show.
The current season, by contrast, is back to its old tricks and much more fun to watch. The past week saw one designer and one model drop out of the competition by their own choice, the return of another designer to fill in the empty slot, a celebrity judge whose comments led to the declaration of two winners instead of one, and finally this week's eliminated designer going home after constructing three different dresses that never quite came together. There's enough going on in one good episode of "Project Runway" to fill a whole season of a lesser program, and the best part is that the tension is focused around the creative process rather than interpersonal drama.
"Runway" doesn't engage in the same amount of manipulative editing as you see on other shows – there is some to accentuate the wilder personalities and build up tension in the elimination segments – and the drama that does result tends to be born out of the extremes of the competition where the designers are essentially working around the clock for days on end. From clips that have surfaced in retrospective programs and from unedited segments online, it's surprising to realize how little most of the footage has been touched. It certainly comes through in the show itself, where you get a very genuine vibe from what's going on onscreen.
And I suspect that this is one of the major draws of the show – it's pulling back the curtain on an industry that is obsessed with appearances, to let us see a little of the rough-edged reality behind those million-dollar dresses worn by anorexic twenty-year olds. The lack of perfection - the quirkiness and volatility of the participants, the chaos and pandemonium before every runway show, and the occasional catastrophe – bring the world of high fashion back to the realm of mortals and turn the spotlight on the designers. It's still plenty pretentious and elitist, one of the few places on television where this is a plus, but now it's also more accessible and we get to see more of the players involved.
The show's hosts are iconic precisely because they're so pleasantly off kilter and go against expectations. Host and model Heidi Klum seems to be perpetually pregnant, yet still always looks fabulous, speaks in a thick Germanic accent, and bids each departing competitor "Auf wiedersehen." She's also one of the tougher critics on the show's judging panel. Tim Gunn, the indispensable mentor figure, shepherds the designers through the competition and provides early critique. He's equal parts wry, British sophisticate and den mother, and has proven so popular he briefly had his own show in 2007, "Tim Gunn's Guide to Style." He's an unlikely icon of American fashion, yet also a long overdue one.
The New York Times ran a piece last year noting that young women who want to work in fashion these days no longer want to be models exclusively, but have shifted their aims toward being designers and fashion editors too. The trend was directly attributed to the popularity of shows like "Project Runway" and "Ugly Betty" and their portrayals of the fashion industry. And no surprise. The contestants on "Project Runway," harried and stressed as they are, always look like they're having way more fun than the poor girls on "America's Next Top Model." Sometimes you have to peek behind the glitz and glamor to find the people who are really worth talking about, and as the age of the supermodel declines, I look forward to the rise of the "Project Runway" generation.
Friday, April 9, 2010
Why is "Merlin" on at 10PM?
I caught a couple of episodes of the BBC "Merlin" series during its summer run on NBC last year. It's a fun show, a reimagining of the Arthur mythology to fit a sort of "Smallville" prequel template, set in the years before Arthur became king. All the familiar characters are teenagers or young adults, including a version of the wizard Merlin who is actually younger than the Arthur that he's charged with protecting. NBC ran it an episode or two at a time in 8-10PM Sunday night slots, but it didn't do well enough for them to pick up the second season. So the Syfy (formerly ScFi) Channel stepped in to pick up the rights - and scheduled new episodes for Fridays at 10PM.
This is some pretty bizarre scheduling. "Merlin" was explicitly made to be accessible to younger viewers. The BBC airs it on Saturdays, usually as early evening "tea time" programming. NBC ran it in prime time, starting at 9PM, but edging up to the more family-friendly 8PM timeslot by the end of its run. Even taking into account that more kids might be up late on Friday nights because of the weekend, 10PM is traditionally reserved for more adult programming like police procedurals and dramatic serials. With its CGI dragons and occasional slapstick, "Merlin" has absolutely no content that would require it to air on the cusp of late night. I wouldn't be surprised if the encore broadcasts in the 7PM hour end up outperforming the premieres.
The 8PM and 9PM slots preceding "Merlin" are filled by episodes of "Stargate Universe," the third incarnation of the "Stargate" franchise for those keeping track, and one of the Syfy network's original programs. Understandably they want to give their own show the benefit of more exposure through earlier, more accessible timeslots. However, the audience for "Stargate" skews older and its content is a bit racier. There's nothing on the level of "Battlestar Galactica," but it's not exactly all-ages viewing either. Having this kind of lead-in plus the counterintuitive scheduling is going to make it all but impossible for "Merlin" to attract a good chunk of its intended audience.
Syfy has programmed several British genre shows before, including ITV's "Primeval," the resurrected "Doctor Who," and its spinoff for younger viewers, "The Sarah Jane Adventures." "Doctor Who" was the only one that lasted longer than a year, occupying a comfortable 9PM berth between "Stargate" and "Battlestar Galactica" on the Friday schedule for several years. "Merlin" was probably picked up to appeal to the same viewers, but the show has the potential to play to a much broader audience than "Doctor Who" or any of the others, and it's an awful shame to see it buried so far out of sight. Surely Syfy could swap "Merlin" over into an earlier timeslot somewhere else on its schedule where the kids could find it? There's no reason why all their British acquisitions have to air on Friday night.
There's such a lack of decent kid-friendly genre programming these days, I can't help but think "Merlin" could do so much better with the CW or ABC Family or other networks with more experience with younger viewers. NBC did a pretty dismal job of marketing the show when they had it, playing up the geeky Arthurian elements instead of the photogenic, multiracial young cast that sets it apart from similar shows. Even the eye-catch screens were godawful looking. Syfy isn't doing much better so far, though they're getting a lot of mileage out of marathoning the older episodes.
If "Merlin" was a US production it probably wouldn't have lasted beyond the first season. Fortunately for fans, it’s doing well enough on the BBC and in other countries to ensure that more series will be produced. But I don't see it lasting on American television for much longer – at least, not where anybody not looking for it I will find it.
This is some pretty bizarre scheduling. "Merlin" was explicitly made to be accessible to younger viewers. The BBC airs it on Saturdays, usually as early evening "tea time" programming. NBC ran it in prime time, starting at 9PM, but edging up to the more family-friendly 8PM timeslot by the end of its run. Even taking into account that more kids might be up late on Friday nights because of the weekend, 10PM is traditionally reserved for more adult programming like police procedurals and dramatic serials. With its CGI dragons and occasional slapstick, "Merlin" has absolutely no content that would require it to air on the cusp of late night. I wouldn't be surprised if the encore broadcasts in the 7PM hour end up outperforming the premieres.
The 8PM and 9PM slots preceding "Merlin" are filled by episodes of "Stargate Universe," the third incarnation of the "Stargate" franchise for those keeping track, and one of the Syfy network's original programs. Understandably they want to give their own show the benefit of more exposure through earlier, more accessible timeslots. However, the audience for "Stargate" skews older and its content is a bit racier. There's nothing on the level of "Battlestar Galactica," but it's not exactly all-ages viewing either. Having this kind of lead-in plus the counterintuitive scheduling is going to make it all but impossible for "Merlin" to attract a good chunk of its intended audience.
Syfy has programmed several British genre shows before, including ITV's "Primeval," the resurrected "Doctor Who," and its spinoff for younger viewers, "The Sarah Jane Adventures." "Doctor Who" was the only one that lasted longer than a year, occupying a comfortable 9PM berth between "Stargate" and "Battlestar Galactica" on the Friday schedule for several years. "Merlin" was probably picked up to appeal to the same viewers, but the show has the potential to play to a much broader audience than "Doctor Who" or any of the others, and it's an awful shame to see it buried so far out of sight. Surely Syfy could swap "Merlin" over into an earlier timeslot somewhere else on its schedule where the kids could find it? There's no reason why all their British acquisitions have to air on Friday night.
There's such a lack of decent kid-friendly genre programming these days, I can't help but think "Merlin" could do so much better with the CW or ABC Family or other networks with more experience with younger viewers. NBC did a pretty dismal job of marketing the show when they had it, playing up the geeky Arthurian elements instead of the photogenic, multiracial young cast that sets it apart from similar shows. Even the eye-catch screens were godawful looking. Syfy isn't doing much better so far, though they're getting a lot of mileage out of marathoning the older episodes.
If "Merlin" was a US production it probably wouldn't have lasted beyond the first season. Fortunately for fans, it’s doing well enough on the BBC and in other countries to ensure that more series will be produced. But I don't see it lasting on American television for much longer – at least, not where anybody not looking for it I will find it.
Subscribe to:
Posts (Atom)