I don't care about Hollywood films. I'm not against Hollywood films, you know? Hollywood films were very good before, in the 1950s.
I think Hollywood... well, there is no Hollywood anymore so let's just call it the mainstream since the business is no longer Hollywood producing its own films and then distributing, they just distribute.
I became super claustrophobic with Hollywood. I don't like Hollywood. I don't like what it represents. I think that Hollywood is great if you're an actor, actress, established. But for reality people or what people perceive as reality, it's tough. People are constantly discrediting everything you put on camera.
Well, I don't think Hollywood's a dirty word at all, I love a lot of Hollywood films.
There's nothing in Hollywood that's inherently detrimental to good art. I think that's a fallacy that we've created because we frame the work that way too overtly. 'This is Hollywood.' 'This isn't Hollywood.' It's like, 'No, this is actually all Hollywood.' People are just framing them differently.
The primary job for women in Hollywood is still super-attractive actress. That is the most high-profile women's job in Hollywood.
Of course, Hollywood is still making some excellent pictures which reflect the great artistry that made Hollywood famous throughout the world, but these films are exceptions, judging from box office returns and press reviews.
Film is universal. All the countries of the world are making films. Hollywood is the only major unsubsidized center for films. To my knowledge all others are at least partially subsidized. I'm glad Hollywood isn't.
When you think about it there's never really been a realistic exposition of Hollywood, I mean - from the inside - showing Hollywood what it can do, what it has done, to people.
I have no interest in changing Hollywood. Hollywood is a place so consumed by the spirit of the world that I don't even want to try to think about how to infiltrate that.
You don't see Indians in Hollywood films around which a story can revolve. As soon as we have a social presence in your society, I am sure there will be many actors from our part of the world that will be acting in Hollywood films.
Hollywood's thinking is very typical. And it's just really predictable too. And I think at Hollywood, these box office movies are flopping. I mean, there hasn't been an original thought coming out of Hollywood since the '80s.
I think Hollywood's gotten more reactionary and conservative over the years, because there's no longer art in Hollywood. Art suffers in Hollywood.
I think the message has already been sent to Hollywood, which is that this kid's a hard worker, he's talented, and people are coming out to see him. And when you have box-office results, Hollywood treats you different. Hollywood stands up.
I never had a desire to leave mainstream Hollywood. And still don't think that I've left mainstream Hollywood.
Hollywood has successfully produced many films framed by anti-racist or pro-integrationist story lines. I'm going to guess that since 'Gone With The Wind,' Hollywood realized films about racism and segregation pull at the heartstrings of everyone and hopefully serve to purge a sense of guilt.