OMG: Has Johnny Depp been boycotted by Hollywood? The Reason Will Break Your Heart

Has Johnny Depp been boycotted by Hollywood?

Johnny Depp is one of the most popular and celebrated actors of the Hollywood entertainment industry and when it comes to getting love of fans from all over the world, he has been receiving the same ever since the beginning of his career. Throughout his entire professional journey, he has been striving to get better and become a better version of him. However, an incident recently might have just played a spoilsport and come as a speedbreaker in his life.

As per the latest reports in Masala, Johnny in an interview with The Sunday Times has revealed that ever since he lost the case to Amber Heard last year in November, he has been dealing with a boycott in the industry. He was quoted by them saying,

“Looked those people in the eyeballs and promised we would not be exploitative. That the film would be respectful. I believe that we’ve kept our end of the bargain, but those who came in later should also maintain theirs. Some films touch people and this affects those in Minamata and people who experience similar things. And for anything for Hollywood’s boycott of me? One man, one actor in an unpleasant and messy situation, over the last number of years?”

He further added,

“Whatever I’ve gone through, I’ve gone through. But, ultimately, this particular arena of my life has been so absurd.” He also hasn’t forgotten his fans. “They have always been my employers. They are all our employers. They buy tickets, merchandise. They made all of those studios rich, but they forgot that a long time ago. I certainly haven’t.”

What’s your take on this folks? Let us know your views in the comments below and for more updates, stay tuned to IWMBuzz.com