Economics professor and finance YouTuber Patrick Boyle has built a channel with over a million subscribers, where his videos routinely pull hundreds of thousands of views. But two of his most-watched videos, both covering Jeffrey Epstein, were demonetized by YouTube in quick succession.
The first incident followed a video Boyle published examining how Epstein accumulated his wealth. The entire channel went dark. Boyle described the moment in a conversation with Hidden Forces host Demetri Kofinas: “I was getting tons of views and it’s kind of exciting when you’re getting way more views than you normally do and then suddenly no views and I thought what’s gone wrong why am I getting no views and the whole channel had been demonetized.”
He reached YouTube through its support chat. A representative reviewed the situation and restored the channel, but left that one Epstein video offline. When Boyle pressed for an explanation, he was told it had been reviewed by the AI and found inappropriate. He asked for a human review. Ten minutes later, the answer was the same. “They said it’s been reviewed by a person and it’s inappropriate,” he recalled. He accepted it at the time, telling himself it might simply be a mistake.
The second incident was harder to brush off. Boyle published a video titled “The Epstein Files Are Worse Than You Think,” walking through what he described as apparent problems with the government’s release of Epstein-related documents. He was deliberate about the framing: “No one is going to sort of someone who doesn’t want to hear about the horrors of Epstein is going to click on that.”
Within hours of publishing, the video was demonetized. At that point it was on pace to become his biggest day ever, reaching a million views in 24 hours, a first for his channel. He posted a community notice to his audience and got some additional traction from people sharing it elsewhere, but the momentum was effectively finished.
Boyle is careful not to frame either incident as a deliberate act of suppression. What concerns him more is the structural effect on smaller creators. “If you’re a small YouTube channel, you’ve got 5 or 10,000 subscribers and you’re trying to grow a channel, if you keep getting dinged on videos about certain topics, you’re going to adjust your content to avoid that.”
He pointed to the practice known as “algo speak,” where younger creators substitute words to avoid triggering automated review systems, using phrases like “unalive” instead of words that might flag their content.
For Boyle, the deeper concern is not any single decision, but what these systems collectively do to the range of information that finds an audience. As he put it: “Do we live in a world of kind of algorithmic censorship?”