YouTube is tightening its restrictions on gambling-related content as sports betting and online prediction markets continue to grow in popularity across the United States.
On Tuesday, the platform announced that it will prohibit content directing users to “unapproved” gambling websites through links, images, text, logos, or verbal mentions. According to YouTube, unapproved gambling sites are those that fail to meet local legal requirements or have not been reviewed by YouTube or its parent company, Google.
This policy update strengthens YouTube’s existing rules, which already ban linking to external sites that violate its guidelines, including unauthorized gambling platforms.
“We’ve reinforced our policies to prohibit content that directs viewers to unapproved gambling websites or applications,” YouTube spokesperson Boot Bullwinkle told CNN ahead of the announcement. “We will also start age-restricting content that promotes online casinos.”
With the new policy, users under 18 and those not logged into an account will be blocked from accessing content that depicts or promotes online betting platforms.
The online sports betting industry has boomed since the U.S. Supreme Court allowed states to legalize sports gambling in 2018. This surge in popularity has also fueled interest in other types of online betting, such as wagers on election outcomes.
YouTube videos promising to teach viewers how to profit from online sports betting and prediction markets have racked up hundreds of thousands of views.
However, gambling regulations differ by region, and experts warn that millions of Americans are at risk of developing severe gambling addictions.
YouTube has long prohibited content that uses sensational language to promise guaranteed winnings or loss recovery from online betting sites. The platform now clarifies that any content guaranteeing returns — even from approved sites — will be removed.
This announcement marks YouTube’s latest effort to refine its content moderation practices. In recent years, the platform has taken decisive action to curb harmful content, including videos spreading false claims about vaccines and abortion or promoting dangerous behaviors related to eating disorders.
In 2023, YouTube also introduced rules requiring creators to disclose AI-generated content that could mislead viewers.
For many social media platforms, the ongoing challenge lies not in establishing content moderation policies but in consistently enforcing them. YouTube has faced criticism — alongside other tech giants — for failing to adequately uphold its own guidelines.
The updated policy will take effect on March 19, with enforcement beginning immediately thereafter.