The YouTube video shows two women, dressed in suits and ties. They smile; they sniffle back tears; they gaze into each other's eyes. They are reading their wedding vows to one another.
The four-minute video titled "Her Vows" contains no nudity, violence or swearing. There's no revealing clothing. No one is engaging in activities that have a "high risk of injury or death." And yet, YouTube has deemed the video unsuitable for people under 18.
Several YouTube users, many of them in the lesbian, gay, bisexual and transgender community, have been complaining that their videos are categorized as "restricted" for no obvious reasons. Besides the vows, targeted videos include coming out stories and one from YouTube celebrity Tyler Oakley titled "8 Black LGBTQ+ Trailblazers Who Inspire Me."
U.S. & World
The day's top national and international news.
After several days of complaints, Google hinted Monday that it might have made a mistake and said it was investigating.
The "restricted" designation lets parents, schools and libraries filter out content that isn't appropriate for users under 18. Turning on the restriction makes videos inaccessible. YouTube calls it "an optional feature used by a very small subset of users."
It's unclear whether the types of videos in question are now being categorized as "restricted" for the first time, or whether this is a long-standing policy that is only now getting attention.
The complaints spawned the hashtag #YouTubeIsOverParty over the weekend. One person even made a video to voice her complaints.
YouTube said in a tweet Sunday that LGBTQ videos aren't automatically filtered out, though some discussing "more sensitive issues" might be restricted. But the company, which is owned by Google, did not specify what it counts as "more sensitive issues."
In an emailed statement on Monday, YouTube said "some videos that cover subjects like health, politics and sexuality may not appear for users and institutions that choose to use this feature." In the case of LGBT topics, which are by definition intertwined with health, politics and sexuality, filtering out what is and isn't appropriate can be difficult.
YouTube followed that statement with another hours later: "We recognize that some videos are incorrectly labeled by our automated system and we realize it's very important to get this right. We're working hard to make some improvements." The statement offered no further explanation.
YouTube content creators can decide to age-restrict their videos themselves. But that's just one of the ways sensitive content is filtered out. YouTube says it also uses "community flagging," which means users who have a problem with content in a video can flag it to YouTube for possible restrictions or removal.
But just because something is flagged, it is not automatically removed. Once a video is flagged, YouTube says it reviews it.
"If no violations are found by our review team, no amount of flagging will change that and the video will remain on our site," YouTube says in its online support page.
What sorts of content gets filtered out in restricted mode can vary by region, based on countries' varying community standards. In general, though, it includes "sexually explicit language or excessive profanity," or violence or disturbing content, according to YouTube's policies.
YouTube's rules also state that videos "containing nudity or dramatized sexual conduct may be age-restricted when the context is appropriately educational, documentary, scientific or artistic. Videos featuring individuals in minimal or revealing clothing may also be age-restricted if they're intended to be sexually provocative, but don't show explicit content."
Videos that show adults engaging in "activities that have a high risk of injury or death" may also be age-restricted.