"The man quickly walked in, held his arm out, and tracing his forearm, said, 'Kids, remember, cut this way for attention, and this way for results, ' and then quickly walked off", the woman reported anonymously.
The mother, Free Hess, said she saw a video back in July 2018 after another mother told her about it. A video was found showing kids how to commit suicide.
"How can anyone do this?"
Hess, who is a pediatrician, alerted YouTube to pull down the video, and she said it took about a week for the firm to take it down. One video titled "Monster School: SLENDERMAN HORROR GAME", features a character enacting a school shooting.
"This is not OK". The statement said, "We appreciate people drawing problematic content to our attention, and make it possible for anyone to flag a video".
She campaigned to have the clip removed but found this month that it had resurfaced on the site, apparently hidden in the middle of a cartoon.More news: Labour MP explains exactly how a second Brexit referendum would work
More news: Journalist Jorge Ramos, Univision Crew Reportedly Detained in Venezuela
More news: First Saudi female ambassador replaces king's son in US
"All of these videos were found on YouTube Kids, a platform that advertises itself to be a safe place for children 8 years old and under", she wrote.
ArsTechnica added, "Videos have been found with adult content ranging from foul-language to depictions of mass shootings, alcohol use, fetishes, human trafficking stories, and sexual situations".
Most parents feel pretty safe letting their children watch YouTube Kids, the child-friendly version of the video platform. The company took down the video after a few days of flagging by concerned viewers, including Hess, according to CNN. She has been working towards bringing these videos down after an outcry from parents and child care professionals alike. "We agree this content is unacceptable and are committed to making the app better every day".
"I had to stop, but I could have kept going", Hess said. "For children who have been exposed, they've been exposed. Flagged videos are manually reviewed 24/7 and any video that don't belong in the app are removed".
"But no system is ideal and inappropriate videos can slip through, so we're constantly working to improve our safeguards and offer more features to help parents create the right experience for their families", the website's description says.
'We have to start doing something NOW and we should start by educating ourselves, educating our children, and speaking up when we see something that is risky for our children'.
"We also need to fight to have the developers of social media platforms held responsible when they do not assure that age restrictions are followed and when they do not remove inappropriate and/or unsafe material when reported", she said.