Some of the terms suggested reportedly included sexual and child abuse, as well as videos of the Florida school shooting.
Some users got a nasty surprise when searching for video.
Even after the offensive search terms stopped being displayed, users still reported odd algorithmic suggestions, seemingly far from what Facebook would normally offer, such as "zodwa wabantu videos and pics" (a South African celebrity) and "cristiano ronaldo hala madrid king video call".
The social media giant has since come out and apologized for the error, and said that they are now investigating the matter. "As soon as we became aware of these offensive predictions we removed them", said the social network in a statement to TheWrap.
Salmonella infections linked to Kratom — CDC
The CDC's initial announcement identified 28 people who fell sick across 20 states between October 12, 2017 and January 308. Nationally, the outbreak has sickened 87 people in 35 states, according to the Centers for Disease Control and Prevention .
"We're very sorry this happened", a Facebook spokesperson told CNN. The company said it's looking into the matter and working to improve the search feature.
Since Facebook bans nudity and sexually explicit content on the platform, the search should not surface explicit content.
The new initially broke on Thursday, as multiple users posted images to twitter showing the inappropriate results that showed up after users entered the words "video of" into the Facebook search bar. People were also cognizant of the fact that the search suggestions did not yield the usual items they see when they look for a particular term. Facebook is not the first company to fall foul of its own search algorithm. Facebook's vice president of product, Guy Rosen, issued an apology, saying the platform has no plans of allowing the behavior on the site and that the survey has been a mistake. In 2016, Google removed the predictive suggestion "are Jews evil" on its search engine.