NEW YORK — Apple and Google removed three dating apps from their app stores after the US government said the apps could expose children to sexual predators.
The Federal Trade Commission said in a statement Monday that the dating apps — Meet24, FastMeet and Meet4U — were in potential violation of the Children’s Online Privacy Protection Act. COPPA prevents websites and apps from collecting personal information from children under 13 without parental consent.
The three apps are owned by a Ukraine-based company called Wildec LLC. The FTC said the company allowed minors 13-years-old and younger to be contacted by other users, contradicting the apps’ privacy policies, which claimed they would follow necessary laws, including COPPA.
The FTC said its staffers found users as young as 12 on the app. The app collected loads of information about them, including their birthdays and location data. The FTC said the apps could cause “substantial consumer injury.”
“Several individuals have reportedly faced criminal charges for allegedly contacting or attempting to contact minors using Wildec’s apps,” the FTC said in a letter addressed to the company. The agency said Wildec was “aware that children under 13 were using all three apps.”
The FTC said Apple and Google took the apps out of the iTunes App Store and the Google Play Store. Wildec’s owner didn’t immediately respond to CNN Business’ request for comment. Apple and Google also didn’t immediately comment.
In the letter to Wildec, the FTC said the apps should “immediately remove personal information from children on the three apps.” It also told Wildec it must seek parental consent before allowing minors to access the apps.
The FTC issued a consumer advisory on dating apps, highlighting Wildec’s apps.
“Adults — including sexual predators — can search by age and location to identify children nearby,” the advisory warned.
Regulators around the world have increasingly cracked down on child exploitation over the internet and on apps. The UK government said could criminally charge tech executives whose companies host unlawful content and material that is damaging to individuals or the country.
The UK said in April that an independent regulator would be created to enforce the new rules, which focus on removing content that incites violence, encourages suicide or constitutes cyber-bullying. Content related to terrorism and child exploitation would face even stricter standards.