The app, called Social Protect, uses AI to ensure that athletes see as few offensive messages as possible coming their way.
It automatically scans incoming social media messages on platforms like Instagram, Facebook, TikTok and YouTube in real time, looking for more than two million offensive words and phrases in its database.
Any posts containing the terms will be automatically hidden in comment sections or replies to athletes, who can also add words or phrases they find disturbing.
Company founder Shane Britten compares the app to antivirus software that runs unnoticed in the background.
“The goal is to keep the comments section free of racism, hate and scams – all the terrible things that can exist on social media,” he said.
But the software is not without errors. The contract paid for by UK Sport does not include social platform X, formerly known as Twitter, which a BBC Sport investigation found is the source of 82% of abuse sent to football managers and players.
The terms of the deal also mean the system can only scan messages that have been made public – abusive direct messages sent to athletes will still be visible.
Some services can block offensive direct messages, but require users to provide their private login credentials to third-party companies and are typically more expensive.
#British #Olympic #Paralympic #athletes #protected #online #abuse


