Jump to content

What Can Players And Clubs Do About 'AI Slop'?

From kaostogel


By.
Dale Johnson


Football problems correspondent


2 March 2026


506 Comments


You do not have to look far on social networks to discover images and videos of footballers in not likely or bizarre situations.


Scroll through TikTok and you might soon stumble across Lionel Messi and Cristiano Ronaldo cutting each other's hair, or boarding the Titanic in Edwardian dress. You may even see Kylian Mbappe on a ski-lift with a turtle.


This is the result of the exponential growth of expert system (AI). Or, more precisely, AI 'slop'.


AI can be asked to provide practically anything. By anyone. The tools are becoming ever more advanced and easily accessible.


It will end up being even harder to spot what is real and what is, in AI terms, deepfake.


It may appear, for the many part, like harmless enjoyable. After all, who really believes Messi and Ronaldo have been serving burgers?


But is there a point at which players and clubs will attempt to fix a limit?


Options are restricted for players to act


As football has actually become a commercial juggernaut, gamers and clubs have actually had to discover how to look after their brands.


That might be by protecting the club crest or challenging the use of a player's name in unauthorised marketing product.


Take Chelsea midfielder Cole Palmer, who has trademarked the term 'Cold Palmer' with the UK government's Intellectual Property Office. The 23-year-old did the same with his name, autograph and signature 'shivering' event.


Creating securities is one thing. Having the ability to tackle this brand-new AI world of unrelenting material is another.


In the UK there is minimal legislation covering somebody's similarity. Or, as it is employed football, image rights.


Jonty Cowan, legal director at law company Wiggin LLP, informed BBC Sport that AI was providing "great deals of novel obstacles".


" Various governments all over the world are trying to figure out ... how do we react to AI?" said Cowan.


AI is being utilized to put gamers into real-life situations, as well as those more undoubtedly phony.


Take the unveilings of Antoine Semenyo and Marc Guehi by Manchester City in January.


The club's main pictures reveal each gamer with director of football Hugo Viana. Yet before those pictures had even been taken, you might find AI images of Semenyo and Guehi signing an agreement along with manager Pep Guardiola.


There was another of Semenyo being greeted at the training centre by previous gamer Yaya Toure, whose old team number - 42 - he was expected to take.


None of these events happened, however it was difficult to tell the photos were phony.


Last month, an image appeared of Manchester United head coach Michael Carrick with Frank Ilett - the supporter who won't cut his hair until the Red Devils win 5 video games in a row.


Once once again, it did not occur however looks so reasonable.


And Cowan said it was difficult for there to be any option when material is presented "in a non-contentious manner".


Unless a person has suffered commercial or reputational damage, alternatives are restricted.


" It's constantly been quite challenging for an individual to impose IP rights," Cowan stated. "If it is a deepfake that is revealing them in a compromising position, let's say, that's various."


The Data (Use and Access) Act came into force last month, making it a criminal offence to produce, share or request a raunchy deepfake.


But then you have AI-generated videos such as Celtic's Luke McCowan punching an assistant referee. Could it damage his reputation, or is it simply not believeable?


A more pressing concern for players might be 'passing off'. This is where somebody unjustly associates their own product and services with the reputation and goodwill of an established brand or company - or player.


It is meant to deceive customers into believing they connected to it - to the hinderance of the established brand.


Cowan described that in December 2024, as part of an AI-related assessment, the UK federal government stated it was thinking about "introducing some sort of character right".


That would offer a gamer more scope to do something about it.


Clubs, for their part, have a couple of more choices open to them.


Social network accounts putting players in the t-shirts of their new team - or any group - is absolutely nothing new.


But what if a club wanted to take concern?


" Where you have actually got, for example, the Man City package they might look at other IP rights," Cowan said.


" Have they infringed the trademark in their crest? Or design rights in their t-shirt? For that type of image, that's what a club or an individual would likely be taking a look at."


BBC Sport comprehends City believe fans understand official channels stay the only locations to opt for any real news, images or videos.


But as the lines blur further, will clubs keep that position?


Tackling platforms more realistic than court action


While clubs and players may consider taking the developers of AI images to court, it is a long and pricey fight.


Cowan states there is a quicker and more affordable route: challenge the platforms straight.


" The Online Safety Act has been introduced in the UK just recently, and that is putting an obligation on platforms to take on unlawful content," he added.


" It might well be that we will see more systems that platforms will introduce to have actually that content taken down. Often, that is the simplest and quickest way to take on these images."


This might result in a growth in companies looking after the digital rights of clubs and players.


Those that already exist scrape sites and apps - utilizing AI, of course - to recognize where a business's copyright or an individual's image may have been utilized.


They can ask for takedowns, successfully tackling using AI without the affected parties getting straight included.


Bad actors may use AI for wicked means


AI provides opportunities in addition to problems. Adverts and promotional material can be developed without gamers even needing to leave their homes.


But along with the real AI-generated adverts, it is simple for unauthorised parties to take a player's likeness and use it to promote their company.


Last year the oversight board that runs Meta's appeals procedure prohibited an advert for a betting app on Facebook, external that was developed utilizing AI.


It featured a manipulated video of previous Brazil striker Ronaldo which mimicked his voice. It was not gotten by Meta's automated detection tools.


Meta was informed to develop "quickly recognizable signs that distinguish AI material" to prevent "significant amounts of rip-off content".


It was a prime example of a platform being and forced to act.


The Football Association has needed to tackle controversy, too.


England head coach Gareth Southgate was targeted during Euro 2024. Fake AI-generated interviews revealed Southgate making derogatory remarks about his players.


The videos were reported and removed. They were found to have breached TikTok's AI-generated policy, which prohibits content that "wrongly reveals public figures in particular contexts".


But by that point, the videos had been seen and shared by countless people.


Should users be forced to state they have used AI?


Scrolling through apps today, it is rare for anybody to show AI has been utilized.


That is even with TikTok's community guidelines asking users to "identify sensible AI-generated content" and banning content considered to "harmfully misguide or impersonate others".


Cowan believes there is unlikely to be any major change to legislation, but platforms might be offered harder rules.


" There are openness requirements under the EU AI Act," Cowan described, with the act not covering the UK.


" Under advertising guidelines, influencers have to disclose where a video they produce has been sponsored.


" I think we may end up with comparable transparency requirements. A little '#AI generated' or comparable label in the corner."


The problem will be whether developers care, and how easy enforcement is for platforms.


Cowan added: "If you have actually got those outright videos, where somebody's putting out a horrible deepfake, they're not going to stress over adding that label."


In the meantime, a minimum of, it appears clubs are not too concerned - that AI is simply something taking place on social networks.


There may come a point they choose more action is required.