Music giant Sony Music says it has requested the removal of more than 135,000 songs by fraudsters impersonating its artists on streaming services.

The so-called deepfakes were created using generative AI, and targeted some of the company's biggest acts, who include Beyoncé, Queen and Harry Styles.

The proliferation of such counterfeits causes direct commercial harm to legitimate recording artists, Sony said - and deliberately target musicians who are promoting a new album.

In the worst cases, [the deepfakes] potentially damage a release campaign or tarnish the reputation of an artist, said Dennis Kooker, president of Sony's global digital business.

The company says the number of songs generated in this fashion is only increasing as artificial intelligence technology becomes cheaper and easier to access.

Since last March alone, it has identified some 60,000 songs falsely purporting to feature artists from their roster. Other acts who may have been affected include Bad Bunny, Miley Cyrus and Mark Ronson.

The problem with deepfakes are they are a demand-driven event, said Kooker. They are taking advantage of the fact an artist is out there promoting their music.

The revelation came at the launch of the music industry's Global Music Report in London on Wednesday.

Figures released by the International Federation of the Phonographic Industry (IFPI) showed that recorded music revenues grew by 6.4% last year, reaching $31.7 billion (£23.8 billion).

Unofficially, the music industry believes up to 10% of content across all streaming platforms is fraudulent. Kooker pointed out that the French streaming company Deezer already had software capable of this task - and claims that 34% of the songs submitted to its service are now categorized as AI-generated.