From the United States, to South East Asia, to the UK, there are performers who either sound like him or imitate his act.
[5][6][7][8] In England and Wales, the Poor Law Amendment Act 1851, section 3, made it an offence to impersonate a "person entitled to vote" at an election.
[citation needed] Audio deepfakes have been used as part of social engineering scams, fooling people into thinking they are receiving instructions from a trusted individual.
[11] In 2019, a U.K.-based energy firm's CEO was scammed over the phone when he was ordered to transfer €220,000 into a Hungarian bank account by an individual who used audio deepfake technology to impersonate the voice of the firm's parent company's chief executive.
[12] As of 2023, the combination advances in deepfake technology, which could clone an individual's voice from a recording of a few seconds to a minute, and new text generation tools, enabled automated impersonation scams, targeting victims using a convincing digital clone of a friend or relative.