Deepfake Caller Impersonates Ukrainian Official in Attempt to Deceive US Senator

The head of the Senate Foreign Relations Committee, Sen. Ben Cardin (D-MD), recently fell victim to a sophisticated deepfake scheme. According to The New York Times, Cardin received an email that appeared to be from Dmytro Kuleba, Ukraine’s former foreign minister, requesting a Zoom meeting. During the call, the impersonator convincingly mimicked Kuleba's appearance and voice but behaved oddly. He posed politically charged questions regarding the upcoming election and pressed Cardin on foreign policy issues, including his stance on firing long-range missiles into Russia.

The unusual nature of the conversation raised Cardin's suspicions, prompting him to report the incident to the State Department. Officials confirmed that Cardin had indeed spoken with an imposter rather than the real Kuleba, though the identity of the perpetrator remains unknown. In a statement, Cardin described the encounter as a "deceptive attempt to engage in conversation" with him.

Senate security officials alerted lawmakers to be vigilant for similar attempts, predicting that such efforts are likely to increase in the coming weeks. They noted that while social engineering threats have grown in recent years, this incident was particularly notable for its technical sophistication and believability.

As AI technology becomes more accessible, deepfake incidents are on the rise, particularly in political contexts. In May, the Federal Communications Commission proposed significant fines against a political consultant for a robocall campaign that impersonated President Joe Biden, misleading voters ahead of the New Hampshire primary. Additionally, Elon Musk shared a deepfake video featuring Vice President Kamala Harris, while former President Donald Trump posted an AI-generated endorsement from Taylor Swift, which she later clarified as not genuine.

Most people like

Find AI tools in YBX