The head of the Senate Foreign Relations Committee, Sen. Ben Cardin (D-MD), recently experienced a Zoom call with an individual using deepfake technology to impersonate Dmytro Kuleba, Ukraine’s former foreign minister. This unusual incident unfolded following an email invitation for a meeting purportedly from Kuleba.
During the call, the imposter closely resembled Kuleba and spoke in a similar tone but exhibited odd behavior, posing politically charged questions regarding the upcoming election. He pressed Cardin for his views on foreign policy matters, including his stance on the use of long-range missiles against Russia. Cardin's suspicions were aroused by the conversation's nature, prompting him to report the encounter to the State Department. Officials confirmed that he had been speaking with an imposter rather than the real Kuleba; however, the identity of the individual behind the deception remains unknown.
In a statement, Cardin remarked on the "malign actor" attempting to mislead him by impersonating a recognized individual. While he refrained from naming the individual, Senate security officials identified him in their communications to lawmakers. They cautioned that further attempts at impersonation are likely in the near future, urging vigilance.
The Senate security office highlighted the increasing prevalence of social engineering threats, noting that this particular instance stood out due to its technical sophistication and believability. With AI tools becoming more accessible, politically motivated deepfakes are becoming more common and effective. A notable example occurred in May when the Federal Communications Commission proposed heavy fines against a political consultant for a robocall campaign impersonating President Joe Biden. In a similar vein, Elon Musk circulated a deepfake video featuring Vice President Kamala Harris, while former President Donald Trump shared an AI-generated endorsement by Taylor Swift on Truth Social, which Swift later clarified was not genuine.