‘Deepfakes’ in action: Seattle TV station accused of doctoring Trump speech video

deepfakes-in-action-seattle-tv-station-accused-of-doctoring-trump-speech-video

10-01-19 07:59:00,

As the nation tuned in to President Trump’s national address on border security, one Seattle TV station apparently manipulated its coverage on the fly, editing the footage to show Trump sticking out his tongue at viewers.

In a side-by-side comparison, Q13 Fox in Seattle appears to have edited its coverage of Trump’s address, turning the president’s skin color a ludicrous shade of orange. In between sentences, the station seems to have doctored the footage to show Trump sticking out his tongue and licking his lips.

Q13 told MyNorthWest that the footage was indeed doctored, and that the culprit has been placed on leave.

“We are investigating this to determine what happened,” said Q13’s news director. “This does not meet our editorial standards and we regret if it is seen as portraying the President in a negative light. The editor responsible for editing the footage is being placed on leave while we investigate further.”

Faking video footage has become easy in recent years, thanks to the widespread availability of video editing software. A combative press conference debate between CNN anchor Jim Acosta and President Trump in November put the issue in the spotlight, after internet detectives accused Infowars editor Paul Joseph Watson of editing video footage of Acosta pushing a White House intern to make the anchor look bad. The ‘edited’ video was shared by the White House, Watson denied the accusation, and eventually the debate was forgotten about.

This is huge.

Seattle news station caught doctoring video of Trump to make him look ridiculous sticking his tongue out.

The media rinsed me for “doctoring” the Acosta video, which didn’t happen.

Zero mainstream media coverage of THIS story.https://t.co/dyArWLfQkJ

— Paul Joseph Watson (@PrisonPlanet) January 10, 2019

Deepfakes could change porn and politics

Slowing down video footage is one thing, but so-called ‘deepfake’ videos – real and fake footage spliced together with the help of artificial intelligence – are becoming increasingly harder to spot and can be put to a limitless array of malicious uses.

Deepfake technology has been used by the porn industry to superimpose celebrity faces onto porn actors’ bodies.

 » Lees verder