Deepfakes
by Kerby Anderson, Contributing Author: Some of the discussion in Europe over deepfakes videos has now reached America. If you have ever seen someone use Photoshop or Lightroom to put another person’s face on a body, you have a pretty good idea of what is now being done with video.
Until recently someone needed access to cutting-edge video technology to make it look like someone was saying something or doing something they never did. Jim Geraghty reminds us in a recent commentary that such professional technology could “make it look like Forrest Gump was shaking hands with John F. Kennedy.” Now, such technology is within the grasp of people outside of Hollywood.
As you probably know, a fake video of Nancy Pelosi was passed around social media. That is why the discussion of deepfakes videos has surfaced in America. Anyone wanting to harm the reputation of another person could create a video intended to embarrass that person. And the targets may not just be politicians and other celebrities. It could be you and me.
Most of these fake videos in Europe and America aren’t that convincing. Most of us have seen enough videos and movies to spot a fake. But Cameron Faulkner reports that researchers at Samsung’s AI Center have developed a method that can use a single photo to make a fairly convincing video. That is why we all need to exercise some discernment when we see a video on social media.
One other thought is worth mentioning. Once we start hearing more about deepfakes videos, we will also probably also see people caught in a real video using a deepfakes defense by proclaiming that “the video of me is false, don’t believe what you see on that video.”
Most of us grew up hearing the phrase, “seeing is believing.” In this new world of deepfakes, that isn’t necessarily true.
Info video added to by ARRA News Service ------------
Kerby Anderson (@kerbyanderson) is a radio talk show host heard on numerous stations via the Point of View Network (@PointofViewRTS) and is endorsed by Dr. Bill Smith, Editor, ARRA News Service.
Tags: Kerby Anderson, Viewpoints, Point of View, Deepfakes To share or post to your site, click on "Post Link". Please mention / link to the ARRA News Service and "Like" Facebook Page - Thanks!
Until recently someone needed access to cutting-edge video technology to make it look like someone was saying something or doing something they never did. Jim Geraghty reminds us in a recent commentary that such professional technology could “make it look like Forrest Gump was shaking hands with John F. Kennedy.” Now, such technology is within the grasp of people outside of Hollywood.
As you probably know, a fake video of Nancy Pelosi was passed around social media. That is why the discussion of deepfakes videos has surfaced in America. Anyone wanting to harm the reputation of another person could create a video intended to embarrass that person. And the targets may not just be politicians and other celebrities. It could be you and me.
Most of these fake videos in Europe and America aren’t that convincing. Most of us have seen enough videos and movies to spot a fake. But Cameron Faulkner reports that researchers at Samsung’s AI Center have developed a method that can use a single photo to make a fairly convincing video. That is why we all need to exercise some discernment when we see a video on social media.
One other thought is worth mentioning. Once we start hearing more about deepfakes videos, we will also probably also see people caught in a real video using a deepfakes defense by proclaiming that “the video of me is false, don’t believe what you see on that video.”
Most of us grew up hearing the phrase, “seeing is believing.” In this new world of deepfakes, that isn’t necessarily true.
Info video added to by ARRA News Service
Kerby Anderson (@kerbyanderson) is a radio talk show host heard on numerous stations via the Point of View Network (@PointofViewRTS) and is endorsed by Dr. Bill Smith, Editor, ARRA News Service.
Tags: Kerby Anderson, Viewpoints, Point of View, Deepfakes To share or post to your site, click on "Post Link". Please mention / link to the ARRA News Service and "Like" Facebook Page - Thanks!
0 Comments:
Post a Comment
<< Home