Deepfake Technology Lets Companies Use You in Commercials

Deepfake Technology Lets Companies Use You in Commercials

The further technology moves ahead the more problems seem to crop up as Deepfake technology can now allow companies to use your likeness in commercials without violating any consent policy. According to those that know however it might still be an issue since those being recorded for the needed footage would still have to give consent as Chazz Mair of ScreenRant has pointed out. That makes a great deal of sense, but it also doesn’t really sound like a serious impediment to technology that could so easily recreate an image of another person. So far this technology has been seen to be kind of amusing and even witty, but the implications thus far have been kind of creepy since it means that just about anyone can be turned into an asset by this kind of programming and therefore the danger of anyone that has less than noble ideals about what to do with it might be able to find their way into a bit of mischief. Yes there are laws against such uses but then again there are laws against turning guns on innocent people as well and that hasn’t turned out so well in recent years. It’s a rough analogy but it’s one that definitely hits the nail on the head since anything with technology is inherently dangerous in the wrong hands these days.

I know, that sounds like a bad line from any futuristic movie in which technology has been utilized in some devious manner. But the truth is that the further things go with technology these days the wider the net is spread and the harder it is for those keeping a close eye on it to do their job. Paranoia might be what a lot of people call it but in matters such as this a little paranoia is better than none. In matters pertaining to privacy a lot of people are going to become a lot more paranoid if they start seeing their likeness on TV and some might even go so far as to start petitions and possible lawsuits that could be shot down initially but gain some serious ground deepfake technology continues to use images that are gained without proper consent. The fact that nothing insidious is being done so far with the technology as far as people know isn’t bound to be enough to placate a lot of people, as the commonality among many is that they don’t want their image being used for fear of one thing or another having to do with their privacy. Liz Tung of WHYY has more to say about this.

For a while now deepfake has been building and it’s been getting better and better with the technological marvels it’s achieved but still many people have been worrying if it’s something that needs to be monitored and possibly regulated in a way that will insure that it’s never used for any illicit purpose. After all it already can’t be used for political purposes with the next election coming so close. Imagine the chaos that would erupt if deepfake technology was used to sway the voters, that kind of fraud would rock the nation no doubt since it would mean that rules and regulations had just flown out the window and that the idea of ‘anything goes’ had finally taken over in a bid for the White House. Yeah it does sound over the top but it also sounds way too real since a lot of us have seen deepfake videos at this point and have likely seen how much better they’ve gotten with each new rendition. To think that they can replicate just about anyone, and they can, is kind of creepy given that eventually there’s likely to be a way they can do this without consent, meaning that just about anyone’s likeness could be used in ways they might not agree with for purposes that they know nothing about. Call it paranoid, fear-mongering, or whatever, but innovations in technology have almost always needed some of the most ruthless and over the top checks and balances if only because of human nature and its inherent flaws. Michelle Drolet of CSO has more that you might want to read on this.

That’s right, it’s not necessarily the technology that’s the most disturbing part of all this, it’s the morals and values of those behind it and what they’re wanting to use it for. The idea of creating this technology just for pushing commercials and not having to gain as much consent for this or that spot is something that seems kind of nonsensical. If anything deepfake tends to look like a bit of technological amusement more than anything, but like all amusing toys there’s usually a risk that someone is going to find a way to make it seem a little more exciting, useful, and possibly problematic. Whether or not this will happen is hard to say since thus far it hasn’t been a major issue, but hopefully down the line things will stay that way.

Start a Discussion

Main Heading Goes Here
Sub Heading Goes Here
No, thank you. I do not want.
100% secure your website.