The impact of deepfakes on marketing

Join top executives in San Francisco July 11-12 to learn how leaders are integrating and optimizing AI investments for success. To know more


While researching AI experts, I came across a deepfake. It wasn’t obvious at first, given her seemingly legit profile and social media engagement. However, after seeing the same spooky AI-generated photo of Dr. Lance B. Eliot all over the web, it was clear he was not a real person. So I followed him and learned grift from him.

The omnipresent Dr. Lance B. Eliot

Eliot has over 11,000 followers on LinkedIn and we have two connections in common. Both have thousands of LinkedIn followers and decades of AI experience, with roles spanning investors, analysts, speakers, columnists and CEOs. LinkedIn members engage with Eliot, though all of his posts are repetitive conversations that lead to his many Forbes articles.

At Forbes, Eliot posts every day or three with nearly identical headlines. After reading a few articles, it becomes obvious that the content is AI generated tech jargon. One of the biggest problems with Eliot’s extensive Forbes portfolio is that the site limits readers to five free stories a month until they’re directed to purchase a subscription for $6.99 a month or $74.99 a year. That gets complicated now that Forbes has officially put itself up for sale at a price close to $800 million.

Eliot’s content is also available through a Medium paywall, which charges $5 a month. And a short profile of Eliot appears on Cision, Muckrack and the Sam Whitmore Media Survey, expensive paid media services used by the vast majority of PR professionals.

Event

Transform 2023

Join us in San Francisco July 11-12, where top executives will share how they’ve integrated and optimized AI investments for success and avoided common pitfalls.

register now

Then there is the online sale of Eliot’s books. He sells them on Amazon, fetching just over $4 a title, although Walmart offers them for less. At Thriftbooks, Eliot’s pearls of wisdom retail for around $27, which is a pretty good deal compared to Porchlight’s $28 price tag. A safe bet is that book sales are boosted by fake reviews. However, some disappointed humans bought the books and gave them low marks, accusing the content of being repetitive.

The damage to big brands and individual identities

After clicking on a link to Eliot’s Stanford University profile, I used another browser and landed on the actual Stanford website, where a search on Eliot yielded no results. A side-by-side comparison shows that the red color of the brand on Eliot’s Stanford page was not the same hue as the authentic page.

A similar experience happened at Cornell’s ArXiv site. With just a minor tweak to the Cornell logo, one of Eliot’s academic papers was published, full of typos and more low-quality AI-generated content presented in a standard academic research paper format. The paper cited an extensive list of sources, including Oliver Wendell Holmes, who apparently published it in an 1897 edition of the Harvard Law Review—three years after his death.

Those not interested in reading Eliot’s content can turn to his podcasts, where a bot spouts gibberish. An excerpt from a listener’s review reads: “If you enjoy hearing someone read word for word from a paper script, this is a great podcast for you.”

The URL posted alongside Eliot’s podcasts promotes his website about self-driving cars, which initially led to a dead end. An update to the same link led to Techbrium, one of Eliot’s fake employer sites.

It’s amazing how Eliot is able to do all this and still make time to speak at executive leadership summits hosted by HMG Strategy. The fake events feature big-name tech companies listed as partners, who’s who of consultants, and real biographies of executives from Zoom, Adobe, SAP, ServiceNow, and the Boston Red Sox, among others.

Attendance at HMG events is free for senior technology executives, as long as they register. Per HMG terms and conditions, “If for any reason you are unable to attend and are unable to submit a direct report in your place, a $100 no-show fee will be charged to cover the costs of meals and service staff. ”.

The cost of ignoring deepfakes

Further investigation into Eliot led to a two-year-old Reddit thread calling him out and quickly veering into hard-to-follow conspiracy theories. Eliot may not be an anagram or linked to the NSA, but he is one of the millions of deepfakes making money online that are getting harder to detect.

Looking at the financial effects of deepfakes raises questions about who is responsible when they generate revenue for themselves and their partners. That’s not to mention the cost of downloading malware, targeting fake leads, and paying for spammy affiliate marketing links.

Arguably, a close eye can spot a deepfake by their messy or absent background, weird hair, weird eyes, and robotic voices that don’t sync with their mouths. But if this were universally true, the cost of deepfakes would not be billions in losses, as they generate financial scams and impersonate real people.

AI hasn’t fixed all the issues that make it difficult to identify a deepfake’s inauthenticity, but it is actively fixing them. It is this type of article about deepfakes that helps AI learn and improve. This leaves the responsibility of detecting deepfakes to individuals, forcing them to be vigilant about who they let into their networks and lives.

Kathy Keating is a real person and the founder of ProsInComms, a public relations consultancy.

DataDecisionMakers

Welcome to the VentureBeat community!

DataDecisionMakers is where experts, including technical people working with data, can share data-related insights and innovation.

If you want to read about cutting-edge ideas and up-to-date information, best practices, and the future of data and data technology, join us at DataDecisionMakers.

You might even consider contributing an article of your own!

Read more from DataDecisionMakers

Leave a Comment