This week on the podcast we discussed clickbait ads and, the hottest topic in digital publishing news right now, ChatGPT and Google’s new clapback, Bard.
Additionally, it seems Google has altered its perspective on AI-generated content.
But first, let’s chat about clickbait and malicious ads.
You can watch the podcast on YouTube or listen here, or wherever you listen to podcasts.
Clickbait ads and protecting audiences
Ad quality is getting increasingly more difficult to control in programmatic advertising, as it is easier than ever for clickbait, offensive, or misleading ads to appear on sites.
Research by GeoEdge, an ad security platform, and Digiday shows that 76% of publishers reported that user experience on their site has been effected by ad quality challenges and 66% reported that it negatively impacted their revenue. In fact, approximately two-thirds of security issues detected by GeoEdge are clickbait scams with deceptive creative rather than ads with malicious code
Why is this important? Bad user experience will result in worsening site metrics, which discourages high-quality advertisers from buying inventory and impacts revenue.
Since there is no universal standard for what a ‘good-looking ad’ versus ‘bad-looking ad’ is, it’s up to publishers to decide what is most effective and acceptable on their site.
Ultimately, it’s important to block clickbait on the pre-impression level, which is difficult, and not all technology is capable of it.
However, it’s a complicated problem for two reasons.
One, there are just some ads that people don’t like the way they look or like the product that’s being advertised. Publishers can choose to not have these ads show on their site, but they may have accept lower revenue from other advertisers.
Two, then you have malicious ads, which may include viruses that infect your computer, or ads that have found a way to get around ad policies and advertise for a little while until they get caught.
With those things in mind, the reason why ads are hard to block at the pre-impression level is because the ads from AdX, Google’s inventory for campaigns, can possibly be resold at other SSPs and DSPs, and it’s difficult to know where the ad is coming from. Tracing it down is really difficult and there is no automated way to do it.
There are technologies and services that work on finding these types of ads, such as Ad Lightning and Confiant, but they can be expensive and are not perfect. Publishers that aren’t at a larger brand or media house often can’t afford these services.
So, what can the average publisher do?
Anytime you see an ad, you can right-click on it, copy, and send it to your service provider with the URL. This way, they can at least stomp out that campaign. If you’re managing your own ad stack or you can identify where it’s being served from, turn the malicious party off. However, this is only beneficial if it seems like one party is specifically causing a lot of trouble on your website.
The last thing to consider is if you happen to be experiencing a lot of lower-quality ads, it may be due to your browsing behavior. The indexes in the real-time bidding protocol are designed to base ad bidding based on historical data, and if you browse around a lot, the historical data for someone like you may be that you are someone who visits a lot of pages but doesn’t click on or buy anything. After awhile, high-quality advertisers aren’t interested in showing you ads because you aren’t likely to respond to the ad, leaving only lower-quality advertisers to bid on the space.
If you use an ads.txt file, you may be wondering how that ties into clickbait or malicious ads. Ads.txt was meant to prevent domain spoofing, or pretending they’re an advertiser they’re not.
For example, the New York Times may be trying to advertise; a domain spoofer would pretend to be the New York Times but then when you click on the ad, it doesn’t go to the New York Times. Ads.txt helps publishers decide who is allowed to provide inventory for their ad space, hopefully preventing domain spoofers from getting in that space. However, all DSPs and SSPs are susceptible to being infiltrated by malicious advertisers, and so there are still going to be those who work around the system and get in.
There is never going to be a perfect solution, and as a publisher, there isn’t a ton you’re able to do outside of what we suggested. The real changes that would make a difference need to happen further up the ladder.
Microsoft Bing and ChatGPT, Google and Bard
Microsoft recently previewed its new version of Bing with ChatGPT-3.5. The preview showed how users can chat with ChatGPT, as well as ask questions, create content, or even have code written from one language to another with the new AI-powered search engine.
The new Bing homepage will load with a new greeting that says ‘Ask me anything.’ Results appear on the righthand side of the search results page, much like a featured snippet on Google Chrome. The results include annotations and cites sources.
Chat-GPT for Bing will provide more real-time answers using the Bing index and will include feedback links to let it know when the results aren’t what you wanted, so it is constantly learning to provide better answers.
According to Microsoft, there are 10 billion search queries a day but it is estimated that about half go unanswered because people are using search for something it wasn’t originally designed to do. Search engines are great for finding resources and websites but are not built to answer more complex requests or tasks. The new Bing search engine will provide both.
It comes as no surprise that Google quickly responded with its own AI-powered search tool, Bard, the next day. Bard is powered by a lightweight version of LaMDA (Language Model for Dialogue Applications) and will become widely available in the coming weeks. Because it uses less computing power, Google should be able to scale it to more users quickly.
Bard uses AI to respond to lengthy queries with a long AI-created response and is able to distill complex information and multiple perspectives into formats that are easy to understand.
Google has been working on this for a long time but it seems as if they were holding onto it for what felt like ‘the perfect time’ or to possibly slowly roll it out over time. However, with Microsoft’s debut of the new Bing, Google suddenly feels the pressure to quicken the release of their version.
Based on our sources, the LaMDA model of OpenAI is just as good or even better.
In regards to both AI-powered search engines, this is just the beginning. Much like the early days of the internet and smartphone, we have no idea just how AI is going to effect the world or how anything operates.
These changes don’t have to be scary, they can be exciting. When we imagine how much the world has changed with the technology that has emerged in the last 30 years, we can only imagine what things will be like 30 years from now with AI. And that is just it—all we can do is speculate and stay in tune with the trends.
It’s our belief that even the way this specific instance of AI use will be different in the coming years; the search engine interface isn’t necessarily the right one for this type of technology, so it’s likely something else will be created that is better suited for AI. It’s also likely there will be another big player, much like Google or Meta, that comes in that is focused specifically on AI.
Already, other Big Tech companies like Amazon and Apple have also been creating their own versions of this for awhile; however, it takes a large amount of computing power, which is why many charge or have limitations.
One thing that is interesting is that Microsoft decided to go ahead and use OpenAI on the existing Bing brand rather than pushing something ‘new and shiny.’ It may be because Microsoft, and then Google, are afraid if they don’t get something out there now, something else will come along and steal their thunder.
Do you have something to tell us?
We now have a suggestions box on The Publisher Lab page, publisherlab.org. Give us topic suggestions or let us know how we can make the podcast better. You can also find previous podcast episodes.
We’ll be back next week with another episode of the Publisher Lab.