Business

Facebook, Youtube may lose billions of dollars in advertising because of bad content

World Federation of Advertisers (WFA), with members such as PepsiCo, P&G and Diageo, have called on their members to put pressure on social media platforms to prevent their ad mounting. into bad content.

Advertisers are not afraid to withdraw money from Facebook or YouTube campaigns, after being attached to controversial content on these platforms, but they always seem to come back.

They have repeatedly fallen into such an awkward situation. Some technology companies have been found to store content such as terrorist videos or intentional comments from pedophiles, but the huge audience of followers of these platforms has prompted advertisers. still can’t leave at the end.

But at least one large group of advertisers, representing the purchasing power of nearly a trillion dollars each year, have begun to build plans to make technology companies clean up their mess.

The World Federation of Advertisers (WFA), with members like PepsiCo, P&G and Diageo, has called on its members as “sponsors of the online advertising system” to put pressure on them. power on platforms, making platforms make every effort to prevent malicious intentions from attacking their services and technology.

Raja Rajamannar, Mastercard’s Marketing Manager, assumed the role of President of WFA at the end of March. Because internet companies receive a lot of advertising revenue, “they absolutely cannot ignore the legitimate interests of advertisers,” he said in an interview last week.

Mr. Raja Rajamannar, Marketing Director of Mastercard. CNBC photo

Rajamannar talked about a broader issue, not just providing a “safe” place for brands. For things like the shooting in New Zealand to be broadcast live on Facebook and transmitted on YouTube and Twitter, “it’s no longer a safety issue for the brand. It is a social safety issue and because we are responsible to society as a marketer. ”

WFA at the end of March urged its members to “think carefully about ad placement” and put moral responsibility on the effectiveness of advertising on social media platforms. The call came after someone reported the emergence of pedophile group comments on YouTube, content related to self-harm and suicide on Instagram and a live shooting. Islam in New Zealand on Facebook.

“But despite these problems, people don’t easily turn away from foundations like they should,” said Rajamannar.

There are big social media giants with extremely wide coverage, they are able to target the right target audience, which is something you can’t ignore, ”he said. “You can’t leave something of that size.”

This trade-off has become more obvious in recent months.

“Do you want a live shooting incident to happen? Definitely not wanted, ”he said. “People tried to do something specific. Is it enough? Not yet. Should we urge? Yes.”

He said there were specific plans to ask the platform in the future to take action.

We are asking them to show a resolution plan, We want to see a clear plan, “he said.

On how to fix the content problem, platforms often answer that they are adding more people to do. In recent years Facebook and YouTube have hired thousands of people to control content on their platforms.

“That’s not exactly a plan. So we told them that there must be a solution by technology. Their solution is to use people? Or a combination of both? We are asking them to reconsider their strategy and come and share with us, ”he said.

There is no clear solution

Last week at the Communications Conference of the Association of National Advertisers in Orlando, Brand Manager for Procter & Gamble Marc Pritchard said his company plans to use content-controlled platforms and comment, including linking comments to the poster’s true identity and ensuring a balance of views. He said he had withdrawn all advertising money for platforms that could not do so, but also said that suppliers were prioritized by P&G “will improve the quality and ensure the safety of the brand. and control the content ”.

Rajamannar said he agreed with Pritchard, but also said that this was not easy to do.

“In principle, I agree with Marc,” he said. “But doing so to achieve that is not the only way.”

He said that it could be tried in many ways. One can temporarily put aside the current advertising ecosystem and “rebuild it from scratch”. He said it was possible, but it would be a challenge. “You have to run the business. You also have to have business results. ”

He said the advent of technologies like blockchain could promise a solution.

Facebook’s vice president of global marketing solutions Carolyn Everson responded to Pritchard’s comment, “We welcome and support Marc Pritchard’s goodwill when we once again appeal to the whole industry we work together. more for our customers We will continue to invest heavily in ensuring the safety and security of our community and are committed to ensuring our platform is safe for all users. “

Responding to WFA’s call for platforms to better manage malicious content that may appear next to ads, Facebook mentioned a recent blog post by Sheryl Operations Manager. Sandberg, which outlined steps including restricting subjects to livestream and using artificial intelligence tools to identify and eliminate hate groups.

Google did not respond when asked to comment.

What is “good intent”

Brands obviously have investments to ensure their ads do not appear alongside offensive content. But Rajamannar said social media companies are also interested in this.

“Social media companies don’t want that to happen,” he said. Their intentions are not bad, but very good. But the important thing is how to turn that good intention into action that brings the results you need, it’s brand safety. ”

Before solving this problem, he said that marketers have a number of options, such as choosing reputable social media units or using white lists or blacklists. He said there may be another option in the future to use third-party technology programmed to help ads not be tied to bad content.

“Social media companies are trying to hire more content monitors, they have tried to improve their algorithms and do things like that,” he said. “To ensure safety for brands, they are taking positive steps, but still not enough.”

Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Mashablz is a global, multi-platform media and entertainment company. We Powered by its own proprietary technology, Mashablz is the go-to source for tech, digital culture and entertainment content for its dedicated and influential audience around the globe.

Copyright ©2018 Designtechnica Mashablz. All rights reserved.

To Top