Here’s a news story that broadcast media executives eager to drive dollars back from digital media should read with keen interest: The Ranking Member of the powerful House Energy and Commerce Committee has asked the heads of the three largest digital and social media companies to hold a meeting to discuss content management and advertising policies on their respective platforms.
“Your platforms have taken on a role of policing content but your practices for doing so are not clear,” said Rep. Frank Pallone, Jr. (D-N.J.) in a letter sent Monday (10/20) to the CEOs of Google, Facebook and Twitter that requested a meeting to address “multiple reports of vague and confusing content guidelines that are frequently applied inconsistently.”
In his letter, Pallone wrote, “The influence of the internet over our national dialogue and our lives has skyrocketed over the past decade. At the same time, the number of websites handling this traffic has consolidated to a handful of key platforms. The combination of these trends have led to these few companies taking on a quasi-governmental role policing content, and therefore a large amount of communication, on the internet.”
Pallone also claimed that the companies’ content management policies may be influenced by a desire to increase page views and ad clicks leading to inconsistent and inadequate content policing on their platforms.
“With a goal of ad clicks or driving page views, these companies’ policies are not neutral; they actively shape content on the web,” he said. “And to the extent that these companies’ platforms have publicly available policies for moderating content, those policies are vague and applied inconsistently. This lack of transparency makes it difficult for consumers to understand how content is controlled and for the government to oversee the market. Ultimately, algorithms and employees become the arbiters of what is acceptable content in the public forum without transparent guidelines.”
Pallone is requesting a briefing with the companies to review their policies for moderating content and advertising and to discuss the implementation of those policies.
Specifically, Pallone asked for each of the companies to be able to discuss how they develop and enforce their policies, how users are aware of these policies, what safeguards are in place to prevent creators of fabricated content from gaming algorithms to promote their stories, and what processes are in place for appeals.