Social media firms catching more misinformation, but critics say 'they could be doing more' - Action News
Home WebMail Sunday, December 29, 2024, 07:39 AM | Calgary | -9.6°C | Regions Advertise Login | Our platform is in maintenance mode. Some URLs may not be available. |
Science

Social media firms catching more misinformation, but critics say 'they could be doing more'

From deleting content to including warnings to offering grants, social media platforms have taken unprecedented steps to fight misinformation online because of the COVID-19 pandemic, but some critics say its still not enough.

Facebook, Twitter and Google have ramped up efforts to police content that contains incorrect or harmful data

The COVID-19 pandemic has put huge pressure on social media companies to police and remove disinformation, and the companies have taken big steps to limit the spread. However, some critics say they could still do more. (The Associated Press)

Social media platforms have taken unprecedented steps to fight misinformation online because of the COVID-19 pandemic, but some critics saythey could still do more.

Facebook, Twitter and Google/YouTube have ramped up their efforts to police content that contains incorrect or harmful information, taking down the worst offenders, attaching warnings to content that has been fact-checkedand linking to official sources, such asthe Public Health Agency of Canada.

That includes posts such asa viral video by an American doctor on disciplinary probation in which he claims 5G technology causes coronavirus(it does not) or a false post implying the Canadian Armed Forces wereinToronto, but which turned out to be a photo of a tank taken duringa festival in 2016.

On Thursday, Facebook said it has attached warnings to 40 million posts about COVID-19, and that 95 per cent of the time, users did not click through to see the content. Twitter says it has taken down over 2,000 tweets related to COVID-19 and "challenged"2.8 million accounts, which can mean limiting who sees certain tweets, requiring a tweet to be removedor placing a warning on tweets that violate rules but are in the public interest to leave up.

The company also announced that it will be notifying users who have liked, commented onor reacted tocontent that waslater deemed misinformation and taken down, by placing messages at the top of their news feed.

But, according topeople who spend time monitoring false and misleading information, it's still not enough.

"The number problem here is insurmountable. Fundamentally, Facebook is too large to monitor for this sort of thing," said Robert Evans, a journalist for the open source investigative website,Bellingcat. "As somebody who spends a lot of his free time studying how disinformation spreads on a platform like Facebook, I don't see how you could stop it without shutting large portions of the site down."

WATCH | 5-G technology did not cause the COVID-19 pandemic:

"I think it's disappointing to see that the social media platforms are not doing enough right now to combat misinformation, and some are doing a little bit of degrees better than others. But there are some that are really just dropping the ball right now," said Susie Erjavec Parker, a social media and digital strategist in Winnipeg.

For example, users who want to report a tweet as disinformation on Twitter have no option to do so, she pointedout.

Cristina Tardguila, associate director of the International Fact-Checking Network (IFCN) at Poynter, says that she has seen platforms act more quickly and more effectively as a result of the pandemic, and they have been more open to discussing issues with fact checkers. She also noted that Facebook and WhatsApp have provided grants to the IFCN. On April 2, Facebook announced $1million US in grants to fund 13 fact-checking organizations around the world in partnership with the IFCN.

"I would say Twitter needs to do a little bit more. Twitter has been deleting tweets that can cause harm, but we need to promote good content, too. It's not just deleting, but it's also exposing what is being done by fact checkers," said Tardguila.

Platforms balance free speech and harm

Michele Austin, head of public policy at Twitter Canada, said it's "mission critical" for Canadians to have reliable, real-time information about COVID-19.

She said the company partnered with the Public Health Agency of Canada (PHAC)in January, and anyone searching for the terms "coronavirus" or "COVID-19" will get a link to PHAC at the top of theresults.

When it comes to cracking down on people maliciously spreading false informationshe said Twitter is being "extremely vigilant."

But she said, "we realize people make mistakes. So we'll give them a warning on their account, we'll freeze that tweet and ask them to take it down. And in most cases, most people really aren't trying to spread misinformation."

Austin also said Twitter is mindful of not going too far.

"We certainly are trying to strike that delicate balance between freedom of speech,the ability to dissent when it comes to an idea,and we also are very cautious to be labelled the arbiters of truth," she said. "But we have a comprehensive list of rules and regulations and guidelines that we are implementing on a daily basis with regard to COVID-19."

Kevin Chan, Facebook Canada's head of public policy, said finding a balance is also a challenge for his company, which also owns Instagram and WhatsApp.

"What's very important is to make sure that we do have this ability to provide people with as much space or freedom of expression as possible," Chan said.

Facebook Canada's global director and head of public policy Kevin Chan emphasizes that the pandemic means it's not business as usual at Facebook. He says the company isn't as fast as usual in reviewing content posted to its platforms because of challenges with employees working from home. (Chris Wattie/Reuters)

So for some false content, Facebook is including a warning label and linking to correct information provided by third-party fact checkers, instead of removing the content completely.

However, Chan said some things are black and white.

"For misinformation that actually does potentially lead to real-world harm, we are removing those things so they're immediately off the system."

Moderation challenges

At the same time, Chan said it will take longer than usual to review everything, as some staff are working from home without access to Facebook's normal systems to protect private content.

"The fact is it is not business as usual," Chan said. "That will have an impact on the ability for us to be as fast as we normally would be to review all the things that are reported to us."

Google, which also owns YouTube, has taken a multi-pronged approach restricting who can advertise with the keywords COVID, COVID-19 or coronavirus, demonetizing videos that talk about the virus, moving content from reliable sources to the top of searches and providing free banners and advertising to public health agencies.

"We're committed to providing Canadians with authoritative information during this critical time and making sure quality content from sources like the Public Health Agency of Canada are easily accessible on Google," said ColinMcKay,head of public policy and government relations for Google Canada.

'We certainly are trying to strike that delicate balance between freedom of speech, the ability to dissent when it comes to an idea,' says Twitter's Michele Austin. (Getty Images)

People trust family, friends more

But Evans is critical of one of the ways platformsare promoting reliable information.

"That's not reducing disinformation. That's adding a banner ad to disinformation.And that banner ad will just be ignored by the people who believe that this is a hoax."

He added that Facebook's challenge is even harder.

I absolutely think that they could be doing more. Now, the fact that they're not doing more, I think, says a lot in terms of where their values lie and where their revenues are being driven from, which is advertising revenue.- Susie Erjavec Parker, social media and digital strategist

"Facebook is fundamentally based around sharing you content that your friends and family [presents]to you. And people trust their friends and family more than they trust the media more than they trust the government."

So, Evans said, "Facebook disinformation is particularly dangerous because it's more personal.I think the disinformation is more pervasive, and it sticks more in people's heads."

A silver lining

People might also trust influencers on Instagram, who often push beauty, fitness and fashion content by appearing relatable. But it can be difficult for the average person to evaluate that information, says Erjavec Parker, and platforms should play a role.

"I absolutely think that they could be doing more. Now, the fact that they're not doing more, I think, says a lot in terms of where their values lie and where their revenues are being driven from, which is advertising revenue," she said.

But, for others, like Tardguila, there's a silver lining in this.

"We are getting money from platforms to do projects that were in our drawers for, I don't know, for years," said Tardguila."We want to fight disinformation, but we don't have money, and now we do."

With files from Elizabeth Thompson and Katie Nicholson

CBC Newsletters

Add some good to your morning and evening.

A variety of newsletters you'll love, delivered straight to you.

...



Discover all CBC newsletters in theSubscription Centre.opens new window

This site is protected by reCAPTCHA and the Google Privacy Policy and Google Terms of Service apply.