In a TV interview, I recently illustrated why news organizations are being held accountable for defamatory comments on Facebook pages, specifically when they enrage such arguments with rabble-rousing click-bait reports.
The Australian high court has ruled that a few of the country’s biggest media groups will now be held responsible for slanderous third-party remarks published on their social media pages.
A former convict in an Australian youth detention centre, Dylan Voller Facebook comments triggered the issue. He asserted that inimical comments stationed on a picture of him shackled in a hood on social media were slanderous. Now prominent news platforms that enclosed the Sydney Morning Herald and Sky News, they pleaded that order asserting that they couldn’t be stored accountable for third-party remarks created on their social media posts, though the high court supported the decision.
The appellants’ actions in promoting nurturing and thereby helping the posting defamatory comments on social media by the third-party Facebook users caused them publishers of those remarks. The appellant’s tries to describe themselves as lethargic and unwitting targets of Facebook’s functionality have an aura of unreality.
Holding measures to ensure the commercial usefulness of the Facebook functionality, the appellants endure the legal impacts. Perhaps Facebook is yet to remark on that court determination. Still, it is possible that Australian news media may now disable the comments feature or post fewer stories on social media – something they express confines liberty of speech.
This conclusion will encourage the exchange of opinions enabling Facebook users to turn off prospects for comment. I urge Australia’s Attorneys General to rectify this inconsistency and bring Australian law in line with those of equivalent Western democracies.
Let’s talk about the decision now with a panel of experts. Chadwick Moore (journalist and commentator), Jennifer Dempster (legal analyst), and Bill Mew are on the show with me (privacy activist and technology expert). Thank you to everyone; I’m sure you’ll all hold strong opinions about this. Chadwick, I’ll start with you. What are your thoughts on this as a journalist? Is it fair to hold the media accountable for third-party posts?
Chadwick Moore: “Interestingly, it does appear you know that it’s going to stifle speech. Interestingly, the media would be held accountable for a Facebook page, which you know they don’t. They may post content on that page, but they don’t own it. Facebook owns everything that goes on to its platform. That’s very clear. This is very different from a media company being held accountable for comments on their own articles on their own websites. So here Facebook is once again sort of reiterating the lie that they’re a neutral platform and that the onus is all on people who post it when Facebook is obviously not a neutral platform. They’re a company with an editorial voice. They’re a publisher. They just get free content from news organizations, so it’s interesting that it almost says that these pages are owned by these news organizations in which they’re not. Another thing is that it supposes that people actually take comments sections seriously. It kind of speaks very lowly of the court’s opinion of the public, that people are reading comments as though they have gone through an editorial process through the publication of which these people are commenting when you know most people read comments to be entertained. They know it’s maybe a bunch of yahoos, and most people are basically able to do their own research and look into things that people are claiming in these sections. It’s kind of a stretch to hold a publisher accountable for defamation, or what have you, based on just what someone on the internet wants to say on their Facebook feed.”
Jennifer, it appears like this could have massive ramifications – you tell me. Does this intercept with Facebook? Does this proceed to all social media firms? Does it conclude the monitoring of comments on all websites worldwide?
Jennifer Dempster: “Oh I think it will you know. Anytime there’s a groundbreaking case like this, you know other courts and other potential litigants are going to start to see that sort of opportunity to go forward. So I think this could be, in essence – this could maybe spread [to] of course other companies. I agree completely with what with what Chad was saying about how these tech giants, these social media companies, like Facebook, Twitter, others are able to skirt around their own liability for setting up these platforms as publishers themselves because we all know that they absolutely have the ability to go in and to take out comments. And to where this puts us all on the on the actual users, the liability and the users, then I think that again allows Facebook to skirt some of this liability, but it also could open the door up to more responsibility – more responsibility for these social media giants – because we all know now that the Communications Decency Act in the US which gives these tech companies ‘carte blanch to allow whatever they want and to take off and remove whatever they want, does make them publishers and you know these … some of these news outlets, I mean, they have had lots of troubling articles. There is a lack of trust and faith in these major news corporations, especially in recent years. And people are starting to see that, so comments that continue to perpetuate that, or that might cause some kind of defamation need to be at least looked at – especially when these are the people that have lots more of a platform, but again that is a platform that was provided by Facebook, Twitter and others that gets it out to a much bigger audience and the news really. The information channels have completely changed that’s why we’re going to see continuing litigation in this field – because people are going to address this. And I think even in the coming months and years, we’re going to continue to see [a] backlash against not only news outlets but also social media companies that have allowed some of this stuff to perpetuate, which has hurt people and also hurt even public health.”
Bill, comment sections of any website … they appear like a sort of a double-edged dagger. It appears as the companies relish them. Perhaps the assembly has more appeal. They develop a rumour close to a report or a certain topic, but also they arrive with that perilous side that individuals will say a lot of nasty, potentially slanderous things. Is this the most suitable solution now -what’s been suggested – or is there a more suitable way to go?
Bill Mew: “There are a large number of responsible publishers who are reporting the news in a very responsible manner, but there are also [at] the other end of the spectrum a number of tabloid-like publications that are publishing rabble-rousing content in sort of click-bait format, which is designed to drive attention and have as much comment and it’s like lighting the fire to the (sort of) rabble-rousing extremist defamation that they just stand back and watch with glee because they’re driving up readership, they’re driving up advertising clicks, and they stand to gain from it. And they seek to wash their hands by standing back and claiming, ‘Oh but we’re just publishers’, but you need to differentiate between the publishers that are actually doing a thoroughly professional job and those that are publishing this sort of rabble-rousing, click-bait, and I think we need some accountability here. There needs to be accountability directly on those posting the comments, and if you want to pursue those people – you can, but at the same time, if the media companies are encouraging this by skewing what they publish on Facebook and other platforms to generate interest – to have sort of rabble-rousing, click-bait – then they need to share some of the blame here.”
And I desire to fetch it back to Jennifer. Just because of the lawful component, Jennifer – I can believe particular tabloid websites, such as Bill was directing to, covering an article perhaps about a celebrity. I can think of a couple of currently experienced footballer scandals, one of whom was the footballer. It was indicated they’d been involved in a child sex abuse case. Now their individuality wasn’t disclosed because of the worry that it would expose the victim’s individuality.
Nevertheless, you go on the comment sections, and the individuality quickly evolves obviously because they get doxed by the commenters. There’s this ‘Wild West’ where anything will go on there. So would this be a suitable thing if you could close down that sort of thing occurring?
Jennifer Dempster: “You know, it is a double-edged sword. This is a tough issue because you know in one way we should … we should take personal responsibility. We should be able … we shouldn’t be allowed to have freedom of speech regardless of what that is and hope that people have the wherewithal to know what to follow [and] what to ignore, but on the other hand, I think what the Australian court was saying here is that that social media has become so large – the audience is so large – whereas before Facebook maybe a tabloid or any kind of news outlet that was going to publish a story about somebody say, this particular football player or anybody else, or publish any kind of story, whether it was true or false, would maybe have whoever their subscribers were – maybe a couple [of] hundred thousand, ten thousand, a couple [of] – hundred now with Facebook this outlet now has an audience of millions and millions of people – 1.2 billion people around the world – and so the audience is much, much larger. You’re looking at a lot more chances for damages or harm to one’s reputation, and so that’s why I think at the very least Facebook at least should have been mentioned as complicit and part of that is having these large audiences because we know that they can also choose what is said and what goes in – especially in comments – but what the court was saying here was that when an article is published, even if they try not to mention the name, they’re still … it still becomes almost part of that article to have those comments because you chose to put it. You’re encouraging people to comment. You’re encouraging people to say things – whatever that might cause to a third party or just somebody else and that is what they’re looking at … is that people, a lot more people, are going to read comments from, say a John Doe with 10 followers if it’s on some major tabloid article or even any major news outlet organization.”
Do you think the media, Chadwick, takes… In the sense that there is a struggle now for media organizations’ attention, and they use the comment area to displease interest in a story, you know, you ought to abide by some obligation here. And they must be totally aware of the incendiary remarks made there, many of which are emphatic anti-Semitic and defamatory on a sort of topics, but it fits their intent. So is it essential for them to take on more accountability for this?
Chadwick Moore: “Well, media organizations already [are] held to extremely high standards. [It] varies obviously in which country you’re in. It’s much more lax here in the United States in terms of getting sued for defamation, and you know if they do not do their due diligence – if they’re not careful in their words and how they present facts of course they can and will get sued and they will suffer greatly for that. So the media organizations are covering there and, you know, when it comes to tabloid and salacious things, I mean like, when has that ever not been the case? That’s been the way journalism …”
What do you think about the comment section, Chadwick?
Chadwick Moore: “That’s the bit I’m talking about because the journalists may be quite careful, but then anything goes in the comments section, and that’s what a lot of people now will if you just go on, for example … email. Online, people will go in there just for the salacious comments rather than the article… right! Well, most news organizations do monitor their comments, especially on their own website. So, full-time people to get rid of extremely hateful language … maybe, you know, if somebody is using like racial slurs or something, they will get rid of that so, you know, they do have some employees to do that. I think the difference here is that you’re talking about Facebook, and you’re holding … it’s a Facebook property, you know. We always talk about how social media is a new public square. Well, obviously the social media companies like to look at it both ways. So, what’s really … this is the new world we live in. This is the digital world. You put information out there, and it spreads faster. Obviously, it spreads more widely than just if you’re, you know, yelling from a soapbox in front of the local post office, but if this is a new town square, this is what we have to get used to. And we have to start training people. People need to be smarter about how they take information and what they read online. Most intelligent, rational people know when they’re being misled. They know when something is click-bait. They know when some of the comment sections may be full of it, or they need to look into something more deeply. You know, as a free speech absolutist, I think that if you want to go down this route, the owners need to go on the owner of the property, which is Facebook … you know the New York Times or someone, has a Facebook page and someone’s commenting. New York Times does not own its page. Facebook does, and that’s very clear in terms of service that Facebook owns everything you put on their site, so it’s … does it seem suspicious that this comes down onto the publisher? And not the publisher being the news organization and not Facebook itself. And I would still say that let people talk because the truth is gonna get out there, of course. There are certain things that people … that we can … that most people agree don’t want to see online, even in comment sections. Things like that …”
My apologies Chadwick. My apologies for interrupting you. We are just a bit lagging for time. I would quickly revert to Bill, facilitating another chance to comment. And Bill, I would also like to highlight a simple thing and that’s what about the cost of moderating? For a media colossus like Facebook, it might not be a big deal to hire people to mediate, but is this going to possibly heighten the cost of it … cause some media organizations are darting to simply shut down their comments.
Bill Mew: “I think media organizations are crying foul when actually there are two very sensible means that they have at their disposal to address this. First of all, within Facebook, they can shut off comments if they’re not particularly worried about having or driving up a big debate or a big furore, then they can simply turn off comments. And secondly, they can’t be sued directly if a comment appears in … against one of their articles. They have to be given the opportunity to take it down, and therefore if somebody points out that a defamatory comment has been published and the publisher has a chance to take it down, then the publisher should do so. If the publisher ignores this opportunity and doesn’t take it down, [then] very rightly the publisher should face the music. It isn’t going to take an enormous amount of resources for them to take action and just go and have a look at comments that have actually been flagged up to them. And if they choose to remove them, then that’s one thing, or if they think that they are not particularly derogatory or inflammatory, then they can choose to keep them up, or they can just disable comments entirely, so I think it’s a little bit overblown by the publishers to complain.”
I despise doing this, and I surely don’t desire to mediate posting defamatory comments on Facebook, but we’re definitely out of time. In fact, we’re late. Really respect your time – Chadwick Moore, Jennifer Dempster, Bill Mew. Thanks so much.