Create a free account, or log in

Public Facebook pages expose SMEs to a barrage of legal risks: Here’s how to protect yourself

The risks associated with operating a public Facebook page have increased following a recent Supreme Court of New South Wales decision.
Lisa Fitzgerald
Lisa Fitzgerald
Facebook comments

The risks associated with operating a public Facebook page have increased under Australian (state-based) defamation law following a recent decision of the Supreme Court of New South Wales. The decision has deemed certain administrators of public Facebook pages ‘primary publishers’ of third-party comments, ruling out the defence of innocent dissemination. Although the defendants in Voller were media companies, the reasoning could apply to any company or person operating a public Facebook page. In reaching its decision, the Court relied on the degree of control an administrator has in operating a public Facebook page and the purpose for which the page exists, which in this case, was for commercial gain.

Fundamentally, it asserted each defendant could: 

  • Aside from not running a public Facebook page at all, exercise absolute control over the third-party content published by “forbidding” comments entirely; or
  • Take steps to hide most comments to most Facebook users, by using the tools offered by Facebook. 

However, in our view, and with all due respect, this finding appears to be open to challenge. We say this because Facebook’s functionality, affording administrators ‘control’, is not always available or not easily accessible, and therefore obstructs the usual editorial process for primary publishers.

Indeed, the ability to exercise absolute control in this context really depends on the tools made available for: 

  • The particular page (such as public pages, private pages or group pages); and
  • Third-party content type (such as posts or comments). 

Currently, the blocking and moderation tools available for public Facebook pages (compared with Facebook group pages, for example) appear to differ for posts by third parties versus comments by third parties — a distinction which, respectfully, does not appear to have been addressed by the Court.

While the Voller decision has been appealed (with a hearing date yet to be given), it has immediate and significant implications for administrators of public Facebook pages. Plus, it raises important considerations when using social media as a business tool:

  • A company could be held liable as a ‘primary publisher’ for defamatory content posted by third 
  • parties on its page.
  • The defence of ‘innocent dissemination’ (which would ordinarily be available to a secondary publisher who was not aware the material was defamatory) will not be available.
  • Active monitoring and moderation is required in order to reduce the risk of liability for potentially defamatory third-party comments.
  • Substantial damages claims for defamation may be payable (despite the existence of statutory caps) following record payouts for reputational damage, with the cases of Rebel Wilson and Geoffrey Rush being two recent examples.

What has changed?

The number of ‘primary publishers’ on social media has expanded due to the decision in Voller. This matters because businesses operating on social media may have previously assumed they were protected against defamation claims by the defence of ‘innocent dissemination’, or simply not thought of it as a risk at all.

Until now, businesses seeking to engage with their online community and customers on social media have done so without needing to invest heavily in the monitoring and removal of such content. However, following Voller, the mere capability to exercise control over third-party comments, including due to the existence of Facebook’s tools, is now enough to put companies in the firing line as a primary publisher.

Although the Voller case did not determine whether the comments themselves were defamatory, the preliminary question considered by the Court (that being, whether the media companies involved were ‘primary publishers’) has much wider significance for all companies running public Facebook pages or providing online ‘discussion forums’. Indeed, the emphasis given to the ability to exercise control could mean that Voller has paved the way for an ever-widening group of primary publishers.

This may cause concern for social media platform providers themselves. For example, Facebook and administrators each have an ability to exercise some control over third-party content on Facebook. They each play a part in facilitating discussion either by providing ‘pubic pages for business use’, providing the ‘tools of control’ or activating the available tools.

Both parties also derive commercial benefit from maximum end-user activity on the platform (from advertising revenues generated by unique visitors and click-through rates).

An administrator may ‘prompt’ discussion based on the original content it posts but Facebook’s algorithms, which control what Facebook users see in their newsfeed, may equally prompt discussion. It would not be surprising if this critical question of control, and consequent liability, is tested further in the courts in the context of social media for business use.

Are the tools offered by Facebook enough to eliminate risk?

In our view, Facebook’s tools will help, but may not be enough to eliminate all risk relating to defamation when using Facebook for business. Rather, reducing risk effectively is likely to involve actively and manually monitoring all comments, establishing clear guidelines for the removal of defamatory content, activating Facebook’s filtering tools and considering blocking new posts.

The reason Facebook’s tools alone may not be enough is partly due, with respect, to the terminology used in the Voller judgment. In our view, the terms of reference obscure exactly what third-party content can be prevented from being published in the first place. In summarising the expert evidence of Ryan Shelly (given in cross-examination), the Court reached the conclusion “all comments” could be blocked “totally”. However, it is unclear whether this is a reference to: 

  • New posts by third parties (posts); or
  • Comments by third parties in response to existing posts (comments); or
  • Both. 

The judgment suggests an administrator can “forbid all comments” from being published entirely. This seems to encapsulate both posts and comments. However, it seems clear from Facebook’s tools (including guidance on Facebook’s Help Centre) that the level of control currently differs, depending on the type of third-party content (namely, posts by third parties versus comments by third parties).

While our own independent investigations have demonstrated it is possible to disable the capacity of third parties to publish any new posts, this functionality does not appear to be available for comments. If this is correct, it would mean the highest level of control over comments will ultimately depend on the effectiveness of the filtering tools and any manual monitoring undertaken.

In using Facebook’s filtering tools, an administrator can ‘hide’ certain content containing specified words. However, this method is unlikely to be foolproof as it would be virtually impossible to anticipate every potential word which may be used, particularly given the potential for different spelling, and the fact that defamation arises from the imputation arising from the words.

Further, hidden comments can still be seen in grey colour to the owner of the public Facebook page, to the third party who posted the comment, and to Facebook friends of that third party. Given publication to a wide audience is not required for a defamation action, there is still a risk of liability for defamation for owners of public Facebook pages even when these tools are activated.

Given this apparent distinction between posts and comments is not addressed directly in the judgment, there is some uncertainty as to an administrator’s ability to moderate all third-party content. As a result, the functionality of Facebook’s moderation tools (and how easily and readily available they are) may need further interrogation as this is not, with respect, set out conclusively either in the judgment or in wider commentary on the implications of this case. This might be an area that is explored further should the defendants appeal.

Key takeaways and tips 

  • Administrators (meaning, the owners and operators) of public Facebook pages can now be considered ‘primary publishers’ of the content on their sites, including third-party comments.
  • As ‘primary publishers’, administrators can now be held liable for defamatory comments posted on their public Facebook pages.
  • The defence of innocent dissemination is not available to primary publishers.
  • While it is possible to disable third-party posts, it may not be possible to disable comments completely (rather, just ‘hide’ certain comments from most Facebook users, using Facebook’s filtering tools). Therefore, the monitoring and moderation of comments is recommended to ensure publication of defamatory comments is kept to a minimum and high-risk comments are deleted promptly. This will mitigate damages associated with any successful claim for defamation
  • It is recommended clear internal guidance be developed for administrators of public Facebook pages setting out how to implement Facebook’s filtering tools effectively and to help ensure decisions made regarding the moderation of third-party content are proportionate to the associated risks.

As with all case law, the application of this decision will depend on the facts. A material consideration in Voller, as stated at the outset, was the commercial purpose for which the public Facebook pages were used. Therefore, an assessment of the particular public Facebook page in question is highly recommended in order to determine a proportionate response. 

NOW READ: The email killer? Facebook’s new Workplace platform brings new opportunities for startups and SMEs

NOW READ: Facebook turns 15: Is the platform still worth the time for small businesses?