Welcome to the newest issue of Socially Aware, our Burton Award winning guide to the law and business of social media. In this edition, we discuss the impact online trolls are having on social media marketing; we revisit whether hashtags should be afforded trademark protection; we explain how an unusual New Jersey law is disrupting the ecommerce industry and creating traps for the unwary; we explore legal and business implications of the Pokémon Go craze; we examine a recent federal court decision likely to affect application of the Video Privacy Protection Act to mobile apps; we discuss a class action suit against an app developer that highlights the legal risks of transitioning app customers from one business model to another; and we describe how Europe's Right to Be Forgotten has spread to Asia.

All this—plus infographics illustrating the enormous popularity of Pokémon Go and the unfortunate prevalence of online trolling.

ARE ONLINE TROLLS RUINING SOCIAL MEDIA MARKETING?

By John Delaney

Earlier this year, I helped moderate a lively panel discussion on social media business and legal trends. The panelists, who represented well-known brands, didn't agree on anything. One panelist would make an observation, only to be immediately challenged by another panelist. Hoping to generate even more sparks, I asked each panelist to identify the issue that most frustrated him or her about social media marketing. To my surprise, the panelists all agreed that online trolls were among the biggest source of headaches.

This contentious group proceeded to unanimously bemoan the fact that the comments sections on their companies' social media pages often devolve into depressing cesspools of invective and hate speech, scaring off customers who otherwise would be interested in engaging with brands online. And it isn't just our panelists who feel this way. Many online publishers have eliminated the comments sections on their websites as, over time, those sections became rife with offtopic, inflammatory and even downright scary messages.

For example, Above the Law, perhaps the most widely read website within the legal profession, recently canned its comments section, citing a change in the comments' "number and quality."

The technology news website Wired even put together a timeline chronicling other media companies' moves to make the same decision, saying the change was possibly a result of the fact that, "as online audiences have grown, the pain of moderating conversations on the web has grown, too."

Both brands and publishers are right to be concerned. Unlike consumers who visit an online branded community to voice a legitimate concern or share an invaluable insight, trolls "aren't interested in a productive outcome." Their main goal is harassment, and, as a columnist at The Daily Dot has observed, "People are generally less likely to use a service if harassment is part of the experience." That's especially true of online branded customer communities, which consumers mainly visit to get information about a brand (50%) and to engage with consumers like themselves (21%).

Of course, it's easy for a brand to eliminate the comments section on its own website or blog. But, increasingly, brands are not engaging with consumers on their own online properties; they're doing it on Facebook, Instagram, Twitter and other third-party social media platforms, where they typically do not have an ability to shut down user comments. Some of these platforms, however, are taking steps to rein in trolls or eliminate their opportunities to post disruptive comments altogether.

The blog comment–hosting service Disqus, for example, recently unveiled a new platform feature that will allow users to "block profiles of commenters that are distracting from their online discussion experience." The live video-streaming app Periscope also recently took measures to rein in trolls, enabling users to flag what they consider to be inappropriate comments during a broadcast. If a majority of randomly selected viewers vote that the flagged comment is spam or abusive, the commenter's ability to post is temporarily disabled. And even Facebook, Instagram and Twitter have stepped up their efforts to help users deal with harassment and unwanted messages.

Brands, however, are seeking a greater degree of control over user comments than what is being offered even by Disqus and Periscope. Given that branded content and advertising are crucial components of many social media platforms' business models, we can expect to see platforms becoming more willing to provide brands with tools to address their troll concerns.

In fact, the user-generated content site Reddit has already taken steps in this direction. Because of its notorious trolling problem, Reddit has had trouble leveraging its large and passionate user base. Last year, in an effort to capitalize on the platform's ability to identify trending content and create a space where brands wouldn't be afraid to advertise, Reddit launched Upvote, a site that culls news stories from Reddit's popular subgroups and doesn't allow comments.

Other platforms will presumably follow Reddit's lead in creating commentfree spaces for brands. Although this may prove to be good news for many brands, one can't help but feel that this inevitable development undermines— just as trolls have undermined—the single most exciting and revolutionary aspect of social media for companies: the ability to truly engage one-onone with customers across the entire customer base.

(Note: This article originally appeared in MarketWatch.)

To continue reading this newsletter, please click here

Because of the generality of this update, the information provided herein may not be applicable in all situations and should not be acted upon without specific legal advice based on particular situations.

© Morrison & Foerster LLP. All rights reserved