How TikTok creators can stay on the right side of copyright law

The growth in TikTok's audience has gone hand in hand with an increase in copyright strikes and takedown notices. As more people become online content creators, what should social media platforms and users be doing to educate themselves about intellectual property?

Sites such as TikTok and Twitch have experienced a big increase in users over the past year, but that has gone hand in hand with an increase in copyright strikes and takedown notices. As more and more people become online content creators, what should social media platforms and users be doing to educate themselves about intellectual property?

Locked down and working from home, many people have turned to TikTok, Twitch and YouTube to while away the time. It has even become a side hustle for some.

TikTok user numbers increased by an estimated 85.3 per cent in the United States in 2020, according to eMarketer, with almost a fifth of adults (65.9 million) accessing the app at least once a month. 

In the UK, a survey by Ofcom revealed that, in April, a third of adults were consuming more video content online than terrestrial TV. Not only that, two in five were either broadcasting or uploading their own content, while 17 per cent said they were earning revenue via vlogging. 

In pursuit of viral fame, TikTok users are able to avoid copyright infringement by selecting songs from the app’s library of music as the ByteDance-owned platform has been signing licensing agreements with some of the major music labels. If users choose from the library, they should never be subject to a takedown request. Nonetheless, TikTok still had to issue 10,625 takedown notices in the first half of last year and 86 per cent led to removal, according to its latest transparency report published in September. 

Copyright infringement on YouTube is a bit more of a minefield. The platform uses an automatic copyright filter, Content ID, which scans videos and compares their contents to a database of material submitted by copyright holders, primarily record labels, and film and TV production studios. Just showing a few seconds of copyrighted material could result in a user being automatically issued with a penalty. 

A fair reaction to IP problems?

However, there are issues with copyright filtering. Tools such as Content ID can hamper some forms of creativity; the reaction video is an example. This involves watching other people listen to an old song or watch a classic TV comedy for the first time and has become increasingly popular over the last year of lockdowns.

While including a full song, TV episode or clip in a reaction video without a licence is an obvious infringement, YouTube’s Content ID tool can discourage fair use. 

The platform’s fair-use policy states that copyrighted material can be added to if the content created is “transformative”. When it comes to reaction videos, simply describing a song or what’s happening in a clip would probably not be considered fair use. But as the 2017 case of Matt Hosseinzadeh versus Ethan and Hila Klein showed, if a reaction video includes a critique or commentary, then it might be deemed fair use. 

The case saw Hosseinzadeh sue the Kleins for a video they created critiquing his original video, arguing it was copyright infringement. But the Kleins contended fair use and the judge sided with them, although saying this wasn’t a “blanket defence”. How much of an original work can be shown to justify fair use is likely to depend on how much of a song, video or clip is needed to put the critique or commentary into context for viewers. 

Clearly, deciding what constitutes fair use on a case-by-case basis, rather than relying on an algorithm to flag copyright infringements, would be an arduous process. Still, YouTube could arguably be doing more to prevent copyright infringement from occurring in the first place.

“There’s a commercial balance that needs to be struck between [platforms] functioning as a valuable method of content discovery and the monetisation and control of that content. But there’s no reason why [platforms] can’t work to better educate users over how third-party content can and can’t be used,” says Steve Kuncewicz, partner and head of creative, digital and marketing sector at commercial law specialist BLM. 

“Policies are one thing, but ensuring they’re adhered to should go beyond a vague threat of enforcement and account suspension.”

Kuncewicz suggests that platforms should be building awareness of the legal risks around infringing copyright into user experience and as part of the sign-up process. 

“We’ve seen plenty of work done to educate users on when to take action in the event of a privacy or harassment issue, yet there’s very little attention being paid to the fundamental issues around the sharing of third-party content,” he says.

Confusing copyright law

A big problem around education is current policies are often vaguely worded and can be confusing because copyright laws differ between jurisdictions. For this reason, it would be “unrealistic” to expect platforms to provide detailed information on whether copyright exists and what constitutes an infringement, argues Emma Ward, partner at IP solicitors firm Nelsons.

“In the UK, copyright will only be infringed if a substantial part of a work has been reproduced. This becomes a particularly thorny issue when considering how that applies to the use of GIFs and memes,” says Ward. 

Video reactions to meme compilations have racked up hundreds of thousands of views in recent months. In America last January, two YouTubers were each threatened with a $3,000 fine by Jukin Media for the use of a single meme in a reaction video, before coming to an agreement with the entertainment licensing group. Under the US Digital Millennium Copyright Act, platforms are shielded from liability. 

Members of the European Union have until June to implement a divisive copyright directive that will require platforms to take more responsibility and not simply filter or remove copyrighted material when requested to do so by copyright holders. Opponents of the directive argue it could lead to a meme ban. 

“The procedures that are currently in place to address copyright issues are very much reactive, dealing with complaints as and when received,” says Ward.

Who should be responsible for IP?

Timothy Watkins, IP expert at Harbottle & Lewis, believes the differing nature of copyright laws is why content creators also need to be taking more responsibility for what they post.

“In reality, the average user isn’t going to spend time reading and becoming familiar with their country’s copyright laws. Some platforms have tried to demystify the laws by creating plain-language articles, guidelines and, in the case of YouTube, its online Copyright School. These can be helpful, but they still require users to actually read them. So there’s a limit to what platforms can do to educate their users and some of the responsibility has to lie with content creators themselves,” argues Watkins. 

Navigating the world of IP can be frightening for any content creator, but especially teenagers and young adults who are only using platforms for fun and to entertain viewers. 

Ward says content creators should ideally seek legal advice before starting out, though of course this isn’t always feasible. She adds that the least they can do is to be familiar with each platform’s terms and conditions, as well as community guidelines, before posting content. That way they can avoid inadvertently infringing copyright. 

“Unlicensed use of third-party content can be an expensive and stressful error if that third party then decides to sue,” she warns. “Users should remember that infringing copyright isn’t just about having a post taken down.”

Striking the right balance between platforms empowering their users to be creative and entertain viewers, and respecting IP rights is “going to take a lot more work on all sides”, Kuncewicz concludes.