Creating Safe Online Games: The Online Safety Bill

0

The explosive growth of the digital world has been largely unregulated until now, on the grounds that too much regulation would stifle innovation. But high-profile examples of sports stars being inundated with racist abuse, teenagers being encouraged to take their own lives and the prevalence of child sexual exploitation and abuse have turned the political appetite for regulating online spaces on their head. The upcoming legislation primarily targets platforms that facilitate offensive content and practices that lawmakers say have benefited from the lack of regulation to date.

In a gaming world that designs a “metaverse” to recreate and augment real-life interactions online, the risks of harm are obvious. Children can be exposed to adults without the usual buffer from parents, guardians, or teachers, and many users have reported virtual groping and verbal abuse in VR settings.

The Online security bill (OSB) is the main UK content legislation in this space. The OSB is intended to protect users, especially children, from harm online. It will primarily focus on user-generated content, covering both illegal and harmful content. It introduces a legal duty of care for certain online providers to protect their users from harm. It will apply to:

  • user-to-user services (internet services that allow content generated directly on the service by a user of the service, uploaded or shared by a user to be experienced by another user)
  • search services (a service that allows users to search multiple websites or databases).

Although not yet finalized in the form of a law (which presumably will be the Online Safety Act), it has undergone extensive legislative review and was nearing completion before to be delayed following the resignation of Boris Johnson.

While the biggest tech and social media companies are likely to be seen as most at risk due to high volumes of user-to-user interaction and sharing of user-generated content, online games line undoubtedly fall within the scope for the same reasons.

What types of games are captured?

Generally speaking, online games that facilitate player interaction or content creation by players fall within the scope of OSB. Geographic scope requires that either:

  • there is a significant number of British players
  • the UK is a target market, or
  • the game can be played in the UK and there are reasonable grounds to believe that there is a significant risk of material harm to UK persons.

In particular, the following types of functionality are likely to fall within the scope of the OSB:

  • Text or voice chat functionality (team chat in team games such as Counter-Strikeor chat on large open servers that collect player avatars such as Fortnite).
  • Games built around generating and sharing user-generated content (like Minecraft).
  • Virtual reality/metaverse functionality in common online spaces.
  • Games with integrated live streaming functionality, forums, marketplaces or other platforms that facilitate user interactions.

Games without online functionality, or games with online functionality that do not meet these criteria, are unlikely to be affected. In the context of games, there is also an exception for one-to-one live oral communications. This may apply to spoken chat functionality between two players in a specific chat room, server or lobby, but as soon as more than two people join, or if there is also text chat functionality, the exception will not be relevant.

Of course, organizations within the gaming space but who are not strictly developers or publishers can also be impacted. Gaming forums, marketplaces, distribution platforms, live streaming and video-on-demand platforms are all likely to be caught. For more details on the scope of the BSF, please read here.

What should gaming companies do?

The OSB bases its regulations around three types of content: illegal content, content harmful to children and content harmful to adults. For more details on what this means, please read here.

In summary, organizations will have security obligations depending on the specific type of content they are trying to manage. These include:

  • carrying out risk assessments. It is essential from the start that companies understand the types of content, players and harm that could be facilitated by their game, as this will inform what further compliance measures are taken.
  • using proportionate measures to mitigate the risk of harm arising from illegal or harmful content. What that means will depend entirely on the specific game or environment and the corresponding damage. Ofcom (the enforcer of the OSB) will issue codes and guidance to support businesses, which may include recommendations for using particular tools for content moderation, user profiling and behavior identification
  • use proportionate processes to prevent individuals from encountering various elements of illegal or harmful content. Again, this will depend on the specific game. Protections may include age verification measures, automatic blocking of certain words, acceptable use policies or strictly enforced codes of conduct, restriction of certain content or features to certain groups of players, etc.
  • use proportionate processes to remove illegal content when they become aware of it. Gaming companies will need to have the ability to be nimble in removing offensive content. Most online games already facilitate inter-player dealings (such as for cheating or offensive language) – this feature may need to be expanded and stronger moderated
  • specifying in terms and conditions how individuals are protected from illegal and harmful content and implementing them consistently. Further information will need to be provided to players about the measures deployed to ensure that a company protects its player base, which could take the form of FAQs or policies which will need to be understandable for all ages involved.

Won’t someone think of the children?!

All gaming organizations will be required to perform child risk assessments to determine if their player base includes children. This assessment should determine:

It seems likely that all but the most mature indie games, which are also behind age walls, will qualify for this threshold. In addition to the summary requirements for relevant gaming businesses, there are requirements that relate specifically to child protection. These include an overarching obligation to protect children’s online safety, broken down into sub-obligations:

For more details on child protection under the BSF, please read here.

  • whether it is possible for children to access the game. In practice, this is probably the case unless age verification or assurance is used if such access is possible, that it there is a significant number of children playing or that the game is likely to attract a significant number of child players. manage risk of harm for different age groups of children. For example, depictions of some violence may be unlikely to harm 16 and 17 year olds, but would be more likely to harm those under 10.
  • prevent children from accessing “priority content”, some of which will be determined by secondary legislation. This means that there will be types of harm that will automatically be deemed harmful to children
  • plan for children’s reading ability when writing policies and information about the protections deployed by the organization.

What about “and what about”?

The OSB leaves open a number of questions and uncertainties. Some of these can be resolved under statutory codes of conduct which Ofcom is required to produce, but others can be reduced to operational implementation.

Which part of the game life cycle is responsible?

Should he be the developer, the party most responsible for producing the game? Should he be the publisher, the party financing and marketing the game? Should it be the distributor (so different from the publisher) wanting to protect the brand of its platform? The OSB itself is not specifically targeted at gaming and therefore does not directly address these issues. We anticipate that once the OSB is in place, the contractual agreements between these parties will determine who is responsible for what, including operational requirements and responsibility in the event of a problem.

What will SMEs and independent developers have to do?

A thorough OSB compliance project, involving a myriad of risk assessments and operational changes, will likely require significant resources to manage. While this is achievable for larger organizations that have more money for compliance processes, unfortunately there is no waiver in the OSB for organizations under particular thresholds. That means the likes of indie developers will always be hooked.

In practice, Ofcom itself will manage application resources and priorities. It’s not yet clear if the games industry is specifically in the crosshairs or if it will amount to collateral damage, but it’s likely that the smaller the organization, the less likely it is to attract the attention of gamers. regulators unless she does something particularly egregious. Ofcom said part of its regulatory responsibilities will involve it “learning on the job” and so it will first engage with the organizations it regulates, rather than jumping straight into regulatory measures. ‘application. This can allow small organizations to take shape rather than face immediate penalties.

The frequent references to “proportionality” should also help to allay some fears. What is proportionate for the largest publisher in the world will not be proportionate for a new independent mobile game developer.

In-game customization

There will no doubt be extreme cases of content that is neither clearly nor out of scope. For example, are custom characters or avatars captured? And what about custom emblems, logos and other game elements – for example, to indicate association with particular individuals or groups (clan tags)?

Hopefully any Ofcom guidelines or gaming-targeted codes of conduct will help in this regard, but fringe cases may require testing in court.

What should you do now?

Great Britain Children’s code has already drawn a line in the sand in an attempt to protect children and their personal data. This and the OSB are part of a wider regulatory effort in the UK and EU to protect people online. It is therefore very important that all gaming companies, large and small, start planning for the implementation of OSB and greater regulation of problematic online content.

Share.

Comments are closed.