Artificial Intelligence is here to stay. It can be an incredibly powerful tool when used properly and can drastically speed up a variety of workflows. With a games industry under pressure, finding ways to speed up production is a tantalising prospect. But what are the risks, and how can you mitigate them?
This is a list of some of the most exciting potential applications for AI in the world of game development, and we will address the potential risks and how to mitigate these if you wanted to go ahead with any.
1 – Artwork, code & content generation
This is the one that most people think of first when considering AI. Using it to generate and assist in your content for you, as this can save a huge amount of time.
The risk here lies predominantly in IP infringement, and ownership.
If you get AI to create content (this could be artwork, code, audio, lore, the list goes on…) you run the risk of the AI creating something that could be accused of infringing on the IP rights of another party. It is difficult to determine where many of the AI tools are getting their data from, and they could be regurgitating content from anywhere. The below example is fairly obvious, and these are getting harder to detect, but any work you produce (whether you got it from AI or not) will still be subject to intellectual property scrutiny, and you may find yourself having to defend that you were the original creator, which in the case of AI you were not.
Image Credit: Hackaday.com
Getty Images Is Suing An AI Image Generator For Using Its Images: https://hackaday.com/2023/02/09/getty-images-is-suing-an-ai-image-generator-for-using-its-images
The other big issue is around ownership.
This is a fairly big topic to unpack and is a developing legal landscape, but broadly the USA requires “human authorship”, and you may find yourself not actually owning copyright on key elements of your game if you used AI to assist. This can make it difficult to enforce protection from copycats, plus if you were considering selling the IP or the business, it could make a sale difficult if you couldn’t prove complete ownership of the work(s) you were selling.
Several countries, such as Australia, Germany, Brazil, Colombia, Mexico, and Spain, adopt a similar stance to the United States, mandating human authorship for copyright protection. However, other jurisdictions like the UK, Ireland, Hong Kong, India, New Zealand, and South Africa explicitly acknowledge copyright protection for computer-generated works through statutory provisions.
☝ Prompt: “create an image of someone creating something using an AI system, but not actually owning the copyright, and they are therefore sad because they can’t sell it”
How to mitigate the risks:
- You can use AI as a starting point, or for concept art and storyboarding, but if you want to put it intro production you should add your own touches to it as much as possible.
- Save your original editor files and show the journey to a transformative work to prove authorship and therefore ownership.
- Get a lawyer to run a trademark search before releasing anything (this can add cost).
- Consider getting insurance that includes IP infringement defence.
- If you outsource any of these functions, consider including in the contract a warranty that AI was not used to deliver content to you.
- Check whether your country recognises ownership from AI (bear in mind even if you owned the work you would still liable for copyright infringement if the AI spits out something infringing).
2 – Data collection and player analysis and Quality Assurance
AI can automate the collection of player data from diverse sources like in-game interactions, social media, and forums. It can also analyse data to uncover valuable insights, from player behaviour patterns to trends that inform game design, marketing tactics, and monetization strategies.
There are a couple of inherent risks to this approach.
AI-based testing tools may produce false positives and AI-generated test results would need some sort of validation refinement. AI testing tools also may have limitations in detecting certain types of defects or assessing subjective aspects of gameplay quality. For many things a qualified human QA is simply better.
Furthermore, there are potentially Data Privacy Concerns.
Collecting and analyzing player data, particularly with regard to personally identifiable information (PII) could cause issues. Companies must adhere to data protection regulations and implement robust security measures to safeguard player data, which could be further complicated if the data goes “offsite” to a 3rd party AI tool. This can be a complicated area of law (especially across multiple jurisdictions), but there are heavy penalties for getting it wrong.
Fortnite Video Game Maker Epic Games to Pay More Than Half a Billion Dollars over FTC Allegations of Privacy Violations and Unwanted Charges: https://www.ftc.gov/news-events/news/press-releases/2022/12/fortnite-video-game-maker-epic-games-pay-more-half-billion-dollars-over-ftc-allegations
☝ Prompt: “a robot testing a video game, and collecting personal information without people’s knowledge”
How to mitigate the risks:
- Supplement automated testing with manual testing and user feedback to ensure more comprehensive QA coverage.
- Ensure transparency, obtaining informed consent, and avoid discriminatory practices wherever possible.
- Familiarise yourself with the data privacy laws where you operate and engage with a lawyer who understands data privacy to help you get it right (it can be complicated!).
3 – In-game content (e.g., talking NPCs and personalised content)
Using AI to create truly emergent gameplay offers potentially limitless content and replayability. If using a closed AI system this could avoid potential copyright issues if the model was trained on your own in-house data.
Artificial intelligence could also use player data player data to deliver personalized content, including targeted offers, in-game rewards, and tailored gaming experiences.
Putting current technical limitations aside, this potentially also has data privacy concerns. Collecting data on players to offer customised experiences and individual price points could cause problems with Consumer Protection Laws.
Many countries have laws and regulations in place to protect consumers from unfair or deceptive practices. Offering different prices or offers to different players without their knowledge could potentially violate these laws if it’s deemed to be discriminatory or unfair.
There is the risk of unintended consequences. Having AI “running loose” in your game could cause a number of problems, for which you could be held responsible. If it was integrated into your game experience, it would be difficult to argue that “it was the AI’s fault”. For example, if an NPC character started acting offensively towards a player, or some element of emergent design was inappropriate for the age-rating or censors, you could land in trouble and the game could be pulled from shelves while this is fixed. It could also be a public relations issue depending on the severity.
Image Credit: arstechnica.com
Endless “Seinfeld” episode grinds to a halt after AI comic violates Twitch guidelines: https://arstechnica.com/information-technology/2023/02/endless-seinfeld-episode-grinds-to-a-halt-after-ai-comic-violates-twitch-guidelines
How to mitigate the risks:
- Train the AI on in-house data only.
- Put in robust checks and balances to avoid any inappropriate content.
- If offering personalized deals and pricing, check with Consumer Protection, and Anti-Discrimination laws and consider getting legal advice.
4 – Upscaling, synthesising and extending life of existing content
Using AI to get more life out of existing or old content can extend the lifespan of a product, but also create brand new content.
Image Credit: theVerge.com
Artificial intelligence is helping old video games look like new: https://www.theverge.com/2019/4/18/18311287/ai-upscaling-algorithms-video-games-mods-modding-esrgan-gigapixel
This is very cool, although the results may vary. There is a danger that some copyrighted material could inadvertently sneak in, especially if there were things like posters, billboards etc in original pixelart that AI could reinterpret and put in something unintended.
Artificial Intelligence can also be used to synthesise voices, motion capture and other development assets, long after the creator of the original work has completed their work.
For example, if for example you hired a voice actor and paid them for 500 lines of dialogue, and then fed this into a machine to create a synthesised copy of that person’s voice for use well beyond the original agreed amount. Similar to how DeepFakes work, the original actor could be quite upset.
This could run into Breach of Contract issues if it was not clear at the outset that was the intention of the studio. This could invite litigation from the original voice artist, and the same could apply from any creator who’s work was used beyond the original scope of their agreement.
This is not unique to AI – Hollywood has been brining back actors from the dead for a long time: https://screenrant.com/dead-actors-brought-back-with-cgi
How to mitigate the risks:
- Make sure any upscaled content has been checked for IP infringement
- If synthesising beyond original work, ensure the contract allows for this. If not consider reaching out to the original creator and seek approval. They may want to be paid more, but this could be better for everyone than litigation and a potential PR problem.
- Consider getting insurance that includes Breach of Contract cover and IP infringement.
- Reach out to a lawyer for guidance if you are not sure on whether a contract allows for synthesising original work.
Phil Wildman is the Founder of GG Insurance Services, which has enabled him to combine his lifelong passion for games with over 20 years of insurance experience.
With clients across the world, GG Insurance advises and arranges policies for broad variety of risks from E&O, Contractual Liability, M&A, IP and Cyber to name a few.
As a Lloyd’s Broker GG have access to the world’s best insurers, and their hyper focus on the games industry allows them to talk the same language as their clients and identify potential risks and liabilities that other brokers may have missed, plus they genuinely care!
0 Comments