We were fortunate enough to receive an invite to the OpenAI private beta a few weeks ago, gaining an interface to their GPT-3 engine. Using AI for mob interactions within Written Realms had been a long term goal of ours for many years, and we jumped at the opportunity to play with it.
The OpenAI API provides a 'completion' endpoint (also available as a Python binding) for interacting with the GPT-3 engine.
Given a text prompt, the API attempts to complete the pattern that you gave it. For example if you give it an input of
1 2 4 8 and request 1 token back, you will get a response of:
16. The API understood the pattern to be a doubling function and gave the next result.
It's not limited to math. If you give as input
Barking is to dogs as meowing is to, you will get as a result:
Not only is GPT-3 good at reading comprehension, but it also good at tailoring its output to match the style of the input. Consider the following prompt:
Tom went fishing all afternoon. He comes home to his wife at sunset, who asks him a question. Question: What did you do today? Answer:
The API returns
I went fishing.
It is possible to get a more creative answer by providing more information in the prompt and turning up the
temperature setting, which gives GPT-3 more of a creative license to make stuff up. For example, if after adding
Tom responds enthusiastically, providing many details. to the prompt before the question and turning the temperature to its max value, here's how the API responded:
I went fishing all day, and in the afternoon I fought the biggest fish that I have ever seen before. I just held on for as long as I could, because once I gave up there was no way I was getting the fish back again. I had to pull everything I could, and my catch ended up weighing 3 pounds of flounder and 4 pounds of bluegill.
Evidently, GPT-3 is quite proficient at making stuff up given the right temperature.
For our first foray into GPT-3, we created a Historian NPC (Non-Player Character) who is able to answer arbitrary questions about Edeus, our flagship world. All of our lore is original and it can be challenging to impart it all to the players at once. An AI able to answer the organic questions of a player in real-time seemed like a perfect use-case.
We already had a feature called 'Reactions' that allows NPCs to respond to something said in their room with predetermined commands. The adjustment made here was to add a
gpt flag to the Historian which at reaction time forwards the user's input to OpenAI in order to get the response.
As with the fisherman above, we can use the dialog pattern, and precede it with a certain context, which explains to GPT-3 what the background is for the interactions. In this case, we gave it an excerpt from a fictional history book written by the NPC in question, and then provided a little bit of background about the NPC himself. Here is a sample interaction with him:
You say 'hello there' Grae the Historian says 'Greetings, traveler.' You say 'Who are you?' Grae the Historian says 'I am Historian Grae, a Priest of the Church of Thune.' You say 'Where are we?' Grae the Historian says 'You are in Bluefinch, a village in the realm of Saphrin.' You say 'Is there anything I should be doing?' Grae the Historian says 'You should be killing orcs.' You say 'Why? What have the orcs ever done to us?' Grae the Historian says 'The orcs are evil. They want to destroy us.' You say 'Okay. Can you tell me more about our origin?' Grae the Historian says 'Humans were created by Thune, the Allmighty God.' You say 'Who rules this area?' Grae the Historian says 'The King of Saphrin, Arctus Kerinin, rules this area.' You say 'Thanks Grae! You've been very helpful' Grae the Historian says 'You're welcome.'
If you want to test this out for yourself, Grae is located 3 north, 1 east of Bluefinch Square in Edeus. We're always tweaking his backstory and various settings so his outputs may (and should!) vary a bit.
The main issue we've been running up against is that it's difficult to constrain the AI's knowledge to that of a fictional world. It cannot help knowing things about the real world. For example:
You say 'When was JFK born?' Grae the Historian says 'John Fitzgerald Kennedy was born on May 29, 1917.'
Ideally, the historian would not know about anything outside of Edeus. But even though we specify in the prompt's backstory that he only answers questions pertaining to his own world, that distinction is not always understood. It's difficult to exclude knowledge.
We plan to keep iterating and try to make the AI feel as human and authentic as possible. One issue is that currently each interaction with the NPC is its own isolated event. For example:
You say 'What is the name of this village?' Grae the Historian says 'This village is called Bluefinch.' You say 'Where did it get its name from?' Grae the Historian says 'Edeus is named after the two titans that fought in the First Age. Thune and Raeshyk.'
Clearly the NPC didn't connect that the second question was a follow-up from the first. This is because each time we do a call to OpenAI we append the user's query to the end of the same base prompt, making each a separate interaction. It could be improved upon by including a few of the previous back and forths when generating a new prompt, provided it was the same player.
There's also entirely different ways we could leverage GPT-3. For example, it could be used to generate content like room, character and item descriptions, or even dynamic quests. A text-based platform and a text-based AI share great synergy, and the possible applications of it seem endless to the imaginative.