UPDATE: This topic is outdated, there is a new model specifically made to chat with, uses same API but just requires you to change some stuff in this topic up a bit, here is the article: OpenAI API
In this topic, I will discuss my personal experience with implementing a chat bot AI in Roblox using OpenAI’s API. OpenAI’s API is, in my opinion, the most cost-effective and reliable option currently available. Although this tutorial is not comprehensive, I can expand it if there is interest. For example, I can teach you how to analyze token usage, provide tips and tricks, discuss monetization strategies, and strategies for making your bot remember messages without using too many tokens. While it’s true that most of these concepts can be figured out on your own, organizing them in one resource could be useful. Kindly show your support to help me determine whether or not to expand this tutorial.
This resource is not revolutionary and only involves a basic HTTP request to interact with OpenAI’s API, and prompt crafting on my part. To follow this tutorial, you will need an account on Open AI and an Open AI API key (you can google how to get one). Additionally, you should have a basic understanding of Luau and the Roblox API. The full code is included, and the tutorial is in the comments. If you have any questions, please ask in the replies.
task.wait(3)
local Players = game:GetService("Players")
local TextService = game:GetService("TextService")
local HttpService = game:GetService("HttpService")
local Chat = game:GetService("Chat")
--This key you get from the website, look up how to get a open AI , API Key and paste yours here
--Your API key has to be in this format: Bearer[spacebar]pasteapikeyhere, example:Bearer sk-31UFEUFHUIAEAHEIA
local headers = {["Authorization"] = "Bearer API-KEY"}
--The AIs name
local botName = "AI"
--starting String
local startmessage = ""
--What the AI will see the player as, if you name this Bob the AI will know the players name is Bob
local PlayerName = "Player"
--AI "Backstory" makes the AI respond to your message in a certain way
local AIbackstory = botName.." is helpful, creative clever, and very friendly."
local ChatService = require(game:GetService("ServerScriptService"):WaitForChild("ChatServiceRunner").ChatService)
local systemMsg = ChatService:AddSpeaker(botName)
systemMsg:JoinChannel("All")
--Character Limits for
local characterLimitForRememberingAI = 100 -- 4 char 1 token
local characterLimitForRememberingPlayer = 80 -- 4 char 1 token
--Character limit, for reference a max roblox message is 200, keep this low
local messageCharLimit = 120
--Keeps track of how many tokens the current payload will consume
local currentTokens = 0
--init
local part = Instance.new("Part")
part.Name = "AI"
part.Parent = workspace
local function sendMessageBot(Msg)
local server = ChatService:GetSpeaker(botName)
Chat:Chat(workspace:FindFirstChild("AI", true), Msg, "Blue")
server:SayMessage(Msg, "All")
end
--Connected player chatted for the demo change this to something else for your game
for i,player in pairs(Players) do
player.Chatted:Connect(function(message, _recipient)
--[[ Message should be filtered like this, but for the tutorial we're not doing it, also don't forget to filter the bot response
local success,result = pcall(function()
return TextService:FilterStringAsync(message, player.UserId, Enum.TextFilterContext.PrivateChat)
end)]]
--adds backstory infront
startmessage = " "
startmessage = AIbackstory..startmessage
if #message > messageCharLimit then
local botErrorText = "ERROR: Message Character Limit is... "..tostring(messageCharLimit)
sendMessageBot(botErrorText)
return false
end
local playerMessage = "\n\n"..PlayerName..": "..message
local combinedmessage = startmessage..playerMessage.." \n "..botName..":"
--[[Combined message looks something like this which is what we're sending to the API.
AI is helpful, creative clever, and very friendly.\n\nPlayerName:Hello how are You?\nAI:
Basically what we're doing here is crafting a prompt, this specific way of formatting your prompt
will result in a accurate response most of the time, I recommend also checking and adding a period if there is nothing ending a sentence
so the AI knows for sure not to auto complete the players message, I haven't done it here for the sake of brevity
]]
print(combinedmessage)
local url = "https://api.openai.com/v1/completions"
local data = {
--[[text-davinci-003 Is the best model but curie is cheaper and still satisfactory depending on the prompt.
See what best suits you ]]
["model"] = "text-curie-001",
--the string were sending
["prompt"] = combinedmessage,
--Max amount of tokens it can take to generate a response, 1 token = 4 char, 35 tokens is pretty good for roblox
["max_tokens"] = 35,
--What sampling temperature to use. Higher values means the model will take more risks. Try 0.9 for more creative applications, and 0 (argmax sampling) for ones with a well-defined answer.
["temperature"] = 0.9,
--An alternative to sampling with temperature, called nucleus sampling, where the model considers the results of the tokens with top_p probability mass. So 0.1 means only the tokens comprising the top 10% probability mass are considered.
["top_p"] = 1,
--Number between -2.0 and 2.0. Positive values penalize new tokens based on whether they appear in the text so far, increasing the model's likelihood to talk about new topics.
["presence_penalty"] = 0.3,
--Number between -2.0 and 2.0. Positive values penalize new tokens based on their existing frequency in the text so far, decreasing the model's likelihood to repeat the same line verbatim.
["frequency_penalty"] = 0.5,
--Sending the user ID of the player aswell (Can only send strings) so that open AIs moderation can step in and block the user if they're abusing the service
["user"] = tostring(player.UserId)}
local response = HttpService:PostAsync(url, HttpService:JSONEncode(data), Enum.HttpContentType.ApplicationJson, false, headers)
local decoded = HttpService:JSONDecode(response)
--You can use this to check if your prompt is too big (Too much history is getting stored), and you can use it to wipe it, anything above 200 is a lot
currentTokens = decoded["usage"]["total_tokens"]
--response to the message we send
local textResponse = decoded["choices"][1]["text"]
print(textResponse)
--The message you display to the player, make sure to filter
--return textResponse
sendMessageBot(textResponse)
end)
end
If you encounter any issues with the code, please let me know and I’ll update the tutorial accordingly. Although the code is relatively simple and I believe I have explained it clearly, there is a possibility of small errors, so please don’t hesitate to notify me.