OpenAI chat module!

Instructions:
Turn on http requests in your game settings.

replace sk_key in the module with your OpenAI secret key (Without “Bearer”), which can be found here: Where do I find my Secret API Key? | OpenAI Help Center

change ratelimit in the module to how you’d like

Documentation:

openai:new() creates a new chatroom
openai:sendmessage() generates a new message for the chatbot
openai:cansend() checks if the user is being api limited

Example:

-- getting module
local openai = require(script.OpenAI)

-- creating a new chatroom
local conversation = openai:new()

-- asking it a question
local response = conversation:sendmessage('What is the square root of 9?')
print(response)

Module:

-- settings
local sk_key = ''
local ratelimit = 5 --seconds

-- variables
local new_api = 'https://api.openai.com/v1/chat/completions'
local old_api = 'https://api.openai.com/v1/completions'

local openai = {}
openai.__index = openai

-- services
local http = game:GetService('HttpService')

-- functions
function openai:new()
	local self = setmetatable({
		lasttick = tick();
		history = {};
	},openai)
	
	self.lasttick -= ratelimit

	return self
end

function openai:cansend()
	local currenttick = tick()
	if currenttick - self.lasttick > ratelimit then
		self.lasttick = currenttick
		return true
	end
	return false
end

function openai:sendmessage(message)
	if not self:cansend() then
		return 'Sorry, you are currently being rate limited. Please try again later.'
	end
	
	local headers = {
		["Authorization"] = "Bearer " .. sk_key;
	}
	
	table.insert(self.history,{
		role = "user",
		content = message
	})
	
	local body = http:JSONEncode({
		model = "gpt-3.5-turbo",
		messages = self.history
	})
	
	local response = 'Response was unavailable.';
	pcall(function()
		response = http:PostAsync(new_api, body, Enum.HttpContentType.ApplicationJson, nil, headers)
	end)
	
	local data = http:JSONDecode(response)
	local text = data["choices"][1]["message"]["content"]
	
	table.insert(self.history,{
		role = 'system',
		content = text
	})
	
	return text
end

return openai

Please post any issues under this thread!

13 Likes

This module seems very useful, maybe I’ll try it later today!
Here’s a few things I think you could touch up on, though:

The Response was unavailable. will never be used. Did you mean to put the http call in a pcall? Otherwise, you should change it to this:

local response = http:PostAsync(new_api, body, Enum.Http ...

You can index directly like this: data.choices[1].message.content, but honestly that one is preference and it doesn’t matter.

I think to avoid confusion, you should tell the user to input a key without Bearer, incase an issue pops up related to that. Maybe they didn’t know Bearer is already added to it.

3 Likes

Thank you for the feedback!
Response was unavailable. was indeed meant to be in a pcall function, I somehow forgot to do so.
I think telling to user to input a bearer would be unnecessary, I don’t believe that bearer will become an issue, unless you’re trying to tell me something I don’t fully understand.

I’ve edited the post with the pcall fix.

4 Likes

What I mean is that, you tell the user that they don’t need to put Bearer, as it is automatically put inside the key where you define the request headers!
It’s fine if you don’t want to do that though!

3 Likes

Ah, I’ve read your response wrong, I will put this in the intructions.

3 Likes

make sure you filter the ai’s response. Even though it shouldn’t respond with anything terrible due to OpenAi’s moderation and such, its still good practice to run anything thats not from roblox, and will be seen by players, through roblox’s own filter.

3 Likes

Absolutely, one of my friends made the chatbot say some really gross stuff. I highly recommend that you filter the response.

2 Likes

I modified the module to fit my style, and also added extra functionality.
It works well, thanks!

local Intel = require(script.Intel)

local Bot = Intel.new()
Bot:SetKey(script:GetAttribute("Key"))

print(Bot:SendMessage("How many cups are there in a gallon?"))
Bot:SetSpeechPattern("Every other letter capitalized") -- pretty unreliable
print(Bot:SendMessage("Solve 5 + 10"))

image

Not too experienced with AI, so I don’t exactly know how to change how the AI talks. So the SetSpeechPattern function just sets a prefix for the next message you send. If you have information around that, I’d love to know!

2 Likes

just asking, what can i make with this module?

2 Likes

I for one converted the OpenAI bot into Roblox using it, mainly to test if it worked or not. But you can really do whatever you want with the module.
image

1 Like

Definitely checking this out when I get home. Seems like it will be quite interesting to mess around with.

2 Likes

This is cool and all, but I don’t think AI is worth 20 dollars a month. I wish the free version wasn’t limited for a few months for openai, but oh well.

2 Likes

I’ve had 300+ requests and it has costed me 14 cents.

1 Like

It would end in July regardless of how much you ‘spent’ of your free 5 dollars. Also, considering you would need to test your AI a lot, the requests could go up a lot and it wasn’t even a user using it.

2 Likes

I doubt that testing your AI would be tight on money, 300 requests is a lot. I made the chatbot work with ~50 requests sent. I’m willing to pay the fees after my free trial expires.

1 Like

You don’t need premium if that’s what you are referring to.

1 Like

I’m getting an HTTP 429 (Too Many Requests) error even though I’m only doing one request.

1 Like

I was having the same issue when I first started making the module. I’m not too sure why it’s happening but it’s probably because OpenAI is prioritizing their own hosting, meaning that you either have to pay based on usage (like me), or wait until OpenAI is at less capacity.

1 Like

Ah, I understand now, I’ll most likely just wait until OpenAI isn’t at a high capacity. Thanks for the help though.

2 Likes

For some reason the response is so slow, it takes like 7 seconds for it to respond

1 Like