openai:new() creates a new chatroom
openai:sendmessage() generates a new message for the chatbot
openai:cansend() checks if the user is being api limited
Example:
-- getting module
local openai = require(script.OpenAI)
-- creating a new chatroom
local conversation = openai:new()
-- asking it a question
local response = conversation:sendmessage('What is the square root of 9?')
print(response)
Module:
-- settings
local sk_key = ''
local ratelimit = 5 --seconds
-- variables
local new_api = 'https://api.openai.com/v1/chat/completions'
local old_api = 'https://api.openai.com/v1/completions'
local openai = {}
openai.__index = openai
-- services
local http = game:GetService('HttpService')
-- functions
function openai:new()
local self = setmetatable({
lasttick = tick();
history = {};
},openai)
self.lasttick -= ratelimit
return self
end
function openai:cansend()
local currenttick = tick()
if currenttick - self.lasttick > ratelimit then
self.lasttick = currenttick
return true
end
return false
end
function openai:sendmessage(message)
if not self:cansend() then
return 'Sorry, you are currently being rate limited. Please try again later.'
end
local headers = {
["Authorization"] = "Bearer " .. sk_key;
}
table.insert(self.history,{
role = "user",
content = message
})
local body = http:JSONEncode({
model = "gpt-3.5-turbo",
messages = self.history
})
local response = 'Response was unavailable.';
pcall(function()
response = http:PostAsync(new_api, body, Enum.HttpContentType.ApplicationJson, nil, headers)
end)
local data = http:JSONDecode(response)
local text = data["choices"][1]["message"]["content"]
table.insert(self.history,{
role = 'system',
content = text
})
return text
end
return openai
This module seems very useful, maybe I’ll try it later today!
Here’s a few things I think you could touch up on, though:
The Response was unavailable. will never be used. Did you mean to put the http call in a pcall? Otherwise, you should change it to this:
local response = http:PostAsync(new_api, body, Enum.Http ...
You can index directly like this: data.choices[1].message.content, but honestly that one is preference and it doesn’t matter.
I think to avoid confusion, you should tell the user to input a key without Bearer, incase an issue pops up related to that. Maybe they didn’t know Bearer is already added to it.
Thank you for the feedback! Response was unavailable. was indeed meant to be in a pcall function, I somehow forgot to do so.
I think telling to user to input a bearer would be unnecessary, I don’t believe that bearer will become an issue, unless you’re trying to tell me something I don’t fully understand.
What I mean is that, you tell the user that they don’t need to put Bearer, as it is automatically put inside the key where you define the request headers!
It’s fine if you don’t want to do that though!
make sure you filter the ai’s response. Even though it shouldn’t respond with anything terrible due to OpenAi’s moderation and such, its still good practice to run anything thats not from roblox, and will be seen by players, through roblox’s own filter.
I modified the module to fit my style, and also added extra functionality.
It works well, thanks!
local Intel = require(script.Intel)
local Bot = Intel.new()
Bot:SetKey(script:GetAttribute("Key"))
print(Bot:SendMessage("How many cups are there in a gallon?"))
Bot:SetSpeechPattern("Every other letter capitalized") -- pretty unreliable
print(Bot:SendMessage("Solve 5 + 10"))
Not too experienced with AI, so I don’t exactly know how to change how the AI talks. So the SetSpeechPattern function just sets a prefix for the next message you send. If you have information around that, I’d love to know!
This is cool and all, but I don’t think AI is worth 20 dollars a month. I wish the free version wasn’t limited for a few months for openai, but oh well.
It would end in July regardless of how much you ‘spent’ of your free 5 dollars. Also, considering you would need to test your AI a lot, the requests could go up a lot and it wasn’t even a user using it.
I doubt that testing your AI would be tight on money, 300 requests is a lot. I made the chatbot work with ~50 requests sent. I’m willing to pay the fees after my free trial expires.
I was having the same issue when I first started making the module. I’m not too sure why it’s happening but it’s probably because OpenAI is prioritizing their own hosting, meaning that you either have to pay based on usage (like me), or wait until OpenAI is at less capacity.