Giving LLM Text Vision and Self Awareness in Luau [OPEN SOURCE]

If that’s the case definitely look into this.

It demonstrates how to use huggingface apis (free access daily limit) Their are lots of open source models to check out. I am currently using my local Model I described, Zephyr 7b and ChatGPT-4 together.

Which one do you recommend for what i want to achieve.

I use Zephyr but i would recccomend that or Mistral. mistralai/Mistral-7B-Instruct-v0.3 · Hugging Face
Mistral is more recent and has a larger context window. IT is also a tool user, that can make function calls so I will likely be changing over to mistral since I have a bunch of tools already made for GPT-4 that Zephyr cannot use.

Ok how do i incorporate this into Roblox.

I’m having some issues figuring it out right now due to AI giving me a hard time. But the basic setup is this

local module = {}
local HttpService = game:GetService("HttpService")
local endpoint = "https://api-inference.huggingface.co/models/mistralai/Mistral-7B-Instruct-v0.3" -- Replace with your actual endpoint
local BearerKey="Get your key from huggingface"
local apiKey = "Bearer "..Bearerkey -- Replace with your actual API key

function module.query(input, system_message)
	local messages = {
		--{ role = "system", content = system_message },
		{ role = "user", content = input }
	}
	local npcdata = {
		inputs =input,		
		max_new_tokens = 512,
		do_sample = true,
		temperature = 0.7,
		top_k = 50,
		top_p = 0.95
	}
	local response = HttpService:RequestAsync({
		Url = endpoint,
		Method = "POST",
		Headers = {
			["Content-Type"] = "application/json",
			["Authorization"] = apiKey
		},
		Body = HttpService:JSONEncode(npcdata),
	})
	print(response)
	return HttpService:JSONDecode(response.Body)
end

return module

is “input” where all or what is sent to the api endpoint goes? Also, what is temperature top_k and top_p. Do i need that for what im trying to achive?

One moment, so I have figured it out and have potentially set it up so it can use tools, system message, and input,

local module = {}
local HttpService = game:GetService("HttpService")
local endpoint = "https://api-inference.huggingface.co/models/mistralai/Mistral-7B-Instruct-v0.3" -- Replace with your actual endpoint
local apiKey = "" -- Replace with your actual API key
local function format_response(str)
	-- find the assistant response in the string
	local start = string.find(str, "<|assistant|>")
	local finish = string.len(str)
	-- local finish = string.find(str, "</s>", start)
	local response = string.sub(str, start + 13, finish)
	-- return the response in a code block format
	return "" .. response .. ""
end

local tools =[[{
			["type"] = "function",
			["function"] = {
				name = "gotoLocation",
				description = "Move to a location specified by coordinates, call it when instructed to move somewhere",
				parameters = {
					type = "object",
					properties = {
						x = {
							type = "integer",
							description = "The X coordinate"
						},
						y = {
							type = "integer",
							description = "The Y coordinate"
						},
						z = {
							type = "integer",
							description = "The Z coordinate"
						}
					},
					required = {"x", "y", "z"}
				}
			}
		},
		{
			["type"] = "function",
			["function"] = {
				name = "setFollow",
				description = "Sets the 'follow' state to true or false as instructed",
				parameters = {
					type = "object",
					properties = {
						follow = {
							type = "boolean",
							description = "The follow state to set (true or false)"
						}
					},
					required = {"follow"}
				}
			}
		}

	]]
function module.query(input, system_message)
	local messages = {
		--{ role = "system", content = system_message },
		{ role = "user", content = input }
	}
	local npcdata = {
	--	{
	inputs = "<|system|>\n "..system_message.. "</s>\n<|tools|>\n "..tools.. "</s>\n<|user|>\n "..input.."</s>\n<|assistant|>",		
		max_new_tokens = 512,
		do_sample = true,
		temperature = 0.7,
		top_k = 50,
		top_p = 0.95
	}
	
	local response = HttpService:RequestAsync({
		Url = endpoint,
		Method = "POST",
		Headers = {
			["Content-Type"] = "application/json",
			["Authorization"] = apiKey
		},
		Body = HttpService:JSONEncode(npcdata),
	})
	print(format_response(response.Body))
	return HttpService:JSONDecode(response.Body)
end

return module

I had some code lying around from working with a zephyr chatbot and it appears to work great!.
" \n Yes, I have two tools available to me. The first one is "gotoLocation", it takes three arguments ‘x’, ‘y’, and ‘z’ which represent the coordinates of a specific location. This tool allows me to move to a specified location. The second tool is "setFollow", it takes a single argument ‘follow’ that can be set to either true or false. This tool allows me to enable or disable following another entity.“”

How do i edit this for what im trying to do with just chatting and thats it

temperature controls randomness top_k higher is less random top_p lower is more random, base value of 1 I think but you should really look that up.

  1. Top_p (Nucleus Sampling): It selects the most likely tokens from a probability distribution, considering the cumulative probability until it reaches a predefined threshold “p”. This limits the number of choices and helps avoid overly diverse or nonsensical outputs.
  2. Top_k (Top-k Sampling): It restricts the selection of tokens to the k” most likely options, based on their probabilities. This prevents the model from considering tokens with very low probabilities, making the output more focused and coherent.

like what can i delete without the think breaking

you don’t need the tools. System_message is very important. But you can just set that to a static value. You can change the <|user|> to the players name <|ClientSide|> and sometimes it’s good idea to start the assistant prompt with the name of the character. :

Wait so what can i do to make it for my needs

1 Like
local HttpService = game:GetService("HttpService")
local endpoint = "https://api-inference.huggingface.co/models/mistralai/Mistral-7B-Instruct-v0.3" -- Replace with your actual endpoint
local BearerKey="Get your key from huggingface"
local apiKey = "Bearer "..Bearerkey -- Replace with your actual API key

local function format_response(str)
	-- find the assistant response in the string
	local start = string.find(str, "<|assistant|>")
	local finish = string.len(str)
	-- local finish = string.find(str, "</s>", start)
	local response = string.sub(str, start + 13, finish)
	-- return the response in a code block format
	return "" .. response .. ""
end


function module.query(input, system_message)
	local npcdata = {
	inputs = "<|system|>\n "..system_message.. "</s>\n<|user|>\n "..input.."</s>\n<|assistant|>",		
		max_new_tokens = 512,
		do_sample = true,
		temperature = 0.7,--high more random
		top_k = 50,-- only consider probabilities of this percent lower more random
		top_p = 0.95--cumulative probability lower more random
	}
	
	local response = HttpService:RequestAsync({
		Url = endpoint,
		Method = "POST",
		Headers = {
			["Content-Type"] = "application/json",
			["Authorization"] = apiKey
		},
		Body = HttpService:JSONEncode(npcdata),
	})
	print(format_response(response.Body))
	return HttpService:JSONDecode(response.Body)
end

return module

place the code with your api key and the inference endpoint shown initially and insert that code.
Place it in a module and require( the module ) and call it with Chatbotresponse=query(input,system_message)
This call can only be done on the server. so you can connect to it via

Player.OnChatted:Connect(function(input)
--insert text filtering moderation
--insert additional chat logic here to customize
Chatbotresponse=query(input,system_message)
local TextChat=game:GetService("TextChatService")
local chatgroup=TextChat.TextChannels.RBXSystem 

--	pcall(function()    chatgroup:DisplaySystemMessage("<font color=\""..rgbcolor.."\">"..npc.Humanoid.DisplayName..": </font> <font color=\"rgb(255,255,255)\">"..str.."".."</font> ")end) 
if Chatbotresponse then
  chatgroup:DisplaySystemMessage(</font> <font color=\"rgb(255,255,255)\">"..Chatbotresponse.."".."</font> ")
end		
end

Boom their you have it a very easy and simple system to create a chatbot using an LLM.
The chat message is shown in the players text box and message are sent via chat.
To make it only show up in the local players chatbox you could use a RemoteFunction and FireClient(player,Chatbotresponse)
and run

ClientInvoker.OnClientEvent:Connect(function(Chatbotresponse) 
local TextChat=game:GetService("TextChatService")
local chatgroup=TextChat.TextChannels.RBXSystem 

if Chatbotresponse then
  chatgroup:DisplaySystemMessage(</font> <font color=\"rgb(255,255,255)\">"..Chatbotresponse.."".."</font> ")
end	
end)

where do i put the api key and the inference endpoint in the code? can you put like a “Put api key here” or something?

Yeah, I edited the code in the above post to be like the other examples.

Ok so i just put the top one in a module script and the bottom in a local script or something?

and is the api key one for the entire huggingface or do i need to get one per model

As I said earlier the code should be run on the server to utilize HTTP service. Then if you want to fire to the client you can use a RemoteFunction. Also you only need one API key per model.
Also, this chat template works with Zephyr 7b and some other models, but it doesn’t work with all of them.
“place the code with your api key and the inference endpoint shown initially and insert that code.
Place it in a module and require( the module ) and call it with Chatbotresponse=query(input,system_message)
This call can only be done on the server. so you can connect to it via”

local chatbotmod=location
local query=require(chatbotmod).query
game.Players.PlayerAdded:Connect(function()
Player.OnChatted:Connect(function(input)
--insert text filtering moderation
--insert additional chat logic here to customize
Chatbotresponse=query(input,system_message)
local TextChat=game:GetService("TextChatService")
local chatgroup=TextChat.TextChannels.RBXSystem 

--	pcall(function()    chatgroup:DisplaySystemMessage("<font color=\""..rgbcolor.."\">"..npc.Humanoid.DisplayName..": </font> <font color=\"rgb(255,255,255)\">"..str.."".."</font> ")end) 
if Chatbotresponse then
  chatgroup:DisplaySystemMessage(</font> <font color=\"rgb(255,255,255)\">"..Chatbotresponse.."".."</font> ")
end		
end
end

Can you like send me a screenshot of the setup and where everything goes?

is this stuff already done within the script i just have to replace it with my api key?