Giving LLM Text Vision and Self Awareness in Luau [OPEN SOURCE]

This is the code I user to engineer the system message as you can see it has memories, Text-Vision awareness, insight provided from a local chat bot, time of day and weather awareness.

function cm.ZephyrStory(person,personal,Playername,quer)
-- Define the API URL and the authorization header
local API_URL = "https://api-inference.huggingface.co/models/HuggingFaceH4/zephyr-7b-beta"
local headers = {Authorization = bearer}

-- Define the HttpService
local HttpService = game:GetService("HttpService")

-- Define a function that takes an input and queries the model
local function queryModel(input, temperature)
    -- Create a payload table with the input and the temperature
  local payload={inputs = input, temperature = temperature,max_new_tokens=1000, min_tokens=250, top_k=100, top_p=0.11}
   -- local payload = {inputs = input, temperature = temperature}
    -- Encode the payload table into a JSON string
    local payloadJSON = HttpService:JSONEncode(payload)
    -- Send a POST request to the API URL with the header and the payload
    -- Use pcall to catch any errors
    local success, response = pcall(HttpService.PostAsync, HttpService, API_URL, payloadJSON, Enum.HttpContentType.ApplicationJson, false, headers)
    -- Check if the request was successful
    if success then
        -- Decode the response into a table
        -- Use pcall to catch any errors
        local success, responseTable = pcall(HttpService.JSONDecode, HttpService, response)
        -- Check if the decoding was successful
        if success then
            -- Return the response table
            return response--Table-- return json
        else
      --  print()
            -- Return nil and the error message
            return nil, response--Table
        end
    else
        -- Return nil and the error message
        return nil, response
    end
end

local personality=personal[1]
local awarobserve=personal[2]
--identify..timeod..awareobserve
--local Resulttable,speakers=cm.LocalZephyrStory(str,npcnam,{[1]=persona,[2]=awareobserve,[3]=identity,[4]=timeod},Player)
local identity=personal[3]
local timeod=personal[4]
local insight=personal[5]
local memory=personal[7]
local previousconversation=personal[6]
if previousconversation==nil then
previousconversation=""
else 
local function RebuildConversation(tbl,response)--reduce size of system message
--local tbl,speakers,str=cm.LocalZephyrDecode(response)
local sum="" for i,v in tbl do for t,o in v do
if t~="narrator" then

 sum=sum.." \n\n "..t..": "..o 
else
 sum=sum.." \n\n "..o 
end end print(sum)
end
sum=sum.." \n\n "..Playername..": "..response 
return sum
end
previousconversation=RebuildConversation(personal[6][1])
end
--awareobserve,timeod,identity
-- Test the function with an example input
--cachedconversation

local input = "<|system|>\n "..identity..timeod..insight..awarobserve..memory..". Parse dialogues with "..person..": and "..Playername..": .</s>\n<|"..Playername.."|>\n "..quer.." </s>\n<|assistant|>"..previousconversation
local temperature = 2
local output,Error = queryModel(input,temperature)
print(output)
local iterations=0
local function RebuildResponse(response)--reduce size of system message
local tbl,speakers,str=cm.LocalZephyrDecode(response)
local sum="" for i,v in tbl do for t,o in v do
if t~="narrator" then
 sum=sum.." \n\n "..t..": "..o 
else
 sum=sum.." \n\n "..o 
end end print(sum)
end

local input = "<|system|>\n"..identity..memory..awarobserve.." Parse dialogues with "..person..": and "..Playername..": . </s>\n<|"..Playername.."|>\n "..quer.."</s><|assistant|>"..sum
if iterations==2 then
input = "<|system|>\n"..memory..awarobserve..identity.." Parse dialogues with "..person..": and "..Playername..": .</s>\n<|"..Playername.."|>\n "..quer.."</s><|assistant|>"..sum
elseif iterations==3 then
input = "<|system|>\n"..identity.."\n Parse dialogues with "..person..": and "..Playername..": .</s>\n<|"..Playername.."|>\n "..quer.."</s>\n<|assistant|>"..sum
end

return input
end
if not Error then
local function iterateoutput(output)
local checkedinput
local loadoutput
local previnput
repeat
iterations+=1
previnput=HttpService:JSONDecode(output)[1].generated_text
local loadoutput = queryModel(RebuildResponse(output))
if loadoutput~=nil then
checkedinput=HttpService:JSONDecode(loadoutput)[1].generated_text
if checkedinput then--only update output if valid
output=loadoutput
print(output)
else
break 
end
else
break
end
until checkedinput==previnput or iterations>=3
return output
end
output=iterateoutput(output)
end
local function DecodeResponse(response)--reduce size of system message
local tbl,speakers,str=cm.LocalZephyrDecode(response)
local sum="" for i,v in tbl do for t,o in v do
if t~="narrator" then
 sum=sum.."\n\n"..t..": "..o 
else
 sum=sum.."\n\n"..o 
end end print(sum)
end
return sum
end
local function iterateoutputLLama(output)
local checkedinput
local loadoutput
local previnput
repeat
iterations+=1
previnput=HttpService:JSONDecode(output)[1].generated_text
local loadoutput =cm.TinyLlama(""..identity..timeod..insight..awarobserve..memory..". Parse dialogues with "..person..": and "..Playername..": ." ,quer,DecodeResponse(output))
if loadoutput~=nil then
checkedinput=HttpService:JSONDecode(loadoutput)[1].generated_text
if checkedinput then--only update output if valid
output=loadoutput
print(output)
else
break 
end
else
break
end
until checkedinput==previnput or iterations>=3
return output
end

local output2=cm.TinyLlama(""..identity..timeod..insight..awarobserve..memory..". Parse dialogues with "..person..": and "..Playername..": ." ,quer,DecodeResponse(output))--recieve generated_text
if output2 then
iterations=0
local output3=iterateoutputLLama(output2)
if output3~=nil then
output=output3
else
output=output2
end
end

--local str=format_response(output[1].generated_text)
--print(str)
--local outputtabl=cm.extractDialogue(str)
-- Print the output


return output--[1] --parseConversation(str)
end

This module is a integral part of this system providing the description of surroundings emotional state of the chatbot. This has been superceded by a local chatbot system that provide RAG to assist the LLM in being Aligned with the values of the personality it is.

New update to the module! The update is the awareness now includes a generalized description of the terrain!

--[[--{
                    [1] = "I observe that, I am walking on basalt.",
                    [2] = "What I see is the environment has a meager amount of ground, a meager amount of cracked lava, a sparse amount of limestone, clusters of sand, a ton of rock, and heaps of basalt.",
                    [3] = "I am observing I believe, we are in an area called Island of Morgan, the center is near at the current elevation as me to the east.",
                    [4] = "What I see is a Crystal not far to the east.",
                    [5] = "I can see that, piece of small rock embedded with a Dragonstone is to the east.",
                    [6] = " We are in a place called Island of Morgan. There is a Crystal near eastward.",
                    [7] = "What I notice is I am walking on basalt, the environment has a meager amount of ground, a meager amount of cracked lava, a sparse amount of limestone, clusters of sand, a ton of rock, and heaps of basalt , I believe, we are in an area called Island of Morgan, the center is near at the current elevation as me eastward, a Crystal near eastward, piece of small rock embedded with a Dragonstone is  eastward. ",
                    [8] = "We are in a place called Island of Morgan, the center is close at the current elevation as me to the east",
                    [9] = "If my eyes don't deceive me, I am walking on basalt, the environment has a meager amount of ground, a meager amount of cracked lava, a sparse amount of limestone, clusters of sand, a ton of rock, and heaps of basalt, we are in a place called Island of Morgan, the center is close at the current elevation as me to the east, and a Crystal close to the east. "--]]--

With this function it counts all of the materials then judges them based off a judgement matrix representing thresholds for the generalization.

function aware.judge.terrain(root,radius)

local function detecter(material)
local materel={}
--print(material)
local nummats=0
for i,v in material do 
if material[i] then
if i~="Size" then
--if material[i][1] then
for t,o in material[i] do 
for c,p in material[i][t] do 
--if --material[i][t][c]~=Enum.Material.Water and
if material[i][t][c]~=Enum.Material.Air then --count all of the terrain
local matstring = string.split(tostring(material[i][t][c]),'Enum.Material.')[2]
if matsolutions[matstring]~=nil then
    matstring= matsolutions[matstring]
 end
if materel[matstring]==nil then
nummats+=1
materel[matstring]=1
else 
materel[matstring]+=1
--table.sort(materel,function(a,b)return a>b end)--sort greatest to 
end

--return true
end
end 
end
end
end
end
--least
print(materel)
--create an array with the keys
local keys = {}
for k in pairs(materel) do
  table.insert(keys, k)
end

--sort the array using a custom comparison function
table.sort(keys, function(a, b) return materel[a] < materel[b] end)

--print the sorted table


--table.sort(materel,function(a,b)return a>b end)--sort greatest to 


--table.sort(materel,function(a,b)return a>b end)--sort greatest to least
local judgeamntstrings = {
  {"a handful of", "a few", "a smattering of", "a sprinkling of"},
  {"a little bit of", "a trace of", "a touch of", "a dash of"},
  {"a sparse amount of", "a scant amount of", "a meager amount of", "a minimal amount of"},
  {"bunches of", "clusters of", "groups of", "packs of"},
  {"a lot of", "heaps of", "a ton of", "loads of"},
  {"a multitude of", "a plethora of", "hordes of", "heaps of"},
  {"a huge quantity of", "a massive amount of", "a colossal amount of", "a prodigious amount of"},
  {"a staggering number of", "an astonishing number of", "a phenomenal number of", "a mind-blowing number of"}
}
 
local judgementstring=""
local index=0
--{1, 2, 3, 4, 5, 6, 8}
local judgmatrix={1,radius,radius*5,radius*10,radius*20,radius*30,radius*40,radius*50}
--for i, k in ipairs(keys) do
--  print(k,)
--end
for i,k in keys do 
index=index+1
if index==nummats then
judgementstring..="and "
end
judgementstring..=aware.judge.amnt( materel[k],judgeamntstrings,judgmatrix).." "..k:lower()
if index~= nummats then
judgementstring..=", "
end
end

return judgementstring
end

local region = Region3.new(root.Position-Vector3.new(radius,radius,radius),root.Position+Vector3.new(radius,radius,radius))
local material = terrain:ReadVoxels(region, 4)            
-- detecter(material) 
--phraselib.opener[aware.mathrandom(1,#phraselib.opener)]
return  detecter(material) 
end



function aware.get.terrain(root,radius,str)
local phrases = {
  "The environment has",
  "The terrain consists of",
  "The surroundings is characterized by",
  "The landscape features",
  "The ecosystem hosts",
}
if radius~=true then
if str~=nil then
--rewrite the table in the first person context
return phrases[math.random(1,#phrases)].." "..str..". "
end
return phrases[math.random(1,#phrases)].." "..aware.judge.terrain(root,radius)..". "
elseif radius==true then
return  phrases[math.random(1,#phrases)]:lower().." "..str..""
end
end

In addition, this module also observes the water in particular


function aware.judge.water(root)
-- Loop through the directions and cast rays
local origin=root.CFrame:ToWorldSpace(CFrame.new(0,5,0)).Position--go above the part to get a better angle
for i, dir in ipairs(waterdirections) do
    local result = workspace:Raycast(origin, dir, params)
    -- Check if the ray hit anything
    if result then
        -- Check if the hit part is water terrain
        if result.Instance:IsA("Terrain") and result.Material == Enum.Material.Water then
             local magn=(origin-result.Position).Magnitude
             local dist, dir = aware.judge.distance(root,magn, origin, result.Position, range)
           
            return phraselib.waterdescription[math.random(#phraselib.waterdescription)] .. dist .. " to the " .. dir.."",magn
        end
    end

end
return "",nil
end

Finally all of the phrases has been cleaned up into a libraries that are descendant modules of the awareness module. This should increase the readability of the module.

Some final notes are that this module is designed with the specific object catagories in mind to add extra general flair. But I have created a new way to describe certain objects

  local function describeDungeons()--describes the closest 3 furnitures that are not the closest one.
                local FurnitureText = ""
                if numDungeons > 1 and closestDungeon and Dungeonarray then
                --table.sort(Furnarray,function(a,b) return a.Ma)
                --local length=3
                local iterations=0
                local maxiterations=math.min(#Dungeonarray,3)
                for i,closeFurniture in Dungeonarray do  
                if closeFurniture.maininstance~=closestDungeon then    
                iterations+=1
                 -- table.insert(arealist,{maininstance=list[h],instance=c,size=g,distance=calc}) 
                if iterations>=2 then FurnitureText=FurnitureText..", " end
                  local dist, dir = aware.judge.distance(root,closeFurniture.distance, pos, getroot2(closeFurniture.maininstance).Position, 200)
                  FurnitureText = FurnitureText..aware.judge.object(closeFurniture.maininstance) .. " " .. dist .. "" .. dir
                if iterations>=maxiterations then break end
                end
                end
                  --  end
                end
                return FurnitureText
            end

This describes the closest 2 other objects instead of generalizing the amount of other objects.

In addition this module can now be used with the chatmodule I published, to query the environment.

It also is now compatible with Determinant AIs chatGPT plugin.

I’ve been lookiing for a solution to make chatbots. How did you get the information to do this and is it possibile to just make a general one with a personality?

I posted a resource with some examples of using API endpoints to interact with large language models.
Chatbot & LLM Artificial Intelligence Model API Code Documentation FREE (Open Source) and Other Useful APIs - Resources / Community Resources - Developer Forum | Roblox

Also you can create localized chatbot like I did by creating synthetic datasets using something like this. In this example we are creating different expert datasets for specific conversations. Might be a bit too in depth for beginners.

function cm.extractDialogue(input)
	-- Create an empty table to store the names and speech
	local dialogueTable = {}
	-- Split the input string by the newline character
	local lines = string.split(input, "\n")
	-- Iterate over the lines
	--   if lines then
	for _, line in ipairs(lines) do
		if string.len(line)>3 then  
			-- Find the position of the colon symbol in the line
			local colonPos = string.find(line, ":")
			-- If the colon symbol is found
			if colonPos then
				-- Extract the name from the line by taking the substring before the colon
				local name = string.sub(line, 1, colonPos - 1)
				-- Extract the speech from the line by taking the substring after the colon
				local speech = string.sub(line, colonPos + 1)
				-- Trim any whitespace from the name and the speech using string.gsub
				name = string.gsub(name, "^%s*(.-)%s*$", "%1")
				speech = string.gsub(speech, "^%s*(.-)%s*$", "%1")
				-- Add the name and the speech to the dialogue table as a key-value pair
				if dialogueTable[name]==nil then dialogueTable[name]={} end --end
				if has_punctuation(speech)  then
					dialogueTable[name] = table.insert(dialogueTable[name],speech) end
			end
		end
	end

	print(dialogueTable)
	-- Return the dialogue table
	if #dialogueTable==0 then
		dialogueTable={input}
	end
	return dialogueTable
end
function cm.ZephyrCustom(systemmsg,prompt)
	-- Define the API URL and the authorization header
	local API_URL = "https://api-inference.huggingface.co/models/HuggingFaceH4/zephyr-7b-beta"--"https://api-inference.huggingface.co/models/HuggingFaceH4/zephyr-7b-alpha"--
	--local API_URL=--"https://api-inference.huggingface.co/models/alignment-handbook/zephyr-7b-dpo-lora"--"https://api-inference.huggingface.co/models/mistralai/Mistral-7B-v0.1"
	local headers = {Authorization = bearer}

	-- Define the HttpService
	local HttpService = game:GetService("HttpService")

	local function queryModel(input, temperature)
		-- Create a payload table with the input and the temperature
		local payload = {inputs = input, temperature = temperature,max_new_tokens=1000, min_tokens=200}--,--- top_k=50, top_p=0.95}-- Encode the payload table into a JSON string
		local payloadJSON = HttpService:JSONEncode(payload)
		-- Send a POST request to the API URL with the header and the payload
		-- Use pcall to catch any errors
		local success, response = pcall(HttpService.PostAsync, HttpService, API_URL, payloadJSON, Enum.HttpContentType.ApplicationJson, false, headers)
		-- Check if the request was successful
		print(success)
		print(response)   
		if success then
			-- Decode the response into a table
			-- Use pcall to catch any errors
			local success, responseTable = pcall(HttpService.JSONDecode, HttpService, response)
			-- Check if the decoding was successful
			if success then
				-- Return the response table
				return responseTable
			else
				-- Return nil and the error message
				return nil, responseTable
			end
		else
			-- Return nil and the error message
			return nil, response
		end
	end

	local function format_response(str)
		-- find the assistant response in the string
		local start = string.find(str, "<|assistant|>")
		local finish = string.len(str)
		-- local finish = string.find(str, "</s>", start)
		local response = string.sub(str, start + 13, finish)
		-- return the response in a code block format
		return "" .. response .. ""
	end
	-- Test the function with an example input
	--print(str)
--	local systemmsg=cm.randomizeString(systemmsg)
	--local prompt=cm.randomizeString(prompt)
	local input = "<|system|>\n "..systemmsg.."</s>\n<|user|>\n "..prompt.. "</s>\n<|assistant|>"
	local temperature = 2
	local output = queryModel(input, temperature)
	print(output)
	local iterations=0
	local previnput=nil
	local checkedinput=nil
	if output then
		repeat
			iterations+=1
			previnput=output[1].generated_text
			local loadoutput = queryModel(previnput)
			if loadoutput~=nil then
				checkedinput=loadoutput[1].generated_text
				output=loadoutput
				print(output)
			else
				break
			end

		until checkedinput==previnput or iterations>=3
		--local output = queryModel(input)
		print(output)
		--print(output)
		if output~=nil then
			local output=format_response(output[1].generated_text)
			print(output)
			local output=cm.extractDialogue(output)
			print(output)
			--listtable(list)
			local conversation=listtable(output[1])
			-- Print the output
			print(conversation)
			return conversation-- parseConversation(str)
		else return output
		end
		return output
	end
	return nil
end
function cm.GenerateZephyrDataset(Greetings,inquiry,IDK,Database,wisdom,name)--v is a personality function can only run from command prompt.
	--local Greetings,inquiry,IDK,Database,wisdom=v()
	if Greetings==nil then Greetings={} end if inquiry==nil then inquiry={} end if IDK==nil then IDK={} end if Database==nil then Database={} end 
	if wisdom==nil then  wisdom={} end
	local c=Instance.new("ModuleScript")
	c.Source='personalities={person=function() local Greetings={"'..table.concat(Greetings,'","')..'"} local inquiry={"'..table.concat(inquiry,'","')..'"} '..'local IDK={"'..table.concat(IDK,'","')..'"} '..'local Database={"'..table.concat(Database,'","')..'"} '
		..'local wisdom={"'..table.concat(wisdom,'","')..'"} return Greetings, inquiry, IDK, Database, wisdom end } return personalities'    c.Parent=game.ReplicatedStorage.GlobalSpells.ChatbotAlgorithm.Personalities 
	c.Name=name
end
function cm.CreatePersonalityData()
Switch.Value=true

for i,v in  animals do
local Datas={["Greetings"]={"Respond to the question with a List 10 entries a "..v.." would say.","Could you answe these 10 questions in order ?"},
["inquiry"]={" You are roleplaying as a "..v..". List 10 entries of different things you would say when asking for a inquiry."," How would a "..v.." ask a question?"},
["IDK"]={" You are roleplaying as a friendly "..v..".  List 10 entries of different things you would say when you as a "..v.." does not know the answer to something.","Can you tell different ways to tell someone I don't know as if I were a "..v.."?"},
["Database"]={"You are roleplaying as a "..v.." from a magical land. List 10 details about what a "..v.." would say .","Can you tell me about "..v.."s?"},
["wisdom"]={"You are roleplaying as a talking "..v..". List 10 entries of wise statements with the nuance of a "..v..".","What wisdom does a talking "..v.." have to share?"},
}
if Switch.Value==false then
break
end
if personaldir:FindFirstChild(v)~=nil then
local Gre,inq,idk,datab,wisd=require(personaldir:FindFirstChild(v)).person()
local Datatables={["Greetings"]=Gre,["inquiry"]=inq,["Database"]=datab,["wisdom"]=wisd,["IDK"]=idk}
local newdata={}
print("Attemping")
--task.wait(mathrandom(1,15))
local prevdata=personaldir:FindFirstChild(v)
for c,o in  Datas do
if Switch.Value==false then
break
end
--if #Datatables[c]==1 then
if c~="IDK" and c~="inquiry" then
print(o[1]..o[2])

--local data=pcall(function() return  end)
--if data~=nil then
local datap=cm.ZephyrCustom(o[1],o[2])
if datap~=nil then
newdata[c]=datap
if newdata[c] then
Datatables[c]=unionTables(Datatables[c],newdata[c])
--for i,v in newdata[c] do
--table.insert(Datatables[c],v)
--end
task.wait(mathrandom(7,15))
end
else 
task.wait(mathrandom(7,15))
newdata[c]=Datatables[c]
end
print(newdata[c])
--end


--for i=#newdata[c], in Datatables[c]
--else newdata[c]=Datatables[c]

end
newdata[c]=Datatables[c]
end

--end
--print(datatable)
--if datatable==nil then
cm.GenerateZephyrDataset(newdata.Greetings,newdata.inquiry,newdata.IDK,newdata.Database,newdata.wisdom,v)
task.wait(mathrandom(1,5))
prevdata:Destroy()
--else task.wait(10)
elseif personaldir:FindFirstChild(v)==nil then
local newdata={}

--local prevdata=require(personaldir:FindFirstChild(v)).person()
for c,o in  Datas do
if Switch.Value==false then
break
end
newdata[c]=cm.ZephyrCustom(o[1],o[2])
task.wait(mathrandom(5,15))

end
--print(datatable)
--if datatable==nil then
cm.GenerateZephyrDataset(newdata.Greetings,newdata.inquiry,newdata.IDK,newdata.Database,newdata.wisdom,v)
task.wait(mathrandom(1,5))
--prevdatas:Destroy()
--else task.wait(10)
end
end
--break
--else
--local datatable=cm.ZephyrMage(v)
--if datatable~=nil then
--cm.GenCustomDataSingleV2("battle",v,datatable)
--task.wait(10)
--else task.wait(10)
--end

end

In my setup I have a local AI chatbot that scores the accuracy of the local response and if the accuracy is low it uses a external LLM api to generate a response. I also generated specific personality data so that it gets the seed for generation from the name.
Into this large table that generates consistent personality data based on this engineered dataset.
The chatbot I created uses machine learning to find the best response by querying all its nodes and ordering them from most relevant to least, Personality datasets are also generated using Zephyr 7b as shown above.


I made this when ChatGPT first came out so it was a very exciting time to be creating chatbots and I very much enjoyed engineering intelligence. It’s not easy But what is good is creating aware and agentic type systems that can manipulate their surroundings and solve logical problems.

But the code above is an example of how to use an Large Language Model endpoint to create datasets. It can be very useful for creating local chatbots

I went to the open source page and used google/gemma-2b-it. Where do i get the GlobalSpells.ChatbotAlgorithm.ChatModule and is there anything else i need to replicate it?

The chatmodule is not required the most up to date version of this module is Text Vision Awareness Judgement Description Library for LLMs in Video Games [Open Source]
Sorry for the confusion. I would recommend the newest version.
This is the link to the page about the Chatmodule library in addition to its previous public build and examples.
Lua Trainable Ai Chatbot Library, Emoji Generate ,Emotion Classifier Context Database,Math/Geometry Data Generator Geometry+Worded Math Solver RAG
Designed to create a retrieval based chatbot and provide rag for LLMs.

		--local DeterminantAgent=require(game.ReplicatedStorage.GlobalSpells.ChatbotAlgorithm.DeterminantAgent)
		--local personality=personal[1]
		local awarobserve=personal[2]
		--identify..timeod..awareobserve
		print(personal)
		--local Resulttable,speakers=cm.LocalZephyrStory(str,npcnam,{[1]=persona,[2]=awareobserve,[3]=identity,[4]=timeod},Player)
		local identity=personal[3]
		local timeod=personal[4]
	local insight=personal[5]
	local botter=player.Name
	if personal[8]~=nil then
		botter=personal[8]
	end
		--local memory=personal[7]
		--local previousconversation=personal[6]
	local systemsg =timeod.."\n"..identity.."\n"..awarobserve.."\n"..insight--Time of day, npc identity, awareness of surroundings and response from local chatbot in addition to previous memories.
	print(systemsg)
		-- Parse story dialogues with "..person..": and "..Playername..": .\n</s>\n<|"..Playername.."|>\n"..quer.." </s>\n<|assistant|>"..previousconversation
	return Agent.SendMessage(player,msg,npc,systemsg,memories,botter)

In this example I show the end result of how a system message is engineered via Time of day, npc identity, awareness of surroundings the response from local chatbot in addition to previous memories.
I tried to make sure all dependencies were checked with it but any dependant functions can be omitted. Only a few things are not usable with the open sourced version, such as certain effects that require a effects library that is not included and animations library. Although, I have published some standalone modules that provide emojis and intelligent emotes as well as animations dataset, in addition to the awareness synthesis algorithm. I would not publish my entire library due to my intellectual property, I would rather provide code and pieces to build one from scratch.
If you need any advice on how to set up a chatbot lmk and I could help you out with specific questions.

These are the relevant resources I have provided to help make chatbots easier for people to create.
This module is a abstract implementation of awareness to perform context based actions based on the surroundings.
NEW Action Token AI Framework LLM Utility/Chatbot Luau [Open Source] - Resources / Community Resources - Developer Forum | Roblox
It utilizes the Awareness library.
The Chatmodule library is much more in depth, and not as concise as the other resources but it allows you to query your data and find the best conversational response to your query. There are some examples in the linked lua trainable Chatbot library. I posted a large 233,000 dataset on there. I was querying it in sections based on emotional tonality evaluation and grouped by starting word to interesting results, but I opted for more engineering the idea of intelligent responses based on smaller, custom high-quality datasets.
with the flow of local chatbot handles short responses and greetings then a LLM API handles long form queries and utilizes data from the local chatbot to engineer the system prompt, while recalling summarizations of previous conversations.

A concept like self-aware AI is not something you can just redefine however you please just to fit your needs; there’s only one actual definition and a bunch of if statements are not it.

It’s for making Large language models self aware. By judging things around it and itself. As illustrated by the judgement library. :slight_smile:
image
Description library describes the objects into a text description.

It judges its features and returns an array of all those elements and the combined element at the bottom synthesizing natural language descriptions from an environment. Objects are categorized intentionally, although it is not interdependent on any of the variables they can all be nil and the module will return an empty string.

This last entry is often used for injecting awareness into a Large Language Model.
All the tests I’ve done with it illustrate how valuable of a tool it is, it makes chatbots very environmentally and self aware when used to its fullest potential. I also use it for the players observation chat bot, a data source for a local chatbot, in addition generating category specific commands. such as Chop.Tree() Mine.Rubble() IT’s very important to keep your workspace organized.
In this example code this library is used to generate commands based on the environment and take text input to recognize them.

    --target attack the enemy
    ["Enemy"] = function(root, obj, key)
        return "preparing to attack", aware.get.NPCDescription(obj.Parent)
    end, --walk towards the npc
    ["NPC"] = function(root, obj, key)
        return nil, ""
    end,
     --do nothing just navigate to
    --walk towards and open the chest
    ["chest"] = function(root, obj, key)
        return "walking towards the lock of the", ""
    end, --navigate to the animpoint inside the chest
    --walk towards and examine the crystal
    ["crystal"] = function(root, obj, key)
        return "checking out this", ""
    end,
    ["dungeons"] = function(root, obj, key)
        return "trying to find the center of this", ""
    end,
    ["fish"] = function(root, obj, key)
          if obj then Controller.PickUpObject({root.Parent, obj}) end
        return "swimming towards the", ""
    end,
    ["house"] = function(root, obj, key)
        return "walking towards the door of the", ""
    end,
    ["loot"] = function(root, obj, key)
       if obj then Controller.PickUpObject({root.Parent, obj}) end
        return "attempting to loot the", actionmod.PickUpObject(root.Parent, obj) 
    end,
    ["mapobj"] = function(root, obj, key)
        return nil, ""
    end,
    ["plant"] = function(root, obj, key)
         if obj then Controller.PickUpObject({root.Parent, obj}) end
        return nil, ""
    end,
    ["player"] = function(root, obj, key)
        return nil, ""
    end,
    ["rubble"] = function(root, obj, key, Player)
        local key = "mine rock"
        commands[key](Player, root.Parent)
        return nil, commands.commandresponse(key)
    end,
    ["tree"] = function(root, obj, key, Player)
        commands["cut tree"](Player, root.Parent)
        return nil, commands.commandresponse("cut tree")
    end
}

The rest of the code for that is here.
NEW Action Token AI Framework LLM Utility/Chatbot Luau [Open Source] - Resources / Community Resources - Developer Forum | Roblox

I have a lot of mind-blowing things that I do not open source. What I do with AI this is only part of it. Shared because It’s designed for artificial intelligence in ROBLOX and gives very impressive results in my simulation where all these categories are generated procedurally. Although at its core it is what is illustrated in the Description library.
So tinkering with the source code or refactoring it would be ok. The categories just increase the quality of the natural language synthesis.

The task i want to accomplish is just making a text box you can type in and send the text to the chatbot and it prints an answer.

Player.Chatted:Connect(function(message) end)

Also the Awareness module linked above has been changed to its newest version unfortunately link counter has been reset.

Wait are you using a model you trained or are you sending a request to the api and that is sending the information back.

What im attempting to do is just have a text box that sends a message to the thing and it sends back text and prints it

More like Apple’s recent release of a local model powering short form queries and a API handles long form queries while receiving a description the surroundings, their appearance, and their equipment, response from the local chatbot and their identity.
The neurons consist of weight matrixes (Projected Accuracy, Repitition, Emotion) and a loss function. Accuracy is computed from a layer that makes inferences based off a vector database while recognizing Synonyms, Antonyms, Reflections, and Nouns. While averaging the sum of all of the conversation context to and connecting related outputs by a 2 layered non repeating network connections that in real time learn the most relevant database. Outputs are averaged to only show the best outputs first. Then the outputs are transformed via a hashed database of synonyms and phrases according to the players chosen wording. For an entry like “You know what I think is awesome?” to associate awesome with something like great . to make that entry
“You know what I assume is great?” then emojis are inserted via a dataset consisting of emojis and their related words.
Context databases and functions that process text. Such as Eliza Algorithm is used to recognize search queries and playlist requests.

You can do that with the chatbox.
For a gui to type in make a textbox you should ask Bing or ChatGPT “In the context of Luau in ROBLOX, write a script that generates a text box that you can send messages with.”

The textbox is the part i understand. i can do that. I just need to know how to got the chatbot stuff working. I dont know what i need for it to work and how to use your examples.

To use a chatbot API use ChatGPT or a model from huggingface, go to the model page. click deploy and get your API key. Chatbot & LLM Artificial Intelligence Model API Code Documentation FREE (Open Source) and Other Useful APIs
To insert Emojis into a string you can use the small emoji model SmallEmoji V1.2 - Insert Emoji to Sentence Algorithm [Open Source] Update: Sigmoid - #16 by Magus_ArtStudios

This next one is a big more complicated, to use the animals you have to import them and republish them.

This module returns an animation that can be played. I run that on a per sentences basis to make the AI more expressive.
Require the Awareness Library and it will setup your workspace with the categories it observes. If you want to use those put your models in those directories or edit the name of the directory in the tablehandler location labeled like workspace.NPCS. Without those it will make some observations. I use this to inject into the system message during inference.

Finally if you were to decide to create your own local chatbot you can try your hand at the chat module. Lua Trainable Ai Chatbot Library, Emoji Generate ,Emotion Classifier Context Database,Math/Geometry Data Generator Geometry+Worded Math Solver RAG
It looks complicated but the main function to use for a chatbot is CompleteQuery(str,filter:bool(filters out non important words)),complete:bool(use synonyms antonyms and nouns to make assumptions) )
The complete query is deisgned to be used on a table of strings given a input it gives the most conversational response. An example to use for that is the awareness.GetSurroundingObjects() which will return a table of observations that can be queried with the CompleteQuery function.
This can be used for RAG to provide additional context to a LLM api. I use it for that and as an instant response to render while a API is processing output, and to handle short form queries.

So in short,

  1. Small Emojis takes (string,temperature) returns the modified string requires no setup just require and use the module.
  2. Intelligent Emotes from text. if you want to use it you have to import some of the animations that are not made by Roblox and reupload them. Tools to do this are provided in that post.
  3. API Code documentation, provides examples of different AI apis from huggingface. Huggingface requires a BearerKey which can be gotten for free by click Deploy → Inference Endpoint (Serverless) bearer key will be in the code provided.
  4. Awareness - sets up itself can be modified to suit your Categories, any that do not exist have placeholders made.
  5. Chatmodule - if you would like to use CompleteQuery to turn data into a chatbot. You can use it without weights which will make it run a bit faster. Or You can download the weights I use or create your own weights by using cm.PredictRun() and training it on a dataset. That will make the model more accurate in identifying the most important words in a sequence for use in retrieval augmented generation. Done by the measurement and classification of synonyms, antonyms, nouns and reflections and the accumulated sum of the inverse sigmoid of the text frequency, then the result is activated by math.log() and the highest value retrieved.

I use all of these in my project.

If you need any coding help to set those up, you can definitely use Bing or ChatGPT to help you set those up. Or send me some screenshots or script outputs and I can provide technical assistance or address any bugs.

Another resource is this code of the Eliza chatbot that I ported to Luau.
Eliza Chatbot Ported to Luau [Open-Source] - Resources / Community Resources - Developer Forum | Roblox

Ok, all i want to do is have the player say something and the ai respond. I dont really need the awarness or anything else. I tried using chatGPT for its own api but is said i had to pay when i set it up.

If that’s the case definitely look into this.

It demonstrates how to use huggingface apis (free access daily limit) Their are lots of open source models to check out. I am currently using my local Model I described, Zephyr 7b and ChatGPT-4 together.

Which one do you recommend for what i want to achieve.