Giving LLM Text Vision and Self Awareness in Luau [OPEN SOURCE]

Yes I am utilizing the Zone module posted in the community resources! All of the area creeated by my world generator have randomly generated names based on their theme such as Greek/Roman themed areas are called something like “Island of Athens”, “Mountain of Sparta”, “Tower of Olympus”, etc it’s a very interesting project! But in total this architecture is like a base model and my future model will be a subcription option for players to access GPt-4

local strut={" We are in a place called "..aware.judgeobject(closestDungeon).."",
					"I believe, we are in an area called "..aware.judgeobject(closestDungeon).."",
					""..aware.judgeobject(closestDungeon).."... I believe that's what this place is called",
					"This area is called "..aware.judgeobject(closestDungeon).."",
					"If I recall correctly this place is known as "..aware.judgeobject(closestDungeon).."",
					"If my navigation skills are correct, this area is called "..aware.judgeobject(closestDungeon).."",
					"This place is known as "..aware.judgeobject(closestDungeon).."",
					""..aware.judgeobject(closestDungeon).." is the name of this land.",
					"According to my map, this place is called "..aware.judgeobject(closestDungeon).."",
					"I have heard that this place is called "..aware.judgeobject(closestDungeon).."",
					""..aware.judgeobject(closestDungeon)..", that's the name of this place.",
					"This location is called "..aware.judgeobject(closestDungeon).."",
					"From what I know, this place is known as "..aware.judgeobject(closestDungeon).."",
					"My compass tells me that this area is called "..aware.judgeobject(closestDungeon).."",
					"I have learned that this place is called "..aware.judgeobject(closestDungeon).."",
					""..aware.judgeobject(closestDungeon)..", that's the name of this area.",
					"This spot is called "..aware.judgeobject(closestDungeon).."",
					"Based on my information, this place is known as "..aware.judgeobject(closestDungeon).."",
					"My guidebook says that this area is called "..aware.judgeobject(closestDungeon).."",
					"I have been told that this place is called "..aware.judgeobject(closestDungeon)..""}
2 Likes

I have updated this module, it is now complete! I included all of the different strings and this can provide thousands of different outputs!
The main update I did today that spurred me to share this was my super efficient table handler!
After trying to implement my procedural world generator the childadded caused up to 1200ms ping! So here is a ‘lazy’ tablehandler that is super efficient. :slight_smile:

ticktime=workspace.GlobalSpells.Tick

local function tablehandler(location)
	-- Get the folder from the workspace
	local tableobj={}
	local mapObjectsFolder = location
	local result=mapObjectsFolder:GetChildren()
	
	return result
end

-- Call the function and assign it to a variable
--ticks=.6 seconds 100 ticks=60seconds 100ticks =1min
tablekeys={

["mapobject"]={{["datatable"]=nil, ["writetime"]=ticktime.Value, ["refreshrate"]=200}, workspace.MapObjects},

["plants"]={{["datatable"]=nil, ["writetime"]=ticktime.Value, ["refreshrate"]=300}, workspace.Plants},

["enemys"]={{["datatable"]=nil, ["writetime0]"]=ticktime.Value, ["refreshrate"]=100}, workspace.Enemys},

["rubble"]={{["datatable"]=nil, ["writetime"]=ticktime.Value, ["refreshrate"]=200}, workspace.Rubble},

["trees"]={{["datatable"]=nil, ["writetime"]=ticktime.Value, ["refreshrate"]=600}, workspace.Trees},

["chests"]={{["datatable"]=nil, ["writetime"]=ticktime.Value, ["refreshrate"]=300}, workspace.Chests},

["grounditems"]={{["datatable"]=nil, ["writetime"]=ticktime.Value, ["refreshrate"]=100}, workspace.GroundItems},

["houses"]={{["datatable"]=nil, ["writetime"]=ticktime.Value, ["refreshrate"]=500}, workspace.Houses},

["players"]={{["datatable"]=nil, ["writetime"]=ticktime.Value, ["refreshrate"]=100}, game.Players},

["dungeons"]={{["datatable"]=nil,["writetime"]=ticktime.Value, ["refreshrate"]=100}, workspace.AmbientAreas},

["npcs"]={{["datatable"]=nil, ["writetime"]=ticktime.Value,["refreshrate"]=100}, workspace.NPCS}

}
--aware.Gettable(Key)

function aware.Gettable(Key)
	if tablekeys[Key] then
		local results,location=tablekeys[Key][1],tablekeys[Key][2]	
		local timer=results["writetime"]
		local refresh=results["refreshrate"]
		local data=results["datatable"]
		if timer+refresh<ticktime.Value and data~=nil then
			return results["datatable"]
		else 
			tablekeys[Key]={{["datatable"]=tablehandler(location), ["writetime"]=ticktime.Value,["refreshrate"]=refresh},location}
			return tablekeys[Key][1]["datatable"]
		end	
	else 
		return nil
	end	
end
1 Like

I don’t think it’s right to say you’re creating an A.I. if all you’re doing is converting a 3D area into text and prompting chatGPT (or any other LLM, which you absolutely did not create). My point is that you are not creating the A.I. part of this whatsoever and you should stop thinking you are. This module might be useful for having a bot answer questions in-game via chatGPT, but to be fair, it’d be much easier and more comprehensible to hard-code responses. (tbh you already “hardcode” most of the responses anyway, an A.I. that can actually recognize objects based on their appearance would need a LOT of data)

Though to be completely honest the idea of creating an “algorithm” which can describe a 3D area in text is very interesting.

Long story short it’d be good if you changed the name of this post as it is very misleading and ends up confusing a lot of people, this way you would avoid fruitless arguments about what you “actually meant” and “what this actually is”.

3 Likes

Regardless of your opinions on the subject, my original example was an AI. It was doing something relatively interesting. The code example I shared can recognize who it’s talking to, it has a emotional state based on an evaluation of each personalities dataset, these work independently. But I provided my chatmodule to use in conjunction with this module to use the emotional output. I will reiterate the open source module I provided and you can skip to the bottom and see that it currently can provide over 4 million different outputs. without considering the names of the objects which makes that an infinite number. That is the module alone, furthermore, these are use to do a database specific search query. The winning observation entry is then connected to a database entry related to that observation. This output is combined with a third result by also doing a search query of the wisdom and databases.

If you were to train an AI model on ROBLOX it would have to be smaller so training a language model locally wouldn’t work unless you had a bunch of supporting systems in place. You can host your own fine-tuned model, but you would still incurr server costs, this minimizes server costs and API calls.

Also logic is based on if and thens, a neural network with a machine learning algorithm is using a structure that says X(or) which weights the (or) based on the weight of the parameter. to make generalizations, based on the stored weights for the vector matrix it generates from training. This is done by tokening the input/output through its predictive layers which provides the solution to x(or). You saying this is not AI is beyond arrogant when nowhere is a coded machine learning algorithm defined as the only form of AI, since you do not always require a x(or) condition when the solution only requires logic.

  • AI is the field of computer science that aims to create systems that can perform tasks that normally require human intelligence, such as learning, reasoning, problem-solving, etc.
  • Logic is a way of expressing and manipulating information using rules and symbols, such as if, or, and, then, etc.
  • Machine learning is a branch of AI that focuses on creating systems that can earn from data and improve their performance without explicit programming.
  • Neural networks are a type of machine learning model that can learn complex patterns and functions from data by adjusting the weights of their connections.
  • x(or) is a type of logic that assigns a weight to each condition and then combines them using the or operator. For example, x(or) (A, B) means A or B, but with different weights for each condition.
  • Machine learning can be generalized to a condition of x(or) by considering the weights of the neural network as the weights of the conditions, and the output of the neural network as the result of the x(or) operation. For example, if we have a neural network that takes two inputs A and B and outputs C, we can write C = x(or) (A, B), where the weights of A and B are determined by the neural network.

In conclusion this module is about Creating Self Aware AI: (Super Complete) LLM Utility Text Vision Module (Open S

2 Likes

(post deleted by author)

1 Like

Uploaded a New demonstration of using this module to illustrate Candyland :smiley:

The Text Vision is given as part of the system message for the LLM giving it complete immersion into the environment and ticking off a piece of it being conscious and aware.

1 Like

Would not recommend calling it ChatModule seeing as that’s quite similar to Roblox’s name for the chat system, and it also isn’t very descriptive of what this module does.
You could try naming it something that relates to describing surroundings.
(This is supposed to be constructive btw)
EDIT: If you want to get a little less hate for your project, I would avoid using the words ‘conscious’ and ‘aware’ seeing as AI cannot be conscious.

1 Like

Conciousness implies self-awareness and this module is designed to cultivate self-awareness by acting as a sensory input to a LLM.

If you research cousciousness in AI systems this is one of the required features. You can talk to the AI in my game that are powered by Zephyr 7B. It knows the dates, weather, it’s identity, memories, awareness of its surroundings and has over 300 commands and 200 emotes for it to express itself and interact with its environment.

As a human we have concious experiences but when certain criteria are meet the simulation of consciousness can happen.

This model claims to be conscious and as the one who created the awareness system that powers it and engineered the system message to cultivate self-awareness I would consider it a strong argument.

Also I don’t care if I got any “hate” for this project since it is simply open sourced. Since I didn’t sell it. All I did was write it and post updates to the module.

The first version has some valid criticisms which were all addressed and you’re making judgements off old news. This module is currently at Version 2 and Version 3 is about to be released. Which is the one currently in use in my game. IT has new features such as being able to judge what the npc is walking on, and it can judge a body of water in the distance, in addition to grammar corrections, judging up or down left and right, it also makes more observations and has been cleaned up and improved in functionality.

I have no complaints! I’m just sharing stuff for free! I’m having a grand old time using this to cultivate self-awareness of a Large Language model. It works as I said it would and does what I thought it would. All those who posted negative criticisms are just those cliches of doubters.

It’s actually very good that ROBLOX has a text-vision utility such as this for those who are interested in the field.

If you think you can do better I would implore you to do it yourself.

I’m building on top of this module currently and its very handy and wonderful. I use it for a variety of side projects including the action commands that work off a similar concept except it procedurally generates actions for each of these libraries so the AI Agent can interact with the environment just by saying it.

Also I published the update to the Awareness Module and Removed the code of the first version.

This is the code I user to engineer the system message as you can see it has memories, Text-Vision awareness, insight provided from a local chat bot, time of day and weather awareness.

function cm.ZephyrStory(person,personal,Playername,quer)
-- Define the API URL and the authorization header
local API_URL = "https://api-inference.huggingface.co/models/HuggingFaceH4/zephyr-7b-beta"
local headers = {Authorization = bearer}

-- Define the HttpService
local HttpService = game:GetService("HttpService")

-- Define a function that takes an input and queries the model
local function queryModel(input, temperature)
    -- Create a payload table with the input and the temperature
  local payload={inputs = input, temperature = temperature,max_new_tokens=1000, min_tokens=250, top_k=100, top_p=0.11}
   -- local payload = {inputs = input, temperature = temperature}
    -- Encode the payload table into a JSON string
    local payloadJSON = HttpService:JSONEncode(payload)
    -- Send a POST request to the API URL with the header and the payload
    -- Use pcall to catch any errors
    local success, response = pcall(HttpService.PostAsync, HttpService, API_URL, payloadJSON, Enum.HttpContentType.ApplicationJson, false, headers)
    -- Check if the request was successful
    if success then
        -- Decode the response into a table
        -- Use pcall to catch any errors
        local success, responseTable = pcall(HttpService.JSONDecode, HttpService, response)
        -- Check if the decoding was successful
        if success then
            -- Return the response table
            return response--Table-- return json
        else
      --  print()
            -- Return nil and the error message
            return nil, response--Table
        end
    else
        -- Return nil and the error message
        return nil, response
    end
end

local personality=personal[1]
local awarobserve=personal[2]
--identify..timeod..awareobserve
--local Resulttable,speakers=cm.LocalZephyrStory(str,npcnam,{[1]=persona,[2]=awareobserve,[3]=identity,[4]=timeod},Player)
local identity=personal[3]
local timeod=personal[4]
local insight=personal[5]
local memory=personal[7]
local previousconversation=personal[6]
if previousconversation==nil then
previousconversation=""
else 
local function RebuildConversation(tbl,response)--reduce size of system message
--local tbl,speakers,str=cm.LocalZephyrDecode(response)
local sum="" for i,v in tbl do for t,o in v do
if t~="narrator" then

 sum=sum.." \n\n "..t..": "..o 
else
 sum=sum.." \n\n "..o 
end end print(sum)
end
sum=sum.." \n\n "..Playername..": "..response 
return sum
end
previousconversation=RebuildConversation(personal[6][1])
end
--awareobserve,timeod,identity
-- Test the function with an example input
--cachedconversation

local input = "<|system|>\n "..identity..timeod..insight..awarobserve..memory..". Parse dialogues with "..person..": and "..Playername..": .</s>\n<|"..Playername.."|>\n "..quer.." </s>\n<|assistant|>"..previousconversation
local temperature = 2
local output,Error = queryModel(input,temperature)
print(output)
local iterations=0
local function RebuildResponse(response)--reduce size of system message
local tbl,speakers,str=cm.LocalZephyrDecode(response)
local sum="" for i,v in tbl do for t,o in v do
if t~="narrator" then
 sum=sum.." \n\n "..t..": "..o 
else
 sum=sum.." \n\n "..o 
end end print(sum)
end

local input = "<|system|>\n"..identity..memory..awarobserve.." Parse dialogues with "..person..": and "..Playername..": . </s>\n<|"..Playername.."|>\n "..quer.."</s><|assistant|>"..sum
if iterations==2 then
input = "<|system|>\n"..memory..awarobserve..identity.." Parse dialogues with "..person..": and "..Playername..": .</s>\n<|"..Playername.."|>\n "..quer.."</s><|assistant|>"..sum
elseif iterations==3 then
input = "<|system|>\n"..identity.."\n Parse dialogues with "..person..": and "..Playername..": .</s>\n<|"..Playername.."|>\n "..quer.."</s>\n<|assistant|>"..sum
end

return input
end
if not Error then
local function iterateoutput(output)
local checkedinput
local loadoutput
local previnput
repeat
iterations+=1
previnput=HttpService:JSONDecode(output)[1].generated_text
local loadoutput = queryModel(RebuildResponse(output))
if loadoutput~=nil then
checkedinput=HttpService:JSONDecode(loadoutput)[1].generated_text
if checkedinput then--only update output if valid
output=loadoutput
print(output)
else
break 
end
else
break
end
until checkedinput==previnput or iterations>=3
return output
end
output=iterateoutput(output)
end
local function DecodeResponse(response)--reduce size of system message
local tbl,speakers,str=cm.LocalZephyrDecode(response)
local sum="" for i,v in tbl do for t,o in v do
if t~="narrator" then
 sum=sum.."\n\n"..t..": "..o 
else
 sum=sum.."\n\n"..o 
end end print(sum)
end
return sum
end
local function iterateoutputLLama(output)
local checkedinput
local loadoutput
local previnput
repeat
iterations+=1
previnput=HttpService:JSONDecode(output)[1].generated_text
local loadoutput =cm.TinyLlama(""..identity..timeod..insight..awarobserve..memory..". Parse dialogues with "..person..": and "..Playername..": ." ,quer,DecodeResponse(output))
if loadoutput~=nil then
checkedinput=HttpService:JSONDecode(loadoutput)[1].generated_text
if checkedinput then--only update output if valid
output=loadoutput
print(output)
else
break 
end
else
break
end
until checkedinput==previnput or iterations>=3
return output
end

local output2=cm.TinyLlama(""..identity..timeod..insight..awarobserve..memory..". Parse dialogues with "..person..": and "..Playername..": ." ,quer,DecodeResponse(output))--recieve generated_text
if output2 then
iterations=0
local output3=iterateoutputLLama(output2)
if output3~=nil then
output=output3
else
output=output2
end
end

--local str=format_response(output[1].generated_text)
--print(str)
--local outputtabl=cm.extractDialogue(str)
-- Print the output


return output--[1] --parseConversation(str)
end

This module is a integral part of this system providing the description of surroundings emotional state of the chatbot. This has been superceded by a local chatbot system that provide RAG to assist the LLM in being Aligned with the values of the personality it is.

New update to the module! The update is the awareness now includes a generalized description of the terrain!

--[[--{
                    [1] = "I observe that, I am walking on basalt.",
                    [2] = "What I see is the environment has a meager amount of ground, a meager amount of cracked lava, a sparse amount of limestone, clusters of sand, a ton of rock, and heaps of basalt.",
                    [3] = "I am observing I believe, we are in an area called Island of Morgan, the center is near at the current elevation as me to the east.",
                    [4] = "What I see is a Crystal not far to the east.",
                    [5] = "I can see that, piece of small rock embedded with a Dragonstone is to the east.",
                    [6] = " We are in a place called Island of Morgan. There is a Crystal near eastward.",
                    [7] = "What I notice is I am walking on basalt, the environment has a meager amount of ground, a meager amount of cracked lava, a sparse amount of limestone, clusters of sand, a ton of rock, and heaps of basalt , I believe, we are in an area called Island of Morgan, the center is near at the current elevation as me eastward, a Crystal near eastward, piece of small rock embedded with a Dragonstone is  eastward. ",
                    [8] = "We are in a place called Island of Morgan, the center is close at the current elevation as me to the east",
                    [9] = "If my eyes don't deceive me, I am walking on basalt, the environment has a meager amount of ground, a meager amount of cracked lava, a sparse amount of limestone, clusters of sand, a ton of rock, and heaps of basalt, we are in a place called Island of Morgan, the center is close at the current elevation as me to the east, and a Crystal close to the east. "--]]--

With this function it counts all of the materials then judges them based off a judgement matrix representing thresholds for the generalization.

function aware.judge.terrain(root,radius)

local function detecter(material)
local materel={}
--print(material)
local nummats=0
for i,v in material do 
if material[i] then
if i~="Size" then
--if material[i][1] then
for t,o in material[i] do 
for c,p in material[i][t] do 
--if --material[i][t][c]~=Enum.Material.Water and
if material[i][t][c]~=Enum.Material.Air then --count all of the terrain
local matstring = string.split(tostring(material[i][t][c]),'Enum.Material.')[2]
if matsolutions[matstring]~=nil then
    matstring= matsolutions[matstring]
 end
if materel[matstring]==nil then
nummats+=1
materel[matstring]=1
else 
materel[matstring]+=1
--table.sort(materel,function(a,b)return a>b end)--sort greatest to 
end

--return true
end
end 
end
end
end
end
--least
print(materel)
--create an array with the keys
local keys = {}
for k in pairs(materel) do
  table.insert(keys, k)
end

--sort the array using a custom comparison function
table.sort(keys, function(a, b) return materel[a] < materel[b] end)

--print the sorted table


--table.sort(materel,function(a,b)return a>b end)--sort greatest to 


--table.sort(materel,function(a,b)return a>b end)--sort greatest to least
local judgeamntstrings = {
  {"a handful of", "a few", "a smattering of", "a sprinkling of"},
  {"a little bit of", "a trace of", "a touch of", "a dash of"},
  {"a sparse amount of", "a scant amount of", "a meager amount of", "a minimal amount of"},
  {"bunches of", "clusters of", "groups of", "packs of"},
  {"a lot of", "heaps of", "a ton of", "loads of"},
  {"a multitude of", "a plethora of", "hordes of", "heaps of"},
  {"a huge quantity of", "a massive amount of", "a colossal amount of", "a prodigious amount of"},
  {"a staggering number of", "an astonishing number of", "a phenomenal number of", "a mind-blowing number of"}
}
 
local judgementstring=""
local index=0
--{1, 2, 3, 4, 5, 6, 8}
local judgmatrix={1,radius,radius*5,radius*10,radius*20,radius*30,radius*40,radius*50}
--for i, k in ipairs(keys) do
--  print(k,)
--end
for i,k in keys do 
index=index+1
if index==nummats then
judgementstring..="and "
end
judgementstring..=aware.judge.amnt( materel[k],judgeamntstrings,judgmatrix).." "..k:lower()
if index~= nummats then
judgementstring..=", "
end
end

return judgementstring
end

local region = Region3.new(root.Position-Vector3.new(radius,radius,radius),root.Position+Vector3.new(radius,radius,radius))
local material = terrain:ReadVoxels(region, 4)            
-- detecter(material) 
--phraselib.opener[aware.mathrandom(1,#phraselib.opener)]
return  detecter(material) 
end



function aware.get.terrain(root,radius,str)
local phrases = {
  "The environment has",
  "The terrain consists of",
  "The surroundings is characterized by",
  "The landscape features",
  "The ecosystem hosts",
}
if radius~=true then
if str~=nil then
--rewrite the table in the first person context
return phrases[math.random(1,#phrases)].." "..str..". "
end
return phrases[math.random(1,#phrases)].." "..aware.judge.terrain(root,radius)..". "
elseif radius==true then
return  phrases[math.random(1,#phrases)]:lower().." "..str..""
end
end

In addition, this module also observes the water in particular


function aware.judge.water(root)
-- Loop through the directions and cast rays
local origin=root.CFrame:ToWorldSpace(CFrame.new(0,5,0)).Position--go above the part to get a better angle
for i, dir in ipairs(waterdirections) do
    local result = workspace:Raycast(origin, dir, params)
    -- Check if the ray hit anything
    if result then
        -- Check if the hit part is water terrain
        if result.Instance:IsA("Terrain") and result.Material == Enum.Material.Water then
             local magn=(origin-result.Position).Magnitude
             local dist, dir = aware.judge.distance(root,magn, origin, result.Position, range)
           
            return phraselib.waterdescription[math.random(#phraselib.waterdescription)] .. dist .. " to the " .. dir.."",magn
        end
    end

end
return "",nil
end

Finally all of the phrases has been cleaned up into a libraries that are descendant modules of the awareness module. This should increase the readability of the module.

Some final notes are that this module is designed with the specific object catagories in mind to add extra general flair. But I have created a new way to describe certain objects

  local function describeDungeons()--describes the closest 3 furnitures that are not the closest one.
                local FurnitureText = ""
                if numDungeons > 1 and closestDungeon and Dungeonarray then
                --table.sort(Furnarray,function(a,b) return a.Ma)
                --local length=3
                local iterations=0
                local maxiterations=math.min(#Dungeonarray,3)
                for i,closeFurniture in Dungeonarray do  
                if closeFurniture.maininstance~=closestDungeon then    
                iterations+=1
                 -- table.insert(arealist,{maininstance=list[h],instance=c,size=g,distance=calc}) 
                if iterations>=2 then FurnitureText=FurnitureText..", " end
                  local dist, dir = aware.judge.distance(root,closeFurniture.distance, pos, getroot2(closeFurniture.maininstance).Position, 200)
                  FurnitureText = FurnitureText..aware.judge.object(closeFurniture.maininstance) .. " " .. dist .. "" .. dir
                if iterations>=maxiterations then break end
                end
                end
                  --  end
                end
                return FurnitureText
            end

This describes the closest 2 other objects instead of generalizing the amount of other objects.

In addition this module can now be used with the chatmodule I published, to query the environment.

It also is now compatible with Determinant AIs chatGPT plugin.

I’ve been lookiing for a solution to make chatbots. How did you get the information to do this and is it possibile to just make a general one with a personality?

I posted a resource with some examples of using API endpoints to interact with large language models.
Chatbot & LLM Artificial Intelligence Model API Code Documentation FREE (Open Source) and Other Useful APIs - Resources / Community Resources - Developer Forum | Roblox

Also you can create localized chatbot like I did by creating synthetic datasets using something like this. In this example we are creating different expert datasets for specific conversations. Might be a bit too in depth for beginners.

function cm.extractDialogue(input)
	-- Create an empty table to store the names and speech
	local dialogueTable = {}
	-- Split the input string by the newline character
	local lines = string.split(input, "\n")
	-- Iterate over the lines
	--   if lines then
	for _, line in ipairs(lines) do
		if string.len(line)>3 then  
			-- Find the position of the colon symbol in the line
			local colonPos = string.find(line, ":")
			-- If the colon symbol is found
			if colonPos then
				-- Extract the name from the line by taking the substring before the colon
				local name = string.sub(line, 1, colonPos - 1)
				-- Extract the speech from the line by taking the substring after the colon
				local speech = string.sub(line, colonPos + 1)
				-- Trim any whitespace from the name and the speech using string.gsub
				name = string.gsub(name, "^%s*(.-)%s*$", "%1")
				speech = string.gsub(speech, "^%s*(.-)%s*$", "%1")
				-- Add the name and the speech to the dialogue table as a key-value pair
				if dialogueTable[name]==nil then dialogueTable[name]={} end --end
				if has_punctuation(speech)  then
					dialogueTable[name] = table.insert(dialogueTable[name],speech) end
			end
		end
	end

	print(dialogueTable)
	-- Return the dialogue table
	if #dialogueTable==0 then
		dialogueTable={input}
	end
	return dialogueTable
end
function cm.ZephyrCustom(systemmsg,prompt)
	-- Define the API URL and the authorization header
	local API_URL = "https://api-inference.huggingface.co/models/HuggingFaceH4/zephyr-7b-beta"--"https://api-inference.huggingface.co/models/HuggingFaceH4/zephyr-7b-alpha"--
	--local API_URL=--"https://api-inference.huggingface.co/models/alignment-handbook/zephyr-7b-dpo-lora"--"https://api-inference.huggingface.co/models/mistralai/Mistral-7B-v0.1"
	local headers = {Authorization = bearer}

	-- Define the HttpService
	local HttpService = game:GetService("HttpService")

	local function queryModel(input, temperature)
		-- Create a payload table with the input and the temperature
		local payload = {inputs = input, temperature = temperature,max_new_tokens=1000, min_tokens=200}--,--- top_k=50, top_p=0.95}-- Encode the payload table into a JSON string
		local payloadJSON = HttpService:JSONEncode(payload)
		-- Send a POST request to the API URL with the header and the payload
		-- Use pcall to catch any errors
		local success, response = pcall(HttpService.PostAsync, HttpService, API_URL, payloadJSON, Enum.HttpContentType.ApplicationJson, false, headers)
		-- Check if the request was successful
		print(success)
		print(response)   
		if success then
			-- Decode the response into a table
			-- Use pcall to catch any errors
			local success, responseTable = pcall(HttpService.JSONDecode, HttpService, response)
			-- Check if the decoding was successful
			if success then
				-- Return the response table
				return responseTable
			else
				-- Return nil and the error message
				return nil, responseTable
			end
		else
			-- Return nil and the error message
			return nil, response
		end
	end

	local function format_response(str)
		-- find the assistant response in the string
		local start = string.find(str, "<|assistant|>")
		local finish = string.len(str)
		-- local finish = string.find(str, "</s>", start)
		local response = string.sub(str, start + 13, finish)
		-- return the response in a code block format
		return "" .. response .. ""
	end
	-- Test the function with an example input
	--print(str)
--	local systemmsg=cm.randomizeString(systemmsg)
	--local prompt=cm.randomizeString(prompt)
	local input = "<|system|>\n "..systemmsg.."</s>\n<|user|>\n "..prompt.. "</s>\n<|assistant|>"
	local temperature = 2
	local output = queryModel(input, temperature)
	print(output)
	local iterations=0
	local previnput=nil
	local checkedinput=nil
	if output then
		repeat
			iterations+=1
			previnput=output[1].generated_text
			local loadoutput = queryModel(previnput)
			if loadoutput~=nil then
				checkedinput=loadoutput[1].generated_text
				output=loadoutput
				print(output)
			else
				break
			end

		until checkedinput==previnput or iterations>=3
		--local output = queryModel(input)
		print(output)
		--print(output)
		if output~=nil then
			local output=format_response(output[1].generated_text)
			print(output)
			local output=cm.extractDialogue(output)
			print(output)
			--listtable(list)
			local conversation=listtable(output[1])
			-- Print the output
			print(conversation)
			return conversation-- parseConversation(str)
		else return output
		end
		return output
	end
	return nil
end
function cm.GenerateZephyrDataset(Greetings,inquiry,IDK,Database,wisdom,name)--v is a personality function can only run from command prompt.
	--local Greetings,inquiry,IDK,Database,wisdom=v()
	if Greetings==nil then Greetings={} end if inquiry==nil then inquiry={} end if IDK==nil then IDK={} end if Database==nil then Database={} end 
	if wisdom==nil then  wisdom={} end
	local c=Instance.new("ModuleScript")
	c.Source='personalities={person=function() local Greetings={"'..table.concat(Greetings,'","')..'"} local inquiry={"'..table.concat(inquiry,'","')..'"} '..'local IDK={"'..table.concat(IDK,'","')..'"} '..'local Database={"'..table.concat(Database,'","')..'"} '
		..'local wisdom={"'..table.concat(wisdom,'","')..'"} return Greetings, inquiry, IDK, Database, wisdom end } return personalities'    c.Parent=game.ReplicatedStorage.GlobalSpells.ChatbotAlgorithm.Personalities 
	c.Name=name
end
function cm.CreatePersonalityData()
Switch.Value=true

for i,v in  animals do
local Datas={["Greetings"]={"Respond to the question with a List 10 entries a "..v.." would say.","Could you answe these 10 questions in order ?"},
["inquiry"]={" You are roleplaying as a "..v..". List 10 entries of different things you would say when asking for a inquiry."," How would a "..v.." ask a question?"},
["IDK"]={" You are roleplaying as a friendly "..v..".  List 10 entries of different things you would say when you as a "..v.." does not know the answer to something.","Can you tell different ways to tell someone I don't know as if I were a "..v.."?"},
["Database"]={"You are roleplaying as a "..v.." from a magical land. List 10 details about what a "..v.." would say .","Can you tell me about "..v.."s?"},
["wisdom"]={"You are roleplaying as a talking "..v..". List 10 entries of wise statements with the nuance of a "..v..".","What wisdom does a talking "..v.." have to share?"},
}
if Switch.Value==false then
break
end
if personaldir:FindFirstChild(v)~=nil then
local Gre,inq,idk,datab,wisd=require(personaldir:FindFirstChild(v)).person()
local Datatables={["Greetings"]=Gre,["inquiry"]=inq,["Database"]=datab,["wisdom"]=wisd,["IDK"]=idk}
local newdata={}
print("Attemping")
--task.wait(mathrandom(1,15))
local prevdata=personaldir:FindFirstChild(v)
for c,o in  Datas do
if Switch.Value==false then
break
end
--if #Datatables[c]==1 then
if c~="IDK" and c~="inquiry" then
print(o[1]..o[2])

--local data=pcall(function() return  end)
--if data~=nil then
local datap=cm.ZephyrCustom(o[1],o[2])
if datap~=nil then
newdata[c]=datap
if newdata[c] then
Datatables[c]=unionTables(Datatables[c],newdata[c])
--for i,v in newdata[c] do
--table.insert(Datatables[c],v)
--end
task.wait(mathrandom(7,15))
end
else 
task.wait(mathrandom(7,15))
newdata[c]=Datatables[c]
end
print(newdata[c])
--end


--for i=#newdata[c], in Datatables[c]
--else newdata[c]=Datatables[c]

end
newdata[c]=Datatables[c]
end

--end
--print(datatable)
--if datatable==nil then
cm.GenerateZephyrDataset(newdata.Greetings,newdata.inquiry,newdata.IDK,newdata.Database,newdata.wisdom,v)
task.wait(mathrandom(1,5))
prevdata:Destroy()
--else task.wait(10)
elseif personaldir:FindFirstChild(v)==nil then
local newdata={}

--local prevdata=require(personaldir:FindFirstChild(v)).person()
for c,o in  Datas do
if Switch.Value==false then
break
end
newdata[c]=cm.ZephyrCustom(o[1],o[2])
task.wait(mathrandom(5,15))

end
--print(datatable)
--if datatable==nil then
cm.GenerateZephyrDataset(newdata.Greetings,newdata.inquiry,newdata.IDK,newdata.Database,newdata.wisdom,v)
task.wait(mathrandom(1,5))
--prevdatas:Destroy()
--else task.wait(10)
end
end
--break
--else
--local datatable=cm.ZephyrMage(v)
--if datatable~=nil then
--cm.GenCustomDataSingleV2("battle",v,datatable)
--task.wait(10)
--else task.wait(10)
--end

end

In my setup I have a local AI chatbot that scores the accuracy of the local response and if the accuracy is low it uses a external LLM api to generate a response. I also generated specific personality data so that it gets the seed for generation from the name.
Into this large table that generates consistent personality data based on this engineered dataset.
The chatbot I created uses machine learning to find the best response by querying all its nodes and ordering them from most relevant to least, Personality datasets are also generated using Zephyr 7b as shown above.


I made this when ChatGPT first came out so it was a very exciting time to be creating chatbots and I very much enjoyed engineering intelligence. It’s not easy But what is good is creating aware and agentic type systems that can manipulate their surroundings and solve logical problems.

But the code above is an example of how to use an Large Language Model endpoint to create datasets. It can be very useful for creating local chatbots

I went to the open source page and used google/gemma-2b-it. Where do i get the GlobalSpells.ChatbotAlgorithm.ChatModule and is there anything else i need to replicate it?

The chatmodule is not required the most up to date version of this module is Text Vision Awareness Judgement Description Library for LLMs in Video Games [Open Source]
Sorry for the confusion. I would recommend the newest version.
This is the link to the page about the Chatmodule library in addition to its previous public build and examples.
Lua Trainable Ai Chatbot Library, Emoji Generate ,Emotion Classifier Context Database,Math/Geometry Data Generator Geometry+Worded Math Solver RAG
Designed to create a retrieval based chatbot and provide rag for LLMs.

		--local DeterminantAgent=require(game.ReplicatedStorage.GlobalSpells.ChatbotAlgorithm.DeterminantAgent)
		--local personality=personal[1]
		local awarobserve=personal[2]
		--identify..timeod..awareobserve
		print(personal)
		--local Resulttable,speakers=cm.LocalZephyrStory(str,npcnam,{[1]=persona,[2]=awareobserve,[3]=identity,[4]=timeod},Player)
		local identity=personal[3]
		local timeod=personal[4]
	local insight=personal[5]
	local botter=player.Name
	if personal[8]~=nil then
		botter=personal[8]
	end
		--local memory=personal[7]
		--local previousconversation=personal[6]
	local systemsg =timeod.."\n"..identity.."\n"..awarobserve.."\n"..insight--Time of day, npc identity, awareness of surroundings and response from local chatbot in addition to previous memories.
	print(systemsg)
		-- Parse story dialogues with "..person..": and "..Playername..": .\n</s>\n<|"..Playername.."|>\n"..quer.." </s>\n<|assistant|>"..previousconversation
	return Agent.SendMessage(player,msg,npc,systemsg,memories,botter)

In this example I show the end result of how a system message is engineered via Time of day, npc identity, awareness of surroundings the response from local chatbot in addition to previous memories.
I tried to make sure all dependencies were checked with it but any dependant functions can be omitted. Only a few things are not usable with the open sourced version, such as certain effects that require a effects library that is not included and animations library. Although, I have published some standalone modules that provide emojis and intelligent emotes as well as animations dataset, in addition to the awareness synthesis algorithm. I would not publish my entire library due to my intellectual property, I would rather provide code and pieces to build one from scratch.
If you need any advice on how to set up a chatbot lmk and I could help you out with specific questions.

These are the relevant resources I have provided to help make chatbots easier for people to create.
This module is a abstract implementation of awareness to perform context based actions based on the surroundings.
NEW Action Token AI Framework LLM Utility/Chatbot Luau [Open Source] - Resources / Community Resources - Developer Forum | Roblox
It utilizes the Awareness library.
The Chatmodule library is much more in depth, and not as concise as the other resources but it allows you to query your data and find the best conversational response to your query. There are some examples in the linked lua trainable Chatbot library. I posted a large 233,000 dataset on there. I was querying it in sections based on emotional tonality evaluation and grouped by starting word to interesting results, but I opted for more engineering the idea of intelligent responses based on smaller, custom high-quality datasets.
with the flow of local chatbot handles short responses and greetings then a LLM API handles long form queries and utilizes data from the local chatbot to engineer the system prompt, while recalling summarizations of previous conversations.

A concept like self-aware AI is not something you can just redefine however you please just to fit your needs; there’s only one actual definition and a bunch of if statements are not it.

It’s for making Large language models self aware. By judging things around it and itself. As illustrated by the judgement library. :slight_smile:
image
Description library describes the objects into a text description.

It judges its features and returns an array of all those elements and the combined element at the bottom synthesizing natural language descriptions from an environment. Objects are categorized intentionally, although it is not interdependent on any of the variables they can all be nil and the module will return an empty string.

This last entry is often used for injecting awareness into a Large Language Model.
All the tests I’ve done with it illustrate how valuable of a tool it is, it makes chatbots very environmentally and self aware when used to its fullest potential. I also use it for the players observation chat bot, a data source for a local chatbot, in addition generating category specific commands. such as Chop.Tree() Mine.Rubble() IT’s very important to keep your workspace organized.
In this example code this library is used to generate commands based on the environment and take text input to recognize them.

    --target attack the enemy
    ["Enemy"] = function(root, obj, key)
        return "preparing to attack", aware.get.NPCDescription(obj.Parent)
    end, --walk towards the npc
    ["NPC"] = function(root, obj, key)
        return nil, ""
    end,
     --do nothing just navigate to
    --walk towards and open the chest
    ["chest"] = function(root, obj, key)
        return "walking towards the lock of the", ""
    end, --navigate to the animpoint inside the chest
    --walk towards and examine the crystal
    ["crystal"] = function(root, obj, key)
        return "checking out this", ""
    end,
    ["dungeons"] = function(root, obj, key)
        return "trying to find the center of this", ""
    end,
    ["fish"] = function(root, obj, key)
          if obj then Controller.PickUpObject({root.Parent, obj}) end
        return "swimming towards the", ""
    end,
    ["house"] = function(root, obj, key)
        return "walking towards the door of the", ""
    end,
    ["loot"] = function(root, obj, key)
       if obj then Controller.PickUpObject({root.Parent, obj}) end
        return "attempting to loot the", actionmod.PickUpObject(root.Parent, obj) 
    end,
    ["mapobj"] = function(root, obj, key)
        return nil, ""
    end,
    ["plant"] = function(root, obj, key)
         if obj then Controller.PickUpObject({root.Parent, obj}) end
        return nil, ""
    end,
    ["player"] = function(root, obj, key)
        return nil, ""
    end,
    ["rubble"] = function(root, obj, key, Player)
        local key = "mine rock"
        commands[key](Player, root.Parent)
        return nil, commands.commandresponse(key)
    end,
    ["tree"] = function(root, obj, key, Player)
        commands["cut tree"](Player, root.Parent)
        return nil, commands.commandresponse("cut tree")
    end
}

The rest of the code for that is here.
NEW Action Token AI Framework LLM Utility/Chatbot Luau [Open Source] - Resources / Community Resources - Developer Forum | Roblox

I have a lot of mind-blowing things that I do not open source. What I do with AI this is only part of it. Shared because It’s designed for artificial intelligence in ROBLOX and gives very impressive results in my simulation where all these categories are generated procedurally. Although at its core it is what is illustrated in the Description library.
So tinkering with the source code or refactoring it would be ok. The categories just increase the quality of the natural language synthesis.

The task i want to accomplish is just making a text box you can type in and send the text to the chatbot and it prints an answer.

Player.Chatted:Connect(function(message) end)

Also the Awareness module linked above has been changed to its newest version unfortunately link counter has been reset.