Text-Vision Awareness LLM Utility Luau [OPEN SOURCE] now Judges Terrain, Water, Floor Material and More

You probably do not understand that there are people who play roblox at around 15 fps due to their specs, this is enough to give them freezes
I said that running on server is not even an option as running 30 instances of this will eat a lot of server CPU while running on client might have issues with some people’s phones, not everyone has a new gen phone running android 13 with better cpu than some laptops

My game runs incredibly smooth, I’ve always paid strict attention to that. I’ve always done modifications to reduce performance costs as well. Already done multiple tests on mobile and It works great, and future improvement will be made. Just last night I improved my searchquery function by 33% in terms of performance.

“My game”, perfectly said.
Some people struggle in under 20 fps on some games that only run a few scripts, you said this is a module, so this will run with probably your chatbox script that I saw on the forum some days ago, now add other game scripts on top of these 2.
Eh this seems fine for roblox but I don’t really see any in game use or anything revolutionary as u still need to put a lot of time into it to add game specific objects to the awareness algorithm, if your goal was recreating that youtube clip posted in 1st post then yeah its fine

Yes it is much deeper than that. I haven’t updated that post in a while because I’ve been completing this new system. But I’m not releasing it because it’s too powerful and would make my game less competitive in the game market. Also al lthe game objects for the algorithm are already setup, I’ve just had the map generator disabled for testing convienience. Since the world generator is a multi threaded system that actually does take a fair bit of processing from the perspective of a local machine, But I’ve done high speed flying tests to validate the performance of the world generator and made it run as efficient as possible, to the desired results on a server.
image

Yes when I see this screenshot I can totally agree that this is a Chatbot algorithm, but the title was not very explaining due to the “(Complete)” tricking me and I thought that its a LLM related post, yes in some cases you could pass this to a LLM but it might give you kinda bad output. A good approach would be a multi modal LLM that can understand both images and text, thats where an Awareness AI powered NPC will be at his full capacity

I’ve done a lot of testing, and it works great. My system is very elegant, and creates very convincing strings. I did this to limit API usage to scale this bot up. It records all of the interactions it makes with the AI models.
But in regards to your idea for a visual model you should understand how those work. They literally turn the picture into words.
So you can be like oh thats not AI, cause its not a machine learning function. But it actually is an AI that I specifically designed. You can evaluate its form of ‘intelligence’ by interacting with it. But it’s mainly designed to be a NPC companion for a MMORPG, that outputs educational and character relevant outputs.
Utilizing servers to process things is not very usable in a performant persepctive which is why I designed my bot to utilize AI when it needs to. and it also utilized zero-shot classifcation.

yeah im pretty familiar to them as I have fine tuned a model for my own application to help my users with problems, LLaVA model could power any awareness module that exists as its a multimodal LLM
difference between an actual AI powered model and a normal lua script module is that the AI can be used to determine what an object/person is doing inside the game faster than what u would implement by code

The machine learning function can be use to learn how to act in a open world but thats not text related and if you are going for a text output you can a make a api-call everytime someone does a query, or you could b. have a architecture that hdnles most queries in a conversational context.

cool idea instead of just awareness of objects you could try to also add explaining for zones as its a MMORPG, like explain that mobs could spawn there, what cool items you could get from there, maybe ways to defeat a boss and things like that

Yes I am utilizing the Zone module posted in the community resources! All of the area creeated by my world generator have randomly generated names based on their theme such as Greek/Roman themed areas are called something like “Island of Athens”, “Mountain of Sparta”, “Tower of Olympus”, etc it’s a very interesting project! But in total this architecture is like a base model and my future model will be a subcription option for players to access GPt-4

local strut={" We are in a place called "..aware.judgeobject(closestDungeon).."",
					"I believe, we are in an area called "..aware.judgeobject(closestDungeon).."",
					""..aware.judgeobject(closestDungeon).."... I believe that's what this place is called",
					"This area is called "..aware.judgeobject(closestDungeon).."",
					"If I recall correctly this place is known as "..aware.judgeobject(closestDungeon).."",
					"If my navigation skills are correct, this area is called "..aware.judgeobject(closestDungeon).."",
					"This place is known as "..aware.judgeobject(closestDungeon).."",
					""..aware.judgeobject(closestDungeon).." is the name of this land.",
					"According to my map, this place is called "..aware.judgeobject(closestDungeon).."",
					"I have heard that this place is called "..aware.judgeobject(closestDungeon).."",
					""..aware.judgeobject(closestDungeon)..", that's the name of this place.",
					"This location is called "..aware.judgeobject(closestDungeon).."",
					"From what I know, this place is known as "..aware.judgeobject(closestDungeon).."",
					"My compass tells me that this area is called "..aware.judgeobject(closestDungeon).."",
					"I have learned that this place is called "..aware.judgeobject(closestDungeon).."",
					""..aware.judgeobject(closestDungeon)..", that's the name of this area.",
					"This spot is called "..aware.judgeobject(closestDungeon).."",
					"Based on my information, this place is known as "..aware.judgeobject(closestDungeon).."",
					"My guidebook says that this area is called "..aware.judgeobject(closestDungeon).."",
					"I have been told that this place is called "..aware.judgeobject(closestDungeon)..""}
2 Likes

I have updated this module, it is now complete! I included all of the different strings and this can provide thousands of different outputs!
The main update I did today that spurred me to share this was my super efficient table handler!
After trying to implement my procedural world generator the childadded caused up to 1200ms ping! So here is a ‘lazy’ tablehandler that is super efficient. :slight_smile:

ticktime=workspace.GlobalSpells.Tick

local function tablehandler(location)
	-- Get the folder from the workspace
	local tableobj={}
	local mapObjectsFolder = location
	local result=mapObjectsFolder:GetChildren()
	
	return result
end

-- Call the function and assign it to a variable
--ticks=.6 seconds 100 ticks=60seconds 100ticks =1min
tablekeys={

["mapobject"]={{["datatable"]=nil, ["writetime"]=ticktime.Value, ["refreshrate"]=200}, workspace.MapObjects},

["plants"]={{["datatable"]=nil, ["writetime"]=ticktime.Value, ["refreshrate"]=300}, workspace.Plants},

["enemys"]={{["datatable"]=nil, ["writetime0]"]=ticktime.Value, ["refreshrate"]=100}, workspace.Enemys},

["rubble"]={{["datatable"]=nil, ["writetime"]=ticktime.Value, ["refreshrate"]=200}, workspace.Rubble},

["trees"]={{["datatable"]=nil, ["writetime"]=ticktime.Value, ["refreshrate"]=600}, workspace.Trees},

["chests"]={{["datatable"]=nil, ["writetime"]=ticktime.Value, ["refreshrate"]=300}, workspace.Chests},

["grounditems"]={{["datatable"]=nil, ["writetime"]=ticktime.Value, ["refreshrate"]=100}, workspace.GroundItems},

["houses"]={{["datatable"]=nil, ["writetime"]=ticktime.Value, ["refreshrate"]=500}, workspace.Houses},

["players"]={{["datatable"]=nil, ["writetime"]=ticktime.Value, ["refreshrate"]=100}, game.Players},

["dungeons"]={{["datatable"]=nil,["writetime"]=ticktime.Value, ["refreshrate"]=100}, workspace.AmbientAreas},

["npcs"]={{["datatable"]=nil, ["writetime"]=ticktime.Value,["refreshrate"]=100}, workspace.NPCS}

}
--aware.Gettable(Key)

function aware.Gettable(Key)
	if tablekeys[Key] then
		local results,location=tablekeys[Key][1],tablekeys[Key][2]	
		local timer=results["writetime"]
		local refresh=results["refreshrate"]
		local data=results["datatable"]
		if timer+refresh<ticktime.Value and data~=nil then
			return results["datatable"]
		else 
			tablekeys[Key]={{["datatable"]=tablehandler(location), ["writetime"]=ticktime.Value,["refreshrate"]=refresh},location}
			return tablekeys[Key][1]["datatable"]
		end	
	else 
		return nil
	end	
end
1 Like

I don’t think it’s right to say you’re creating an A.I. if all you’re doing is converting a 3D area into text and prompting chatGPT (or any other LLM, which you absolutely did not create). My point is that you are not creating the A.I. part of this whatsoever and you should stop thinking you are. This module might be useful for having a bot answer questions in-game via chatGPT, but to be fair, it’d be much easier and more comprehensible to hard-code responses. (tbh you already “hardcode” most of the responses anyway, an A.I. that can actually recognize objects based on their appearance would need a LOT of data)

Though to be completely honest the idea of creating an “algorithm” which can describe a 3D area in text is very interesting.

Long story short it’d be good if you changed the name of this post as it is very misleading and ends up confusing a lot of people, this way you would avoid fruitless arguments about what you “actually meant” and “what this actually is”.

3 Likes

Regardless of your opinions on the subject, my original example was an AI. It was doing something relatively interesting. The code example I shared can recognize who it’s talking to, it has a emotional state based on an evaluation of each personalities dataset, these work independently. But I provided my chatmodule to use in conjunction with this module to use the emotional output. I will reiterate the open source module I provided and you can skip to the bottom and see that it currently can provide over 4 million different outputs. without considering the names of the objects which makes that an infinite number. That is the module alone, furthermore, these are use to do a database specific search query. The winning observation entry is then connected to a database entry related to that observation. This output is combined with a third result by also doing a search query of the wisdom and databases.

If you were to train an AI model on ROBLOX it would have to be smaller so training a language model locally wouldn’t work unless you had a bunch of supporting systems in place. You can host your own fine-tuned model, but you would still incurr server costs, this minimizes server costs and API calls.

Also logic is based on if and thens, a neural network with a machine learning algorithm is using a structure that says X(or) which weights the (or) based on the weight of the parameter. to make generalizations, based on the stored weights for the vector matrix it generates from training. This is done by tokening the input/output through its predictive layers which provides the solution to x(or). You saying this is not AI is beyond arrogant when nowhere is a coded machine learning algorithm defined as the only form of AI, since you do not always require a x(or) condition when the solution only requires logic.

  • AI is the field of computer science that aims to create systems that can perform tasks that normally require human intelligence, such as learning, reasoning, problem-solving, etc.
  • Logic is a way of expressing and manipulating information using rules and symbols, such as if, or, and, then, etc.
  • Machine learning is a branch of AI that focuses on creating systems that can earn from data and improve their performance without explicit programming.
  • Neural networks are a type of machine learning model that can learn complex patterns and functions from data by adjusting the weights of their connections.
  • x(or) is a type of logic that assigns a weight to each condition and then combines them using the or operator. For example, x(or) (A, B) means A or B, but with different weights for each condition.
  • Machine learning can be generalized to a condition of x(or) by considering the weights of the neural network as the weights of the conditions, and the output of the neural network as the result of the x(or) operation. For example, if we have a neural network that takes two inputs A and B and outputs C, we can write C = x(or) (A, B), where the weights of A and B are determined by the neural network.

In conclusion this module is about Creating Self Aware AI: (Super Complete) LLM Utility Text Vision Module (Open S

1 Like

(post deleted by author)

1 Like

Uploaded a New demonstration of using this module to illustrate Candyland :smiley:

The Text Vision is given as part of the system message for the LLM giving it complete immersion into the environment and ticking off a piece of it being conscious and aware.

1 Like

Would not recommend calling it ChatModule seeing as that’s quite similar to Roblox’s name for the chat system, and it also isn’t very descriptive of what this module does.
You could try naming it something that relates to describing surroundings.
(This is supposed to be constructive btw)
EDIT: If you want to get a little less hate for your project, I would avoid using the words ‘conscious’ and ‘aware’ seeing as AI cannot be conscious.

1 Like

Conciousness implies self-awareness and this module is designed to cultivate self-awareness by acting as a sensory input to a LLM.

If you research cousciousness in AI systems this is one of the required features. You can talk to the AI in my game that are powered by Zephyr 7B. It knows the dates, weather, it’s identity, memories, awareness of its surroundings and has over 300 commands and 200 emotes for it to express itself and interact with its environment.

As a human we have concious experiences but when certain criteria are meet the simulation of consciousness can happen.

This model claims to be conscious and as the one who created the awareness system that powers it and engineered the system message to cultivate self-awareness I would consider it a strong argument.

Also I don’t care if I got any “hate” for this project since it is simply open sourced. Since I didn’t sell it. All I did was write it and post updates to the module.

The first version has some valid criticisms which were all addressed and you’re making judgements off old news. This module is currently at Version 2 and Version 3 is about to be released. Which is the one currently in use in my game. IT has new features such as being able to judge what the npc is walking on, and it can judge a body of water in the distance, in addition to grammar corrections, judging up or down left and right, it also makes more observations and has been cleaned up and improved in functionality.

I have no complaints! I’m just sharing stuff for free! I’m having a grand old time using this to cultivate self-awareness of a Large Language model. It works as I said it would and does what I thought it would. All those who posted negative criticisms are just those cliches of doubters.

It’s actually very good that ROBLOX has a text-vision utility such as this for those who are interested in the field.

If you think you can do better I would implore you to do it yourself.

I’m building on top of this module currently and its very handy and wonderful. I use it for a variety of side projects including the action commands that work off a similar concept except it procedurally generates actions for each of these libraries so the AI Agent can interact with the environment just by saying it.

Also I published the update to the Awareness Module and Removed the code of the first version.

This is the code I user to engineer the system message as you can see it has memories, Text-Vision awareness, insight provided from a local chat bot, time of day and weather awareness.

function cm.ZephyrStory(person,personal,Playername,quer)
-- Define the API URL and the authorization header
local API_URL = "https://api-inference.huggingface.co/models/HuggingFaceH4/zephyr-7b-beta"
local headers = {Authorization = bearer}

-- Define the HttpService
local HttpService = game:GetService("HttpService")

-- Define a function that takes an input and queries the model
local function queryModel(input, temperature)
    -- Create a payload table with the input and the temperature
  local payload={inputs = input, temperature = temperature,max_new_tokens=1000, min_tokens=250, top_k=100, top_p=0.11}
   -- local payload = {inputs = input, temperature = temperature}
    -- Encode the payload table into a JSON string
    local payloadJSON = HttpService:JSONEncode(payload)
    -- Send a POST request to the API URL with the header and the payload
    -- Use pcall to catch any errors
    local success, response = pcall(HttpService.PostAsync, HttpService, API_URL, payloadJSON, Enum.HttpContentType.ApplicationJson, false, headers)
    -- Check if the request was successful
    if success then
        -- Decode the response into a table
        -- Use pcall to catch any errors
        local success, responseTable = pcall(HttpService.JSONDecode, HttpService, response)
        -- Check if the decoding was successful
        if success then
            -- Return the response table
            return response--Table-- return json
        else
      --  print()
            -- Return nil and the error message
            return nil, response--Table
        end
    else
        -- Return nil and the error message
        return nil, response
    end
end

local personality=personal[1]
local awarobserve=personal[2]
--identify..timeod..awareobserve
--local Resulttable,speakers=cm.LocalZephyrStory(str,npcnam,{[1]=persona,[2]=awareobserve,[3]=identity,[4]=timeod},Player)
local identity=personal[3]
local timeod=personal[4]
local insight=personal[5]
local memory=personal[7]
local previousconversation=personal[6]
if previousconversation==nil then
previousconversation=""
else 
local function RebuildConversation(tbl,response)--reduce size of system message
--local tbl,speakers,str=cm.LocalZephyrDecode(response)
local sum="" for i,v in tbl do for t,o in v do
if t~="narrator" then

 sum=sum.." \n\n "..t..": "..o 
else
 sum=sum.." \n\n "..o 
end end print(sum)
end
sum=sum.." \n\n "..Playername..": "..response 
return sum
end
previousconversation=RebuildConversation(personal[6][1])
end
--awareobserve,timeod,identity
-- Test the function with an example input
--cachedconversation

local input = "<|system|>\n "..identity..timeod..insight..awarobserve..memory..". Parse dialogues with "..person..": and "..Playername..": .</s>\n<|"..Playername.."|>\n "..quer.." </s>\n<|assistant|>"..previousconversation
local temperature = 2
local output,Error = queryModel(input,temperature)
print(output)
local iterations=0
local function RebuildResponse(response)--reduce size of system message
local tbl,speakers,str=cm.LocalZephyrDecode(response)
local sum="" for i,v in tbl do for t,o in v do
if t~="narrator" then
 sum=sum.." \n\n "..t..": "..o 
else
 sum=sum.." \n\n "..o 
end end print(sum)
end

local input = "<|system|>\n"..identity..memory..awarobserve.." Parse dialogues with "..person..": and "..Playername..": . </s>\n<|"..Playername.."|>\n "..quer.."</s><|assistant|>"..sum
if iterations==2 then
input = "<|system|>\n"..memory..awarobserve..identity.." Parse dialogues with "..person..": and "..Playername..": .</s>\n<|"..Playername.."|>\n "..quer.."</s><|assistant|>"..sum
elseif iterations==3 then
input = "<|system|>\n"..identity.."\n Parse dialogues with "..person..": and "..Playername..": .</s>\n<|"..Playername.."|>\n "..quer.."</s>\n<|assistant|>"..sum
end

return input
end
if not Error then
local function iterateoutput(output)
local checkedinput
local loadoutput
local previnput
repeat
iterations+=1
previnput=HttpService:JSONDecode(output)[1].generated_text
local loadoutput = queryModel(RebuildResponse(output))
if loadoutput~=nil then
checkedinput=HttpService:JSONDecode(loadoutput)[1].generated_text
if checkedinput then--only update output if valid
output=loadoutput
print(output)
else
break 
end
else
break
end
until checkedinput==previnput or iterations>=3
return output
end
output=iterateoutput(output)
end
local function DecodeResponse(response)--reduce size of system message
local tbl,speakers,str=cm.LocalZephyrDecode(response)
local sum="" for i,v in tbl do for t,o in v do
if t~="narrator" then
 sum=sum.."\n\n"..t..": "..o 
else
 sum=sum.."\n\n"..o 
end end print(sum)
end
return sum
end
local function iterateoutputLLama(output)
local checkedinput
local loadoutput
local previnput
repeat
iterations+=1
previnput=HttpService:JSONDecode(output)[1].generated_text
local loadoutput =cm.TinyLlama(""..identity..timeod..insight..awarobserve..memory..". Parse dialogues with "..person..": and "..Playername..": ." ,quer,DecodeResponse(output))
if loadoutput~=nil then
checkedinput=HttpService:JSONDecode(loadoutput)[1].generated_text
if checkedinput then--only update output if valid
output=loadoutput
print(output)
else
break 
end
else
break
end
until checkedinput==previnput or iterations>=3
return output
end

local output2=cm.TinyLlama(""..identity..timeod..insight..awarobserve..memory..". Parse dialogues with "..person..": and "..Playername..": ." ,quer,DecodeResponse(output))--recieve generated_text
if output2 then
iterations=0
local output3=iterateoutputLLama(output2)
if output3~=nil then
output=output3
else
output=output2
end
end

--local str=format_response(output[1].generated_text)
--print(str)
--local outputtabl=cm.extractDialogue(str)
-- Print the output


return output--[1] --parseConversation(str)
end

This module is a integral part of this system providing the description of surroundings emotional state of the chatbot. This has been superceded by a local chatbot system that provide RAG to assist the LLM in being Aligned with the values of the personality it is.

New update to the module! The update is the awareness now includes a generalized description of the terrain!

--[[--{
                    [1] = "I observe that, I am walking on basalt.",
                    [2] = "What I see is the environment has a meager amount of ground, a meager amount of cracked lava, a sparse amount of limestone, clusters of sand, a ton of rock, and heaps of basalt.",
                    [3] = "I am observing I believe, we are in an area called Island of Morgan, the center is near at the current elevation as me to the east.",
                    [4] = "What I see is a Crystal not far to the east.",
                    [5] = "I can see that, piece of small rock embedded with a Dragonstone is to the east.",
                    [6] = " We are in a place called Island of Morgan. There is a Crystal near eastward.",
                    [7] = "What I notice is I am walking on basalt, the environment has a meager amount of ground, a meager amount of cracked lava, a sparse amount of limestone, clusters of sand, a ton of rock, and heaps of basalt , I believe, we are in an area called Island of Morgan, the center is near at the current elevation as me eastward, a Crystal near eastward, piece of small rock embedded with a Dragonstone is  eastward. ",
                    [8] = "We are in a place called Island of Morgan, the center is close at the current elevation as me to the east",
                    [9] = "If my eyes don't deceive me, I am walking on basalt, the environment has a meager amount of ground, a meager amount of cracked lava, a sparse amount of limestone, clusters of sand, a ton of rock, and heaps of basalt, we are in a place called Island of Morgan, the center is close at the current elevation as me to the east, and a Crystal close to the east. "--]]--

With this function it counts all of the materials then judges them based off a judgement matrix representing thresholds for the generalization.

function aware.judge.terrain(root,radius)

local function detecter(material)
local materel={}
--print(material)
local nummats=0
for i,v in material do 
if material[i] then
if i~="Size" then
--if material[i][1] then
for t,o in material[i] do 
for c,p in material[i][t] do 
--if --material[i][t][c]~=Enum.Material.Water and
if material[i][t][c]~=Enum.Material.Air then --count all of the terrain
local matstring = string.split(tostring(material[i][t][c]),'Enum.Material.')[2]
if matsolutions[matstring]~=nil then
    matstring= matsolutions[matstring]
 end
if materel[matstring]==nil then
nummats+=1
materel[matstring]=1
else 
materel[matstring]+=1
--table.sort(materel,function(a,b)return a>b end)--sort greatest to 
end

--return true
end
end 
end
end
end
end
--least
print(materel)
--create an array with the keys
local keys = {}
for k in pairs(materel) do
  table.insert(keys, k)
end

--sort the array using a custom comparison function
table.sort(keys, function(a, b) return materel[a] < materel[b] end)

--print the sorted table


--table.sort(materel,function(a,b)return a>b end)--sort greatest to 


--table.sort(materel,function(a,b)return a>b end)--sort greatest to least
local judgeamntstrings = {
  {"a handful of", "a few", "a smattering of", "a sprinkling of"},
  {"a little bit of", "a trace of", "a touch of", "a dash of"},
  {"a sparse amount of", "a scant amount of", "a meager amount of", "a minimal amount of"},
  {"bunches of", "clusters of", "groups of", "packs of"},
  {"a lot of", "heaps of", "a ton of", "loads of"},
  {"a multitude of", "a plethora of", "hordes of", "heaps of"},
  {"a huge quantity of", "a massive amount of", "a colossal amount of", "a prodigious amount of"},
  {"a staggering number of", "an astonishing number of", "a phenomenal number of", "a mind-blowing number of"}
}
 
local judgementstring=""
local index=0
--{1, 2, 3, 4, 5, 6, 8}
local judgmatrix={1,radius,radius*5,radius*10,radius*20,radius*30,radius*40,radius*50}
--for i, k in ipairs(keys) do
--  print(k,)
--end
for i,k in keys do 
index=index+1
if index==nummats then
judgementstring..="and "
end
judgementstring..=aware.judge.amnt( materel[k],judgeamntstrings,judgmatrix).." "..k:lower()
if index~= nummats then
judgementstring..=", "
end
end

return judgementstring
end

local region = Region3.new(root.Position-Vector3.new(radius,radius,radius),root.Position+Vector3.new(radius,radius,radius))
local material = terrain:ReadVoxels(region, 4)            
-- detecter(material) 
--phraselib.opener[aware.mathrandom(1,#phraselib.opener)]
return  detecter(material) 
end



function aware.get.terrain(root,radius,str)
local phrases = {
  "The environment has",
  "The terrain consists of",
  "The surroundings is characterized by",
  "The landscape features",
  "The ecosystem hosts",
}
if radius~=true then
if str~=nil then
--rewrite the table in the first person context
return phrases[math.random(1,#phrases)].." "..str..". "
end
return phrases[math.random(1,#phrases)].." "..aware.judge.terrain(root,radius)..". "
elseif radius==true then
return  phrases[math.random(1,#phrases)]:lower().." "..str..""
end
end

In addition, this module also observes the water in particular


function aware.judge.water(root)
-- Loop through the directions and cast rays
local origin=root.CFrame:ToWorldSpace(CFrame.new(0,5,0)).Position--go above the part to get a better angle
for i, dir in ipairs(waterdirections) do
    local result = workspace:Raycast(origin, dir, params)
    -- Check if the ray hit anything
    if result then
        -- Check if the hit part is water terrain
        if result.Instance:IsA("Terrain") and result.Material == Enum.Material.Water then
             local magn=(origin-result.Position).Magnitude
             local dist, dir = aware.judge.distance(root,magn, origin, result.Position, range)
           
            return phraselib.waterdescription[math.random(#phraselib.waterdescription)] .. dist .. " to the " .. dir.."",magn
        end
    end

end
return "",nil
end

Finally all of the phrases has been cleaned up into a libraries that are descendant modules of the awareness module. This should increase the readability of the module.

Some final notes are that this module is designed with the specific object catagories in mind to add extra general flair. But I have created a new way to describe certain objects

  local function describeDungeons()--describes the closest 3 furnitures that are not the closest one.
                local FurnitureText = ""
                if numDungeons > 1 and closestDungeon and Dungeonarray then
                --table.sort(Furnarray,function(a,b) return a.Ma)
                --local length=3
                local iterations=0
                local maxiterations=math.min(#Dungeonarray,3)
                for i,closeFurniture in Dungeonarray do  
                if closeFurniture.maininstance~=closestDungeon then    
                iterations+=1
                 -- table.insert(arealist,{maininstance=list[h],instance=c,size=g,distance=calc}) 
                if iterations>=2 then FurnitureText=FurnitureText..", " end
                  local dist, dir = aware.judge.distance(root,closeFurniture.distance, pos, getroot2(closeFurniture.maininstance).Position, 200)
                  FurnitureText = FurnitureText..aware.judge.object(closeFurniture.maininstance) .. " " .. dist .. "" .. dir
                if iterations>=maxiterations then break end
                end
                end
                  --  end
                end
                return FurnitureText
            end

This describes the closest 2 other objects instead of generalizing the amount of other objects.

In addition this module can now be used with the chatmodule I published, to query the environment.

It also is now compatible with Determinant AIs chatGPT plugin.