[Updated]DeterminantAI - AI Agent powered characters that you can customize!

We are working on a solution to rate limits.

4 Likes

Roblox doesnt have a computer vision component so i guess it uses raycast then it adds the input inside the as the material of the car and the names like “window” “wheel” and so on

We’re going to deploy some changes to enhance our service. To make this happen, we’ll be temporarily pausing the service shortly, will keep the downtime as brief as possible. Thank you for your patience!

Please update the plugin, it’s working now. If you encounter any issues, please let me know. The plugin will be frequently updated over the next few days. Please set it to “Auto Update” in “Manage Plugins” so you won’t miss out any new features.

NEW Update>>
We’ve fixed the issue with the AI not responding due to usage limits. Now all users will be use the AI for up to 100 requests per day!

We’re also working on some new and exciting features where you can ask the NPC to do things such as go to places, pick up objects etc! Please stay tuned!

Hey so this is a pretty neat plugin, sadly, at least I can’t seem to make it work, it keeps saying this error in the console and the npc does nothing:


Any solution for this? I really want to test it out, besides that, it looks promising so far ngl, awesome job!

Edit: Ok im actually dumb and forgot you have to talk to the npc first LOL, it works as intended, cool job mate, if anything, I would say that you should look at that infite yield but for now it works great, the AI itself fails sometimes to get what the surroundings actually do, but i guess its common considering that GPT does that time to time so yeah, its pretty good for now

2 Likes

I’m glad to hear you find the plugin neat! As for the infinite yield issue, I’ll definitely fix it. Also, I appreciate your feedback on the AI’s understanding of surroundings – it’s an area we’re continually working to improve. Thanks for testing out the plugin and for your supportive words!

1 Like

Good job, I was trying to make something similar but that takes in-game screenshots and audio recordings then sending it combined with text to a chatbot API. Either way, great job! (I hope I can make my thing work later too :p)

Only minor thing is that I can see you’re using Microsoft Azure AI which if I remember correctly is paid. So eventually you’ll have to make it so that others have to put their own key in. (You’re endpoint link can also be abused by others via non-roblox products, etc…)

2 Likes

This is amazing.

Upon testing I got the following bugs:

-When multiple players are in the game and multiple NPC’s: Only the first player’s NPC talks, no matter who else is talking to their own chosen NPC. That first chosen NPC answers to everybody who chose to talk to an NPC. The other individual NPC’s will follow their own individual players, but only that first chosen NPC will talk, to everybody.

-When a player leaves the game, nobody else can choose to talk to the NPC they left behind. It’s just there standing around like a lost puppy. It’d be awesome if the NPC’s can wander around until chosen. Then when the player who chose them leaves the game, the NPC goes back to wandering around until chosen by another player.

-When the NPC is killed, the body stays there and all of the above still apply. That first chosen NPC’s severed head is the only thing still talking lol. Would be epic if it respawned back where it started, and went back to wandering around to be chosen and spoken to by another player again.

Either way, this is pretty amazing. Looking forward to seeing how this evolves.

2 Likes

this is very good, i hope you keep updating it, btw what do you want the credit to be given as? like “credits to (yourname) for character ai” ?

1 Like

it would be great to have a way of stopping npcs talking to you and stop following you, and it would be great to have different npcs with different characteristics

2 Likes

This is really cool. Nice work!

1 Like

I’ve messed around with the tool for a while and it works suprisingly well for a free plugin, thank you and the rest of the community resources posters.

1 Like

:fire::fire: NEW FEATURE - COMMAND AI GO TO LOCATION:fire::fire:

We’ve launched a new feature that will allow AI NPCs to go to locations specified by the player. For example, a player can ask the NPC to “go to the bus stop” and the NPC would move there :running_man:. Say ‘Stop following me’ :stop_sign:, or ‘Leave me alone’ :no_good_man:, and the NPC will stop following you!

Behind the scenes the NPC is “thinking” :brain: about what the right action is. It’s not keyword matching, instead it’s “understanding” the ask, which means if you tell the NPC to “go wait for the bus” it will also go to the bus stop

Check out this video for a demo: https://www.youtube.com/watch?v=RkMEVsUI1GI

The feature is still in beta, so please bear with us! Happy creating! :sparkles::sparkles:

3 Likes

This is some interesting code! I’m glad to see others are interested in the subject as well.

I integrated your AI into my system

image
These are 3 responses. One from my local chatbot I shared, the second is from your API and the 3rd is a Experiment I did by combining the response from your API to Zephyr 7b.
I also integrated my Awareness module to create the same array as your except it grabs 3 of the closest objects from each object type, combines them into one table.

function aware.get.nearestModels(root,radius,getone,mode)--get the nearest object of each type 
local objectmegaarray={}
for i,v in aware.near do
local _,_,_,_,objectarray=aware.near[i](root,radius,getone,mode)
objectmegaarray[i]=objectarray 
end

local organizedobjects={}
for i,objectarray in objectmegaarray do
for t,o in objectarray do 
if t<3 then --get 3 objects max
table.insert(organizedobjects,o)
else 
break
end
end
end
table.sort(organizedobjects,function(a,b)return a.distance<b.distance end)
return organizedobjects
end

This was really interesting diving into. If you’re interested in seeing some of the changes I can send you copy of the modified module.

One of the main things I did was implement multiple personalities based on the queried npc. Each npc has their own message table.

it was something like this. Basically I turned all the local npc variables into tables where the npc’s name is the hash lookup for the state. If you want to see the modularized code shoot me a pm.

function DeterminantAgent.followloop(npc)
local heartbeaploop=nil
local timeSinceLastUpdate = 0
local updateInterval = 0.5  -- Update every 0.5 seconds
local isPlayerInWavingRange = false
--this function follows the player
heartbeaploop=RunService.Heartbeat:Connect(function(deltaTime)
	timeSinceLastUpdate = timeSinceLastUpdate + deltaTime

	if timeSinceLastUpdate >= updateInterval then
		timeSinceLastUpdate = 0  -- Reset the timer

		if currentstate[npc.Humanoid.DisplayName] == npcStates.following then
			if hiredPlayer[npc.Humanoid.DisplayName]~=nil and hiredPlayer[npc.Humanoid.DisplayName].Character and hiredPlayer[npc.Humanoid.DisplayName].Character:FindFirstChild("HumanoidRootPart") then
				local playerPosition = hiredPlayer[npc.Humanoid.DisplayName].Character.HumanoidRootPart.Position
				local npcPosition = npc.HumanoidRootPart.Position

				-- Calculate the direction vector from the NPC to the player
				local direction = (playerPosition - npcPosition).unit

				-- Calculate the target position for the NPC, maintaining the follow distance
				local targetPosition = playerPosition - direction * followDistance

				-- Check the distance between the NPC and the target position
				if (targetPosition - npcPosition).magnitude > 1 then -- '1' is a threshold to avoid jittery movement
					npc.Humanoid:MoveTo(targetPosition)
				end
			end
		elseif currentstate[npc.Humanoid.DisplayName] == npcStates.idle then
			if hiredPlayer[npc.Humanoid.DisplayName]~=nil and hiredPlayer[npc.Humanoid.DisplayName].Character and hiredPlayer[npc.Humanoid.DisplayName].Character:FindFirstChild("HumanoidRootPart") then
				local distance = (npc.HumanoidRootPart.Position - hiredPlayer[npc.Humanoid.DisplayName].Character.HumanoidRootPart.Position).magnitude
				if distance < 10 then -- Assuming 10 units as the waving distance
					-- Trigger waving animation
					-- Make sure to replace 'waveAnimationId' with the ID of your actual animation
					if not isPlayerInWavingRange then
						-- Player just entered the range, wave
						--isPlayerInWavingRange = true

						--emoteBindableFunction:Invoke("wave")
					end
				else
					isPlayerInWavingRange = false
					local npcPosition = npc.HumanoidRootPart.Position
					local playerPosition = hiredPlayer[npc.Humanoid.DisplayName].Character.HumanoidRootPart.Position

					local direction = Vector3.new(playerPosition.X - npcPosition.X, 0, playerPosition.Z - npcPosition.Z)
					npc.HumanoidRootPart.CFrame = CFrame.lookAt(npcPosition, npcPosition + direction)
				end
			end
		elseif currentstate[npc.Humanoid.DisplayName] == npcStates.leave then
        
            heartbeaploop:Disconnect()
			-- do something else
		end
	end
end)
end

I also inject the conventional awareness to the AI model in addition to the position data for the function calls.
I use the same pipeline as Zephyr 7b so it’s connected to the context window of Zephyr via its memories.

function registermemory(player,memory,npc)
if player and npc then
--local player=player.Name
--local npc=npc.Name

if cacheofmemories[player]==nil then--register a table for player
cacheofmemories[player]={}
end
if cacheofmemories[player][npc]==nil then--created nested table for npc
cacheofmemories[player][npc]={}
cacheofmemories[player][npc.."memories"]={}
return ""
end
if memory~=nil then

local memorystring= cm.summarrization(memory)--cm.summarrization(memory)--summary of the memory
if memorystring then
print("Created the memory")
print(memorystring)
--cacheofmemories[player][npc]={memorystring}
table.insert(cacheofmemories[player][npc.."memories"],memory)--add complete memory entry
table.insert(cacheofmemories[player][npc],memorystring)--add summarized memory entry
end
task.delay(3,function()
if #cacheofmemories[player][npc]>3 then --if summaries exceeds 3
local quantizedmemory=table.concat((cacheofmemories[player][npc.."memories"])," ")--summarize three sections of non summarized memories
local memorystring= cm.summarrization(quantizedmemory)--cm.summarrization(quantizedmemory)--summarize 3 sections
if memorystring then -- if response
print("Quantized the memory")
print(memorystring)
if #cacheofmemories[player][npc.."memories"]>=7 then--6 memories size for Zephyr
--save the old quantized memories. Create summary of all memories. and store last two memories.
--cacheofmemories[player][npc.."memories"]={table.concat(cacheofmemories[player][npc]," "),--concat the summarized cache
        --cacheofmemories[player][npc.."memories"][6],--last two memories
               -- cacheofmemories[player][npc.."memories"][7]}
local newtbl={table.concat(cacheofmemories[player][npc.."memories"])}

for i,v in cacheofmemories[player][npc.."memories"] do 
if i>2 then
table.insert(newtbl,v)
end 
end

cacheofmemories[player][npc.."memories"]=newtbl
end
cacheofmemories[player][npc]={cacheofmemories[player][npc][2],cacheofmemories[player][npc][3]}--clear the cache
table.insert(cacheofmemories[player][npc],memorystring)--memory now has 3 entries

end
end
end)
end
if #cacheofmemories[player][npc]>0 then
return "I remember "..table.concat(cacheofmemories[player][npc],". ")
end
end
end

I think this in conjunction with the normal context window could allow for long term memories.
Also in this register memory function I am using a small 400m model on huggingface to summarize the the conversation.

function cm.summarrization(inputq,mode)
	-- Get the HttpService
	--https://huggingface.co/facebook/bart-large-cnn?
	-- Get the HttpService
	-- Define the URL and the headers for the request
    
	local API_URL = "https://api-inference.huggingface.co/models/facebook/bart-large-cnn"
	if mode then API_URL ="https://api-inference.huggingface.co/models/slauw87/bart_summarisation" end
    local headers = {
		["Authorization"] = Bearerkey,
		--["Content-Type"] = "application/json"
	}

	-- Define the payload for the request
	local payload = {
		inputs = inputq
	}

	-- Encode the payload as a JSON string
	local payloadJSON = HttpService:JSONEncode(payload)

	-- Send the request and get the response
	local success, response = pcall(function()
		return HttpService:RequestAsync({
			Url = API_URL,
			Method = "POST",
			Headers = headers,
			Body = payloadJSON
		})
	end)

	-- Check if the request was successful
	if success then
		-- Decode the response as a JSON table
		local responseJSON = HttpService:JSONDecode(response.Body)
print(response)
if responseJSON["Body"]~=nil then
if responseJSON["Body"]["Error"]~=nil then
if responseJSON["Body"]["estimated_time"]~=nil then
task.wait(responseJSON["Body"]["estimated_time"])
local success, response = pcall(function()
		return HttpService:RequestAsync({
			Url = API_URL,
			Method = "POST",
			Headers = headers,
			Body = payloadJSON
		})
	end)
responseJSON = HttpService:JSONDecode(response.Body)

end
end
if responseJSON["Body"]["summary_text"]~=nil then
return responseJSON["Body"]["summary_text"]
end
end
		-- Check if the response has a summary
		if responseJSON[1].summary_text then -- Use [1] to access the first element of the array
			-- Print the summary
			--print(responseJSON[1].summary_text) -- Use [1] to access the first element of the array
			return responseJSON[1].summary_text	
		else
			-- Print an error message
			--print(response)
			return inputq
		end
	else
		-- Print an error message
		
		--print("Request failed: " .. response)
		return inputq
	end
end

I’m also reducing API usage by saving each response in a personality specific database. But I may have to provide the awareness to the direct query match (exact personality, query, and exact surroundings, equals same response. Which would be interesting to see develop over time.

3 Likes

hey Magus!

thanks so much for trying our plugin and putting time into extending it! I’ve seen you around on the forums talking about AI related things

I’ll read through your suggestions and come up w a more in depth response

1 Like

Some main points are you can expand a max context range by using summarization and a algorithm for handling memories.
I have a library of about 100 expressive emotes labeled based on a keyword description of the emote. This algorithm leverages synonyms, antonyms, and nouns to get the likeliest emote (synonym example synonyms={“Hello”,“Hi”,“Hey”,“Greetings”}, antonyms={“goodbye”,“farewell”}) Scoring each entry by only looking for one example of each synonym group. Thus we can query a database with the sentence. “Hey there, nice to meet you. I’m very excited to go on a adventure.” and the npc would wave hello, then as it says the next sentence it would find the emotes with tags of excited, and adventure. Excited is a noun and so is adventure so adding noise to the algorithm would give it a 50/50 chance to execute either a emote tagged with adventure of excited.

Some main advice is that Good AI requires good data!

I’d be willing to share with you the library of emotes I have not open-sourced to speed up your project. I understand if you wouldn’t want to use my chat module library to leverage all those elements of the english language (synonyms, antonyms, reflections and nouns) to query such a database.
But the code is in the open sourced module it just doesn’t have the library if you were interested to see how it was implemented.
This was done by displaying each sentence at a time on a word by word basis (or character by character if FPS>30) then processing the sentence to determine the emote (and now actions based off my 170 action commands).
Here is a demonstration video of what I’m talking about. I also used a similar system to create a library of atmospheric particle effects and audio samples based on emotional tonality.
Here’s a demonstration video.

Also I just did an experiment where I inject the response as the starting string to Zephyr 7b and got this output.


   [["Kahlani: Good day to you, traveler! I am Duchess Kahlani; it's my honor to make your acquaintance. How may I assist you on this fine morning?  

 ArtStudios: I am in search of a rare artifact, rumored to be hidden in this very place. Do you happen to know anything about it?  

 Kahlani: I'm afraid I'm not privy to such information, traveler. However, I do know that there are a few chests scattered around this area, some of which may contain items of interest. Would you care to join me in exploring this island?  

 ArtStudios: That would be most gracious of you, Duchess. I would be honored to accompany you on this quest.  

 Kahlani: Very well, let us set off then. But first, let us take a moment to orient ourselves. Based on my instincts, I believe we are currently near a Broadleaf tree to the southwest, and a chest is nearby. There are also a couple of locked chests in the vicinity, but I'm afraid I don't have the key to them. Shall we begin our search?  

 ArtStudios: Absolutely, Duchess. Lead the way!  

 Kahlani: As you wish, traveler. Let us proceed with caution and vigilance, for we never know what dangers may lie ahead. But with your skills and my intuition, I'm confident we'll find what we're looking for! "]]

In this example I have Zephyr acting as a story teller and start it with the response from your AI after it displays the response. Then it simulates a conversation between the player and the npc.

Then the conversation is quantized into a memory for Zephyr. and Perhaps it should also become a memory for the other model. So they could be seamlessly integrated.

Roleplaying Zephyr 7B Luau API Documentation (Free API) System Message - Resources / Community Resources - Developer Forum | Roblox

Eliza Chatbot Ported to Luau [Open-Source] - Resources / Community Resources - Developer Forum | Roblox

Artificial Intelligence Model APIs FREE (Open Source) and Other Useful APIs - Resources / Community Resources - Developer Forum | Roblox

Also one modification I would make is something like this in the perception module!

if primaryPart then
            local pos=Vector3.new(math.floor(primaryPart.Position.X),math.floor(primaryPart.Position.Y),math.floor(primaryPart.Position.Z))
			local description = name .. " at " .. tostring(pos) 
			table.insert(descriptions, description)
		end

Should round down the floating points of each position to reduce token usage. That also may make it easier for the AI to call functions with those cordinates.

2 Likes

this is so cool, could you share how this was made, kinda wanna dig into it and try to expand, and possibly even provide my own api key and such to avoid rate limits as they might pop up

1 Like

If you encountered “Script Injection permissions” issues, please check the screenshot added in the first post

1 Like

I think it’s important to emphasize that you’re using your own Azure API for this on probably a credit grant.

If the module gets used (or abused) a lot, you’ll end up with a bill sooner or later, which will break the whole module. Every game built on it won’t work anymore.

If you want your module to be future-proof, include an explanation on how to self-host the API and a rough estimation of the $cost/month.

2 Likes