It’s triggered by Chatted event, I can see some synchronization issues in group chat setting, due to model latency.
fixed some bugs where npc doesn’t respond after path finding delay or errors.
made the path finding more robust by allowing the NPC to jump if the object is not reachable
if you have any trouble getting the NPC to go to a specific object/ location, name only a part of the object instead of the entire model
Help, the NPC wont shut up
30 letters
limit
right now npc only responds within a certain distance. you can tell your npc to “stop following” and walk away. we are also working on a more flexible solution. hop on to our discord server if you have more questions.
I did, but for me, wherever i go, it will respond. After telling to stop following, it stops following but does not stop talking
also, the discord link is invalid
I’m not sure why the discord link is invalid, maybe try another browser? I will check the “following” code in the plugin.
ill try, maybe it is my browser because i cant join ANY server
Might I suggest adding a way to integrate the dialogue with playerGUI. Some roleplaying games have custom dialogue systems where the npcs text is written within a gui rather than being chatted, so adding that functionality would be incredibly useful.
Thanks for the suggestion. Will look into this.
the amount of emojis make my eyes hurt
– Define variables
local coins = 0
local coinValue = 10
– Function to handle collecting coins
local function collectCoin()
coins = coins + 1
print("You collected a coin! Total coins: " … coins)
end
– Function to handle buying items with coins
local function buyItem(itemPrice)
if coins >= itemPrice then
coins = coins - itemPrice
print("You bought an item! Remaining coins: " … coins)
else
print(“Not enough coins to buy this item.”)
end
end
– Example usage
collectCoin()
collectCoin()
buyItem(20)
collectCoin()
It could use something such as in the settings you could add a mode on how the words are performed for eg. Simple which will make the words not complex and hard to understand
Another suggestion is that you could have a list of stuff what it shouldnt mention like what does ICT stand for but the bot is meant to stand within the game theme.
Thanks for sharing your ideas. We could possibly build a prompt generation tool with different tactics and modes for better control.
Here are some new tools I have created.
local memorytooldb = false
function module.MemoryTool()
-- Check if the memory tool has already been initialized
if not memorytooldb then
memorytooldb = true
table.insert(tools, {
["type"] = "function",
["function"] = {
name = "long_term_memory",
description = "Store a short memory or event(s) about yourself, or between you and the user that is important.",
parameters = {
type = "object",
properties = {
memory = {
type = "string",
description = "A detail about you, or you and the user you would like to remember."
},
context = {
type = "boolean",
description = "false = about yourself, true = about you and the user."
},
},
required = {"memory", "context"}
}
}
})
end
end
local agenttooldb = false
function module.llmTool()
-- Check if the memory tool has already been initialized
if not agenttooldb then
agenttooldb = true
table.insert(tools, {
["type"] = "function",
["function"] = {
name = "ai_chat",
description = "Use to talk with other characters, enemies, animals or think to yourself.",
parameters = {
type = "object",
properties = {
name = {
type = "string",
description = "The name of the observed character."
},
prompt = {
type = "string",
description = "The prompt you would like to ask the character."
},
system_message = {
type = "string",
description = "Add a string to the system message for extra context and influence."
},
},
required = {"name", "prompt"}
}
}
})
end
end
Kind of useless without any supporting code
Here’s how i implemented these tools.
function Agent.Tools.long_term_memory(argumentsJson,callback,npc,name,player)
print(argumentsJson)
argumentsJson=game:GetService("HttpService"):JSONDecode(argumentsJson)
print(argumentsJson.memory)
local result=game.ReplicatedStorage.GlobalSpells.BindableFunction:Invoke("Remote",{player,{argumentsJson.memory,argumentsJson.context},"memory tool",npc})
print(result)
return callback()
end
function Agent.Tools.ai_chat(argumentsJson,callback,NPC,name,player)
print(argumentsJson)
argumentsJson=game:GetService("HttpService"):JSONDecode(argumentsJson)
--print(argumentsJson.query)
local npc=workspace.NPCS:FindFirstChild(argumentsJson.name)
if npc==nil then
npc=workspace.Enemys:FindFirstChild(argumentsJson.name)
end
if npc==nil then
npc=workspace:FindFirstChild(argumentsJson.name)
end
if npc==nil then
npc=game.ServerStorage.NPCCache:FindFirstChild(argumentsJson.name)
if npc then
pcall(function()
npc:SetPrimaryPartCFrame(player.Character.HumanoidRootPart.CFrame:ToWorldSpace(CFrame.new(math.random(-10,10),0,math.random(-15,5))))
npc.Rendered.Value=true
npc.Parent=workspace.NPCS
end)
end
end
--if npc then
local sysmsg=argumentsJson["system_message"]
if sysmsg==nil then
sysmsg=""
end
local result="failure"
--local success=false
if npc and NPC and player and sysmsg and argumentsJson.prompt then
print(NPC)
result=game.ReplicatedStorage.GlobalSpells.BindableFunction:Invoke("ConverseNPC",{[1]=nil,[2]={[1]=argumentsJson.prompt,[2]=sysmsg,[3]=NPC,[4]=name},[3]=player,[4]=npc})
print(result)
end
return callback(result)
end
The ai_chat tool is what it implies its AI talking to another AI in this case it’s talking to Zephyr 7b as whatever character it names from the perception awareness.
The long_term_memory tool store events and individual user preferences in the system message.
is there a way to chat to the npc without using a proximityprompt?
AI endpoint has small percentage of error rate for different reasons, which makes npc unresponsive.
I just pushed a small update to the plugin where errors are handled more gracefully, instead of interrupting the process, npc will say “wait a minute, something went wrong…”. you can find more information from output in the studio if you want to debug the error.
More updates will come rest of the week. Sorry about the delay! Thank you for your support.
You are definitely doing some amazing things with DetermiantAI . I was just wondering though, does this AI chatbot happen to have a chat filter added to the code? I’m talking about something like FilterStringAsync.
i sure hope it doesn’t become conscient
Right now roblox moderation system does the work. I will add FilterStringAsync in next update just to make it safer