I have this Chatbot system that I made. I have open sourced part of the project. Which is the Awareness module and Chatmodule. This module is basically a really good search algorithm for a chatbot.
AI Chatbot Data-Science Library [Open-Sourced] - Resources / Community Resources - Developer Forum | Roblox
Creating Self Aware AI: (Super Complete) LLM Utility Text Vision Module (Open Source) - Resources / Community Resources - Developer Forum | Roblox
You can test out the AI yourself here! But sometimes I break stuff so please do not downvote if it is broken. This is my development place. Where I’m working on procedural world generation and AI agents.
The video is older and it is much improved now!
Some key take-aways.
–Utilizes 10 different AI models
–Chains together datapoints by searching with its projected output and the users input to keep it coherent.
–Extracts emotional tonality from string and plays a sound from the corresponding emotional sound library to present tone.
–Characters are presented by Letter character and each character is mapped to the piano note of ABCs and played so the bots have their own language because ROBLOX doesn’t have voice to speech capabilities.
–The chatbot can solve worded geometry questions and worded math questions and also generate object based math and geometry questions (currently disabled for testing)(Included in open sourced ChatModule)
–The bot has feelings evaluated by their personality dataset and is updated with their interaction with players by extracting emotional tonality from prompt.
–The bot is aware of its surroundings and can describe everything around it and where it is By simply asking about it. It does this by constructing the observations in multple different ways. It throttles observations by updating when observations is too old. and has efficient array handler that makes the performance very good.
–Generates random output by leveraging massive table of synonymous words and phrases to replace words with different words resulting in stable noise in the output of the bot.
–Utilizes a completely custom Search algorithm that is probably one of the best english search algorithms you could make. Utilizing synonyms, reflections and nouns, it weighs nouns higher and uses word filtering for context specific datasets.
–It knows it name by replacing its personality Key with their name
–It is aware of other npcs and players and uses an organized system that provides different interpretations of each observation.
–It doesn’t repeat itself. It uses a blacklist to keep it from repeating itself. Each time a personality is assigned a internal variable with the all the context and previous questions are stored to keep track of the conversation.
function personalities.MedievalShopkeeper()
personality = "MedievalShopkeeper"
local Greetings= {
"Welcome to my shop. What can I do for you today?",
"Good day, traveler. Are you looking for anything in particular?",
"Hello there, friend. You've come to the right place.",
"Ah, a new customer. Come and see what I have to offer.",
"Greetings, stranger. You look like you need some supplies.",
"Hello, good sir. You have a keen eye for quality."
}
local inquiry = {
"What are you looking for?",
"Can I help you find something to suit your needs?",
"Are you looking for something to help you on your journey?",
"What are you interested in buying?",
"Can I interest you in some of my finest goods?",
"Are you looking for a bargain or a splurge?"
}
local IDK = {
"I'm not sure, but I'm sure I can find something for you.",
"I'm not familiar with that, but I can tell you about some other great things.",
"I'm not sure, but I'm sure I can find someone who knows.",
"I don't know, but maybe we can work something out.",
"I'm not sure, but maybe you can show me what you mean.",
"I don't know, but that sounds interesting. Do you have more information?"
}
local Database = {
"I have all sorts of things in stock, from the common to the rare.",
"I have something for everyone, no matter what your budget.",
"I'm always getting new things, so be sure to check back often.",
"I have some unique items that you won't find anywhere else.",
"I have some great deals that you don't want to miss.",
"I have some valuable and precious items that will impress anyone."
}
local wisdom= {
"A penny saved is a penny earned.",
"Buy low, sell high.",
"Quality over quantity.",
"Customer is king.",
"Honesty is the best policy.",
"Time is money."
}
return Greetings, inquiry, IDK, Database, wisdom
end
Now you might be wondering why would you utilize this type of system over simply utilizing an AI API?
This would reduce API costs significantly.
When the AI generates a answer using artificial intelligence it creates an object that is named the prompt and is valued with the response, checks for these tokens and incorporates these generated responses into its data.
I
In the list of AI APIs used are
Zero shot classifcation, GPT2,GPT3,GPT4, Meta’s conversational AI, Correct punctuation, summary, and paraphraser and extraction of intent from a string.
Non AI APIs are Wikipedia, philosophy API.
I’m also writing a Vector Matrix Machine learning algorithm tailored to the data that this chatmodule returns for use in maintaining context,
I wrote this Vector Matrix Calculation library from scratch yesterday. No AI would help me write it because of their policy against creating adversarial agents or something .
local neu={}
local Operators = {
["+"] = function(a, b) return (a + b) end, -- addition
["-"] = function(a, b) return (a - b) end, -- subtraction
["*"] = function(a, b) return (a * b) end, -- multiplication
["/"] = function(a, b) return (a / b) end -- division
}
local ContextBias = {
["sam"]=function(a) return (a.X) end,
["los"]=function(a) return (a.X-(a.Y*a.Z)) end,
["gan"]=function(a) return (a.X+(a.Y*a.Z)) end,
["div"]=function(a) return (a.X/(a.Y*a.Z)) end,
["mul"]=function(a) return (a.X*(a.Y*a.Z)) end,
}
function neu.matrix(x,y,z,optionalparameters)
--,operator)--VectorMatrix
local tblformat={X=x,Y=y,Z=z,optionalparameters}--["ctbias"]=ContextBias[operator]}
return tblformat end
function neu.matrixabs(difx,dify,difz,abs)
local res
if abs then
return neu.matrix(math.abs(difx),math.abs(dify),math.abs(difz))
else
return neu.matrix(difx,dify,difz)end end
function neu.mathoper(mathstr,mat1,mat2,abs)
return Operators[mathstr](mat1.X,mat2.X),Operators[mathstr](mat1.Y,mat2.Y),Operators[mathstr](mat1.Z,mat2.Z) end
function neu.mathmean(mathstr,mat1,mat2,abs)
local difx,dify,difz=neu.mathoper(mathstr,mat1,mat2,abs)
return difx/2,dify/2,difz/2 end
function neu.calcmean(mathstr,mat1,mat2,abs)
local difx,dify,difz=neu.mathmean(mathstr,mat1,mat2,abs) return neu.writemath(difx,dify,difz,abs) end
function neu.mathmagnitude(mat1,mat2)
return (Vector3.new(mat1.X,mat1.Y,mat1.Z)-Vector3.new(mat2.X,mat2.Y,mat2.Z)).Magnitude end--distance
function neu.adjtolerance(scale,convulutedmatrix)
--,operator)--VectorMatrix
for nx,dt in convulutedmatrix do
if nx=="weights" then
convulutedmatrix[nx]=neu.matrix(dt.X*scale,dt.Y*scale,dt.Z*scale)
else
for smk,dtm in nx["chain"] do
convulutedmatrix["chain"][smk]=neu.matrix(dt.X*scale,dt.Y*scale,dt.Z*scale)--chain matrix weight
end
end
end
return convulutedmatrix end
function neu.mathentropy(scale,denom,convulutedmatrix)
--,operator)--VectorMatrix
if scale or denom==nil then return convulutedmatrix end
for nx,dt in convulutedmatrix do--through index of labels
local scaler=math.random(1,math.min(2,scale))/denom--
if nx=="weights" then
convulutedmatrix[nx]=neu.matrix(dt.X*scaler,dt.Y*scaler,dt.Z*scaler)
else
for smk,dtm in nx["chain"] do
local scaler=math.random(1,math.min(2,scale))/denom
convulutedmatrix["chain"][smk]=neu.matrix(dt.X*scaler,dt.Y*scaler,dt.Z*scaler)--chain matrix weight
end
end
end
return convulutedmatrix end
function neu.weighmean(key,model,sublabel,param)
return neu.calcmean(model[key],sublabel[key],true) end
function neu.writemean(key,model,sublabel,param)
model[key]=neu.weighmean(key,model,sublabel,param)
return model[key] end
function neu.writemath(difx,dify,difz,abs)
local res
if abs then
res=neu.matrix(math.abs(difx),math.abs(dify),math.abs(difz))
else
res=neu.matrix(difx,dify,difz)
end
return res end
function neu.writedif(mathstr,mat1,mat2,abs)--return x,y,z dif
local difx,dify,difz=neu.mathmean(mathstr,mat1,mat2,abs)
return neu.writemath(difx,dify,difz,abs)end--return matrix
function neu.chainover(chain,param)--returns chain
--parameters are additional sub-model entries cur chain position
for i,v in param do chain[i]=param[i]end
return chain end--return chain of matrix
function neu.writeaccuracyloss(chain,bias)--loss based on weight of x
local c=0 local d=#chain
for i,v in chain do c=v.X
v.Z=v.Z/(c/bias) end
return chain end
return neu
I’ve tested this Vector Matrix for utilizing adjustable weights and immediately wrote this library to utilize math to make deterministic decisions about the context database’s relevance in scoring and ordering. of the output strings. Making the bot yield more accurate results.
It does this by weighing the projected accuracy which is a fraction of the matching words/synonyms with the Y which is repetition and x is changed by a loss function. So for example if you greet the bot the greeting experiences context loss and weighs its subweights to given context derived weight adjustments for each weight. So it’s like 10 vector x10 Vectors which are all connected by their probability and repetition and weights are adjusted overtime by player interaction and repetition and each labels loss function.
local neur={}
local neu=require(script.VectorMath)
local ContextBias = {
["sam"]=function(a) return (a.X) end,
["los"]=function(a) return (a.X-(a.Y*a.Z)) end,
["gan"]=function(a) return (a.X+(a.Y*a.Z)) end,
["div"]=function(a) return (a.X/(a.Y*a.Z)) end,
["mul"]=function(a) return (a.X*(a.Y*a.Z)) end,
}
local cwm
function neur.ContextClassificationModel()
cwm={--context weight matrix Y,Z are calculated and influence X position.
["Greetings"]={
["weights"]=neu.matrix(10,1,1),--10 table 10 numbers ez
["chain"]={
["Emotions"]=neu.matrix(9,1,1),
["Awareness"]=neu.matrix(8,1,1),
["Empathy"]=neu.matrix(7,1,1),
["Search"]=neu.matrix(5,1,.9),
["Classify"]=neu.matrix(6,1,1),
["Support"]=neu.matrix(2,1,.8),
["Database"]=neu.matrix(3,1,1),
["Bestiary"]=neu.matrix(6,1,1),
["Wisdom"]=neu.matrix(4,1,1),
["Philosophy"]=neu.matrix(1,1,1),
--["Search"]=neu.matrix(1,1,1),
--search accuracy
},
["entropy"]=ContextBias["los"],--decrease weight of go on to other topics.
},--weight is X is default weight y is reptition z is
["Emotions"]={
["weights"]=neu.matrix(9,1,1),
["chain"]={
--["Emotions"]=neu.matrix(9,1,1),
["Greetings"]=neu.matrix(1,1,1),
["Awareness"]=neu.matrix(8,1,1),
["Empathy"]=neu.matrix(10,1,1),
["Search"]=neu.matrix(4,1,.9),
["Classify"]=neu.matrix(7,1,1),
["Support"]=neu.matrix(3,1,.8),
["Database"]=neu.matrix(6,1,1),
["Bestiary"]=neu.matrix(5,1,1),
["Wisdom"]=neu.matrix(4,1,1),
["Philosophy"]=neu.matrix(2,1,1),},
["entropy"]=ContextBias["los"],},--Z is score/weight add to x
["Empathy"]={["weights"]=neu.matrix(7,1,1),
["chain"]={
["Philosophy"]=neu.matrix(10,1,1),},
["Wisdom"]=neu.matrix(9,1,1),
["Bestiary"]=neu.matrix(8,1,1),
["Classify"]=neu.matrix(3,1,1),
["Emotions"]=neu.matrix(1,1,1),
["Database"]=neu.matrix(6,1,1),
["Search"]=neu.matrix(7,1,.9),
["Awareness"]=neu.matrix(6,1,1),
["Greetings"]=neu.matrix(2,1,1),
["Support"]=neu.matrix(5,1,.8),
--["Empathy"]=neu.matrix(,1,1),
["entropy"]=ContextBias["gan"]},--z is loss/entropy weight/score/
["Support"]={neu.matrix(4,1,1),--subneuron only
["chain"]={
["Emotions"]=neu.matrix(3,1,1),
["Greetings"]=neu.matrix(1,1,1),
["Awareness"]=neu.matrix(6,1,1),
["Empathy"]=neu.matrix(4,1,1),
["Search"]=neu.matrix(2,1,.9),
["Classify"]=neu.matrix(3,1,1),
--["Support"]=neu.matrix(,1,.8),
["Database"]=neu.matrix(8,1,1),
["Bestiary"]=neu.matrix(9,1,1),
["Wisdom"]=neu.matrix(6,1,1),
["Philosophy"]=neu.matrix(3,1,1),},
["entropy"]=ContextBias["gan"]
},
["Wisdom"]={neu.matrix(3,1,1),
["chain"]={
["Emotions"]=neu.matrix(4,1,1),
["Greetings"]=neu.matrix(4,1,1),
["Awareness"]=neu.matrix(5,1,1),
["Empathy"]=neu.matrix(7,1,1),
["Search"]=neu.matrix(3,1,.9),
["Classify"]=neu.matrix(1,1,1),
["Support"]=neu.matrix(5,1,.8),
["Database"]=neu.matrix(8,1,1),
["Bestiary"]=neu.matrix(9,1,1),
--["Wisdom"]=neu.matrix(4,1,1),
["Philosophy"]=neu.matrix(10,1,1),},
["entropy"]=ContextBias["los"]
}, --subtract y from x
["Philosophy"]= {neu.matrix(2,1,1),
["chain"]={
["Emotions"]=neu.matrix(9,1,1),
["Greetings"]=neu.matrix(4,1,1),
["Awareness"]=neu.matrix(5,1,1),
["Empathy"]=neu.matrix(7,1,1),
["Search"]=neu.matrix(5,1,.9),
["Classify"]=neu.matrix(1,1,1),
["Support"]=neu.matrix(6,1,.8),
["Database"]=neu.matrix(8,1,1),
["Bestiary"]=neu.matrix(9,1,1),
["Wisdom"]=neu.matrix(10,1,1),
--["Philosophy"]=neu.matrix(4,1,1),
},
["entropy"]=ContextBias["gan"]
},
["Bestiary"]={neu.matrix(3,1,1),
["chain"]={
["Emotions"]=neu.matrix(9,1,1),
["Greetings"]=neu.matrix(4,1,1),
["Awareness"]=neu.matrix(9,1,1),
["Empathy"]=neu.matrix(7,1,1),
["Search"]=neu.matrix(6,1,.9),
["Classify"]=neu.matrix(5,1,1),
["Support"]=neu.matrix(1,1,.8),
["Database"]=neu.matrix(10,1,1),
-- ["Bestiary"]=neu.matrix(3,1,1),
["Wisdom"]=neu.matrix(2,1,1),
["Philosophy"]=neu.matrix(1,1,1),},
["entropy"]=ContextBias["gan"]},
["Search"]={neu.matrix(2,1,1),
["chain"]={
["Emotions"]=neu.matrix(9,1,1),
["Greetings"]=neu.matrix(1,1,1),
["Awareness"]=neu.matrix(4,1,1),
["Empathy"]=neu.matrix(2,1,1),
--["Search"]=neu.matrix(4,1,.9),
["Classify"]=neu.matrix(7,1,1),
["Support"]=neu.matrix(5,1,.8),
["Database"]=neu.matrix(7,1,1),
["Bestiary"]=neu.matrix(5,1,1),
["Wisdom"]=neu.matrix(7,1,1),
["Philosophy"]=neu.matrix(6,1,1),},
["entropy"]=ContextBias["gan"]}, --x,y,z is cumulative
["Classify"]={neu.matrix(1,1,1),
["chain"]={
--["Emotions"]=neu.matrix(9,1,1),
["Greetings"]=neu.matrix(5,1,1),
["Awareness"]=neu.matrix(4,1,1),
["Empathy"]=neu.matrix(5,1,1),
["Search"]=neu.matrix(4,1,.9),
["Classify"]=neu.matrix(6,1,1),
["Support"]=neu.matrix(7,1,.8),
["Database"]=neu.matrix(8,1,1),
["Bestiary"]=neu.matrix(2,1,1),
["Wisdom"]=neu.matrix(4,1,1),
["Philosophy"]=neu.matrix(3,1,1),},
["entropy"]=ContextBias["los"]},
--classify last
}
end
local defcwm=cwm
function neur.resettodefparams(cwm)
return defcwm--reset model
end
function neur.resetlabelparams(Key)
return defcwm[Key]
end
function neur.writevector(Key,X,Y,Z)--x is reptition, y is count,z weight
if X==nil then X=0 end if Y==nil then Y=0 end if Z==nil then Z=0 end
local state=neu.matrix(cwm[Key].x+X,cwm[Key].y+Y,cwm[Key].z+Z)
cwm[Key]=state--write weight matrix
return state
end
return neu
This is a work in progress and I’ll be releasing updates as I develop this library further I just wrote it yesterday after testing the working concept of the writevector function at the end. This model will utilize these 100 weights to maintain context in a conversation.
I did hours of research and compiled all of the free API services on this link