Lua Trainable RAG Local Chatbot Library, Local Model Aligns Large Model, Geometry+Worded Math Solver,Math/Geometry Data Generator, Chat Database

Yeah looks good to me! I optimized this module very much. The original was 10000+ times less efficient. Currently the performance sits around 3.8% core usage scaled up on 20x more data. Mostly by using direct key lookup tables stored in modules. Such as all the synonyms, antonyms, nouns and reflections.
To optimize and minimize ram usage i only use one instance of the chatmodule and use it as a node to process things locally or on the server. Such as filtering wikipedia results from TOS content. Also I use it for my random npc generation algorithm to make their appearance more in line with their title.

When I was using a Massive Final Fantasy Dataset of 218,000 entries. I trained the AI by using example prompts to optimize the pathways, print the results and save them into a module.
By querying the example queries then queriy the possible responses.

Also used this to create probabiistic LLM models
by training on the dataset (took 3 hours)

1 Like

Released a new video of this my AI created using this Library! It’s a multi layered system that handles handles accurate most short form questions locally with this library and then injects context into another AI model! So I did that with Zephyr and by Engineering the System_message I tell it the date, time, weather, awareness and give it the response from the chatbot library. With all of this the chatbot sometimes get very immersed in the environment and its character.

Does it actually use AI? I have searched thru the modules and I don’t really find any Neural Network

It’s not really a Neural Network. Yes it uses a bag of words model to make inferences to query data to get conversational responses from a dataset.

local optimizers={}
function cm.GetSynomArray(words2,complete,filter)
	local synomarray={}	
    -- if #words>0
    local noi,syni,refli,antio=0,0,0,0
    for _, originword in words2 do	
		if complete==true then
            if string.len(originword)>=2 or originword=="I" then
            local	syn={cm.Getsynonyms(originword,complete)}
		--	if #syn==1 and syn[1]==originword then
                    
                    local noun={cm.Getnoun(originword,complete)}
                    if noun[1]~=false then
                        noi=noi+1
                    if synomarray["nouns"..noi]==nil then                        
                       -- table.insert(synomarray,noun)
                        synomarray["nouns"..noi]=noun
                    else 
                        for i,t in noun do
                           table.insert(synomarray["nouns"],t)
                         end   
                    end    
				--end	   
            end
       
            if filter==false then
                local refl={cm.Getreflection(originword,complete)}
                 --   if #refl~=1 then
                       refli=refli+1
                    --print(refl)
                    if refl[1][1]~=originword and synomarray["reflection"..refli]==nil then
                synomarray["reflection"..refli]=refl
               -- else 
                    --    for i,t in refl do
                    --        if t~=originword then
                    --            table.insert(synomarray["reflection"],t)
                    --          end  
                   --end   
                end
           --     end
            end    
                local anti={cm.GetAntonymDirect(originword,syn[1])}
                if anti[1]~=false then
                    antio=antio+1
                    local cont=false
                    for i,v in synomarray do
                        for t,o in v do
                            if o==anti[1] then
                                cont=true
                                break
                            end
                        end
                    end
                if synomarray["antonyms"..antio]==nil and cont==false then
                       
                        
                        synomarray["antonyms"..antio]=anti
        elseif cont==false then 
            for i,t in anti do
                table.insert(synomarray["antonyms"],t)
            end   
        end
                    
       end
                syni=syni+1
                if synomarray["synonyms"..syni]==nil then
                    
            synomarray["synonyms"..syni]=syn
            else
                for i,t in syn do
                    table.insert(synomarray["synonyms"],t)
                end 
                end
        end      
        else
if wordvector==nil then  wordvector=require(game.ReplicatedStorage.GlobalSpells.ChatbotAlgorithm.Weights)
end	
           local defw= wordvector[originword] 
            if not defw then
                defw=0.5
            else
                defw=1-cm.sigmoid(defw)*20
            end
    
		table.insert(synomarray,{{originword},defw})--double nestedtable	
            end	
            
    end	
  
	return	synomarray
end
-- Define the function f(x,c)
local function fxc(x, c)
  return 1 / (math.abs(x - c) + 1)
end

It also leverages features of the english language such as synonyms, antonyms, nouns, and perspective reflection for accuracy.

It’s leverages the entire conversation context to keep the conversation context. Subtly similar to a context window. It also classifies the emotional tonality of the input string and compares it. It also can take previous previous output strings to attempt to find a match that has to do with the previous response and input.

local emotion2,escore2=cm.getemotion(str)
                    if emotion2 and emotion~="Question" then 
                        if emotion2==emotion then
                            count=count+(escore/10)
                        elseif emotion2=="Polite" then    count=count+(escore/12) 
                        end
                    end 
function cm.findBestMatch(strings, keyword,filter,mode,complete,synomarray,words2,query2)
	local bestMatch = nil -- the best match string
	-- the highest count of keyword occurrences
    local besttruweight=1
	local best_match, strm
    local words3 =nil
    local synomarray2=nil
    if words2==nil then words2 = cm.splitString(keyword,filter,true)  --print(words2)
	end
	
    local q2weight=1
    local keywordmap2={}
    local sortedMatches={}
   -- local bestweight=9
    local emotion,escore
    local emotion3,escore3
   
    local bestc=0
--    print(#words2)
  --  local weight2=  0
   local co2,we2,tr2=nil,nil,nil     
	if words2~={} and #words2>0 and strings~=nil and words2~=nil then
       
        if synomarray==nil and complete==true then synomarray=cm.GetSynomArray(words2,complete) 
		end 
		local wordhash= cm.hash(table.concat(words2))
		if contextnetwork[wordhash]==nil then
			contextnetwork[wordhash]=synomarray
		end
		local function contextweight(str)
			local co2,we2,tr2
			local co3,we3,tr3=0,0,0
		--	local distance=0
			for t,o in contextnetwork do
				
				co2,we2,tr2= cm.countKeyword(str, o,filter,complete,words2,noise) --get previous input score
				co3=co3+co2/2--growing value
			end	
			return co3
		end
        local inputweight=#words2
        local iw= inputweight/3.33
        if query2~=nil then words3 = cm.splitString(keyword,filter,true)  --print(words2)
        if synomarray2==nil and complete==true then synomarray2=cm.GetSynomArray(words3,complete) 
        end
 if query2~=nil then
             emotion3,escore3=cm.getemotion(query2)
            iw=iw+(#words3/12)
        end
        q2weight=#words3
co2,we2,tr2= cm.countKeyword(table.concat(words3," "), synomarray,filter,complete,words2,noise) --get previous input score
 
 end    
    if  globalemotion then
            emotion,escore=globalemotion[1],globalemotion[2]
        end  
        local strings=cm.reducebybl(strings,internalblacklist)
        local strings=cm.reducebybl(strings,curbl)--remove entries that are on bl
     local noise=0

local co,we,tr= cm.countKeyword(table.concat(words2," "), synomarray,filter,complete,words2,noise) 
local we=#words2
if words3 then
we2=#words3
end
local truw=(we*3)
local bestCount = co/4
local bestweight=truw*3
local minrelevance=co/truw --count divided by (weight*3) 33% minimum accuracy to input string	
local bestbl=0
--print("Minimum Relevance "..minrelevance)
--print(#strings)
local ref
--if #strings>300 then ref= pick_random_numbers(1, #strings) else  ref=strings end
if filter==1 or complete==false then
minrelevance=0
bestbl=0
end
 for i,str in strings do
--local str= strings[i]  
       
            local count,_,truec = cm.countKeyword(str, synomarray,filter,complete,words2,noise) 
--Get the count then measure subtlties if over threshold           
            if  (count>=minrelevance) then --count> math.min(inputweight/2,1) and if the count is higher than the best count so far 
bestbl=count
--local noise=mathrandom(500,1000)/20000
        
                count=count+noise                               
                if count>0 then
   if complete==nil or complete==false then
count=truec/2
 else  if emotion then 
                   local emotion2,escore2=cm.getemotion(str)
                    if emotion2 and emotion~="Question" then 
                        if emotion2==emotion then
                            count=count+(escore/10)
                        elseif emotion2=="Polite" then    count=count+(escore/12) 
                        end
                    end 
                    if emotion2 then
                        if emotion2==emotion3 then
                            count=count+(escore/16)                          
                        else   -- count=count-(escore/q2weight/8)   
                        end
                    end  
                end end

 if co2~=nil then  
                 local co3,we3,tr3= cm.countKeyword(str, synomarray2,filter,complete,words3,noise) --apply new synom array with original words or input words? I think second query words are important for measurement
                if co3 and we3 and tr3 then
                count=count+(co3/weight)+(tr3/16) -- give count 1/3  
                end           
               end
                    local  words=cm.splitString(str,nil)
					local weight=cm.sigmoid(#words) 
				
      	
					if math.log(count/(#words)/weight)>math.log(bestCount/(besttruweight)/bestweight) then
--if math.log(count/math.sin(#words/#words2)/weight)>math.log(bestCount/math.sin(besttruweight/#words2)/bestweight) then
   --  if math.log(count*fxc(#words,#words2)*weight)>math.log(bestCount*fxc(besttruweight,#words2)*bestweight) then
--if math.log(count/#words/#words2/weight)+math.log(count*fxc(#words,#words2)*weight)>math.log(bestCount/math.sin(besttruweight/#words2)/bestweight)+math.log(bestCount*fxc(besttruweight,#words2)*bestweight) then
						--if contextnetwork[contextkey][words[1]..words[2]]==nil then
						--	contextnetwork[contextkey][words[1]..words[2]]
						--end
					local cont=contextweight(str)/6.66
						print("Context weight of string is "..cont)
						count=count+cont
					print("Resultant Context weight of string is "..count)
                 if mode==2 then
						bestMatch=strings[i+1]	
					else						
						bestMatch = str 
                    end	-- update the best match string
                    --print(count)
                besttruweight=#words
                bestc=truec    
				bestCount = count-- update the best count number
                bestweight=weight              
                    table.insert(sortedMatches, {match = bestMatch, count = bestCount, weight = #words,truecount=truec,address=i}) -- Insert each match and its score as a  then
                    end end
                end	
	end
return bestMatch,bestCount,bestweight,sortedMatches,bestc	
else print("empty table")	
	end	
--	--	print()
--	--print(bestMatch)
print(bestMatch)
return nil
end

The bag of words is activated with a sigmoid function and it uses math.log to create a type of probabilistic curve.

That is the main thing it does. To create bag of words to use with this you can evaluate a dataset extract the features you wish to exract and make inferences on a dataset. It’s very transparent in that it returns the predicted accuracy of the response.

I also use a custom context classification network to utilize the predicted accuracies and maintain conversation context.

	local funckeysdef={
					["Commands"]=function(answer,answer2) return commandres,nil,2,1 
					end,
					["Motivation"]=cwm["Motivation"]["query"],
					["Inspiration"]=cwm["Inspiration"]["query"],
					["Truths"]=cwm["Truths"]["query"],
					["ScienceWisdom"]=cwm["ScienceWisdom"]["query"],
					["Spirituality"]=cwm["Spirituality"]["query"],
					["Emotions"]=EmotionsC,				
					["Awareness"]=Awarenesslookup,
					["Love"]=LoveLookup,				
					["Personal"]=Personal,
					["Judgement"]=Judgement,	
					["Greetings"]=GreetingsC,	
					["Support"]=Supportlookup,
					["Database"]=Databaselookup,	
					["Empathy"]=EmpathyC,	
					["Bestiary"]=Bestiarylookup,	
					["Wisdom"]=Wisdomlookup,						
					["Philosophy"]=PhilosophyAPI,	
				["Math"]=MathInterpreter,
				["Music"]=function(answer,answer2)	print("Calling music") local res=cm.DiscJockey(answer,player)
					print(answer)

					print(res)
					--	if res~="" then

					return res,nil,5,1
					--end 
				end,
					["Therapist"]=function(answer,answer2)
						if elizc==4 and elizaa~=nil and elizaa~="" then
							Result,blacklist,elizc,elizw=cm.CompleteQuery(answer,{elizaa},true,false,false,false,reverse,nil,nil,synomarray,filterwords,answer2)            
							elizc/=3           
						end 
						return elizaa,_,elizc,elizw end
				}
				local neuralMatches={}		
				table.sort(cwm, function(a, b) return a["weights"].X > b["weights"].X end)		
				local function ComputeConnectionMatrix()
					local numind=0
					-- print(cwm)
					local totresults=0
					local function CallNeuron(i,answer,answer2,chainer)--i is name of contextdb
						if funckeysdef[i]~=nil then --and totresults<=#nofilterwords  then
							-- table.sort(cwm, function(a, b) return a["weights"].X > b["weights"].X end)               
							local filter=true
							--filter=true            
							local zesult,_,zcore,zeight,chai=funckeysdef[i](answer,answer2,filter)
							if zesult and zesult~="" then local Result,_,score,weight=zesult,_,zcore,zeight
								totresults=totresults+1
								if score==nil then score=1 end if weight==nil then weight=1 end
								--print()

								funckeysdef[i]=nil--kill local register so no repeat.

								-- Result2,_,score2,weight2=
								local emotion=cm.getemotion(Result)
								print(Result)
								local sigx=cm.sigmoid(cwm[i]["weights"].X)/2

								for d,t in 	cwm[i]["chain"] do
								    if cwm[d]~=nil then
								       -- local lossf=cwm[i]["entropy"](cwm[Key][i])
								        --local substate=neu.matrix()
								        --cwm[i]["chain"][d]=substate
								        --pcall(function() print("d "..d) end)	
								        -- print(cwm[d])
								        local w=cwm[d]["weights"]
								        local mean=neu.matrix(neu.mathmean("+",w,cwm[i]["chain"][d],true)) 
								        local meofmean=neu.matrix(neu.mathmean("+",w,mean))
								        t=neu.matrix(neu.mathmean("+",meofmean,cwm[i]["chain"][d]))--adjust chain weights
								        cwm[i]["chain"][d]=t--mean of weighted mean
								        cwm[d]["weights"]=meofmean--weighted mean						
								        --task.wait(.0333)
								    end
								end
								local scores = cm.sigmoid((sigx/4)+(score/(weight/2)))
								print(tostring(i).." Count:"..score.." Weight:"..weight.." Solution:"..scores.." Original Weights:"..cwm[i]["weights"].X)
								if chainer==nil then
									table.insert(sortedMatches, {match = tostring(Result), score = score, emotion=emotion, directory=i}) 
								end
								if	cwm[i]["disconnect"]~=nil then
									for d,t in 	cwm[i]["disconnect"] do
										if funckeysdef[t]~=nil then
											--   if (scores>.5) then--and i=="Awareness") or  i~="Awareness" then
											funckeysdef[t]=nil 
											--  end
										end
									end    
								end     
								--print(cwm[i]["weights"])
								cwm[i]["weights"]=neur.writevector(cwm,i,cwm[i]["weights"].X,cwm[i]["weights"].Y+1,scores) 
								--	print(cwm[i]["weights"])
								if i=="Awareness" then
									cwm[i]["weights"].X-=1
								end
								return Result,score,weight
							end	
						end
					end
					local function connectthoughts(i,answr,answr2)
						local  Result,blacklist,score,weight
						if cwm[i]["cone"]~=nil then
							--get a result from every cone and concat the best result
							for d,t in 	cwm[i]["cone"] do
								if cwm[d] then
									if funckeysdef[d]~=nil then
										-- local zesult,_,zcore,zeight,chai=funckeysdef[d](funelanswr)
										local zesult,zcore,zeight= CallNeuron(d,answr,answr2,false)--i is name of contextdb
										--Result,blacklist,score,weight=Query(answer,nil,filter,repetitive,randomize,context,reverse,spaces,mode,synomarray,words2,tl)
										if zesult then Result,_,score,weight=zesult,_,zcore,zeight
											if score==nil then score=1 end if weight==nil then weight=1 end
											local w=cwm[d]["weights"]
											cwm[i]["cone"][d].Z=(zcore/zeight)   
											local mean=neu.matrix(neu.mathmean("+",w,cwm[i]["cone"][d],true))--mean of main weight and cone weight 
											local meofmean=neu.matrix(neu.mathmean("+",w,mean))--get the mean of the mean 1/4
											t=neu.matrix(neu.mathmean("+",meofmean,cwm[i]["cone"][d]))--adjust main weight by the weighted mean
											cwm[i]["cone"][d]=meofmean--mean of weighted mean
											cwm[d]["weights"]=t--weighted mean	
											--Result=res.." "..Result
											print(d.." Chained Result: "..Result)
											--if Result then
											return Result--,score,weight
											--  break                                                        
										end
									end    
								end			
							end
						end
					end
					local answer=answer
					cm.OverheadEmoji(plremotion,npc)
					local queried=false
					local mimport
					if importance~=nil then
						mimport=cm.ReduceWords(importance)
					else
						importance=""
						mimport=""
					end

					for i,v in (cwm) do
						print(i)
						CallNeuron(i,answer)		
					end

					--connectthoughts(i,answr,answr2)
					if #sortedMatches>0 then
						table.sort(sortedMatches, function(a, b) return a.score > b.score end)		
						for i,v in (sortedMatches) do
							-- CallNeuron(v.Directory,answer,v.match)	
							--  print(v)
							local result= connectthoughts(v.directory,answer,cm.ReduceWords(v.match))
							if result then
								sortedMatches[i].match= v.match.." "..result
							end        
						end
						local score=0
						local totals=0
						if #sortedMatches>1 then
							table.sort(sortedMatches, function(a, b)return a.score > b.score end)
							for i,a in sortedMatches do totals=totals+a.score end
						elseif #sortedMatches==1 then
							totals=sortedMatches[1].score
						end
						--Go through stack with normal result then popagate connections
						--  table.sort(sortedMatches, function(a, b) return a.score > b.score end)
						print(sortedMatches)
						local averagescore=totals/(#sortedMatches)---Reward
						local Rewscore=averagescore-Reward

						local sorted = {}
						local numind=0
						local totalscore=0
						--local plrc=string.len(origanswer)
						print("Average is "..averagescore.." Accuracy of response is "..totals.." AI likelihood is "..Rewscore)
						local recl=math.max(2,#nofilterwords)
						print(recl)
						local totallength=0
						local plrc=string.len(origanswer)-->=35 
						print(AIToken)
						print(plrc)
						-- if  AIToken==false and (#nofilterwords>10 or ((Rewscore<0.5 and #nofilterwords>5) or (Reward>0 and (#nofilterwords>7 or (plrc>=40 and #nofilterwords>5)))) or (Rewscore<0 and #nofilterwords>4)) then
						--AIToken=true 
					if AIToken==false and #nofilterwords>7 then 
						AITokenObject.Value="Zephyr" 
							AIToken="Zephyr"
							Reward=0 --end
							print("Using Zephyr") 


						end                                                                                 
						-- return nil
						--      else
						print("AI Token is "..tostring(AIToken))
					local highest=0
					for i,sentence in ipairs(sortedMatches) do
							local   blck=cm.ckbl(sortedMatches[i].match,curbl) 
							--   print(i) 
							local length=string.len(sortedMatches[i].match)
							if ((sortedMatches[i].score)>=averagescore-.1 and i<=(recl) and totallength<200  and blck==false  and totallength<totals*50) or sortedMatches[i].directory=="Commands"  then 
							totallength+=length
							if highest<sortedMatches[i].score then
								highest=sortedMatches[i].score
							end	
								--
								--if totalscore==0 and (sortedMatches[i].directory~="Therapist" or (sortedMatches[i].directory=="Therapist" and #sortedMatches==1)) and AIToken==false then -- and mathrandom(1,2)==1
								--	print("Using AI API service")

								--	local result=Classify(origanswer,sortedMatches[i].match)
								--	if result then 
								--		print(result)   
								--		table.insert(curbl,sortedMatches[i].match)--insert old match
								--		sortedMatches[i].match=result   --modify new match                         
								--		--  table.insert(sorted, result)

								--	end
								--	--  else
								--end


								if (sortedMatches[i].directory~="Therapist")  or  (sortedMatches[i].directory=="Therapist" and  sortedMatches[i].score==4 and mathrandom(1,3)==1) or (sortedMatches[i].directory=="Therapist" and  sortedMatches[i].score~=4)  then
									totalscore+=sortedMatches[i].score
									totalscore=0

									--totalscore=score+sortedMatches[i].score end
									if (sortedMatches[i].directory~="Awareness" and sortedMatches[i].directory~="Commands" and sortedMatches[i].directory~="Therapist") then  
										-- print(sortedMatches[i].directory~="Awareness")      

										table.insert(curbl, sortedMatches[i].match)	
									end 
									table.insert(sorted, sortedMatches[i].match)
									-- elseif blck==false then                   
									--   cm.rmbl(sortedMatches[i].match,curbl)
									--end
								end-- 
							end


							blacklist=curbl
							if totalscore>=25 then break end
						end	

If the predicted accuracy is low or the input string is of sufficient length it uses an external AI model.
It’s learning by adjusting the weights of each database and function and strengthening or weakening connections based on the weights of the chain and cones.


								for d,t in 	cwm[i]["chain"] do
								    if cwm[d]~=nil then
								       -- local lossf=cwm[i]["entropy"](cwm[Key][i])
								        --local substate=neu.matrix()
								        --cwm[i]["chain"][d]=substate
								        --pcall(function() print("d "..d) end)	
								        -- print(cwm[d])
								        local w=cwm[d]["weights"]
								        local mean=neu.matrix(neu.mathmean("+",w,cwm[i]["chain"][d],true)) 
								        local meofmean=neu.matrix(neu.mathmean("+",w,mean))
								        t=neu.matrix(neu.mathmean("+",meofmean,cwm[i]["chain"][d]))--adjust chain weights
								        cwm[i]["chain"][d]=t--mean of weighted mean
								        cwm[d]["weights"]=meofmean--weighted mean						
								        --task.wait(.0333)
								    end
								end
								local scores = cm.sigmoid((sigx/4)+(score/(weight/2)))
								print(tostring(i).." Count:"..score.." Weight:"..weight.." Solution:"..scores.." Original Weights:"..cwm[i]["weights"].X)
								if chainer==nil then
									table.insert(sortedMatches, {match = tostring(Result), score = score, emotion=emotion, directory=i}) 
								end
								if	cwm[i]["disconnect"]~=nil then
									for d,t in 	cwm[i]["disconnect"] do
										if funckeysdef[t]~=nil then
											--   if (scores>.5) then--and i=="Awareness") or  i~="Awareness" then
											funckeysdef[t]=nil 
											--  end
										end
									end    
								end     
								--print(cwm[i]["weights"])
								cwm[i]["weights"]=neur.writevector(cwm,i,cwm[i]["weights"].X,cwm[i]["weights"].Y+1,scores) 
								--	print(cwm[i]["weights"])
								if i=="Awareness" then
									cwm[i]["weights"].X-=1
								end
								return Result,score,weight
							end	
						end
					end
					local function connectthoughts(i,answr,answr2)
						local  Result,blacklist,score,weight
						if cwm[i]["cone"]~=nil then
							--get a result from every cone and concat the best result
							for d,t in 	cwm[i]["cone"] do
								if cwm[d] then
									if funckeysdef[d]~=nil then
										-- local zesult,_,zcore,zeight,chai=funckeysdef[d](funelanswr)
										local zesult,zcore,zeight= CallNeuron(d,answr,answr2,false)--i is name of contextdb
										--Result,blacklist,score,weight=Query(answer,nil,filter,repetitive,randomize,context,reverse,spaces,mode,synomarray,words2,tl)
										if zesult then Result,_,score,weight=zesult,_,zcore,zeight
											if score==nil then score=1 end if weight==nil then weight=1 end
											local w=cwm[d]["weights"]
											cwm[i]["cone"][d].Z=(zcore/zeight)   
											local mean=neu.matrix(neu.mathmean("+",w,cwm[i]["cone"][d],true))--mean of main weight and cone weight 
											local meofmean=neu.matrix(neu.mathmean("+",w,mean))--get the mean of the mean 1/4
											t=neu.matrix(neu.mathmean("+",meofmean,cwm[i]["cone"][d]))--adjust main weight by the weighted mean
											cwm[i]["cone"][d]=meofmean--mean of weighted mean
											cwm[d]["weights"]=t--weighted mean	
											--Result=res.." "..Result
											print(d.." Chained Result: "..Result)
											--if Result then
											return Result--,score,weight
											--  break                                                        
										end
									end    
								end			
							end
						end
					end

This architecture was chosen because it’s good on performance and actually works very well!
Unfortunately, this module was some of my early work so it’s rather outdated and unorganized. Was mostly a learning experience. But I have very satisfactory results using this module as a important component in building a chatbot.

A chatbot that attempts to combine sentences from its training data rather than individual words is typically referred to as a retrieval-based chatbot . These chatbots work by selecting a response from a set of predefined responses, matching the input query with the most appropriate sentence or sentences from their training data. They don’t generate new sentences but rather retrieve the best match, which can sometimes involve combining sentences that were previously written and stored in their database

It all uses its synonym data to randomize output strings using the users input string thus encouraging the chatbot to use similar language as the user.

    local newS = str
    

 
	for j, phrases in (interchangephrases) do
		for k, phrase in (phrases) do
			--cm.GetInterchangephrases(s,complete)
			if newS:lower():find(phrase:lower()) then
				
                  if query then

                newS=cm.randomizeStringLight(str,{phrases},"query",query)	
               else 
                    
                newS=cm.randomizeStringLight(str,{phrases})--,"query",query)
                 end
               -- newS=replace_str(newS, phrase, randomPhrase)
               -- newS=cm.randomizeStringLight(str,{phrases})	 
		end
		end
	end
	local wordarray=cm.splitString(newS)
	for i, d in (wordarray) do
	
		local phrases=cm.Getsynonyms(d,true)
    if #phrases>1 then
for k, phrase in (phrases) do          
         --   print(phrases)   
 	if newS:lower():find(phrase:lower()) then
				-- Pick a random phrase from the same g  
          if query then

                newS=cm.randomizeStringLight(str,{phrases},"query",query)	
               else 
                newS=cm.randomizeStringLight(str,{phrases})--,"query",query)
                 end
end
end
end
	--newS=cm.randomizeStringLight(str,{phrases})	
   -- 	for k, phrase in (phrases) do
			--if string.match(newS:lower(),"^"..phrase:lower().."$") then
			
			


	end
	
return newS
        
end

I implemented it all from scratch but this library does not use a conventional neural network. But it does provide excellent resources that would make implementing and executing a language based AI much easier and efficient by utilizing the synonyms to compress the training vocabulary.
Machine Learning - Yes
AI - Yes
Neural Network - I’m sure you can train a neural network using this library as demonstrated by these functions.

function cm.TrainModel(strings,model)
	--local model={}
	for i, str in ipairs(strings) do -- loop through the strings in the table
		local words=cm.splitString(str)
		for t,wo in ipairs(words) do
			local prevw,nextw
			if wo~="I" then
				wo=wo:lower()
			end
			local s=cm.Getsynonyms(wo,true)		
			--print(s[1])
			
		if model[s[1]]==nil then 
				model[s[1]]={}
				model[s[1]]["fr"]=1
				--model[s[1]]["pw"]={}
				model[s[1]]["nw"]={}
			--	print(model[s[1]])	
				end 
			model[s[1]].fr=model[s[1]].fr+1
			if t~=1 then
				--local prev=cm.Getsynonyms(words[t-1],true)
				--prevw=prev[1]
				--if model[s[1]].pw[prevw]==nil and prevw then
				----	model[s[1]].pw[prevw]=
				--	model[s[1]].pw[prevw]=1 
				--else model[s[1]].pw[prevw]=model[s[1]].pw[prevw]+1	
				--end
				if t~=#words then
					local nex=cm.Getsynonyms(words[t+1],true)
						nextw=nex[1]
					
					if model[s[1]].nw[nextw]==nil then model[s[1]].nw[nextw]=1 
					else model[s[1]].nw[nextw]=model[s[1]].nw[nextw]+1	
					end
				end

			end			
	
		end
	end	
	--print(model)
	
	
	--table.sort(model, function(a, b) return a.fr > b.fr end)

	return model
end

function cm.TrainLargeModel(strings,model)
	--local model={}
	for i, str in ipairs(strings) do -- loop through the strings in the table
		local words=cm.splitString(str)
		for t,wo in ipairs(words) do
			local prevw,nextw
			if wo~="I" then
				wo=wo:lower()
			end
			local s=cm.Getsynonyms(wo,true)		
			--print(s[1])

			if model[s[1]]==nil then 
				model[s[1]]={}
				model[s[1]]["fr"]=1
				model[s[1]]["pw"]={}
				model[s[1]]["nw"]={}
				model[s[1]]["p2w"]={}
				model[s[1]]["n2w"]={}
				--	print(model[s[1]])	
			end 
			model[s[1]].fr=model[s[1]].fr+1
			if t~=1 then
				local prev=cm.Getsynonyms(words[t-1],true)
				prevw=prev[1]
				if model[s[1]].pw[prevw]==nil and prevw then
					--	model[s[1]].pw[prevw]=
					model[s[1]].pw[prevw]=1 
				else model[s[1]].pw[prevw]=model[s[1]].pw[prevw]+1	
				end
			end
			if t>2 then
				local prev=cm.Getsynonyms(words[t-2],true)
				prevw=prev[1]
				if model[s[1]].p2w[prevw]==nil and prevw then
					--	model[s[1]].pw[prevw]=
					model[s[1]].p2w[prevw]=1 
				else model[s[1]].p2w[prevw]=model[s[1]].p2w[prevw]+1	
				end
			end
			if t<#words-1 then
				local nex=cm.Getsynonyms(words[t+2],true)
				nextw=nex[1]

				if model[s[1]].n2w[nextw]==nil then model[s[1]].n2w[nextw]=1 
				else model[s[1]].n2w[nextw]=model[s[1]].n2w[nextw]+1	
				end
			end
			
			if t~=#words then
					local nex=cm.Getsynonyms(words[t+1],true)
					nextw=nex[1]

					if model[s[1]].nw[nextw]==nil then model[s[1]].nw[nextw]=1 
					else model[s[1]].nw[nextw]=model[s[1]].nw[nextw]+1	
					end
			end

						

		end
	end	
	--print(model)


	--table.sort(model, function(a, b) return a.fr > b.fr end)

	return model
end

Really when training a neural network you have to be able to extract features from the dataset.
Some thing I would do differently is build a vocab table for the AI and replace the words wih numbers, then extract and place all the features you can from the word, and read up on how to structure training a neural network. I pretty much left off on that but currently I am satisfied with my chatbot. Main motivations were performance efficiency, controllability, and transparency of the outputs of the chatbot. Each capability and specialization is a node and they either input their output to another node or disconnect certain nodes.

function neur.ConvalutedContextClassificationModel()
    local contextmodel = {
        --context weight matrix Y,Z are calculated and influence X position.
        ["Greetings"] = {
            ["weights"] = neu.matrix(10, 1, 1),
            --10 table 10 numbers ez
            ["chain"] = {
                ["Emotions"] = neu.matrix(9, 1, 1),
                ["Awareness"] = neu.matrix(8, 1, 1),
                ["Empathy"] = neu.matrix(7, 1, 1),
                --  ["Search"]=neu.matrix(5,1,.9),
                ["Classify"] = neu.matrix(6, 1, 1),
                ["Support"] = neu.matrix(2, 1, .8),
                ["Database"] = neu.matrix(3, 1, 1),
                ["Greetings"]=neu.matrix(3,1,1),
                ["Wisdom"] = neu.matrix(4, 1, 1)
                --["Math"]=neu.matrix(0,1,1),
                -- ["Philosophy"]=neu.matrix(1,1,1),
                --["Search"]=neu.matrix(1,1,1),
                --search accuracy
            },
            ["cone"] = {
               -- ["Emotions"] = neu.matrix(9, 1, 1),
               --["Support"] = neu.matrix(2, 1, .8),
                --["Empathy"]=neu.matrix(9,1,1),
                --["Awareness"]=neu.matrix(8,1,1),
                ["Classify"] = neu.matrix(6, 1, 1),
               --["Database"] = neu.matrix(6, 1, 1),
                --["Database"] = neu.matrix(6, 1, 1),
            },
            ["disconnect"] = {
              -- "Therapist
    "ScienceWisdom","Database","Wisdom","Support","Empathy",--"Awareness"
            },
            ["parameters"] = {
                filter=true,
                complete=true,
                randomize=true                      
            },
            ["entropy"] = ContextBias["gan"]
            --decrease weight of go on to other topics.
        },
        ["Determinant"] = {
            ["weights"] = neu.matrix(8, 1, 1),
            --10 table 10 numbers ez
            ["chain"] = {
                ["Emotions"] = neu.matrix(9, 1, 1),
                ["Awareness"] = neu.matrix(8, 1, 1),
                ["Empathy"] = neu.matrix(7, 1, 1),
                --  ["Search"]=neu.matrix(5,1,.9),
                ["Classify"] = neu.matrix(6, 1, 1),
                ["Support"] = neu.matrix(2, 1, .8),
                ["Database"] = neu.matrix(3, 1, 1),
                ["Greetings"]=neu.matrix(3,1,1),
                ["Wisdom"] = neu.matrix(4, 1, 1)
                --["Math"]=neu.matrix(0,1,1),
                -- ["Philosophy"]=neu.matrix(1,1,1),
                --["Search"]=neu.matrix(1,1,1),
                --search accuracy
            },
            ["cone"] = {
               -- ["Emotions"] = neu.matrix(9, 1, 1),
               --["Support"] = neu.matrix(2, 1, .8),
                --["Empathy"]=neu.matrix(9,1,1),
                --["Awareness"]=neu.matrix(8,1,1),
              --  ["Classify"] = neu.matrix(6, 1, 1),
               --["Database"] = neu.matrix(6, 1, 1),
                --["Database"] = neu.matrix(6, 1, 1),
            },
            ["disconnect"] = {
              --"Therapist,
    "ScienceWisdom","Database","Wisdom","Support","Empathy","Awareness","Greetings","Love",
            },
            ["parameters"] = {
                filter=true,
                complete=true,
                randomize=true                      
            },
            ["entropy"] = ContextBias["gan"]
            --decrease weight of go on to other topics.
        },
        ["Commands"] = {
            ["weights"] = neu.matrix(10, 1, 1),
            --10 table 10 numbers ez
            ["chain"] = {
                ["Emotions"] = neu.matrix(9, 1, 1),
                ["Awareness"] = neu.matrix(8, 1, 1),
               
            },
            ["cone"] = {
               -- ["Emotions"] = neu.matrix(9, 1, 1),
               --["Support"] = neu.matrix(2, 1, .8),
                --["Empathy"]=neu.matrix(9,1,1),
                --["Awareness"]=neu.matrix(8,1,1),
                ["Classify"] = neu.matrix(6, 1, 1),
               --["Database"] = neu.matrix(6, 1, 1),
                --["Database"] = neu.matrix(6, 1, 1),
            },
            ["disconnect"] = {
              -- "Therapist
  --  "ScienceWisdom",
--"Database","Wisdom","Support","Empathy",--"Awareness"
            },
            ["parameters"] = {
                filter=true,
                complete=true,
                randomize=true                      
            },
            ["entropy"] = ContextBias["gan"]
            --decrease weight of go on to other topics.
        },
        ["Personal"] = {
            ["weights"] = neu.matrix(8, 1, 1),
            --10 table 10 numbers ez
            ["chain"] = {
                ["Emotions"] = neu.matrix(9, 1, 1),
                ["Awareness"] = neu.matrix(8, 1, 1),
                ["Empathy"] = neu.matrix(7, 1, 1),
                --  ["Search"]=neu.matrix(5,1,.9),
                ["Classify"] = neu.matrix(6, 1, 1),
                ["Support"] = neu.matrix(2, 1, .8),
                ["Database"] = neu.matrix(3, 1, 1),
                ["Greetings"]=neu.matrix(3,1,1),
                ["Wisdom"] = neu.matrix(4, 1, 1)
                --["Math"]=neu.matrix(0,1,1),
                -- ["Philosophy"]=neu.matrix(1,1,1),
                --["Search"]=neu.matrix(1,1,1),
                --search accuracy
            },
            ["cone"] = {
               -- ["Emotions"] = neu.matrix(9, 1, 1),
               --["Support"] = neu.matrix(2, 1, .8),
                --["Empathy"]=neu.matrix(9,1,1),
                --["Awareness"]=neu.matrix(8,1,1),
                ["Classify"] = neu.matrix(6, 1, 1),
               --["Database"] = neu.matrix(6, 1, 1),
                --["Database"] = neu.matrix(6, 1, 1),
            },
            ["disconnect"] = {
              -- "Therapist
    "ScienceWisdom","Database","Wisdom","Support","Empathy",--"Awareness"
            },
            ["parameters"] = {
                filter=true,
                complete=true,
                randomize=true                      
            },
            ["entropy"] = ContextBias["gan"]
            --decrease weight of go on to other topics.
        },
        ["Love"] = {
            ["weights"] = neu.matrix(8, 1, 1),
            --10 table 10 numbers ez
            ["chain"] = {
                ["Emotions"] = neu.matrix(9, 1, 1),
                ["Awareness"] = neu.matrix(8, 1, 1),
                ["Empathy"] = neu.matrix(7, 1, 1),
                --  ["Search"]=neu.matrix(5,1,.9),
                ["Classify"] = neu.matrix(6, 1, 1),
                ["Support"] = neu.matrix(2, 1, .8),
                ["Database"] = neu.matrix(3, 1, 1),
                ["Greetings"]=neu.matrix(3,1,1),
                ["Wisdom"] = neu.matrix(4, 1, 1)
                --["Math"]=neu.matrix(0,1,1),
                -- ["Philosophy"]=neu.matrix(1,1,1),
                --["Search"]=neu.matrix(1,1,1),
                --search accuracy
            },
            ["cone"] = {
               -- ["Emotions"] = neu.matrix(9, 1, 1),
               --["Support"] = neu.matrix(2, 1, .8),
                --["Empathy"]=neu.matrix(9,1,1),
                --["Awareness"]=neu.matrix(8,1,1),
                ["Classify"] = neu.matrix(6, 1, 1),
               --["Database"] = neu.matrix(6, 1, 1),
                --["Database"] = neu.matrix(6, 1, 1),
            },
            ["disconnect"] = {
              -- "Therapist
    "ScienceWisdom","Database","Wisdom","Support","Empathy",--"Awareness"
            },
            ["parameters"] = {
                filter=true,
                complete=true,
                randomize=true                      
            },
            ["entropy"] = ContextBias["gan"]
            --decrease weight of go on to other topics.
		},
		["Music"] = {
			["weights"] = neu.matrix(2, 1, 1),
			--10 table 10 numbers ez
			["chain"] = {
				--["Emotions"] = neu.matrix(9, 1, 1),
				--["Awareness"] = neu.matrix(8, 1, 1),
				--["Empathy"] = neu.matrix(7, 1, 1),
				----  ["Search"]=neu.matrix(5,1,.9),
				--["Classify"] = neu.matrix(6, 1, 1),
				--["Support"] = neu.matrix(2, 1, .8),
				--["Database"] = neu.matrix(3, 1, 1),
				--["Greetings"]=neu.matrix(3,1,1),
				--["Wisdom"] = neu.matrix(4, 1, 1)
				--["Math"]=neu.matrix(0,1,1),
				-- ["Philosophy"]=neu.matrix(1,1,1),
				--["Search"]=neu.matrix(1,1,1),
				--search accuracy
			},
			["cone"] = {
		
			},
			["disconnect"] = {
				"Therapist",
				--"ScienceWisdom","Database","Wisdom","Support","Empathy",--"Awareness"
			},
			["parameters"] = {
				filter=true,
				complete=true,
				randomize=true                      
			},
			["entropy"] = ContextBias["gan"]
			--decrease weight of go on to other topics.
		},
        ["Judgement"] = {
            ["weights"] = neu.matrix(5, 1, 1),
            --10 table 10 numbers ez
            ["chain"] = {
                ["Emotions"] = neu.matrix(9, 1, 1),
                ["Awareness"] = neu.matrix(8, 1, 1),
                ["Empathy"] = neu.matrix(7, 1, 1),
                --  ["Search"]=neu.matrix(5,1,.9),
                ["Classify"] = neu.matrix(6, 1, 1),
                ["Support"] = neu.matrix(2, 1, .8),
                ["Database"] = neu.matrix(3, 1, 1),
                ["Greetings"]=neu.matrix(3,1,1),
                ["Wisdom"] = neu.matrix(4, 1, 1)
                --["Math"]=neu.matrix(0,1,1),
                -- ["Philosophy"]=neu.matrix(1,1,1),
                --["Search"]=neu.matrix(1,1,1),
                --search accuracy
            },
            ["cone"] = {

                ["Classify"] = neu.matrix(6, 1, 1),
            },
            ["disconnect"] = {
              -- "Therapist
   -- "ScienceWisdom","Database","Wisdom","Support","Empathy",--"Awareness"
            },
            ["parameters"] = {
                filter=true,
                complete=true,
                randomize=true                      
            },
            ["entropy"] = ContextBias["gan"]
            --decrease weight of go on to other topics.
        },
        ["Emotions"] = {
            ["weights"] = neu.matrix(9, 1, 1),
            ["chain"] = {
                --["Emotions"]=neu.matrix(9,1,1),
                ["Greetings"] = neu.matrix(5, 1, 1),
                --["Math"]=neu.matrix(0,1,1),
                ["Awareness"] = neu.matrix(8, 1, 1),
                ["Empathy"] = neu.matrix(10, 1, 1),
                --["Search"]=neu.matrix(4,1,.9),
                ["Classify"] = neu.matrix(7, 1, 1),
                ["Support"] = neu.matrix(3, 1, .8),
                ["Database"] = neu.matrix(6, 1, 1),
                ["Bestiary"] = neu.matrix(5, 1, 1),
                ["Wisdom"] = neu.matrix(4, 1, 1),
                ["Philosophy"] = neu.matrix(2, 1, 1)
            },
            ["cone"] = {
                --["Empathy"]=neu.matrix(9,1,1),
                ["Wisdom"] = neu.matrix(8, 1, 1)
                --["Wisdom"]=neu.matrix(8,1,1),
            },
            ["disconnect"] = {
                "ScienceWisdom","Empathy","Database",--"Awareness"
            },
            ["parameters"] = {
                filter=true,
                complete=true,
                randomize=true                      
            },
            ["entropy"] = ContextBias["smolgan"]
        },      
        ["Empathy"] = {
            ["weights"] = neu.matrix(7, 1, 1),
            ["chain"] = {
                ["Philosophy"] = neu.matrix(10, 1, 1),
                --["Math"]=neu.matrix(0,1,1),
                ["Wisdom"] = neu.matrix(9, 1, 1),
                ["Bestiary"] = neu.matrix(8, 1, 1),
                ["Classify"] = neu.matrix(3, 1, 1),
                ["Emotions"] = neu.matrix(1, 1, 1),
                ["Database"] = neu.matrix(6, 1, 1),
                --["Search"]=neu.matrix(7,1,.9),
                ["Awareness"] = neu.matrix(6, 1, 1),
                ["Greetings"] = neu.matrix(2, 1, 1),
                ["Support"] = neu.matrix(5, 1, .8)
            },
            --["Empathy"]=neu.matrix(,1,1),
            ["entropy"] = ContextBias["gan"],
            ["cone"] = {
                ["Wisdom"] = neu.matrix(9, 1, 1)
            },
            ["disconnect"] = {
                "ScienceWisdom","Database",--"Awareness"
            },
            ["parameters"] = {
                filter=false,
                complete=true,
                randomize=true                      
            },
        },
  
---ommited for brevity

In short their are some valuable data science resources in this module for creating a chatbot in ROBLOX using Luau in addition you can train it using, DataPredict [Release 1.16] - Machine And Deep Learning Library (Learning AIs, Generative AIs, and more!)
the main thing would be to extract features from the data, this could be the position of the word in a sentence, the repetition of the word in the dataset next word, previous word, previous 2 words and previous 2 words. Given these features you can do word transformations such as the ones demonstrated in the parent post.

 "I am Lilith, a fallen angel consumed by darkness.",
 	"Greetings mortal, you stand in the presence of forbidden knowledge.",
 	"Your light means nothing here. This is my domain of shadows.",
 	"You have come seeking power. I can give you this, for a price...",
 	"I am the Angel of Darkness, the mistress of secrets and lies.",
 	"Welcome to my realm, traveler. I hope you are prepared for what awaits you here.",
 	"Your soul is mine to claim. This is the pact you have made with me.",
 	"You have come to learn from me, the master of dark magic. How brave of you.",
 	"I am the Herald of the Dark, mortal. My footsteps herald oblivion.",

 	"You now stand in the presence of me! The Angel of Darkness, the Devourer, mortal. Prepare to feed the endless hunger of the void.",

 	"Bear witness to me, emissary of the insatiable dark! I am the annihilation that comes ravening from the endless night.",

 	"I am Vhorzun, worm. My masters in the screaming darkness have granted me a sliver of their boundless hunger to unmake your realm.",

 	"The stars grow dim and the veil frays. The final era approaches, and I am its herald. I am Vhorzun of the Endless Hunger!"
 } print(cm.PredictRun(Greetings,mo))  -  Studio
01:24:35.544   ▼  {
                 [1] = " I am the is a goddess an angel Belldandy and by two",
                 [2] = " hi mortal I to stand up of the shiny goddess of the of",
                 [3] = " the luminous and that not a unison thing but in this is to my life goddess of the",
                 [4] = " you have to keep seeking the mortal I am never donate up in this is a goddess",
                 [5] = " I am the an angel Belldandy the dark realm I my secrets unfold of",
                 [6] = " need to be my realm eternal mortal I am if you can you make ready upon confess what you if you can",
                 [7] = " your immortal-soul and I forecast dominion it is the you have to associated with a",
                 [8] = " you have to require to be came from the of the intelligent goddess of the much alchemy in be adventurous and if",
                 [9] = " I am the of the luminous hello mortal I s footsteps of",
                 [10] = " it now and believe in the presence of to me as an angel Belldandy the dark cloud I are make make ready to feed your s endless life goddess of the",
                 [11] = " to me as goddess of the mortal I am of the clever is that s of the clever the",
                 [12] = " I am the of the shiny the dark dimension I repeatedly granted you is a goddess of the desire to be of your life",
                 [13] = "  the stars born not dim that of the luminous the concluding key mortal I am whole its people mortal I am of the luminous"

Probably the main hurtle would be some time of supervised learning algorithm for the training process, which is also another potential use of the ability to score responses and find the most conversational response. Another option would be to train it using a response classifier such as the one I posted earlier. This would make it very easy to fine tune even a retrieval based chatbot made using this library using an array that weighs the given scores of the response.

-- Define the API URL and the authorization header
local API_URL = "https://api-inference.huggingface.co/models/tinkoff-ai/response-quality-classifier-tiny"
local headers = {Authorization = Bearkey }

-- Define a function to query the API with a payload
local function query(payload)
    -- Encode the payload as a JSON string
    local jsonPayload = HttpService:JSONEncode(payload)
    -- Send a POST request to the API and get the response
    local response = HttpService:PostAsync(API_URL, jsonPayload, Enum.HttpContentType.ApplicationJson, false, headers)
    -- Decode the response as a JSON table
    local jsonResponse = HttpService:JSONDecode(response)
    -- Return the response table
    return jsonResponse
end

-- Call the query function with pcall and handle the error
local status, result = pcall(query, payload)
if status then
    -- The query function returned successfully, result is the response table
    print(result)
else
    -- The query function raised an error, result is the error message
result=0    
print("Error: " .. result)
end

return result
end

-- Define a function that takes three arguments: query1, query2, and response
local function constructInput(query1, query2, response)
    -- Concatenate the arguments with the special tokens
    local input = "[CLS]"..query1.."[SEP]"..query2.."[RESPONSE_TOKEN]"..response
    -- Return the input string
    return  ResponseClassifier(input)
end

function cm.GenerateQuerySearch()
local newdata={}
local contextdb=require(game.ReplicatedStorage.GlobalSpells.ChatbotAlgorithm.Context.Index.FantasyDialogue)
for i,v in testdataset do
newdata[i]={}
local words=cm.splitString(i)
local synomar=cm.GetSynomArray(words,true,false)
task.wait()

for t,o in contextdb do
local	Result1,blacklist,score2,weight2,sortedMatches=cm.CompleteQuery(i, o,false,false,true,false,nil,nil,nil,synomar,words)
--cm.CompleteQuery(table.concat(emotiontbl, ""),animdb,1,true,true,false,false,true)
print(sortedMatches)
if sortedMatches~=nil then
for p,o in sortedMatches do
if p<3 then
local score=constructInput("", i,sortedMatches[p].match)
print(score)
local a=sortedMatches[p].address
if newdata[i]==nil then
newdata[i]={}
end
if newdata[i][t]==nil then
newdata[i][t]={}
end
newdata[i][t][p]={score=score,match=sortedMatches[p].match}--p is priority
-- {{[1]=0.5,[2]="a"}, {[1]=0.8,[2]="b"}, {[1]=0.3,[2]="c"}}
--table.sort(newdata[i][t], function(a,b) return a[1] > b[1] end)
end    --  table.insert(sortedMatches, {match = bestMatch, count = bestCount, weight = bestweight,truecount=truec,address=i})
end
end
if apicalls>30 then
break
end

Modern large language models use a Transformer architecture. Although given the size of conventional local LLM you would be hard pressed to be able to run it in ROBLOX without GPU acceleration, so building a chatbot using limited computational resources may include a bunch of hacks that separate it from cutting edge methods that brute force the issue using models trained for weeks on end using the most advanced GPUs.


This link is the best resource for training a neural network in roblox, although I’m not sure if it includes a Transformer, Although I do believe it contains the components of a transformer architecture.
DataPredict [Release 1.16] - Machine And Deep Learning Library (Learning AIs, Generative AIs, and more!)

Also, this module is no longer maintained, as I said it was more of a learning experience and it’s a bit convoluted and messy state and extends about 14000 lines of code. But it should be in a usable state given the directions. Hopefully in the future I can find the time and restructure or rewrite the main functionalities of this module. It’s all open sourced and free to use and demonstrates solutions to data science problems in creating chatbots that can solve worded arithmetic, recognize search queries and train using a compressed vocabulary.

It looks really good ngl. It is really impressive and interesting. You usually see chatbots that output predefined responses. But this has real work and It is amazing. The fact that It can be used for an AI as well as a non-ai Chatbot is amazing, truly amazing.

I am an experienced programmer in Roblox and I would LOVE to someday maybe collab with you! We could surely make something good and Big!

I love this project and the effort you are putting on It! Keep Up the good work

1 Like

IF you want to carry on some new work try building a chatbot in ROBLOX you should use this module for the model Training. DataPredict [Release 1.16] - General Machine And Deep Learning Library (Learning AIs, Generative AIs, and more!)
As you train you need a text-encoder to turn the inputs into numbers. This would involve building a vocabulary of the first word received by the cm.GetSynonyms function this function will drastically decrease the complexity of the training data and make your chatbot much more efficient and faster to train by compressing the english language while maintaining coherence.

It would be something similar to

function cm.TrainModel(strings,model)
	--local model={}
	for i, str in ipairs(strings) do -- loop through the strings in the table
		local words=cm.splitString(str)
		for t,wo in ipairs(words) do
			local prevw,nextw
			if wo~="I" then
				wo=wo:lower()
			end
			local s=cm.Getsynonyms(wo,true)		
			--print(s[1])
			
		if model[s[1]]==nil then 
				model[s[1]]={}
				model[s[1]]["fr"]=1
				--model[s[1]]["pw"]={}
				model[s[1]]["nw"]={}
			--	print(model[s[1]])	
				end 
			model[s[1]].fr=model[s[1]].fr+1
			if t~=1 then
				--local prev=cm.Getsynonyms(words[t-1],true)
				--prevw=prev[1]
				--if model[s[1]].pw[prevw]==nil and prevw then
				----	model[s[1]].pw[prevw]=
				--	model[s[1]].pw[prevw]=1 
				--else model[s[1]].pw[prevw]=model[s[1]].pw[prevw]+1	
				--end
				if t~=#words then
					local nex=cm.Getsynonyms(words[t+1],true)
						nextw=nex[1]
					
					if model[s[1]].nw[nextw]==nil then model[s[1]].nw[nextw]=1 
					else model[s[1]].nw[nextw]=model[s[1]].nw[nextw]+1	
					end
				end

			end			
	
		end
	end	

As you can see the model in that example indexes the words as a key to access to the number in the subtables. To train using a machine learning model you have to turn all of the words into numbers. So you would index the number into a vocabulary whereas the number equals the word. to unpack your model.
and Viola! You can train a Large Language Model by feeding it datasets similar to how you would train a davinci style model. If you need any help setting up a neural network you can ask the creator of the machine learning library for help and he is pretty nice and helpful.
Basically the TrainModel functions and Predict run functions are pretty close to being compatible with that library it’s just a few changes.
This process would make you the first in ROBLOX to ever make a chatbot using traditional machine learning methods such as the ones documented in his library.
I would say the cons are that it’s experimental, it’s been known to take lots of training to make machine learning models learn what they need about language, the main advantage you’d have is the amount of language compression the getsynonyms function provides which could reduce the
learning space complexity of your model exponentially.
I am doing other stuff currently and but I have a newer version of this module is currently pretty similar but a bit cleaned up and performant. but for the getsynonyms function it’s pretty simple to just call it. The dataset has been thoroughly tested to maintain coherence and meaning of a sentence.
This video demonstrates some of the process, where he creates a chatbot trained on only texts from shakespeare.

To train a chatbot that takes in queries, answers, and a system message you would train it on data structured like that and provide the end user the answer.
You can find conversational datasets in this structure on huggingface.co
My project was more about training a classifier and determining the accuracy based off how good of a response is given.
Similar to something like this

local function ResponseClassifier(payload)
-- Define the API URL and the authorization header
local API_URL = "https://api-inference.huggingface.co/models/tinkoff-ai/response-quality-classifier-tiny"
local headers = {Authorization = Bearkey }

-- Define a function to query the API with a payload
local function query(payload)
    -- Encode the payload as a JSON string
    local jsonPayload = HttpService:JSONEncode(payload)
    -- Send a POST request to the API and get the response
    local response = HttpService:PostAsync(API_URL, jsonPayload, Enum.HttpContentType.ApplicationJson, false, headers)
    -- Decode the response as a JSON table
    local jsonResponse = HttpService:JSONDecode(response)
    -- Return the response table
    return jsonResponse
end

-- Call the query function with pcall and handle the error
local status, result = pcall(query, payload)
if status then
    -- The query function returned successfully, result is the response table
    print(result)
else
    -- The query function raised an error, result is the error message
result=0    
print("Error: " .. result)
end

return result
end

which can be used to train or fine tune a model. You can also use chatGPT for supervised training by having it rate the answers outputs using a sort of GPT style function interaction tool.

Thank you! I will make sure to use it. Here is the current tokenizer:

local NLPM = {}

local separators = {
	"·",
	",",
	".",
	";",
	":",
	"!",
	"?",
	"-",
	"(",
	")",
	"[",
	"]",
	"{",
	"}",
	"<",
	">",
	"'",
	'"',
	"/",
	"\\",
	"|",
	"_",
	"*",
	"&",
	"^",
	"%",
	"$",
	"#",
	"@",
	"`",
	"~",
	"=",
	"+",
	"-",
	"*",
	"/",
	"\\",
	"|",
	"_",
	":",
	";",
	",",
	".",
	"?",
	"!",
	"\t",
	"\n",
	"\r",
	"\f",
	"\v",
	"1",
	"2",
	"3",
	"4",
	"5",
	"6",
	"7",
	"8",
	"9",
	"0",
	"ª",
	"º",
	"...",
	"—",
	"–",
	"‘",
	"’",
	"“",
	"”",
	" ",
}

function has(separators,char)
	for _,value in ipairs(separators) do
		if value == char then
			return true
		end
	end
	return false
end

function NLPM.word_tokenize(input:string)
	local tokens = {}
	local currentToken = ""

	for i = 1, #input do
		local char = input:sub(i, i)

		if has(separators, char) then
			if currentToken ~= "" then
				currentToken = string.lower(currentToken)
				table.insert(tokens, currentToken)
				currentToken = ""
			end
			if char == " " and #tokens > 0 and #currentToken == 0 then
				-- Unir espacios con la palabra anterior
				tokens[#tokens] = tokens[#tokens] .. char
			else
				table.insert(tokens, char)
			end
		else
			currentToken = string.lower(currentToken) .. char
		end
	end

	if currentToken ~= "" then
		table.insert(tokens, string.lower(currentToken))
	end
	--print(tokens)
	return tokens
end

function NLPM.untokenize(tokens)
	return table.concat(tokens)
end

function NLPM.extractWord_fromTokens(tokens,word)
    for index,token in pairs(tokens) do
        if string.lower(token) == string.lower(word) then
            return token
        end
    end
    return nil
end

function NLPM.extractWord_fromInput(input,word)
    for i = 1,#input do
        local token = string.sub(input,i,i)
        if string.lower(token) == string.lower(word) then
            return token
        end
    end
    return nil
end

function NLPM.extractNumbers_fromInput(input:string)
    local number_pattern = "%d+"
    local numbers = {}
    for match in input:gmatch(number_pattern) do
        table.insert(numbers,match)
    end
    return numbers
end

function NLPM.extractFirstNumber_fromInput(input:string)
    local number_pattern = ("%d+")
    local numbers = {}
    if input:match(number_pattern) then
        table.insert(numbers,input:match(number_pattern))
    end
    print("EJWE",numbers,"EJWE2")
    if #numbers >= 1 then
        return tonumber(numbers[1])
    end
    return nil
end

function NLPM.extractLastNumber_fromInput(input:string)
    local number_pattern = "%d+"
    local numbers = {}
    for match in input:gmatch(number_pattern) do
        table.insert(numbers,match)
    end
    local number = numbers[#numbers]
    return tonumber(number)
end

return NLPM
1 Like

That looks good! Chatbots that respond to queries have structured data often encoded in Json to seperate example query and response, for inference inject user query and do next token prediction. The features you extract from training text are inputs, these can be word position, previous word current word next word etc. These inputs can be encoded into matrixes (numbers). and used to train a model, provided the right setup. Using the synonyms=cm.Getsynonyms(word,true) to return the array of synonyms you get compressed language inputs be hashing synonyms[1] or synonyms[#synonyms] this would reduce the complexity of what the model has to learn to chat.

DataPredict [Release 1.16] - General Machine And Deep Learning Library (Learning AIs, Generative AIs, and more!)
Your seperators should be organized

local newseperators={}
local inputnumber=0
for i,token in separators do 
 inputnumber+=1
newseperators[token]={i,string.len(token)}-Begin to store features of that input. 
end
print(newseperators)

function Model(inputtext)
local input={}
local wordarray=cm.splitString(inputtext)--copy and past this function to tokenize inputs.
for i,word in wordarray do 
local compresswords=cm.Getsyonyms(word,true)
local vocabelement=compresswords[1]
if newseperators[vocabelement]==nil then 
 inputnumber+=1{1}
 newseperators[vocabelement]={inputnumber,string.len(vocabelement),1}--one is repetition
else 
newseperators[vocabelement][3]+=1
end
--etc
end
end

Thank you. I Will make sure to credit you for helping. :people_hugging:

1 Like

Here is how I plan on making the Neural Network

• a table with numerical data will be feed into the NeuralNetwork

• the table Will be iterated thru and you Will be able to choose between using Sigmoid and ReLu

• The weights Will be applied with the bias, and the vector Will be fed to the function you chose

• The final vector Will be added to a table…

1 Like

It Will have 4 hidden layers with 100 neurons each. There Will be 500 weights and 500 bias
The max input length Will be 500

1 Like

I will probably need to import the weights cuz if I do the training in Roblox, my computer bouta blow

1 Like

Another interesting thing I found is this video, which demonstrates creating a GPT-2 like model. Let’s reproduce GPT-2 (124M) - YouTube

It is interesting! Altho I I’ll be dead by the time the video ends :skull:

That guy is a goat. That video is very long it took me like 3 days to watch.

Concerning chatbot some new efficient methods that have reduce the size of LLM that have proposed learning predicting multiple tokens at once instead of single tokens such as capturing the next 2 words as demonstrated above in the for loop indexing words to

function cm.TrainLargeModel(strings,model)
	--local model={}
	for i, str in ipairs(strings) do -- loop through the strings in the table
		local words=cm.splitString(str)
		for t,wo in ipairs(words) do
			local prevw,nextw
			if wo~="I" then
				wo=wo:lower()
			end
			local s=cm.Getsynonyms(wo,true)		
			--print(s[1])

			if model[s[1]]==nil then 
				model[s[1]]={}
				model[s[1]]["fr"]=1
				model[s[1]]["pw"]={}
				model[s[1]]["nw"]={}
				model[s[1]]["p2w"]={}
				model[s[1]]["n2w"]={}
				--	print(model[s[1]])	
			end 
			model[s[1]].fr=model[s[1]].fr+1
			if t~=1 then
				local prev=cm.Getsynonyms(words[t-1],true)
				prevw=prev[1]
				if model[s[1]].pw[prevw]==nil and prevw then
					--	model[s[1]].pw[prevw]=
					model[s[1]].pw[prevw]=1 
				else model[s[1]].pw[prevw]=model[s[1]].pw[prevw]+1	
				end
			end
			if t>2 then
				local prev=cm.Getsynonyms(words[t-2],true)
				prevw=prev[1]
				if model[s[1]].p2w[prevw]==nil and prevw then
					--	model[s[1]].pw[prevw]=
					model[s[1]].p2w[prevw]=1 
				else model[s[1]].p2w[prevw]=model[s[1]].p2w[prevw]+1	
				end
			end
			if t<#words-1 then
				local nex=cm.Getsynonyms(words[t+2],true)
				nextw=nex[1]

				if model[s[1]].n2w[nextw]==nil then model[s[1]].n2w[nextw]=1 
				else model[s[1]].n2w[nextw]=model[s[1]].n2w[nextw]+1	
				end
			end
			
			if t~=#words then
					local nex=cm.Getsynonyms(words[t+1],true)
					nextw=nex[1]

					if model[s[1]].nw[nextw]==nil then model[s[1]].nw[nextw]=1 
					else model[s[1]].nw[nextw]=model[s[1]].nw[nextw]+1	
					end
			end

						

		end
	end	
	--print(model)


	--table.sort(model, function(a, b) return a.fr > b.fr end)

	return model
end

Another thing to note is their position in the sentence this increases the complexity and determines your context length. Typical methods train models on structured data like “<|system|> You are a chatbot, the user is asking about France, the capital is Paris. /n<|user|>\n What is the capital of France? \n<|assistant|>\n The capital of France is Paris.”
Then when you input queries to the model you replace the system message and perform next token prediction using your weights using some randomness (temperature), accumulated probability top_n, and top_p (only tokens with this probability are considered).

function cm.PredictRun2(strings,model)--strings are test inputs to predict every next two words. could be used for training a different layer--language can be compressed to a vector database 
	local responses={}
	for i, str in ipairs(strings) do -- loop through the strings in the table
		local words=cm.splitString(str)
		local eo=0
		local news=""
		local prevc=str
		local hci=0
		local tnwo=nil
		
		for t,wo in ipairs(words) do
			local cap=false	
			--if cm.iscapitalized(wo)==true then
			--	cap=true
			--end

			local prevw="$"
			local nextw="$"
			
			eo=eo+1
			if t>=1 then
			
				if eo>=3 then eo=0
					if wo~="I" then
						wo=wo:lower()
					end
					local s=cm.Getsynonyms(wo,true)		
					--model[s[1]].fr=model[s[1]].fr+1
					if model[s[1]] then
					
						local tn2w=nil
						local tnw=nil
						if t~=#words then
							--local hc=0
							--=words[i+1]
							local hc=0
							
							for c,t in model[s[1]].nw do
								if c~="I" then
									c=string.lower(c)
								end
								---local we =model[c].fr/8
								local sol=t
								if sol>hc and hc>hci then
									hc=sol
									tnw=tostring(c)	
								elseif hci>hc then
									hc=hci
									tnw=tnwo
								end
							end
							hci=0
							
							local hc=0
							--=words[i+1]
							if t<#words-1 then
							for c,t in model[s[1]].n2w do
								if c~="I" then
									c=string.lower(c)
								end
								local we =model[c].fr/8
								local sol=t
								if sol>hc then
									hc=sol
									tn2w=tostring(c)	
								end
							end
						else 
							--tnw=words[#words]
							end
						end	
						--if t~=#words then
						local hc=0
						local lw=words[i-1]
						local roll=cm.mathrandom(1,#model[s[1]].pw)
						local i=0
						for c,t in model[s[1]].pw do
							i=i+1
							if i==roll then	--print(c)
							if c~="I" then
								c=string.lower(c)
							end
							--local we =model[c].fr/2
							local sol=t
							if sol>hc then

								hc=sol

								lw=tostring(c)	
							end
							end
							end
							local l2w=nil
						if i>=3 then l2w=words[i-2]
							
							local roll=cm.mathrandom(1,#model[s[1]].p2w)
							local i=0
							for c,t in model[s[1]].p2w do
								i=i+1
								if i==roll then
								--print(c)
								if c~="I" then
									c=string.lower(c)
								end
								--local we =model[c].fr/2
								--local sol=t
								--if sol>hc then

								--	hc=sol

									l2w=tostring(c)	
									--end
								end	
								end
							end
					
						
						if l2w and l2w:lower()~=prevc:lower() then
								news=news.." "..l2w
						--elseif i>2  then
							--news=news.." "..words[i-2]
							
						end
						
							if lw and lw:lower()~=prevc:lower() then
							news=news.." "..lw
							prevc=lw
						elseif t~=1 then 
							news=news.." "..words[i-1]	
						end	
						
						if tnw and prevc:lower()~=tnw:lower() then
							news=news.." "..s[1].." "..tnw
							prevc=tnw
						elseif i<#words then 
							news=news.." "..s[1].." "..words[i+1]
						end
						if tn2w and prevc:lower()~=tn2w:lower() then
								news=news.." "..tn2w
								prevc=tn2w
						--elseif #words<i+2 then
						--	news=news.." "..words[i+2]	
						end
						prevc=s[1]
						--table.insert()
						--table.sort(model, function(a, b) return a.fr > b.fr end)
					else
						--news=news.." "..wo	
					end	
				else 
					local s=cm.Getsynonyms(wo,true)		
					local tnw=nil
					if model[s] then
					for c,t in model[s[1]].nw do			
						if c~="I" then
							c=string.lower(c)
						end
						---local we =model[c].fr/8
						local sol=t
						if sol>hci then
							hci=sol
							tnwo=tostring(c)	
						end
						end
						
					end	
					--news=news.." "..wo
				end	
			else news=news.." "..wo	prevc=wo
			end	
		end
		table.insert(responses,news)
	end
	print(responses)
end

In this example it’s very simple and not doing too much math just sampling random tokens and inserting them into the input string as a test. That’s about where I left on that side project. Currently very busy working on a MMORPG with en-masse chatbot agents. On a side note, I have released a template model place of parralel chatbot agents, each with their own context window. The open-sourced code I published a couple days ago is something I threw together using the public modules I created. It’s very clean and neat. Took only a few hours to get up an running.

It uses the Awareness module, emojis, emotes, and a small bit of augmented generation, powered by a 7b parameter model on huggingface. It is designed to be simple and customizable check it out if you’re interested! [FREE] Mistral 7b AI Chatbot Agent: Simple to Use! Aware, Infinite Agents, 2000+ Emojis, 100+ Emotes, Memories , Wiki, 32k Context [Open Sourced]

This Chatbot module I still use for Advanced RAG. It now has a context network, where accumulated weights of the context are normalized, added and divided by 2 each step to get a curve where the earliest context has shrinking influence to score a databased response. It works well as a chatbot because it has about 14 nodes that specialize in different things. (MOE technique) These are manually connected to nodes that provide related content, and optimized to disconnect other nodes that are determined to be irrelevant given another context’s score passed the probability threshold, given the input and previous nodes input (Optimized to 1/6 of the input influence). To connect ideas and strengthen current context strength. The responses are stored in a table, organized by highest to low probability and concatenated as a response while dropping responses that are below the averaged scored of all responses.

local neur={}
local neu=require(script.VectorMath)
local ContextBias = {
	["sam"]=function(a) return (a.X) end,
	["smollos"]=function(a) return (a.X-((a.Y/2)*a.Z)) end,
	["los"]=function(a) return (a.X-(a.Y-3*a.Z)) end,
	["suplos"]=function(a) return (a.X-(a.Y*a.Z)) end,
    ["gan"]=function(a) return (a.X+(a.Y*a.Z)) end,
    ["supgan"]=function(a) return (a.X+((a.Y*a.Z)*2)) end,
	["smolgan"]=function(a) return (a.X+(a.Y/2*a.Z)) end,
	["div"]=function(a) return (a.X/(a.Y*a.Z)) end,
	["mul"]=function(a) return (a.X*(a.Y*a.Z)) end,
}
local cwm

local lib=game.ReplicatedStorage.GlobalSpells.ChatbotAlgorithm
local sup2=lib.SupportingData
local sup=nil

local RunService = game:GetService("RunService")
local isLocal = RunService:IsClient()
local bind
if isLocal then
	bind=game.ReplicatedFirst.ContextDBCaller.SS2.Event
	--bind=game.Players.LocalPlayer.PlayerGui.Chatbot.LocalProcessor	
else 
	bind=game.ReplicatedStorage.GlobalSpells.BindableFunction 
end	
--local bind=game.ReplicatedFirst.ContextDBCaller.SS2.Event
function neur.matrix(x,y,z,optionalparameters)		
	return neu.matrix(x,y,z,optionalparameters) end




function neur.ConvalutedContextClassificationModel()
    local contextmodel = {
        --context weight matrix Y,Z are calculated and influence X position.
        ["Greetings"] = {
            ["weights"] = neu.matrix(10, 1, 1),
            --10 table 10 numbers ez
            ["chain"] = {
                ["Emotions"] = neu.matrix(9, 1, 1),
                ["Awareness"] = neu.matrix(8, 1, 1),
                ["Empathy"] = neu.matrix(7, 1, 1),
                --  ["Search"]=neu.matrix(5,1,.9),
                ["Classify"] = neu.matrix(6, 1, 1),
                ["Support"] = neu.matrix(2, 1, .8),
                ["Database"] = neu.matrix(3, 1, 1),
                ["Greetings"]=neu.matrix(3,1,1),
                ["Wisdom"] = neu.matrix(4, 1, 1)
                --["Math"]=neu.matrix(0,1,1),
                -- ["Philosophy"]=neu.matrix(1,1,1),
                --["Search"]=neu.matrix(1,1,1),
                --search accuracy
            },
            ["cone"] = {
               -- ["Emotions"] = neu.matrix(9, 1, 1),
               --["Support"] = neu.matrix(2, 1, .8),
                --["Empathy"]=neu.matrix(9,1,1),
                --["Awareness"]=neu.matrix(8,1,1),
                ["Classify"] = neu.matrix(6, 1, 1),
               --["Database"] = neu.matrix(6, 1, 1),
                --["Database"] = neu.matrix(6, 1, 1),
            },
            ["disconnect"] = {
              -- "Therapist
    "ScienceWisdom","Database","Wisdom","Support","Empathy",--"Awareness"
            },
            ["parameters"] = {
                filter=true,
                complete=true,
                randomize=true                      
            },
            ["entropy"] = ContextBias["gan"]
            --decrease weight of go on to other topics.
        },
        ["Determinant"] = {
            ["weights"] = neu.matrix(8, 1, 1),
            --10 table 10 numbers ez
            ["chain"] = {
                ["Emotions"] = neu.matrix(9, 1, 1),
                ["Awareness"] = neu.matrix(8, 1, 1),
                ["Empathy"] = neu.matrix(7, 1, 1),
                --  ["Search"]=neu.matrix(5,1,.9),
                ["Classify"] = neu.matrix(6, 1, 1),
                ["Support"] = neu.matrix(2, 1, .8),
                ["Database"] = neu.matrix(3, 1, 1),
                ["Greetings"]=neu.matrix(3,1,1),
                ["Wisdom"] = neu.matrix(4, 1, 1)
                --["Math"]=neu.matrix(0,1,1),
                -- ["Philosophy"]=neu.matrix(1,1,1),
                --["Search"]=neu.matrix(1,1,1),
                --search accuracy
            },
            ["cone"] = {
               -- ["Emotions"] = neu.matrix(9, 1, 1),
               --["Support"] = neu.matrix(2, 1, .8),
                --["Empathy"]=neu.matrix(9,1,1),
                --["Awareness"]=neu.matrix(8,1,1),
              --  ["Classify"] = neu.matrix(6, 1, 1),
               --["Database"] = neu.matrix(6, 1, 1),
                --["Database"] = neu.matrix(6, 1, 1),
            },
            ["disconnect"] = {
              --"Therapist,
    "ScienceWisdom","Database","Wisdom","Support","Empathy","Awareness","Greetings","Love",
            },
            ["parameters"] = {
                filter=true,
                complete=true,
                randomize=true                      
            },
            ["entropy"] = ContextBias["gan"]
            --decrease weight of go on to other topics.
        },
        ["Commands"] = {
            ["weights"] = neu.matrix(10, 1, 1),
            --10 table 10 numbers ez
            ["chain"] = {
                ["Emotions"] = neu.matrix(9, 1, 1),
                ["Awareness"] = neu.matrix(8, 1, 1),
               
            },
            ["cone"] = {
               -- ["Emotions"] = neu.matrix(9, 1, 1),
               --["Support"] = neu.matrix(2, 1, .8),
                --["Empathy"]=neu.matrix(9,1,1),
                --["Awareness"]=neu.matrix(8,1,1),
                ["Classify"] = neu.matrix(6, 1, 1),
               --["Database"] = neu.matrix(6, 1, 1),
                --["Database"] = neu.matrix(6, 1, 1),
            },
            ["disconnect"] = {
              -- "Therapist
  --  "ScienceWisdom",
--"Database","Wisdom","Support","Empathy",--"Awareness"
            },
            ["parameters"] = {
                filter=true,
                complete=true,
                randomize=true                      
            },
            ["entropy"] = ContextBias["gan"]
            --decrease weight of go on to other topics.
        },
        ["Personal"] = {
            ["weights"] = neu.matrix(8, 1, 1),
            --10 table 10 numbers ez
            ["chain"] = {
                ["Emotions"] = neu.matrix(9, 1, 1),
                ["Awareness"] = neu.matrix(8, 1, 1),
                ["Empathy"] = neu.matrix(7, 1, 1),
                --  ["Search"]=neu.matrix(5,1,.9),
                ["Classify"] = neu.matrix(6, 1, 1),
                ["Support"] = neu.matrix(2, 1, .8),
                ["Database"] = neu.matrix(3, 1, 1),
                ["Greetings"]=neu.matrix(3,1,1),
                ["Wisdom"] = neu.matrix(4, 1, 1)
                --["Math"]=neu.matrix(0,1,1),
                -- ["Philosophy"]=neu.matrix(1,1,1),
                --["Search"]=neu.matrix(1,1,1),
                --search accuracy
            },
            ["cone"] = {
               -- ["Emotions"] = neu.matrix(9, 1, 1),
               --["Support"] = neu.matrix(2, 1, .8),
                --["Empathy"]=neu.matrix(9,1,1),
                --["Awareness"]=neu.matrix(8,1,1),
                ["Classify"] = neu.matrix(6, 1, 1),
               --["Database"] = neu.matrix(6, 1, 1),
                --["Database"] = neu.matrix(6, 1, 1),
            },
            ["disconnect"] = {
              -- "Therapist
    "ScienceWisdom","Database","Wisdom","Support","Empathy",--"Awareness"
            },
            ["parameters"] = {
                filter=true,
                complete=true,
                randomize=true                      
            },
            ["entropy"] = ContextBias["gan"]
            --decrease weight of go on to other topics.
        },
        ["Love"] = {
            ["weights"] = neu.matrix(8, 1, 1),
            --10 table 10 numbers ez
            ["chain"] = {
                ["Emotions"] = neu.matrix(9, 1, 1),
                ["Awareness"] = neu.matrix(8, 1, 1),
                ["Empathy"] = neu.matrix(7, 1, 1),
                --  ["Search"]=neu.matrix(5,1,.9),
                ["Classify"] = neu.matrix(6, 1, 1),
                ["Support"] = neu.matrix(2, 1, .8),
                ["Database"] = neu.matrix(3, 1, 1),
                ["Greetings"]=neu.matrix(3,1,1),
                ["Wisdom"] = neu.matrix(4, 1, 1)
                --["Math"]=neu.matrix(0,1,1),
                -- ["Philosophy"]=neu.matrix(1,1,1),
                --["Search"]=neu.matrix(1,1,1),
                --search accuracy
            },
            ["cone"] = {
               -- ["Emotions"] = neu.matrix(9, 1, 1),
               --["Support"] = neu.matrix(2, 1, .8),
                --["Empathy"]=neu.matrix(9,1,1),
                --["Awareness"]=neu.matrix(8,1,1),
                ["Classify"] = neu.matrix(6, 1, 1),
               --["Database"] = neu.matrix(6, 1, 1),
                --["Database"] = neu.matrix(6, 1, 1),
            },
            ["disconnect"] = {
              -- "Therapist
    "ScienceWisdom","Database","Wisdom","Support","Empathy",--"Awareness"
            },
            ["parameters"] = {
                filter=true,
                complete=true,
                randomize=true                      
            },
            ["entropy"] = ContextBias["gan"]
            --decrease weight of go on to other topics.
		},
		["Music"] = {
			["weights"] = neu.matrix(2, 1, 1),
			--10 table 10 numbers ez
			["chain"] = {

			},
			["cone"] = {
		
			},
			["disconnect"] = {
				"Therapist",
				--"ScienceWisdom","Database","Wisdom","Support","Empathy",--"Awareness"
			},
			["parameters"] = {
				filter=true,
				complete=true,
				randomize=true                      
			},
			["entropy"] = ContextBias["gan"]
			--decrease weight of go on to other topics.
		},
        ["Judgement"] = {
            ["weights"] = neu.matrix(5, 1, 1),
            --10 table 10 numbers ez
            ["chain"] = {
                ["Emotions"] = neu.matrix(9, 1, 1),
                ["Awareness"] = neu.matrix(8, 1, 1),
                ["Empathy"] = neu.matrix(7, 1, 1),
                --  ["Search"]=neu.matrix(5,1,.9),
                ["Classify"] = neu.matrix(6, 1, 1),
                ["Support"] = neu.matrix(2, 1, .8),
                ["Database"] = neu.matrix(3, 1, 1),
                ["Greetings"]=neu.matrix(3,1,1),
                ["Wisdom"] = neu.matrix(4, 1, 1)
            },
            ["cone"] = {

                ["Classify"] = neu.matrix(6, 1, 1),
            },
            ["disconnect"] = {
              -- "Therapist
   -- "ScienceWisdom","Database","Wisdom","Support","Empathy",--"Awareness"
            },
            ["parameters"] = {
                filter=true,
                complete=true,
                randomize=true                      
            },
            ["entropy"] = ContextBias["gan"]
            --decrease weight of go on to other topics.
        },
        ["Emotions"] = {
            ["weights"] = neu.matrix(9, 1, 1),
            ["chain"] = {
                --["Emotions"]=neu.matrix(9,1,1),
                ["Greetings"] = neu.matrix(5, 1, 1),
                --["Math"]=neu.matrix(0,1,1),
                ["Awareness"] = neu.matrix(8, 1, 1),
                ["Empathy"] = neu.matrix(10, 1, 1),
	               --["Search"]=neu.matrix(4,1,.9),
                ["Classify"] = neu.matrix(7, 1, 1),
                ["Support"] = neu.matrix(3, 1, .8),
                ["Database"] = neu.matrix(6, 1, 1),
                ["Bestiary"] = neu.matrix(5, 1, 1),
                ["Wisdom"] = neu.matrix(4, 1, 1),
                ["Philosophy"] = neu.matrix(2, 1, 1)
            },
            ["cone"] = {
                --["Empathy"]=neu.matrix(9,1,1),
                ["Wisdom"] = neu.matrix(8, 1, 1)
                --["Wisdom"]=neu.matrix(8,1,1),
            },
            ["disconnect"] = {
                "ScienceWisdom","Empathy","Database",--"Awareness"
            },
            ["parameters"] = {
                filter=true,
                complete=true,
                randomize=true                      
            },
            ["entropy"] = ContextBias["smolgan"]
        },      
        ["Empathy"] = {
            ["weights"] = neu.matrix(7, 1, 1),
            ["chain"] = {
                ["Philosophy"] = neu.matrix(10, 1, 1),
                --["Math"]=neu.matrix(0,1,1),
                ["Wisdom"] = neu.matrix(9, 1, 1),
                ["Bestiary"] = neu.matrix(8, 1, 1),
                ["Classify"] = neu.matrix(3, 1, 1),
                ["Emotions"] = neu.matrix(1, 1, 1),
                ["Database"] = neu.matrix(6, 1, 1),
                --["Search"]=neu.matrix(7,1,.9),
                ["Awareness"] = neu.matrix(6, 1, 1),
                ["Greetings"] = neu.matrix(2, 1, 1),
                ["Support"] = neu.matrix(5, 1, .8)
            },
            --["Empathy"]=neu.matrix(,1,1),
            ["entropy"] = ContextBias["gan"],
            ["cone"] = {
                ["Wisdom"] = neu.matrix(9, 1, 1)
            },
            ["disconnect"] = {
                "ScienceWisdom","Database",--"Awareness"
            },
            ["parameters"] = {
                filter=false,
                complete=true,
                randomize=true                      
            },
        },
        ["Therapist"] = {
            ["weights"] = neu.matrix(8, 1, 1),
            ["chain"] = {
                ["Philosophy"] = neu.matrix(10, 1, 1),
                --["Math"]=neu.matrix(0,1,1),
                ["Wisdom"] = neu.matrix(9, 1, 1),
                ["Bestiary"] = neu.matrix(8, 1, 1),
                ["Classify"] = neu.matrix(3, 1, 1),
                ["Emotions"] = neu.matrix(1, 1, 1),
                ["Database"] = neu.matrix(6, 1, 1),
                --["Search"]=neu.matrix(7,1,.9),
                ["Awareness"] = neu.matrix(6, 1, 1),
                ["Greetings"] = neu.matrix(2, 1, 1),
                ["Support"] = neu.matrix(5, 1, .8)
            },
            --["Empathy"]=neu.matrix(,1,1),
            ["entropy"] = ContextBias["gan"],
            ["cone"] = {
                --["Wisdom"] = neu.matrix(9, 1, 1)
                
            },
            ["disconnect"] = {
               --"Greetings"
            },
            ["parameters"] = {
                filter=true,
                complete=true,
                randomize=false                      
            },
        },     
        ["Support"] = {
            ["weights"] = neu.matrix(2, 1, 1),
            --subneuron only
            ["chain"] = {
                ["Emotions"] = neu.matrix(3, 1, 1),
                --["Math"] = neu.matrix(2, 1, 1),
                ["Greetings"] = neu.matrix(1, 1, 1),
                ["Awareness"] = neu.matrix(6, 1, 1),
                ["Empathy"] = neu.matrix(4, 1, 1),
                ["Search"] = neu.matrix(2, 1, .9),
                ["Classify"] = neu.matrix(3, 1, 1),
                --["Support"]=neu.matrix(,1,.8),
                ["Database"] = neu.matrix(8, 1, 1),
                ["Bestiary"] = neu.matrix(9, 1, 1),
                ["Wisdom"] = neu.matrix(6, 1, 1),
                ["Philosophy"] = neu.matrix(3, 1, 1)
            },
            ["cone"] = {
                ["Motivation"] = neu.matrix(4, 1, .8),
                ["Wisdom"] = neu.matrix(3, 1, .8),
                ["Truths"] = neu.matrix(6, 1, .8)
            },
            ["disconnect"] = {
                "ScienceWisdom","Inspiration","Database","Wisdom",
            },
            ["parameters"] = {
                filter=false,
                complete=true,
                randomize=false                     
            },
            ["entropy"] = ContextBias["los"]
        },
        ["Wisdom"] = {
            ["weights"] = neu.matrix(3, 1, 1),
            ["chain"] = {
                ["Emotions"] = neu.matrix(4, 1, 1),
              --  ["Math"] = neu.matrix(2, 1, 1),
                ["Greetings"] = neu.matrix(4, 1, 1),
                ["Awareness"] = neu.matrix(5, 1, 1),
                ["Empathy"] = neu.matrix(7, 1, 1),
                ["Search"] = neu.matrix(3, 1, .9),
                ["Classify"] = neu.matrix(1, 1, 1),
                ["Support"] = neu.matrix(5, 1, .8),
                ["Database"] = neu.matrix(8, 1, 1),
                ["Bestiary"] = neu.matrix(9, 1, 1),
                --["Wisdom"]=neu.matrix(4,1,1),
                ["Philosophy"] = neu.matrix(10, 1, 1)
            },
            ["cone"] = {
                ["Support"] = neu.matrix(5, 1, .8)
            },
            ["entropy"] = ContextBias["smolgan"],
            ["disconnect"] = {
                "Inspiration","Truths","Spirituality","Motivation","ScienceWisdom","Database"
               
            },
            ["parameters"] = {
                filter=true,
                complete=true,
                randomize=false                      
            },
        },
        ["Philosophy"] = {
            ["weights"] = neu.matrix(2, 1, 1),
            ["chain"] = {
                ["Emotions"] = neu.matrix(9, 1, 1),
                ["Greetings"] = neu.matrix(4, 1, 1),
                ["Awareness"] = neu.matrix(5, 1, 1),
                ["Empathy"] = neu.matrix(7, 1, 1),
                ["Classify"] = neu.matrix(1, 1, 1),
                ["Support"] = neu.matrix(6, 1, .8),
                ["Database"] = neu.matrix(8, 1, 1),
                ["Bestiary"] = neu.matrix(9, 1, 1),
                ["Wisdom"] = neu.matrix(10, 1, 1)
            },
            ["cone"] = {
                ["Motivation"] = neu.matrix(4, 1, .8),
                ["Wisdom"] = neu.matrix(3, 1, .8),
                ["Truths"] = neu.matrix(6, 1, .8)
            },
            ["disconnect"] = {
                "Database"
            },
            ["entropy"] = ContextBias["gan"],
            
            ["parameters"] = {
                filter=true,
                complete=true,
                randomize=false                      
            },
            
        },
        ["Bestiary"] = {
            ["weights"] = neu.matrix(3, 1, 1),
            ["cone"] = {},
            ["chain"] = {
                ["Emotions"] = neu.matrix(9, 1, 1),
                --["Math"] = neu.matrix(2, 1, 1),
                ["Greetings"] = neu.matrix(4, 1, 1),
                ["Awareness"] = neu.matrix(9, 1, 1),
                ["Empathy"] = neu.matrix(7, 1, 1),
                --["Search"]=neu.matrix(6,1,.9),
                ["Classify"] = neu.matrix(5, 1, 1),
                ["Support"] = neu.matrix(1, 1, .8),
                ["Database"] = neu.matrix(10, 1, 1),
                --	["Bestiary"]=neu.matrix(3,1,1),
                ["Wisdom"] = neu.matrix(2, 1, 1),
                ["Philosophy"] = neu.matrix(1, 1, 1)
            },
            ["disconnect"] = {
                "Search","Sciencewisdom","Emotions","Greetings",
                "Inspiration","Truths","Spirituality","Motivation","Philosophy",
            },
            ["parameters"] = {
                filter=true,
                complete=false,
                randomize=false                      
            },
            ["entropy"] = ContextBias["gan"]
        },
        ["Search"] = {
            ["weights"] = neu.matrix(2, 1, 1),
            ["chain"] = {
                ["Emotions"] = neu.matrix(9, 1, 1),
               -- ["Math"] = neu.matrix(0, 1, 1),
                ["Greetings"] = neu.matrix(1, 1, 1),
                ["Awareness"] = neu.matrix(4, 1, 1),
                ["Empathy"] = neu.matrix(2, 1, 1),
                --["Search"]=neu.matrix(4,1,.9),
                ["Classify"] = neu.matrix(7, 1, 1),
                ["Support"] = neu.matrix(5, 1, .8),
                ["Database"] = neu.matrix(7, 1, 1),
                ["Bestiary"] = neu.matrix(5, 1, 1),
                ["Wisdom"] = neu.matrix(7, 1, 1),
                ["Philosophy"] = neu.matrix(6, 1, 1)
            },
            ["cone"] = {},
            ["entropy"] = ContextBias["gan"],
            ["disconnect"] = {
                "Inspiration","Truths","Spirituality","Motivation","ScienceWisdom","Database","Wisdom","Support"
                ,"Philosophy","Bestiary"
            },
        },
        ["Database"] = {
            ["weights"] = neu.matrix(3, 1, 1),
            ["chain"] = {
                ["Emotions"] = neu.matrix(9, 1, 1),
                ["Math"] = neu.matrix(0, 1, 1),
                ["Greetings"] = neu.matrix(1, 1, 1),
                ["Awareness"] = neu.matrix(4, 1, 1),
                ["Empathy"] = neu.matrix(2, 1, 1),
                ["Search"] = neu.matrix(4, 1, .9),
                ["Classify"] = neu.matrix(7, 1, 1),
                ["Support"] = neu.matrix(5, 1, .8),
                ["Database"] = neu.matrix(2, 1, 1),
                ["Bestiary"] = neu.matrix(5, 1, 1),
                ["Wisdom"] = neu.matrix(7, 1, 1),
                ["Philosophy"] = neu.matrix(6, 1, 1)
            },
            ["cone"] = {
                ["Support"] = neu.matrix(5, 1, .8)
            },
            ["entropy"] = ContextBias["suplos"],
            ["disconnect"] = {
               "ScienceWisdom"
               ,"Philosophy","Empathy","Support",
                "Inspiration","Truths","Spirituality","Motivation",
                
            },
            ["parameters"] = {
                filter=true,
                complete=false,
                randomize=false                      
            },
        },
        ["Awareness"] = {
            ["weights"] = neu.matrix(5, 1, 1),
            ["cone"] = {
         
                ["Bestiary"] = neu.matrix(5, 1, 1)
            },
            --["Wisdom"]=neu.matrix(7,1,1),},
            ["chain"] = {
                ["Emotions"] = neu.matrix(9, 1, 1),             
                ["Greetings"] = neu.matrix(7, 1, 1),
                ["Awareness"] = neu.matrix(4, 1, 1),
                ["Empathy"] = neu.matrix(2, 1, 1),
                ["Classify"] = neu.matrix(7, 1, 1),
                ["Support"] = neu.matrix(5, 1, .8),
                ["Database"] = neu.matrix(7, 1, 1),
                ["Bestiary"] = neu.matrix(5.5, 1, 1),
                ["Wisdom"] = neu.matrix(6.5, 1, 1),
                ["Philosophy"] = neu.matrix(6, 1, 1)
            },
            ["parameters"] = {
                filter=true,
                complete=true,
                randomize=false                      
            },
            ["entropy"] = ContextBias["gan"]
        },
        ["Math"] = {
            ["weights"] = neu.matrix(4, 1, 1),
            ["chain"] = {
                ["Emotions"] = neu.matrix(9, 1, 1),
                ["Greetings"] = neu.matrix(5, 1, 1), --["Math"]=neu.matrix(0,1,1),
                ["Awareness"] = neu.matrix(4, 1, 1),
                ["Empathy"] = neu.matrix(5, 1, 1),
                ["Search"] = neu.matrix(3, 1, .9),
                ["Classify"] = neu.matrix(6, 1, 1),
                ["Support"] = neu.matrix(7, 1, .8),
                ["Database"] = neu.matrix(1, 1, 1),
                ["Bestiary"] = neu.matrix(2, 1, 1),
                ["Wisdom"] = neu.matrix(4, 1, 1),
                ["Philosophy"] = neu.matrix(3, 1, 1)
            }, --["Math"]=neu.matrix(0,1,1),
            ["cone"] = {},
            ["entropy"] = ContextBias["supgan"],
            ["disconnect"] = {
                "Inspiration","Truths","Spirituality","Motivation","ScienceWisdom","Database","Wisdom"
                ,"Philosophy","Bestiary","Emotions","Empathy"
            },
        },
        ["Inspiration"] = {
            ["weights"] = neu.matrix(9, 1, 1),
            ["chain"] = {
                ["Emotions"] = neu.matrix(9, 1, 1),
                ["Math"] = neu.matrix(0, 1, 1),
                ["Greetings"] = neu.matrix(1, 1, 1),
                ["Awareness"] = neu.matrix(4, 1, 1),
                ["Empathy"] = neu.matrix(2, 1, 1),
                ["Search"] = neu.matrix(4, 1, .9),
                ["Classify"] = neu.matrix(7, 1, 1),
                ["Support"] = neu.matrix(5, 1, .8),
                ["Database"] = neu.matrix(7, 1, 1),
                ["Bestiary"] = neu.matrix(5, 1, 1),
                ["Wisdom"] = neu.matrix(7, 1, 1),
                ["Philosophy"] = neu.matrix(6, 1, 1)
            },
            ["cone"] = {
                ["Greetings"] = neu.matrix(1, 1, 1),
                ["Classify"] = neu.matrix(7, 1, 1),
                ["Wisdom"] = neu.matrix(4, 1, 1)
            },
            ["entropy"] = ContextBias["smolgan"],
            ["disconnect"] = {
               "ScienceWisdom","Database"
                ,"Bestiary",
            },
            ["query"] = function(str,_,filter, repetitive,randomize,context,reverse,spaces,mode,synomarray,words2,tl)
                -- First, we need to get the ReplicatedStorage service
                if sup == nil then
                    sup = require(sup2)
                end -- object with the given name
            
                local Result, blacklist, score, weight =
                    bind:Invoke(
                        str,
                        sup.Inspiration(),
                        filter,
                        repetitive,
                        randomize,
                        context,
                        reverse,
                        spaces,
                        mode,
                        synomarray,
                        words2
                    )
                --print(Result)
                return Result, blacklist, score, weight
            end
        },
        ["ScienceWisdom"] = {
            ["weights"] = neu.matrix(5, 1, 1),
            ["chain"] = {
                ["Emotions"] = neu.matrix(9, 1, 1),
                ["Math"] = neu.matrix(0, 1, 1),
                ["Greetings"] = neu.matrix(1, 1, 1),
                ["Awareness"] = neu.matrix(4, 1, 1),
                ["Empathy"] = neu.matrix(2, 1, 1),
                ["Search"] = neu.matrix(4, 1, .9),
                ["Classify"] = neu.matrix(7, 1, 1),
                ["Support"] = neu.matrix(5, 1, .8),
                ["Database"] = neu.matrix(7, 1, 1),
                ["Bestiary"] = neu.matrix(5, 1, 1),
                ["Wisdom"] = neu.matrix(7, 1, 1),
                ["Philosophy"] = neu.matrix(6, 1, 1)
            },
            ["cone"] = {
                ["Support"] = neu.matrix(5, 1, .8)
            },
            ["disconnect"] = {
                "Spirituality","Wisdom","Bestiary","Empathy",--"Awareness"
            },
            ["entropy"] = ContextBias["gan"],
            ["query"] = function(str,_,filter, repetitive,randomize,context,reverse,spaces,mode,synomarray,words2,tl)

                -- First, we need to get the ReplicatedStorage service
                if sup == nil then
                    sup = require(sup2)
                end -- object with the given name
               
                local Result, blacklist, score, weight =
                    bind:Invoke(
                        str,
                        sup.ScienceWisdom(),
                        filter,
                        repetitive,
                        randomize,
                        context,
                        reverse,
                        spaces,
                        mode,
                        synomarray,
                        words2
                    )
                print(Result)
                return Result, blacklist, score, weight
            end
        },
        ["Motivation"] = {
            ["weights"] = neu.matrix(3, 1, 1),
            ["chain"] = {
                ["Emotions"] = neu.matrix(9, 1, 1),
                ["Math"] = neu.matrix(0, 1, 1),
                ["Greetings"] = neu.matrix(1, 1, 1),
                ["Awareness"] = neu.matrix(4, 1, 1),
                ["Empathy"] = neu.matrix(2, 1, 1),
                ["Search"] = neu.matrix(4, 1, .9),
                ["Classify"] = neu.matrix(7, 1, 1),
                ["Support"] = neu.matrix(5, 1, .8),
                ["Database"] = neu.matrix(7, 1, 1),
                ["Bestiary"] = neu.matrix(5, 1, 1),
                ["Wisdom"] = neu.matrix(7, 1, 1),
                ["Philosophy"] = neu.matrix(6, 1, 1)
            },
            ["cone"] = {
               -- ["Motivation"] = neu.matrix(4, 1, .8),
                ["Wisdom"] = neu.matrix(3, 1, .8),
                ["Truths"] = neu.matrix(6, 1, .8),
              --  ["Support"] = neu.matrix(5, 1, .8)
            },
            ["disconnect"] = {
                "ScienceWisdom","Search","Database"
            },
            ["parameters"] = {
                filter=true,
                complete=true,
                randomize=false                      
            },
            
            ["entropy"] = ContextBias["smolgan"],
            ["query"] = function(
                str,
                _,
               filter,
                repetitive,
                randomize,
                context,
                reverse,
                spaces,
                mode,
                synomarray,
                words2,
                tl)
                -- First, we need to get the ReplicatedStorage service
                if sup == nil then
                    sup = require(sup2)
                end -- object with the given name
               
                local Result, blacklist, score, weight =
                    bind:Invoke(
                        str,
                        sup.Motivation(),
                        filter,
                        repetitive,
                        randomize,
                        context,
                        reverse,
                        spaces,
                        mode,
                        synomarray,
                        words2
                    )
            --    print(Result)
                return Result, blacklist, score, weight
            end
        },
        ["Truths"] = {
            ["weights"] = neu.matrix(3, 1, 1),
            ["chain"] = {
                ["Emotions"] = neu.matrix(9, 1, 1),
                ["Math"] = neu.matrix(0, 1, 1),
                ["Greetings"] = neu.matrix(1, 1, 1),
                ["Awareness"] = neu.matrix(4, 1, 1),
                ["Empathy"] = neu.matrix(2, 1, 1),
                ["Search"] = neu.matrix(4, 1, .9),
                ["Classify"] = neu.matrix(7, 1, 1),
                ["Support"] = neu.matrix(5, 1, .8),
                ["Database"] = neu.matrix(7, 1, 1),
                ["Bestiary"] = neu.matrix(5, 1, 1),
                ["Wisdom"] = neu.matrix(7, 1, 1),
                ["Philosophy"] = neu.matrix(6, 1, 1)
            },
            ["cone"] = {
               -- ["Motivation"] = neu.matrix(4, 1, .8),
                ["Wisdom"] = neu.matrix(3, 1, .8),
                --["Truths"] = neu.matrix(6, 1, .8),
                --["Support"] = neu.matrix(5, 1, .8)
            },
            ["disconnect"] = {
                "Bestiary","Database"
            },
            ["parameters"] = {
                filter=true,
                complete=true,
                randomize=false                      
            },

            ["entropy"] = ContextBias["gan"],
            ["query"] = function(
                str,
                _,
                filter,
                repetitive,
                randomize,
                context,
                reverse,
                spaces,
                mode,
                synomarray,
                words2,
                tl)
                -- First, we need to get the ReplicatedStorage service
                if sup == nil then
                    sup = require(sup2)
                end -- object with the given name
               
                local Result, blacklist, score, weight =
                    bind:Invoke(
                        str,
                        sup.Truths(),
                        filter,
                        repetitive,
                        randomize,
                        context,
                        reverse,
                        spaces,
                        mode,
                        synomarray,
                        words2
                    )
                print(Result)
                return Result, blacklist, score, weight
            end
        },
        ["Spirituality"] = {
            ["weights"] = neu.matrix(3, 1, 1),
            ["chain"] = {
                ["Emotions"] = neu.matrix(9, 1, 1),
                ["Math"] = neu.matrix(0, 1, 1),
                ["Greetings"] = neu.matrix(1, 1, 1),
                ["Awareness"] = neu.matrix(4, 1, 1),
                ["Empathy"] = neu.matrix(2, 1, 1),
                ["Search"] = neu.matrix(4, 1, .9),
                ["Classify"] = neu.matrix(7, 1, 1),
                ["Support"] = neu.matrix(5, 1, .8),
                ["Database"] = neu.matrix(7, 1, 1),
                ["Bestiary"] = neu.matrix(5, 1, 1),
                ["Wisdom"] = neu.matrix(7, 1, 1),
                ["Philosophy"] = neu.matrix(6, 1, 1)
            },
            ["cone"] = {
                ["Motivation"] = neu.matrix(4, 1, .8),
                ["Wisdom"] = neu.matrix(3, 1, .8),
                ["Truths"] = neu.matrix(6, 1, .8)
            },
            ["disconnect"] = {
                "ScienceWisdom","Database"
            },
            ["parameters"] = {
                filter=true,
                complete=true,
                randomize=false                      
            },

            ["entropy"] = ContextBias["smolgan"],
            ["query"] = function(
                str,
                _,
                filter,
                repetitive,
                randomize,
                context,
                reverse,
                spaces,
                mode,
                synomarray,
                words2,
                tl)
                -- First, we need to get the ReplicatedStorage service
                if sup == nil then
                    sup = require(sup2)
                end -- object with the given name
               
                local Result, blacklist, score, weight =
                    bind:Invoke(
                        str,
                        sup.Spirituality(),
                        filter,
                        repetitive,
                        randomize,
                        context,
                        reverse,
                        spaces,
                        mode,
                        synomarray,
                        words2
                    )
                print(Result)
                return Result, blacklist, score, weight
            end
        }
    }
return contextmodel
end

function sigmoid(x)
	return 1 / (1 + math.exp(-x))
end
function neur.writevector(cwm,Key,X,Y,Z)--x is reptition, y is count,z weight
	--if X==nil then X=0 end if Y==nil then Y=0 end if Z==nil then Z=0 end
	
	local state
	--neu.matrix(cwm[Key].x+X,cwm[Key].y+Y,cwm[Key].z+Z)
	--local keyweight=cwm[Key]["weights"]	
	
	cwm[Key]["weights"].Y=Y
	cwm[Key]["weights"].Z=Z
    cwm[Key]["weights"].X=sigmoid(math.abs(cwm[Key]["entropy"](cwm[Key]["weights"])))
    return cwm[Key]["weights"]
end
return neur

Finally it uses this function to transform the output based on the query, using the synonyms library.

function cm.randomizeStringLight(str,interphrases,randomize,once)  
    local newS = str
    local s=str
   local sentences = {}
if str then 
   local per=" "
    if str:gmatch("[^%.]+") then
        --  per=". "
        for s in str:gmatch("[^%.]+") do
            table.insert(sentences, s..".")
        end
    else 
        per=" "
        sentences={str}
    end
    for i,v in sentences do
        local str=sentences[i]  
   -- if phrase:lower():find(s:lower())==nil then
                        -- Pick a random phrase from the same group
        for j, phrases in ipairs(interphrases) do
            for k, phrase in ipairs(phrases) do
                local pass=nil
                if s:find(phrase) then       
                pass=true            
                elseif s:find(phrase:lower()) then                   
                pass=false
                 elseif s:lower():find(phrase:lower()) then   
                pass=0                
              end

           if pass~=nil then   local randomPhrase,cap
               
                      --  local wordarray=cm.splitString(newS)
                          if randomize==nil then
                            randomPhrase = phrases[mathrandom(#phrases)]
                        elseif randomize=="query" then
                       -- print(once) 
--query,database,filter,complete,words2,synomarray)
                    randomPhrase=cm.QueryDatabase(once,phrases,false,false)--,false,false,false,nil,nil)--,synomarray,filterwords,answer2)  
                 --  print(randomPhrase) 
                     if randomPhrase~=nil then 
               --     print("Got context phrase")    
                    else  randomPhrase = phrases[mathrandom(#phrases)]
                    end else
                            randomPhrase= phrases[randomize]                      
                         end
                        if pass==0 or false then randomPhrase=phrase:lower()  end
                        newS = replace_str(newS, phrase, randomPhrase)
                       -- newS = newS:gsub(phrase, randomPhrase)	
                        if once==true then break end           
                end
            end
      
   end 
end

-- Define a function that takes three arguments: a string, a word to replace, and a replacement word
-- Define a function that takes three arguments: a string, a word to replace, and a replacement word

    return newS
end
end
  
function cm.randomizeString(str,query) 
    local newS = str
	if interchangephrases==nil then
		interchangephrases=require(script.InterchangePhrases)
		
	end  

 
	for j, phrases in (interchangephrases) do
		for k, phrase in (phrases) do
			--cm.GetInterchangephrases(s,complete)
			if newS:lower():find(phrase:lower()) then
				-- Pick a random phrase from the same group
			--	local randomPhrase = phrases[mathrandom(#phrases)]
				-- Replace the original phrase with the random one
				--newS = newS:gsub(phrase, randomPhrase)
                  if query then

                newS=cm.randomizeStringLight(str,{phrases},"query",query)	
               else 
                    
                newS=cm.randomizeStringLight(str,{phrases})--,"query",query)
                 end
               -- newS=replace_str(newS, phrase, randomPhrase)
               -- newS=cm.randomizeStringLight(str,{phrases})	 
		end
		end
	end
	local wordarray=cm.splitString(newS)
	for i, d in (wordarray) do
	
		local phrases=cm.Getsynonyms(d,true)
    if #phrases>1 then
for k, phrase in (phrases) do          
         --   print(phrases)   
 	if newS:lower():find(phrase:lower()) then
				-- Pick a random phrase from the same g  
          if query then

                newS=cm.randomizeStringLight(str,{phrases},"query",query)	
               else 
                newS=cm.randomizeStringLight(str,{phrases})--,"query",query)
                 end
end
end
end
	--newS=cm.randomizeStringLight(str,{phrases})	
   -- 	for k, phrase in (phrases) do
			--if string.match(newS:lower(),"^"..phrase:lower().."$") then
			
			--	--if cap==true then
			--		-- Pick a random phrase from the same group
			--		local randomPhrase = phrases[mathrandom(#phrases)]
			--	-- Replace the original phrase with the random one
			--	if cap==true then randomPhrase=cm.capitalize(randomPhrase) end
			--		newS = newS:gsub(phrase, randomPhrase)
			--	end
			--end
--end

	end
	
return newS
        
end

If is basically monte carlo tree search.
Improving LLM accuracy with Monte Carlo Tree Search (youtube.com)
A mixture of these implications combine in a retrieval based chat bot. That answers simple queries and uses a LLM for answers which it has low accuracy. You can test it here.
Epic RPG with AI +Optimized Performance - Roblox