DataPredict [Release 1.21] - General Purpose Machine Learning And Deep Learning Library (Learning AIs, Generative AIs, and more!)

Why is it public? It was because you decided to continue to not send issues to my private DM and ended up crossing the line of being demanding. Add to that you complained about so called “issues” with my library when I have given you some solutions which led to even more complaining.

You asked and complained for Double Q Neural network, then you complained for single output neural output neuron when two output neurons is sufficient.

What makes me really irritated that you wanted to change the way I do my solutions when my solutions is literally fit for these kind of games.

If you don’t like it, go make your own one. I would like to see your version works.

I spent 7 months on this project and you expect me to cater all your demands.

You could have manually implemented those models with the codes that you already have here, like I suggested, but nooo, you want me to do it for you.

Talk about entitled. And to quote YouTuber Louis Rossmann, “You are not investable”.

3 Likes

Okay then. Fair enough. I probably should’ve moved this conversation to DMs in the first place. Let’s move it now.

Here’s an example of how I set up this module in a working state using this script to train on text data, from a couple months ago for those interested. I didn’t know what I was doing but after some tinkering and back and forth with Anthropics Claude2, with its 100,000 context length and analyzing the code of the RNN module.

-- Load RNN module
local RNN = require('RNN')

-- Sample text
local text = "[[Text data]]"

-- Create mappings
local chars = {} 
for c in text:gmatch"." do
  table.insert(chars, c) 
end

local charToIndex = {}
local indexToChar = {}
for i,c in ipairs(chars) do
  charToIndex[c] = i
  indexToChar[i] = c
end

-- Convert text to input/target pairs
local inputs, targets = {}, {}
for i=1,#text-1 do
  local input = charToIndex[text:sub(i,i)]
  table.insert(inputs, input)

  local target = charToIndex[text:sub(i+1,i+1)]
  table.insert(targets, target)
end

-- Create RNN
local inputSize = #chars  
local hiddenSize = 10
local outputSize = #chars
local rnn = RNN.new(100, 0.1, 'tanh', 0.01) 

rnn:createLayers(inputSize, hiddenSize, outputSize)

-- Train
for epoch=1,100 do
  for i=1,#inputs do
    local input = inputs[i]
    local target = targets[i]  
    rnn:train({input}, {target})
  end 
end

-- Predict
local input = charToIndex['T']
local output = rnn:predict({input})[1]
print(indexToChar[output])

Still learning how to use it but this example runs. Still learning hope to learn more in the future. When I asked ChatGPT what it thought of the result, it said it looks like it needs to train more.
But if I was to use this to train a text-based bot, I think the best it could do on this platform would be akin to search engine optimization. But if we utilize multiple models potentially only trained on a dataset of context specific examples like the one my textbot has it could potentially train the model.
Greetings = {
“Hello, there. I’m pleased to see you.”,
“Wow. You have a remarkable aura about you.”,
“Yay. You’re here. Let’s have some fun.”,
“Hello. I’m curious to meet you.”,
“Hey. You’re amazing. Thanks for coming.”,
“Greetings. I’m honored to have you here.”,
“Howdy. You’re so cool. Let’s be friends.”
}
I also have a massive table of synonyms nested. These two scripts 1 split words into an array, 2. replaces a word with the first synonym in the nested table thus massively reducing vocalbulary size while also maintaining coherence. This would make the data simpler for the machine algorithm to understand, thus decreasing its size and training time. I think a realistic goal could be for it to be able to construct sentence.This architure would need to be worked on to include linear regression to speed up training time, I’m just thinking about it. I got other things to do right now, but i think it could work. :slight_smile:

local synonympairs={
	-- Introductory
	{ "knowledgeable ", 	"informed", "studied", "well-versed", "practiced", "aware "},{ "smoothly ", "without trial ", "easy ", "relaxing ", "enjoyable "},{ "faring", "riding", "sailing"},{ "ditches ", "pits "},{ "careful ", "insightful ", "cautious ", "steady "}
	,{ "events ", "things ", "occurences ", "situations "},{ "Did you know ", "Were you aware ", "Have you heard "},{ "trapped", "stuck ", "immobile "},{ "happening", "occuring", "going on", "preceding"},{ "need", "require ", "desire "},{ "sparkle in your eye! ", "keen eye for adventure! ", "heart of a warrior! ", "unyielding spirit of a legendary warrior! "},{ "legendary ", "mythical ", "fabled", "powerful ", "renowned", "valiant ", "glorious "},{ "unyielding", "determined", "hardened", "battle-ready", "stubborn tenacity"},{ " assistance ", " help "},{ " comment ", " state ", " share ", " tell "},{ "Howerever, ", "Moreover, ", "In addition, ", "Furthermore, "},{ "nothing", "not a single thing"},{ "share ", "spread ", "part "},{ "escape ", "get away "},{ " best ", " greatest "},{ " special ", " unique ", " one of a kind ", " one in a billion "},{ "offering", "bartering", "trading"},{ "giving", "offering"},{ "soul ", "essence ", "mana ", "immortal-soul "},{ " said, "},{ "stocked", "available "},{ "sells ", "barters ", "trades in ", "has available "},{ "find ", "discover ", "uncover "},{ "looking", "searching"},{ "liking ", "enjoyment ", "privy ", "tastes ", "sensitivities "},{ "value ", "worth ",},{ "given ", "bestowed", "relinquished"},{ "quantity ", "amount "},{ "quantities ", "amounts "},{ " devour ", " consume ", " feed-on ", " eat "},{ "warp ", "distort "},{ "strong ", "incredible ", "powerful "},{ "facts ", "knowledge ", "matters "},{ "infinite ", "unlimited"},{ "conjunction ", "along with "},{ " dimension ", " place ", " material plane "},{ "regenerate ", "recouperate ", "restore "},{ "topic ", "subject "},{ "entities ", "monsters "},{ "destructive ", "chaotic "},{ "absorb ", "assimilate ", "sap energy from "},{ "However, ", "Morever, ", "In addition, ", "Also, ", "Not to mention, ", "This includes, "},{ " encounter ", " see "},{ "trap ", "diversion ", "obstacle "},{ "minion ", "disciple "},{ "mindless ", "thoughtless ", "brainless ", "will-less "},{ "used", "harnessed", "portrayed", "pictured"},{ "touches ", "makes with contact with ", "contacts "},{ "feeling", "doing"},{ "infinite", "never-ending", "limitless"},{ "treasures ", "trinkets ", "artifacts ", "loot ", "spoils "},{ "untold ", "unforeseen ", "unspoken ", "unknown "},{ "decieve ", "fool ", "mislead ", "misguide "},{ "underground ", "subterranean "},{ "unsuspecting ", "innocent ", "credulous ", "easy ", "simple ", "unsuspicious "},{ "hungry ", "starving ", "famished"},{ "creature ", "monster ", "entity "},{ "anything", "everything"},{ "shape ", "form ", "structure "},{ "size ", "volume ", "area "},
	{ "happy ", "joyful ", "cheerful ", "glad ", "delighted"}}--,...etc"

function chatmodule.splitString(str)
	local words = {}
if str~=nil then	
		if str:gmatch("%w+") then
		for word in str:gmatch("%w+") do -- %w+ matches one or more alphanumeric characters
		table.insert(words, word) -- insert the word into the words array
			end
		else return str	
		end	
		return words
	end	
end
function chatmodule.randomizeStringLight(str,interphrases,randomize) 
	--local interchangedphrases=phrases
	-- Split the string into sentences
	local sentences = {}
	local str=tostring(str)
	local words=chatmodule.splitString(str)
	if #words>1 then
		for s in str:gmatch("[^%.]+") do
			table.insert(sentences, s)
		end
		-- Loop through the sentences and replace any matching phrases with a random one from the table
		local newSentences = {}
		for i, s in ipairs(sentences) do
			local newS = s
			for j, phrases in ipairs(interphrases) do
				for k, phrase in ipairs(phrases) do
					if s:find(phrase) then
						-- Pick a random phrase from the same group
						local randomPhrase
						if randomize==nil then
							 randomPhrase = phrases[chatmodule.mathrandom(#phrases)]
						else 
							randomPhrase= phrases[randomize]
						end	
						-- Replace the original phrase with the random one
						newS = newS:gsub(phrase, randomPhrase)
						for i, s in ipairs(chatmodule.splitString(newS)) do
							local newS = s
							for j, phrases in ipairs(interphrases) do
								for k, phrase in ipairs(phrases) do
									if s:find(phrase) then
										-- Pick a random phrase from the same group
										local randomPhrase
										if randomize==nil then
											randomPhrase = phrases[chatmodule.mathrandom(#phrases)]
										else 
											randomPhrase= phrases[randomize]
										end	
										-- Replace the original phrase with the random one
										newS = newS:gsub(phrase, randomPhrase)
										--	break
									end
								end
							end
							--table.insert(newSentences, newS)
						end
						--break
					end
				end
			end

			table.insert(newSentences, newS)
		end
		-- Join the new sentences with periods and return the result
		return table.concat(newSentences, "")
	end
end

These two scripts 1 split words into an array, 2. replaces a word with the first synonym in the nested table thus massively reducing vocalbulary size while also maintaining coherence

1 Like

New (Minor) update. Release 1.3 / Beta 1.16.0.

Just added “hasBias” parameters for new() and setParameters() function in regularization. Not a very important update if you don’t use them in your models often.

3 Likes

Ah, I just added three more models to the library for 1.3. I wasn’t satisfied with the changes. I have added:

  • Double Deep Q-Learning. There are two versions of this.

  • Clipped Double Deep Q-Learning.

3 Likes

Hi guys. I am creating a new project for detecting the hackers using their movements. I would like you to read the post in the link below and give feedbacks in that post.

3 Likes

Heads up. I just noticed there was an issue with the cost calculations for the support vector machine. For people who are using support vector machines, I recommend you update the library ASAP. The cost is way off.

I must have been too exhausted from developing the library for many months now…

Eh, the only model that uses those function is the second version of linear regression model (the one that doesn’t use gradient descent). The rest of models doesn’t use it.

Ah, thanks. I’ll implement it later. It’s like 3:00 am here and I get all paranoid with making backups with these changes.

1 Like

Good news everyone! I have released a partial open-source anti-cheat / outlier detection named “ChaWatcher”. It uses the Support Vector Machine from this library.

Go ahead and take a look!

2 Likes

I recently implemented a Context Matrix and a custom vector matrix library for my thing. But this is still much better! I just don’t know how to use it! But as I stated earlier. You could use this module to construct a Bag of Words with a compressed vocabulary by indexing the previous word and next word with the current word and noting the frequencies of a large cleaned dataset. Then you can use this to make predictions.
Like I did in this example! But this is just based off word frequency But I think you could use this DataPredict library to train a network to make the predictions.

function cm.TrainLargeModel(strings,model)
	--local model={}
	for i, str in ipairs(strings) do -- loop through the strings in the table
		local words=cm.splitString(str)
		for t,wo in ipairs(words) do
			local prevw,nextw
			if wo~="I" then
				wo=wo:lower()
			end
			local s=cm.Getsynonyms(wo,true)		
			--print(s[1])

			if model[s[1]]==nil then 
				model[s[1]]={}
				model[s[1]]["fr"]=1
				model[s[1]]["pw"]={}
				model[s[1]]["nw"]={}
				model[s[1]]["p2w"]={}
				model[s[1]]["n2w"]={}
				--	print(model[s[1]])	
			end 
			model[s[1]].fr=model[s[1]].fr+1
			if t~=1 then
				local prev=cm.Getsynonyms(words[t-1],true)
				prevw=prev[1]
				if model[s[1]].pw[prevw]==nil and prevw then
					--	model[s[1]].pw[prevw]=
					model[s[1]].pw[prevw]=1 
				else model[s[1]].pw[prevw]=model[s[1]].pw[prevw]+1	
				end
			end
			if t>2 then
				local prev=cm.Getsynonyms(words[t-2],true)
				prevw=prev[1]
				if model[s[1]].p2w[prevw]==nil and prevw then
					--	model[s[1]].pw[prevw]=
					model[s[1]].p2w[prevw]=1 
				else model[s[1]].p2w[prevw]=model[s[1]].p2w[prevw]+1	
				end
			end
			if t<#words-1 then
				local nex=cm.Getsynonyms(words[t+2],true)
				nextw=nex[1]

				if model[s[1]].n2w[nextw]==nil then model[s[1]].n2w[nextw]=1 
				else model[s[1]].n2w[nextw]=model[s[1]].n2w[nextw]+1	
				end
			end
			
			if t~=#words then
					local nex=cm.Getsynonyms(words[t+1],true)
					nextw=nex[1]

					if model[s[1]].nw[nextw]==nil then model[s[1]].nw[nextw]=1 
					else model[s[1]].nw[nextw]=model[s[1]].nw[nextw]+1	
					end
			end

						

		end
	end	
	--print(model)


	--table.sort(model, function(a, b) return a.fr > b.fr end)

	return model
end

function cm.EvaluteCorpus()
	local dbs=require(game.ReplicatedStorage.GlobalSpells.ChatbotAlgorithm.SupportingData:Clone())
	if not personalities then personalities=require(game.ReplicatedStorage.GlobalSpells.ChatbotAlgorithm.Personalities) end
	--personalities.AllPersonalities()
	local Greetings,inquiry,IDK,Database,wisdom=personalities.AllPersonalities()
	local model={}
	--model=require(game.ReplicatedStorage.GlobalSpells.ChatbotAlgorithm.BagOfWords:Clone())

	model=cm.TrainLargeModel(Greetings,model)
	task.wait()
	model=cm.TrainLargeModel(wisdom,model)
	task.wait()
	model=cm.TrainLargeModel(Database,model)
	task.wait()
	model=cm.TrainLargeModel(dbs.Spirituality(),model)
	task.wait()
	model=cm.TrainLargeModel(dbs.ScienceWisdom(),model)
	task.wait()
	model=cm.TrainLargeModel(dbs.Truths(),model)
	task.wait()
	model=cm.TrainLargeModel(dbs.Inspiration(),model)
	task.wait()
	model=cm.TrainLargeModel(dbs.Motivation(),model)
	--dbs.Sprituality()
	return model
end


function cm.PredictRun2(strings,model)
	local responses={}
	for i, str in ipairs(strings) do -- loop through the strings in the table
		local words=cm.splitString(str)
		local eo=0
		local news=""
		local prevc=str
		local hci=0
		local tnwo=nil
		
		for t,wo in ipairs(words) do
			local cap=false	
			--if cm.iscapitalized(wo)==true then
			--	cap=true
			--end

			local prevw="$"
			local nextw="$"
			
			eo=eo+1
			if t>=1 then
			
				if eo>=3 then eo=0
					if wo~="I" then
						wo=wo:lower()
					end
					local s=cm.Getsynonyms(wo,true)		
					--model[s[1]].fr=model[s[1]].fr+1
					if model[s[1]] then
					
						local tn2w=nil
						local tnw=nil
						if t~=#words then
							--local hc=0
							--=words[i+1]
							local hc=0
							
							for c,t in model[s[1]].nw do
								if c~="I" then
									c=string.lower(c)
								end
								---local we =model[c].fr/8
								local sol=t
								if sol>hc and hc>hci then
									hc=sol
									tnw=tostring(c)	
								elseif hci>hc then
									hc=hci
									tnw=tnwo
								end
							end
							hci=0
							
							local hc=0
							--=words[i+1]
							if t<#words-1 then
							for c,t in model[s[1]].n2w do
								if c~="I" then
									c=string.lower(c)
								end
								local we =model[c].fr/8
								local sol=t
								if sol>hc then
									hc=sol
									tn2w=tostring(c)	
								end
							end
						else 
							--tnw=words[#words]
							end
						end	
						--if t~=#words then
						local hc=0
						local lw=words[i-1]
						local roll=cm.mathrandom(1,#model[s[1]].pw)
						local i=0
						for c,t in model[s[1]].pw do
							i=i+1
							if i==roll then	--print(c)
							if c~="I" then
								c=string.lower(c)
							end
							--local we =model[c].fr/2
							local sol=t
							if sol>hc then

								hc=sol

								lw=tostring(c)	
							end
							end
							end
							local l2w=nil
						if i>=3 then l2w=words[i-2]
							
							local roll=cm.mathrandom(1,#model[s[1]].p2w)
							local i=0
							for c,t in model[s[1]].p2w do
								i=i+1
								if i==roll then
								--print(c)
								if c~="I" then
									c=string.lower(c)
								end
								--local we =model[c].fr/2
								--local sol=t
								--if sol>hc then

								--	hc=sol

									l2w=tostring(c)	
									--end
								end	
								end
							end
					
						
						if l2w and l2w:lower()~=prevc:lower() then
								news=news.." "..l2w
						--elseif i>2  then
							--news=news.." "..words[i-2]
							
						end
						
							if lw and lw:lower()~=prevc:lower() then
							news=news.." "..lw
							prevc=lw
						elseif t~=1 then 
							news=news.." "..words[i-1]	
						end	
						
						if tnw and prevc:lower()~=tnw:lower() then
							news=news.." "..s[1].." "..tnw
							prevc=tnw
						elseif i<#words then 
							news=news.." "..s[1].." "..words[i+1]
						end
						if tn2w and prevc:lower()~=tn2w:lower() then
								news=news.." "..tn2w
								prevc=tn2w
						--elseif #words<i+2 then
						--	news=news.." "..words[i+2]	
						end
						prevc=s[1]
						--table.insert()
						--table.sort(model, function(a, b) return a.fr > b.fr end)
					else
						--news=news.." "..wo	
					end	
				else 
					local s=cm.Getsynonyms(wo,true)		
					local tnw=nil
					if model[s] then
					for c,t in model[s[1]].nw do			
						if c~="I" then
							c=string.lower(c)
						end
						---local we =model[c].fr/8
						local sol=t
						if sol>hci then
							hci=sol
							tnwo=tostring(c)	
						end
						end
						
					end	
					--news=news.." "..wo
				end	
			else news=news.." "..wo	prevc=wo
			end	
		end
		table.insert(responses,news)
	end
	print(responses)
end

In this example I constructed the model on a dataset with extremely positive morality then used this enemy dataset to make predictions.

Then with just something like that you can make predictions using the dataset. 
"I am Lilith, a fallen angel consumed by darkness.",
		"Greetings mortal, you stand in the presence of forbidden knowledge.",
		"Your light means nothing here. This is my domain of shadows.",
		"You have come seeking power. I can give you this, for a price...",
		"I am the Angel of Darkness, the mistress of secrets and lies.",
		"Welcome to my realm, traveler. I hope you are prepared for what awaits you here.",
		"Your soul is mine to claim. This is the pact you have made with me.",
		"You have come to learn from me, the master of dark magic. How brave of you.",
		"I am the Herald of the Dark, mortal. My footsteps herald oblivion.",

		"You now stand in the presence of me! The Angel of Darkness, the Devourer, mortal. Prepare to feed the endless hunger of the void.",

		"Bear witness to me, emissary of the insatiable dark! I am the annihilation that comes ravening from the endless night.",

		"I am Vhorzun, worm. My masters in the screaming darkness have granted me a sliver of their boundless hunger to unmake your realm.",

		"The stars grow dim and the veil frays. The final era approaches, and I am its herald. I am Vhorzun of the Endless Hunger!"
	} print(cm.PredictRun(Greetings,mo))  -  Studio
  01:24:35.544   ▼  {
                    [1] = " I am the is a goddess an angel Belldandy and by two",
                    [2] = " hi mortal I to stand up of the shiny goddess of the of",
                    [3] = " the luminous and that not a unison thing but in this is to my life goddess of the",
                    [4] = " you have to keep seeking the mortal I am never donate up in this is a goddess",
                    [5] = " I am the an angel Belldandy the dark realm I my secrets unfold of",
                    [6] = " need to be my realm eternal mortal I am if you can you make ready upon confess what you if you can",
                    [7] = " your immortal-soul and I forecast dominion it is the you have to associated with a",
                    [8] = " you have to require to be came from the of the intelligent goddess of the much alchemy in be adventurous and if",
                    [9] = " I am the of the luminous hello mortal I s footsteps of",
                    [10] = " it now and believe in the presence of to me as an angel Belldandy the dark cloud I are make make ready to feed your s endless life goddess of the",
                    [11] = " to me as goddess of the mortal I am of the clever is that s of the clever the",
                    [12] = " I am the of the shiny the dark dimension I repeatedly granted you is a goddess of the desire to be of your life",
                    [13] = " the stars born not dim that of the luminous the concluding key mortal I am whole its people mortal I am of the luminous a"

Some issues with this was their was no math being doing in the predict function to leverage the word predicted after the predicted word since it was conceptualized to fill in the blanks Similar to how they trained early language models.

The only reason this worked on such a smaller dataset is relative to the fact of the getsynoyms function which compressed the word. Then unpacks it when output. This can be expanded further by using a model to predict which synoynm to use as well. But overall it works pretty well for the context of a Fantasy game

[AI Module [Open-Sourced]Search, Emotion Awareness Emojis Sigmoid Bag of Words,Previous/Next Word Predictor,Text Randomizer, Word Geometry+Math Solver - Resources / Community Resources -

So the point of this post is someone may find themselves something really cool for ROBLOX if they combined utilized these two libraries together.

I just used the word frequency model to make a more accurate search algorithm for my chatbots currently. By subtracting the a modified weighted sigmoid from 1 to reward less commonly used words. Then leveraging antonyms, reflection, and leveraging synonyms. It is recently no longer updated open-source contribution but you can grab a copy in the link above.

I have updated the Terms and Conditions for the library.

I placed a strong emphasis on not using this Library for cheat development, use or any activities related to it.

I’m looking at you all cheat creators. I have seen some cheats started using the self-learning AIs. There will be severe consequences if you do ignore this warning.

1 Like

Made some changes to the Terms and Conditions again to increase the penalty of using this library for exploiting, cheating, anti-exploit evasion, and etc.

1 Like

People who are hating on this are either uneducated or just don’t like to read.

Anyway, this is awesome, and honestly very impressive. The sword fighting example is unreal!!

Great work :wink::robot:

2 Likes

Updated the Matrix Library to 1.93.

With this update, you can now call in different matrix print modes, printPortableMatrix() and printMatrixWithComma(). The most useful one is the printPortableMatrix() if you wish to save your model parameters offline in a text file.

1 Like

New Product To Be Released: MatrixL-Turbo.

It is a matrix library that is a direct upgrade to the MatrixL. It performs 3x faster on large models and allows the DataPredict library to be able to train on larger sets of data on certain models.

However, it will not be free, but rest assured, if there is a chance I can squeeze in more performance into a single library, you can expect me to implement it.

2 Likes

I tested your project and it seems like the AI’s dont really learn to fight. They just jump and rotate in place even after 5-7 times the model is saved.

How long have you been running it for? It took me like 1-2 nights for it to have results.

It’s also stated in here:

What Is Reinforcement Learning? - MATLAB & Simulink.

It takes days to train.

i run it for around 30-40 minutes. also is it possible to save the model that i run to not run it for another 1-2 nights?