Neural Network Library Modified

Alright, so @Kironte, made an epic NN library. Link: here

It’s extremely complicated for my not very math talented brain. So I couldn’t make many improvements in the algorithm. To be honest, I am in awe of what he has accomplished.

but I digress. What I modified was, I made it a lot simpler to use and also improved general readability for his code.

The difference between the original and modified will be listed below

Kironte's
local module=require(workspace.KNNLibrary) --Activating and getting the library

local network = module.createNet(2,2,3,1,"LeakyReLU") 	--Creating a network with 2 inputs, 2 hidden layers, 2 nodes per hidden layer, 1 output node,
														--and with ReLU as the activation function
local vis = module.getVisual(network) 	--Creating the visual
vis.Parent = game.StarterGui			--and parenting it to StarterGui so you can see it as the Server
										
local counter=100000	--We will train for half a million times
local tim=tick()
for i=1, counter do
	local xCoo,yCoo=math.random(-400,400)/100,math.random(-400,400)/100 --For this test, our precision is between -4.00 and +4.00
	local correctAnswer=1
	if 2*(xCoo)^2+xCoo^3<yCoo then 			--The function we're using for this test is x^3 + 2x^2. We want the network to tell us whether or not
		correctAnswer=0						--a set of X,Y coordinates is above or below the function's line
	end
	module.backwardNet(network,0.01,{xCoo,yCoo},{correctAnswer})
	if tick()-tim>=0.1 then  --To avoid timeouts, we add a wait() every half second the script runs. This is to make sure even potato processors will work
		tim=tick()
		wait()
		print(i/counter*(100).."% trained.")
		module.updateVisualState(network,vis)
	end
end
print(module.saveNet(network)) --We print out the network just for fun and demonstration
local wins=0
local tim=tick()
for i=-400,399 do				--Here, we cycle through every coordinate between -4.00,-4.00 and +4.00,+4.00
	for d=-400,399 do
		local xCoo,yCoo=(d)/100,(i)/100
		local answer=module.forwardNet(network,{xCoo,yCoo})[1] --Since the output is an array, we have to index the number we want, in this case, 1
		local out=1
		if 2*(xCoo)^2+xCoo^3<yCoo then
			out=0
		end
		if math.abs(answer-out)<=0.4 then 	--Though this bit is completely up to the user, I set a tolerance of +-0.4
			wins=wins+1						--If the output is 0.6 while the answer is 1, I mark it correct.
		end
		--[[If you want that really cool fading effect like what I demoed, enable this code. Note that it will never finish
			testing with it.
			 
		if d%5==0 then
			module.updateVisualActive(network,vis,{xCoo,yCoo},1)
			wait()
		end	
		
		]]
	end
	if tick()-tim>=0.5 then --Lets add a wait() here too so we don't lag up too much
		tim=tick()
		print("Testing... "..(i+400)/(8).."%")
		wait()
		module.updateVisualActive(network,vis,{math.random(-400,400)/100,math.random(-400,400)/100},1)
	end
end

print((wins/640000*100).."% correct!") 	--This tells us overall, how accurate our network is at calculating whether points are above or
										--below a x^3+2x^2 cubic function
Mine
local module = require(5654635113) --Activating and getting the library

local network = module.new(2, 2, 3, 1, "LeakyReLU")

local vis = network:getVisual()
vis.Parent = game.StarterGui

local counter = 100000	--We will train for half a million times
local tim = tick()
for i = 1, counter do
	local xCoo, yCoo = math.random(-400, 400) / 100, math.random(-400, 400) / 100
	local correctAnswer = 1
	if 2 * (xCoo) ^ 2 + xCoo ^ 3 < yCoo then
		correctAnswer = 0
	end
	network:backwardNet(0.01, {xCoo, yCoo}, {correctAnswer})
	if tick() - tim >= 0.1 then
		tim = tick()
		wait()
		print(i / counter * (100).."% trained.")
		network:updateVisualState(vis)
	end
end

print(network:saveNet(network))

local wins = 0
local tim = tick()
for i = -400, 399 do
	for d = -400, 399 do
		local xCoo, yCoo = (d) / 100, (i) / 100
		local answer = network:forwardNet({xCoo, yCoo})[1]
		local out = 1
		if 2 * (xCoo) ^ 2 + xCoo ^ 3 < yCoo then
			out = 0
		end
		if math.abs(answer - out) <= 0.4 then
			wins = wins + 1
		end
	end
	if tick() - tim >= 0.5 then
		tim = tick()
		print("Testing... "..(i + 400) / (8).."%")
		wait()
		network:updateVisualActive(vis, {math.random(-400, 400) / 100, math.random(-400, 400) / 100}, 1)
	end
end

print((wins / 640000 * 100).."% correct!")

The main difference is:

Kironte:

local network = module.createNet(2,2,3,1,"LeakyReLU")
local Result = module.forwardNet(network,{1,2})[1] -- Content right here
module.backwardNet(network,0.01,{1,2},{1}) -- just an example, this won't work

Mine:

local network = module.createNet(2, 2, 3, 1,"LeakyReLU")
local Result = network:forwardNet({1, 2})[1] -- Content right here
network:backwardNet(0.01, {1, 2}, {1}) -- just an example, this won't work

Its a bit neater and slightly readable.

My module here: Neural-Network-Modified

I hope this helps y’all. Kironte is probably working on the better version rn.

All credit goes to @Kironte

(New to posting in devforum, if I did anything wrong, please tell me in DM’s or replies.)

5 Likes

Nice! I cringe when I look at my old pre-dictionary code. I am indeed working on a new library but I hope this module helps educate you and anyone who tries it about machine learning.

4 Likes