Neural Network Library 2.0

this basically generates the path for the AI like PathfindingService ?, does it detect any nearby objects and checks if it’s a player If not then that’s fine but if he knows then he is big :brain: and it just makes the NPC or AI Smart?

2 Likes

It shouldn’t. I didn’t write that it should anywhere but if you encounter a situation where I did write that it does or the library produces incorrect outputs, let me know.

LeakyReLU is better because it discourages negative activations while not completely cutting them off. It is necessary for some network designs in order to reduce the risk of dead nodes.

I didn’t write that you shouldn’t use TensorFlow; in fact, you should, if possible. It’s just that TensorFlow is for Python, not Lua. If you want a ML library for Lua, you have to use Torch. But, because Torch isn’t made for Luau and Roblox in general, a custom library like this is necessary.

3 Likes

I’m not really sure what you’re talking about. This ML library has nothing to do with with generating paths like the PathfindingService, unless you’re talking about self-driving vehicles. In which case, sort of.
Self-driving vehicles that use simple neural networks don’t generate a path; they just navigate through their immediate environment in the best possible way they were taught. This doesn’t replace pathfinding, but it can complement it.

2 Likes

Just thought I’d point out something in the example code that I think was a typo

--2 layers with 2 nodes each, and 1 output 'out'.
local net = FeedforwardNetwork.new({"x","y"},2,3,{"out"},setting)

this code creates two layers with 3 nodes but it says there are only two hidden layers with two nodes. I got stumped on this and had to look at the code to figure it out but others may find it confusing so I thought I’d mention it.

side note:
A feature request, unlimited amount of layers with a selected amount of nodes would be nice. Ex:

-- 5 layers, 1st being the input x, y; 2-4 is
-- hidden nodes; and 5 is the output.
local net = ({"x", "y"}, {6, 4, 3}, {"out"}, settings)

I don’t think it would make much of a difference but if it’s possible, it would be nice.

2 Likes

I just have another question to make my understanding better: In your example code its basically just for loops and when the minimum number reaches the generations number it ends but how would I make it infinite and train the AI forever?

2 Likes

Hey i don’t have much NN experience but I want to ask whats the purpose of this function?

function isAboveFunction(x,y)
    if x^3 + 2*x^2 < y then
        return 0
    end
    return 1
end

And are there any resources I can learn about Nueral Networks?

2 Likes

He is summing the delta weights i think. I usually just do weights * previousLayer but I saw this method on 3blue1brown when working with biases

3 Likes

but whats it purpose? Is it necessory?

2 Likes

Yes, in machine learning at least we are simply tweaking these weights to get a desired result. It is not alive but rather finding the minima of a data model. This is why I cringe when I hear elon talking about ai btw :wink:

4 Likes

just forward propogate and backword propogate in a infinite loop, updating the weights each loop. I dont really understand what u mean maybe i am giving you a extremly oversimplified explanation

3 Likes

Like @Kironte what a neural network actually is is it receives inputs and gives out outputs by messing with activation functions. The data that is recorded or trained in a sense is weights, the synapses in a sense that connect the neurons/nodes. In a sense we are finding the minima to achieve our target goal form our data model.

3 Likes

what kind of weight did you mean this??

local weights = 100

for _ = 1, weights, 1 do
   -- ...
end
2 Likes

All this function does is determine if the given XY coordinates are above the cubic function. This is why it calculates the Y of the cubic at the given X and it then compares this Y to the given Y. Whether or not 0 or 1 is decided as ‘above’ or ‘below’ doesn’t matter, what matters is that it remains this way and there is a distinct difference between the 2 possible outputs.

3 Likes

The weights aren’t that simple to retrieve, especially in an OOP implementation where you need to grab them from every individual node. If you haven’t, you should check out the introduction page of the website to make sure you understand how weights work.

1 Like

It is indeed a typo, I changed it afterward and forgot to change the comment.

This is actually already a feature, I just completely forgot to document it on the website as I didn’t think people would really need it. I’ll put it up when I get around to it!

2 Likes

No like get your model forward propogate and back propogate functions and just constantly update those weights (which are a array of numbers ranging from 0 to 1, i dont know his model but thats how i do it)

local weights = {
 0,0,0,0,0,0,0,0,0,0
}
game:GetService("RunService").Heartbeat:Connect(function()
  forwardPropogate() --This tells us our outputs and how much they were off from the target
  backwordPropopgate() --This applies that delta change to the previous node layers
end)
1 Like

How would infinite node layers work? Oh nvm im stupid I thought you tried to create a sigma with no end which would f up the ratios lol

1 Like

If I pass a Vector X and Vector Z, I would expect a value for those vectors but it gives this random values like 0.472 and can you tell me what does the output actually give?

1 Like

How can I implement reinforcing learning using this? In the example it is supervised learning.

3 Likes

For this, you would use the GeneticAlgorithm class.
There is some example code for this here!

3 Likes