50 is probably enough.
For this AI training place, the server moves the goal post (pun intended) when the AI reaches the goal (sorry).
I’d rather focus on making them on standing upright first though.
50 is probably enough.
For this AI training place, the server moves the goal post (pun intended) when the AI reaches the goal (sorry).
I’d rather focus on making them on standing upright first though.
Are you giving the ais information on all parts or raycasting eyes?
Yes. Just straight line forward though, but not sideways.
Do they have a humanoid inside?
Yes. Similar of those player’s character.
Done, inspired by AI Learns to Walk (deep reinforcement learning) - YouTube
ai playground.rbxl (67.0 KB)
Scripted this a bit aswell
Ah, thanks. Though the training might take a while due to work. Sorry about that.
would it be possible to use this for something like ai learning how to speak and understand things or whatever
Short answer.
No, if you are referring to self-learning AIs learning to speak. The self-learning AI can only be based on decision making. However, for the “understanding things”, it really depends. What do you exactly want it to understand?
Long answer:
If you really want the AI to learn how to speak, you need a lot of data. Use LSTM or RNN to train your AI. But even then, I doubt you can collect enough data for training.
Self-Learning AIs with the capability to speak is not possible (yet). This library only covers self-learing AIs with the decision making skills that takes in the environment and output certain actions.
for understanding things i mean something like telling the ai to do something and then it does exactly what you want
Not possible with this library. Though, you can say it already exists like ChatGPT.
Don’t expect me to add that. I am employed right now and I don’t really have time for large scale projects like that.
For something like that you would need to prompt an AI model to write code. ChatGPT is the best API for this. You can send a system message to only respond in code format then recieve the string from the API and execute the string using
LoadString(response.response_text). But you would
function GPT4(inputText)
-- Import the HttpService
-- Define the API URL
local API_URL =""
-- Define the headers with your authorization key
local headers = {
["Authorization"] = Bearerkey
}
-- Define a function to query the API with a payload
local function query(payload)
-- Encode the payload as a JSON string
local jsonPayload = HttpService:JSONEncode(payload)
-- Send a POST request to the API URL with the headers and the payload
local response = HttpService:PostAsync(API_URL, jsonPayload, Enum.HttpContentType.ApplicationJson, false, headers)
-- Decode the response as a JSON table
local jsonResponse = HttpService:JSONDecode(response)
-- Return the JSON table
return jsonResponse
end
-- Define your input text
-- Query the API with your input text as the inputs field
local output = query({
["system_message"] ="You are a Lua coding assistant in the context of ROBLOX Studio. Only provide your response in code block format and only provide explanations in comments."
["inputs"] = chatmodule.randomizeString(inputText),
})
-- or
local generatedText = output[1]["generated_text"]
--print(output)
-- Print the string
return generatedText
-- Print the output
end
You can then save this string if it passes into a function key array. You can then use a function like this to input the prompt and store the generated command into an array. This is just concept. I haven’t done it myself but I’ve looked into it. I’m working on a library of action functions right now, but a more advanced version of this concept here would be good for creating self learning AIs that write their own code using a LLM.
---example of table function
local Labeldb={
["search query"]=wikidb,
}
--example of writing a input into an existing table
--- and receiving api input to store the function into an array
function WriteData(playerinput)
local savedfunction=GPT4(playerinput)
Labeldb[playerinput]= function() return savedfunction:LoadString() end --This way the recieved variable is stored and the function is executed with the
Labeldb[playerinput]()
end
--example of calling the command
He never mentioned about the self-learning AI writing their own code.
It’s not about seeking approval. I’m just trying to explain my side of the story. If you don’t have an argument or don’t want to listen, just say so.
If you need to explain your side of your story AND post it in public where everyone can see, that is seeking approval. Otherwise you wouldn’t explain it here since you would be confident in your own opinion.
Yes, I think there is a bug in the library. The main reason why I think that is because the model failed to do a linear regression problem with only one data point. The cost ended up increasing instead of decreasing.
I still don’t see how posting something in public equates to seeking approval. If you said something like it brings unnecessary attention, or that it’s making the thread too long, I would agree. You also shouldn’t ignore the fact that you also posted your stuff in public as well. Also, whether you post in public or not doesn’t determine how confident you are in your opinion.
At this point, believe what you want. I’m getting quite tired of this. I was never trying to prove that you’re 100% right or 100% wrong but rather both.
Why is it public? It was because you decided to continue to not send issues to my private DM and ended up crossing the line of being demanding. Add to that you complained about so called “issues” with my library when I have given you some solutions which led to even more complaining.
You asked and complained for Double Q Neural network, then you complained for single output neural output neuron when two output neurons is sufficient.
What makes me really irritated that you wanted to change the way I do my solutions when my solutions is literally fit for these kind of games.
If you don’t like it, go make your own one. I would like to see your version works.
I spent 7 months on this project and you expect me to cater all your demands.
You could have manually implemented those models with the codes that you already have here, like I suggested, but nooo, you want me to do it for you.
Talk about entitled. And to quote YouTuber Louis Rossmann, “You are not investable”.
Okay then. Fair enough. I probably should’ve moved this conversation to DMs in the first place. Let’s move it now.
Here’s an example of how I set up this module in a working state using this script to train on text data, from a couple months ago for those interested. I didn’t know what I was doing but after some tinkering and back and forth with Anthropics Claude2, with its 100,000 context length and analyzing the code of the RNN module.
-- Load RNN module
local RNN = require('RNN')
-- Sample text
local text = "[[Text data]]"
-- Create mappings
local chars = {}
for c in text:gmatch"." do
table.insert(chars, c)
end
local charToIndex = {}
local indexToChar = {}
for i,c in ipairs(chars) do
charToIndex[c] = i
indexToChar[i] = c
end
-- Convert text to input/target pairs
local inputs, targets = {}, {}
for i=1,#text-1 do
local input = charToIndex[text:sub(i,i)]
table.insert(inputs, input)
local target = charToIndex[text:sub(i+1,i+1)]
table.insert(targets, target)
end
-- Create RNN
local inputSize = #chars
local hiddenSize = 10
local outputSize = #chars
local rnn = RNN.new(100, 0.1, 'tanh', 0.01)
rnn:createLayers(inputSize, hiddenSize, outputSize)
-- Train
for epoch=1,100 do
for i=1,#inputs do
local input = inputs[i]
local target = targets[i]
rnn:train({input}, {target})
end
end
-- Predict
local input = charToIndex['T']
local output = rnn:predict({input})[1]
print(indexToChar[output])
Still learning how to use it but this example runs. Still learning hope to learn more in the future. When I asked ChatGPT what it thought of the result, it said it looks like it needs to train more.
But if I was to use this to train a text-based bot, I think the best it could do on this platform would be akin to search engine optimization. But if we utilize multiple models potentially only trained on a dataset of context specific examples like the one my textbot has it could potentially train the model.
Greetings = {
“Hello, there. I’m pleased to see you.”,
“Wow. You have a remarkable aura about you.”,
“Yay. You’re here. Let’s have some fun.”,
“Hello. I’m curious to meet you.”,
“Hey. You’re amazing. Thanks for coming.”,
“Greetings. I’m honored to have you here.”,
“Howdy. You’re so cool. Let’s be friends.”
}
I also have a massive table of synonyms nested. These two scripts 1 split words into an array, 2. replaces a word with the first synonym in the nested table thus massively reducing vocalbulary size while also maintaining coherence. This would make the data simpler for the machine algorithm to understand, thus decreasing its size and training time. I think a realistic goal could be for it to be able to construct sentence.This architure would need to be worked on to include linear regression to speed up training time, I’m just thinking about it. I got other things to do right now, but i think it could work.
local synonympairs={
-- Introductory
{ "knowledgeable ", "informed", "studied", "well-versed", "practiced", "aware "},{ "smoothly ", "without trial ", "easy ", "relaxing ", "enjoyable "},{ "faring", "riding", "sailing"},{ "ditches ", "pits "},{ "careful ", "insightful ", "cautious ", "steady "}
,{ "events ", "things ", "occurences ", "situations "},{ "Did you know ", "Were you aware ", "Have you heard "},{ "trapped", "stuck ", "immobile "},{ "happening", "occuring", "going on", "preceding"},{ "need", "require ", "desire "},{ "sparkle in your eye! ", "keen eye for adventure! ", "heart of a warrior! ", "unyielding spirit of a legendary warrior! "},{ "legendary ", "mythical ", "fabled", "powerful ", "renowned", "valiant ", "glorious "},{ "unyielding", "determined", "hardened", "battle-ready", "stubborn tenacity"},{ " assistance ", " help "},{ " comment ", " state ", " share ", " tell "},{ "Howerever, ", "Moreover, ", "In addition, ", "Furthermore, "},{ "nothing", "not a single thing"},{ "share ", "spread ", "part "},{ "escape ", "get away "},{ " best ", " greatest "},{ " special ", " unique ", " one of a kind ", " one in a billion "},{ "offering", "bartering", "trading"},{ "giving", "offering"},{ "soul ", "essence ", "mana ", "immortal-soul "},{ " said, "},{ "stocked", "available "},{ "sells ", "barters ", "trades in ", "has available "},{ "find ", "discover ", "uncover "},{ "looking", "searching"},{ "liking ", "enjoyment ", "privy ", "tastes ", "sensitivities "},{ "value ", "worth ",},{ "given ", "bestowed", "relinquished"},{ "quantity ", "amount "},{ "quantities ", "amounts "},{ " devour ", " consume ", " feed-on ", " eat "},{ "warp ", "distort "},{ "strong ", "incredible ", "powerful "},{ "facts ", "knowledge ", "matters "},{ "infinite ", "unlimited"},{ "conjunction ", "along with "},{ " dimension ", " place ", " material plane "},{ "regenerate ", "recouperate ", "restore "},{ "topic ", "subject "},{ "entities ", "monsters "},{ "destructive ", "chaotic "},{ "absorb ", "assimilate ", "sap energy from "},{ "However, ", "Morever, ", "In addition, ", "Also, ", "Not to mention, ", "This includes, "},{ " encounter ", " see "},{ "trap ", "diversion ", "obstacle "},{ "minion ", "disciple "},{ "mindless ", "thoughtless ", "brainless ", "will-less "},{ "used", "harnessed", "portrayed", "pictured"},{ "touches ", "makes with contact with ", "contacts "},{ "feeling", "doing"},{ "infinite", "never-ending", "limitless"},{ "treasures ", "trinkets ", "artifacts ", "loot ", "spoils "},{ "untold ", "unforeseen ", "unspoken ", "unknown "},{ "decieve ", "fool ", "mislead ", "misguide "},{ "underground ", "subterranean "},{ "unsuspecting ", "innocent ", "credulous ", "easy ", "simple ", "unsuspicious "},{ "hungry ", "starving ", "famished"},{ "creature ", "monster ", "entity "},{ "anything", "everything"},{ "shape ", "form ", "structure "},{ "size ", "volume ", "area "},
{ "happy ", "joyful ", "cheerful ", "glad ", "delighted"}}--,...etc"
function chatmodule.splitString(str)
local words = {}
if str~=nil then
if str:gmatch("%w+") then
for word in str:gmatch("%w+") do -- %w+ matches one or more alphanumeric characters
table.insert(words, word) -- insert the word into the words array
end
else return str
end
return words
end
end
function chatmodule.randomizeStringLight(str,interphrases,randomize)
--local interchangedphrases=phrases
-- Split the string into sentences
local sentences = {}
local str=tostring(str)
local words=chatmodule.splitString(str)
if #words>1 then
for s in str:gmatch("[^%.]+") do
table.insert(sentences, s)
end
-- Loop through the sentences and replace any matching phrases with a random one from the table
local newSentences = {}
for i, s in ipairs(sentences) do
local newS = s
for j, phrases in ipairs(interphrases) do
for k, phrase in ipairs(phrases) do
if s:find(phrase) then
-- Pick a random phrase from the same group
local randomPhrase
if randomize==nil then
randomPhrase = phrases[chatmodule.mathrandom(#phrases)]
else
randomPhrase= phrases[randomize]
end
-- Replace the original phrase with the random one
newS = newS:gsub(phrase, randomPhrase)
for i, s in ipairs(chatmodule.splitString(newS)) do
local newS = s
for j, phrases in ipairs(interphrases) do
for k, phrase in ipairs(phrases) do
if s:find(phrase) then
-- Pick a random phrase from the same group
local randomPhrase
if randomize==nil then
randomPhrase = phrases[chatmodule.mathrandom(#phrases)]
else
randomPhrase= phrases[randomize]
end
-- Replace the original phrase with the random one
newS = newS:gsub(phrase, randomPhrase)
-- break
end
end
end
--table.insert(newSentences, newS)
end
--break
end
end
end
table.insert(newSentences, newS)
end
-- Join the new sentences with periods and return the result
return table.concat(newSentences, "")
end
end
These two scripts 1 split words into an array, 2. replaces a word with the first synonym in the nested table thus massively reducing vocalbulary size while also maintaining coherence