I’m Open-Sourcing a plugin I built that writes and implements code in any experience, using external AI LLM models and the context of the existing code from that experience.
It’s currently working, this is not a “Please build XYZ for me” request.
If you’d like to contribute, collaborate, discuss, or use it for your own purposes, the GitHub link is below.
GitHub Link:
This plugin is different than the Roblox Assistant in two major ways:
It’s code changes take into account the context of the experience’s scripts.
It can complete multiple tasks as a result of one prompt (Change or create multiple new scripts).
Link to try it now:
How it works:
The plugin reads all of the context of your game to start a prompt.
It adds whatever your natural language text prompt is to the context of your game. (Whatever you asked it to do)
It sends that combined prompt to the API model of your choosing (OpenAI GPT-4o, Anthropic Claude 3.5 Sonnet, or Google Gemini 1.5 Pro)
It uses your API key for that service.
It receives the response from the model via API, and breaks it into individual changes to complete.
It understands which script to make each change in, and places the changes for you.
It logs each change for you to review and either “Go-To” to investigate, or “Undo” to remove.
This. is. insane. the problem with asking ChatGPT or any other AI is that they don’t have all the information or scripts. but this completely fixes it! I can’t wait to try this!
Yup! This will essentially do the copy/pasting of scripts to AI models for you, and then implement the code changes too.
Let me know if you have any challenges, the setup calls for getting your own API key to use. Google usually has a free $300 worth of API key credits when you get an API key from them. If this is your first time doing that, you can go here: https://ai.google.dev/ and hit “Get API key in studio” button.
You could’ve saved 70% of writing if you just bundled the UI in a model as the plugin, but I guess writing it manually is a better idea as for readability and better maintenance.
Adding to your little comment:
-- It could be nice to be able to save API provider selections, and API keys across user sessions.
The amount of tokens wasted is wild. If we’re making AI products that require the end user to provide the payment, we want to optimize the request as an attempt to waste as least tokens as possible. Your request boasts ~1300 tokens, and if we add the code supplied, god knows how many.
1300 tokens might not seem much (even though it is), but if you read the request body you’ll realize that it’s too much yapping, and given how smart LLMs are at reading code and understanding it, we can cut at least 600 tokens from that same request and expect an almost better answer.
LLMs don’t even know about Roblox’s API. OpenAI’s 4o provides internet access, and have you not thought about supplying the Creator Documentation as a source? Huh, that could’ve solved many problems. Nevermind, it’s just ChatGPT, not the native model.
Even Roblox’s Generative AI sucks at what it does already, and no one can do any AI coding plugin that does anything remotely similar to Roblox’s, at least yet. If you want a proper AI solution, GitHub’s Copilot already does better.
The GUI is all written in code because as you said, it makes it much easier to share with AI models for maintenance. Ripping structure and properties of GUI elements from the explorer is tedious and difficult for working with AI LLMs.
I’ve implemented the Plugin:SetSetting() functionality, exactly what I was looking for. Good recommendation!
Agreed that the token length on my prompt is very long, it did get that long from growing through testing though, and has been basically beaten the models into reliable submission, so any cutting could lead to issues. Happy to try any specific recommendations there, however at $3/million tokens, cutting 600 tokens is cutting 1/5th of 1 cent. The majority of the costs come from the contents of the game you’re including, my current requests are about 200k tokens.
I disagree with your statement that Roblox’s Generative AI coding is better than OpenAI or Anthropic’s models. They are likely (almost certainly) trained on all the documentation and devforum posts, and in my experience I get phenomenal answers from their APIs, that are better than what the Roblox assistant can provide, especially because the Roblox assistant doesn’t take into account the context of your experience.
GitHub Copilot is a valid option, however I think getting to a dev environment setup where you’d be using Github Copilot is not a reasonable expectation for most Roblox developers/the target audience of this plugin.