I think that games should have an optional “Developer Console Enabled” setting.
Currently, the microprofiler shows spikes which come specifically from the items that the Developer Console prints. It’d be nice if there was a setting that I could enable/disable that would prevent the console from even being loaded into the game.
Alongside the spikes, (which look like this: )
having this ability could make it more difficult for remote events to be reverse engineered, and exploiting more difficult on games that are not tested. Those users wouldn’t be able to load the developer console and as a result, they’d have to either create a load a custom one.
I have no need for the console to be enabled unless I’m debugging, which is not every time I play my game.
Sounds like the solution is making the developer console not create performance spikes, rather than giving an option to disable it. Note that the performance spikes will only begin once the developer console is opened for the first time, so as long as it’s not opened in the first place, the result is the same as it being disabled.
Exploiters do not use the developer console to reverse engineer remote events. It’s a really bad tool to use for that purpose.
Oftentimes they print out anything that the remote events receive, especially if their exploit can’t decompile. They also tend to print out variables and stuff when testing their scripts. (Everytime something is printed it creates a spike)
I actually considered
except the issue still rises that there’s always system errors, HTTP errors, unhandled errors (specifically via admin scripts that are models and errors that simply haven’t been fixed yet), and warnings that still pop up and create an annoying spike everytime.
Also, I have a bad tendency of clicking f9 sometimes out of habit, and it’s rather annoying having to rejoin everytime I do that.
The exploiter would just make their own developer console or trick the client into thinking that the developer console should be available even though it is disabled. The real problem here seems to be that the developer console should not be creating GUIs or doing any other processing until it is enabled. There was a recent rewrite of the developer console, maybe a bug was introduced?
If your RemoteEvents and RemoteFunctions can be successfully reverse engineered merely from the developer console you’re really not putting enough security on them. However I agree it shouldn’t be producing as much lag as it does.
For me, the lag spikes stop if I change away from the tab that is printing a lot. It’s not ideal though, as normally when I print a lot it’s because I need to see what is being printed.
I do think the solution is to improve the performance of the developer console. Solving this by allowing it to be disabled would only solve the issue if seeing what is being printed isn’t important. Realistically, you should only be printing if it’s important.