Cannot work on place - Roblox Studio failing to save, publish, crashes on open/close

Place Link: Testing - Roblox

Starting a few days ago I would commonly get “Failed to Autosave. Would you like to temporarily disable autosave?” after about 5-15 minutes of developing. After this error had appeared attempting to download a copy or publish to roblox would both not work (File window would not appear at all.) These occured for both me and my co-developer on team create. Downloading a copy of the place still gives “Failed to autosave” errors.

These problems became more and more frequent over the past few days, now I can barely open the place at all. Most recently I was given this when opening the place, with roblox crashing with “unexpected error” shortly after:

Another common error I am seeing in the output is “Couldn’t create buffer: 8007000e” I also recently got this when opening the place [It IS in a game]:

The only similar post I could find was this one: Here
Though it is not quite the same as what is happening to us - I also moved around and duplicated large part count models recently before the bug started to happen. (One time ctrl+D on a large model caused studio to crash directly, when rejoining there was no duplicate of the model)

Also I am aware the place file is very large and there are a huge amount of parts, but I’ve worked on similar sized places with no problems at all.

29 Likes

is Team create enabled?

4 Likes

Yes this is a team create place, happens to anyone who attempts to download a copy. I’ve overwritten the place with a recent backup that seems stable when opened outside of team create, but it still leads to failed autosaves and inability to download a copy after the game has been open for a small amount of time.

10 Likes

Been trying everything the past few days to fix this, we’ve promised fans a big update soon. If anybody has any experience with issues like this I would greatly appreciate any advice.

Adding a few more details from the past few days:

  • Problems have only gotten worse, now backups from even weeks ago refuse to open. They will load all the way to 100% (hanging at 42% for a long time) and then give the error: “Local Library failed with Error 8: Not enough storage is available to process the command.”

  • Occasionally after attempting to open one of the places my display drivers will crash resulting in a black screen for a few seconds. I’ve googled this and it can apparently happen when memory has problems.

  • I’ve tried reinstalling studio, removing plugins, clearing cache, etc. but none of this has any effect

  • There are a large amount of unions (some very old) - Similar posts often reference corrupted unions

  • The files cannot upload to the devforum

  • Even backups from before any problems showed up cannot be opened

  • This is not unique to my PC, my co-dev is experiencing the same thing

  • I am able to play on the place in play mode from the website

9 Likes

The most recent studio updates seems to have improved things greatly. I can now reliably open backups of the game and save/autosave on them. I attempted to revert to the most modern team create place, but I still was not able to download a copy or autosave from that.
I’ve uploaded a backup over the place from before the corruption began and it now seems to be working properly in TC, but I am still testing things to make sure it is stable.

9 Likes

I’ve been testing place files for the past couple days and things have been mostly good, but I still got a “Failed to Autosave” today. Here is what happened:

I published a ‘clean’ copy to Team Create, from before the issues ever appeared. Then I tested it, I was able to autosave, save, and publish multiple times without problems. The ‘clean’ copy I used was however missing content that was added in the days after the issue appeared.
I attempted to transfer the content using a middleman place going from modern file > Empty TC place > clean copy. This did work and I was able to save and publish, but I DID receive a “Failed to autosave” message after bringing over a decent chunk of stuff. Upon reopening the clean TC place after restarting my computer it appeared fine, and I was able to save/autosave.

So things have gotten better, but I would still greatly appreciate some contact from staff on this issue to at least let me know what the cause of this was and if it is even safe to transfer from a post-issue place.

On one hand I am very happy that I have some working files, but I’ve been in the dark about this for almost a week at this point with no word from Roblox. It is very scary to go forward with this update when in the back of my head all I am thinking is that my future work may be corrupted too.

8 Likes

Thanks for reporting that. As fas as I see from your messages Studio’s complaining that it’s unable to allocate memory from time to time (RAM). I’ve loaded your place (it’s quite large!), enabled Team Create and noticed that with your place Studio alone is taking ~4GB. Depending on how much memory you have on your machine and what other applications you’re running this can be quite a lot. When you see these errors please open Task Manager (Activity Monitor on Mac) and check how much memory is taken.

8 Likes

Could this place be loaded in a non-memory heavy way. Such as a more basic version through editing rendering settings and such. I’ve experienced this same issue before, and would have loved to individually divide the place file so it is not as memory-heavy. I hope all is not lost :frowning:

1 Like

I do not have a place that reliably replicates the first crash where the terrain textures fail to load, but when dealing with the most unstable copy I have my memory went from 3.56/16GB on desktop to 7.14/16GB with the place open. Trying to download a copy from that place results in an instant “Error saving file - please try again or save with a different filename” but my memory usage stays consistent during this at 7.14. I also did not notice the memory going up during loading, only once the place loaded it went to 7.14.

My resource monitor is still showing a lot of “standby” and “free” memory available. I know the place is quite large, but outside of these issues we are able to build on it without any lag or that sort of thing. It is also a streaming-enabled place so we didn’t think the large size would effect players too much.

Is it possible it is hitting some sort of studio limit rather than a hardware limit on our machines? Would there be a way for me to give studio additional memory to work with?

8 Likes

There are no limits in Studio, but those imposed by OS. I suspect that for some reason Studio is unable to get enough memory for it to work. Maybe it has something to do with crossing 8GB. Let us experiment a bit more, see if we can reproduce the problem.

2 Likes

Ok, thanks a lot. The place currently published to “Testing” right now is usually fairly stable, but there are later copies that fail to save pretty much every time. Version 4586 is one of the ones that is like that, if you have access to that. I could also revert Testing to that version if you guys wanted to look at it from there. I can’t get them to upload to the devoforum though, they may be too large but they never complete the upload when I try.

8 Likes

After opening the game, the memory usage for Studio goes up to 3G+. And since Studio on windows is a 32bit application, though 4G is the theoretical upper limit, 3G+ is kind of dangerous. Then Studio may fail randomly when some memory allocation fails.

Sorry about that, for now, it seems to be some hard limit for Studio on windows.

Btw, Studio on Mac is 64bit, if possible, try to open the large game on Mac?

I don’t have access to any Mac computers unfortunately.

Could you provide some advice on what to do going forwards? Is there a way to reduce memory usage on the place? I was suggested loading in/out models from ReplicatedStorage using a plugin, but it doesn’t seem like this actually affects the memory usage. Splitting up the maps into different places is sort of a last resort if there is any other possible way.

Info on what uses the most memory/things to look out for that could be eating up a lot of memory would be helpful. There are months of work in these files so I would really like to get it working in some way or another, even if it means reducing some things.

4 Likes

After opening the game, activate the 3D viewport, press Ctrl+F6 and you will see a microprofiler result. Then go to Mode → Counters, some memory usage would be available.

I checked the game, the microprofiler result is not very helpful to locate which part of datamodel is using the most memory. While it does show that the undo/redo is taking too much memory. I am trying to analyze the codes to see if we can improve the undo/redo design in Studio.

Thanks a lot for your report! which is very helpful for us to analyze the issue:-)

Are there any plans to update windows to use 64 bit as well?

It’s really discouraging to know that we’re limited by outdated tech, where as other game engines use 64 bit (Unity, Unreal Engine)

1 Like

Alright, thanks so much.

If there is any amount of reduction from you guys it should help immensely, seeing as we are sort of on the ‘edge.’

Seeing as the game is StreamingEnabled I take it the memory usage shouldn’t affect players in-game? (As in hitting the limit and causing failures, not general lag)
If that is the case maybe there could be some way to selectively disable some studio features, such as disabling undo entirely? I don’t really know if such a thing is possible, but it was just an idea I had - I would be happy to work on a reduced mode of studio if it meant the players could get a larger size.

5 Likes

@lll_xyz is researching it as we speak. We’re planning to convert it to 64 bit as soon as practical (we’ll plan the work when the research is complete).

5 Likes

Maybe it’s worth trying to write a plugin to automatically disable it using ChangeHistoryService?

1 Like

That’s super exciting to hear!

2 Likes