Hello! Those are great feedback! I totally agree that at least some form of Editable > Fixed would be very useful, and should be available soon like early next year.
Since unfixed EditableMesh can currently allocate and deallocate vertices and triangles, we can’t predict the exact memory usage. Therefore, we’re currently taking a conservative approach and assuming the worst-case scenario based on the potential size of an editable mesh. It’s going to be less and less conservative. (that’s been said, for some lower-end device, the limit is less than 8, so, if you are creating an experience for all devices, make sure to always check myEditableObject == nil after calling the create api : ) ).
Although FixedSize is not an available option for createEditableMesh yet, createEditableMeshAsync has FixedSize defaults to true, which costs way less. Alternatively, you can always add faces/verts to a single editable mesh instead of spawning an editable mesh every time.
I would like to see some sort of CreateMeshPartAsync property that auto-centers the bounding box as well as maintains the original mesh size - Im using this for procedural content and dont know either myself at the start so rn I have to manually calculate a bounding box at the end, offset all of the vertex positions in my list and resize the meshpart after creation to the size of the calculated bounding box.
I currently have a script generating a mesh with a bunch of duplicate vertices, i thought that fixing it would result in a performance gain and my case it went from 7000 vertices to 2000. It caused the normals to be all wonky. I therefore used SetFaceNormals to have the normals be correct for each face again but the function DynamicGeometryManager::transcodeVerticesAndCalculateBounds (seen in the profiler) appears to take 250ms instead of the 3ms it took before, which i find weird. And it is all due to the SetFaceNormals function. Leaving the normals untouched makes the function take the regular 3ms but manually changing the normals using the SetFaceNormals method makes it take 250ms.
With my part of the code only takes 8ms, this 250ms delay is pretty ridiculous.
How exactly does this memory budgeting work? For scaling to lower clients, is the memory allocation based on the lowest memory client? Is it just based on server memory? If I set my experience to only be available on a specific platform (such as PC) will I get higher memory budget?
I can’t seem to find more info on the memory budgeting anywhere. Also, the documentation should specify that FixedSize is supposed to be passed in with the :CreateEditableMesh() function.
Could you give a rough ETA on this? Is this something being actively worked on or something more long term? I’m using EditableMeshes for a large feature in my game, and I’m wondering if it’s worth to continue to work on it or wait until there’s further news
Hello! It is definitely in the roadmap, but the exact timeline is not yet clear. We are currently in the process of investigating in performance of the unbounded editableMesh to finalize the execution details. I expect to have a more detailed answer/ETA next month. I will keep you updated as we move forward.
Hello, memory budgeting depends on the client’s device, not the server (RCC) or the lowest-memory client in an experience. For example, editable objects could be created if the running device’s performance won’t be negatively impacted or cause a crash. This is why it’s important to always check myEditableObject == nil after calling the create API. For CreateEditableMesh(), we’re currently taking a conservative approach and assuming the worst-case scenario based on the potential size of an editable mesh. It’s going to be less conservative and more accurate to reflect the actually memory budgeting as moving forward.
These spikes add up quickly, causing total delays of 1 ms or more, which can severely degrade performance. This inconsistency makes it more difficult to rely on the method for realtime use.
What is causing this?
It appears to be repeatable too, it is the exact same each time.
I’m not sure if this is the right place to say this, but I’m noticing a lot of disappearing among editable images, which gets worse and worse the further the pivot of the MeshPart is from the actual mesh. In the video, the pivot of each chunk is a couple ten studs below the actual mesh, but this still happens even when it’s closer.
Oh, and also it seems that if the Y coordinate for the editable mesh is the same among all vertices in that mesh, then it just doesn’t render? The lakes in my generation all have a water height of 0, so when a chunk is entirely made up of a lake, then all the vertices in said chunk have a height of 0 and they disappear…
Could someone please explain how FindVerticesWithinSphere works? I am trying to create my own Subtract async for mesh parts however only for spherical intersectors and an issue I am facing is that I have absolutely no clue how the function works. There is 0 documentation on how your actually meant to calculate the size of it and if it is in object space or local space.
I tried it but when I test it out it doesn’t work. I have the actual mesh fully sized and another sphere mesh which I use to detect everything within. I can put it directly over a vertex and it just won’t be registered
Would it be allowed to store an image in the datastore if it’s only ever shown to the user who made it?
For example, would it be allowed to make a drawing game that utilizes an EditableImage as the canvas, where the user can save their creation to the datastore ONLY so they can load it again later to work on it?