I tried this out for EditableImage:WritePixels and EditableImage:ReadPixels and it actually doesn’t speed up the general use cases because even though WritePixels and ReadPixels are much faster, reading and writing data from the buffer is much slower than from a table. Use cases where a developer already wants to use a buffer are, of course, much faster because there is no conversion from buffer to table. We might add these APIs for flexibility, but developers would need to profile their individual use cases to see if they are faster.
There should really be a Thickness
option for the DrawLine
method, because stuff like drawing systems look very odd with a 1px line connecting each dot; it also kind of makes DrawLine
useless in my current work, as it’s only job is to create a line that is 1px thick
Right now, I’m resorting to a pretty hacky method where I would emulate a line by filling the line using the DrawCircle
method. Would be nice for just a straightforward option.
Here’s a video of this implementation:
Would you be able to paint editable meshes with materials/textures as well? Or is it locked to a single color value
The TextureID property of an EditableMesh
’s parent MeshPart
does not preview on the EditableMesh
, but an EditableImage
can be previewed on an EditableMesh
. Are there plans to allow the parent’s TextureID to preview on the EditableMesh
?
I’d like to use EditableMesh
es to procedurally map the UVs according to a texture atlas I’ve uploaded as an image. This is currently not possible unless I manually recreate the texture atlas using an EditableImage
.
With the new vertex API overhaul, is it possible to have different materials/textures per vertex for use in texture blending? I’d love to be able to have blending textures without having to use the limiting and often finnicky default terrain.
Calling editableMesh:GetAdjacentTriangles(triangleId) just returns {0, 0, 0} for me. Also, when the planet model in the file attached is scaled to over 100, editableMesh:RaycastLocal() becomes hit and miss, eventually not working at all, returning nil over 120. See replication file below:
GetAdjacentTriangles Bug.rbxl (174.8 KB)
Thanks for the repro, I’ll take a look
Is there currently a way to export an editable mesh? I can see the changes after parenting it to the original mesh, but have no way of exporting and inspecting within blender
Did you manage to take a look yet? No worries if you haven’t. Just would love to make progress on this project of mine, and editable mesh plays a huge role in it
I’ve been using EditableMesh to create a procedural world system and I thought I would share my feedback on it.
Normals
I find that the above behaviour causes more problems than it solves. It would be much better (in my opinion) to have a helper method that, when called, would generate automatic normals instead.
Collisions
I have messed around a bit with EditableMesh collisions via the FFlagSimEnableEditableMeshCreateMeshPartAsync
fflag. My only complaint would be that If I make a slight modification to an EditableMesh, calling :CreateMeshPartAsync()
would calculate the collision geometry for the entire mesh instead of only the regions of the mesh that have changed.
Maybe this could be solved by a new CollisionMesh
Instance. This Instance would allow for developers to have fine-grained control over collision geometry.
Materials
How that works with editable meshes planet? I’m so interested to see how that work in roblox :o
adding onto collisions, maybe it would be better if EditableMeshes got a ‘CanCollide’ property to allow for dynamic updates to editable meshes collision geometry.
hey, as of today (31/3/2024) I have started to experience this issue, specifically when using a surface appearance.
Are you planning on releasing a resample mode property of sorts, for the editable image? Because right now my low resolution textures are blurry
I could reupload it at a higher resolution, but that would be bad for performance, and i would run into issues due to your imposed 1024x1024 resolution limit. (I use one editableimage to load multiple textures, and then use uv coordinates to select which one for different areas of the mesh.)
EDIT:
It also leads to the textures leaking at the edges (i store all textures for the editable mesh in one large editable image.
I found it very interesting
and I also thought it was a cool idea
I wanted to make my own terrain generator, but I ran into the problem of creating a collision when using the EditableMesh:CreateMeshPart() function: roblox studio returns an error stating that this feature is disabled. Is there any alternative to create a collision?
I would just make an algorithm to place parts around the player every frame, to simulate collisions.
ive been thinking about the performance applications of this, is this faster or slower than the dynamic meshes version of bone transformation?
I’m so intrigued by those clouds! How did you make them?