Hey, thanks for trying out the example places.
Can you verify that you have turned on the Studio Beta checkbox by going to File > Beta Features and then checking the “In-experience CSG improvements” as described in this image in the main post?
If you are looking for wireframe rendering within Studio to compare geometry before/after one of the operations, you can go to View > Wireframe Rendering to toggle On/Off that rendering mode at any point.
Here is a quick example I did with some simple Unions / Intersects on a block but it will really depend on what geometry you are trying to run CSG on:
Turns out my pc was just having a moment and it no longer drops frames.
In the most boring way possible, can you try turning the beta feature off, reboot studio then back on again?
This message is only possible when the flag isn’t on (looking at the code )
It already fixed by restart studio!
Also is there way to make it only affect the material metal but not plastic?
Nice.
That would be outside of the CSG Operation but if you’re using my examples, it’s certainly possible. Where the ray shooting is done, we check for a tag (CollectionService). You could also check the material type
if(collectionService:HasTag(raycastResult.Instance,"breakable")) and *some other condition here* then
The new CSG engine features work perfectly when tested in Roblox Studio with my procedural generation module script (though, there are still issues with the PartOperation’s lighting). However, the script produces errors unrelated to the new CSG engine additions when I publish these changes to my game. Reverting to using the old methods in-game, causes the script to work perfectly. I’m wondering how this could be, as the only thing I’ve changed was the methods I use to subtract geometry from parts. If you need my module script, just let me know and I will DM it to you.
Oh this might be a silly question but does CSG work client-side now without having to ping the server first, do calculations on the server before sending it back to the client?
Oh and does it work with MeshParts?
I don’t use unions because I’m a 3D Blender artist but would like to use this for like destroying rocks or chopping trees which I of course model in Blender using actual meshes.
But it also must be client side since I just want it for visual effects and not create unnecessary input lag or use up internet bandwidth.
This is still in beta, you can only use it in studio. It’s not 100% stable yet; but look for an announcement in the future regarding this
If this is the case, then they should change the name of the beta from “In-experience CSG improvements” to “In-Studio CSG improvements.”
It seems like using SubstituteGeometry() has a hard time replicating changes to the client if you use the function within a RemoteEvent function. All it does is make a single change per part before it cannot replicate it anymore (but for some reason it works with the RemoteEvent function in the SimpleTools example place).
Client:
Server:
repro:
SubstitudeGeometry replication bug repro.rbxl (104.6 KB)
(Click to perform SubtractAsync)
Yes it does work client side (without replication) but it does not work on mesh parts as CSG starts from primitives
Hey! Thanks for the bug report. I was able to reproduce it and I am trying to figure out what the issue is.
Roblox R6 siege knockoff incoming
Warning Bad Joke incoming:
Wouldn’t Roblox R15 siege be better?
How do I award reddit gold to that comment?
That makes sense.
Will we one day perhaps be able to use it on MeshParts (provided the mesh is fully closed with no holes and with all edges/etc connected)?
I think it has really great potential and many usecases for people (like myself) who exclusively work with meshes.
Another question I have is, if I have a humanoid character where the unions are parented in the model, will this all render in a single draw call or is every union still going to get their own draw call?
One of my greatest performance concerns with unions is draw calls and how having many unique objects results in insane amounts of lag which is why I often also just use meshes.
@BelgianBikeGuy Is what he said true? The post says it supercharges existing experiences, but if it’s only for studio, it wouldn’t affect existing experiences then.
Blast, I think you’re misunderstanding a bit. The goal is to help improve “in experience” functionality. That being said, before new APIs and features are released projects usually go through various stages of availability (and stability). In very broad strokes:
- Internal Development
- alpha of some kind
- some closed beta (either internal or with select invited developers - usually sign ups are asked for)
- a studio beta where the new features are available for testing. (in experience - when you test)
- a general open beta (you can publish with it but there might still be some issues)
- a full release
Keep in mind these are general steps and we might go back and forth depending on issues found. Currently we are at the “Studio Beta” stage. Once the bugs and issues people (and we) find are resolved, we can then move on to the next stage
In other words, the APIs are for “in-experience” situations but they are only available for development, testing and feedback (for now)
Hope this clarifies it a little
~BelgianBikeGuy