It should cache the intellisense information. It is recalculating it every single time you type, regardless of whether that library has changed at all.
It makes development an absolute pain, coupled with the glitchy intellisense menus which disappear randomly and re-appear (separate bug already posted, and still not fixed).
In this video you can see it’s even a problem with game:…
You can also see how glitchy and erratic the behavior of intellisense can be. I press enter but it just disappears, waits 2-3 seconds and then re-appears where enter finally works.
Intellisense should be nearly instant and definitely cached where possible.
Intellisense dialog should not mysteriously disappear right when you press enter to select something and then magically re-appear again after a few seconds and finally allow you to press enter and have it work.
Yeah, I’ve also been noticing this happening to me as well. I was wondering if it was because of me having a large script. Personally, It doesn’t affect me that much, however it could definitely slow down programming.
Thank you for the bug report, do you have a place file that you’re willing to share to help us figure out what might be slowing down Autocomplete in this case?
Open the LocalScript “Test Confetti” and you can repeat the things I did in the video.
Did this start happening today?
We enabled some changes a few hours ago that should have improved the slowest scripts by 30% in a certain pet game, but if it caused a regression instead, we will disable it and re-evaluate.
not sure but as of my experience, this bug has been for an long time. Even when editing in studio runtime, the issue happens, not so often.
It started happening a few weeks ago. We extensively use Luau and everything in our entire codebase is typed.
This is still a major problem for us and has slowed our development down significantly. It is now taking up to 30 seconds on the latest Intel hardware (i9-13900k, 128gb ram, etc…). Any update on this? Really appreciate all of the hard work that is being put into Luau, and I know we are pushing it - but I believe that Luau is the future and we have adopted it wholeheartedly.
To paraphrase my update in the other thread: We believe we have fixed the bug which causes Autocomplete to fail-to-accept, but the general Autocomplete slowness is harder to fully fix. Autocomplete is ultimately powered by the Luau type system, so the longer the script takes to typecheck, the longer autocomplete results will take.
We do have some ideas for improving Autocomplete’s performance, and particularly providing a more user-friendly failure mechanism - which we are actively working on right now. But, without a repro case it is difficult to know if it will improve your particular place.
Following up, it is completely broken in the project we’re working on now. No autocomplete whatsoever. All we get are improperly formatted error lines, random red/orange bars streaming across our screens, etc. Screenshot below (my code has been stuck like this, in many different places)
Did this just happen today, or has it been going on for a while? What happens if you modify the script? If it’s consistent, can you send in a repro case by DM?
Started happening more recently. Intellisense / type-checking takes about 30 seconds on our project, and the type cache clears about every 15 seconds. We’ve reverted to basically just typing off of memory at this point. We’re really in need of getting Intellisense working again Thank you for all of the hard effort. Having this fixed means the world to all of us.
Our team is actively working on the problem; we’re seeing promising results (reduced cache invalidations, and improved performance with a cold cache) in your repro case and will keep you updated on our progress, but we do not yet have an ETA on all the changes.