vscode-cpptools
vscode-cpptools copied to clipboard
Improve Find All References performance?
Our testing indicated that performance was on par with Visual Studio and that VS had some bugs that was causing worse performance (due to it getting stuck processing on 1 thread some times) -- @Colengms do you know the VS bug that you filed?
But...there's probably stuff we could do to improve performance, but since the bottleneck is generally IntelliSense parsing, I'm not sure yet if the performance gain would be worth the effort.
Generally, the "too long" case will occur if there are too many files to "confirm" -- users could use the cancel or preview if they don't want to wait for confirmation.
Let us know if anyone has a specific repro where you believe we're doing something incorrect (i.e. performance is slower than with VS, similarly configured) that is causing bad performance or too low CPU usage, e.g. if you have a 200k line file somewhere in your code base, it could get "stuck" lexing on 1 core.
The "searching" phase could also take slightly longer (than VS) for opened files because we don't re-use the already opened document object so we have to read it from disk.
Also, let us know if anyone encounters too slow "canceling" or too slow "previewing".
The VS issue is here: https://developercommunity.visualstudio.com/content/problem/672813/find-all-references-performance-issues.html
Can "searching" for files containing references be limited to workspace folders, if possible as an option? Since, IMHO, it may reduce no. of files to be "confirmed". And mostly IMHO, only references within the workspace are required.
it's 2021, but find the reference still very slow for c++...
2022 now
@ljhm The speed isn't expected to improve by 2023 either, but you could potentially change the settings C_Cpp.references.maxCachedProcesses, C_Cpp.references.maxConcurrentThreads, C_Cpp.references.maxMemory to increase the performance, but whether or not that helps depends on which part of the processing is the bottleneck in the particular invocation and the amount of memory/CPU available.
I miss the speed of Source Insight, which is my development tool before vscode. :cry::cry::cry:2022.08.10:cry::cry::cry:
It's fast if the files to confirm are already loaded. Can we add a command to pre-load all workspace files? My SSD has enough space.
@gtianw Do you mean to create an IntelliSense process for all the workspace files? You could file a feature request for that. It would use a lot of RAM memory, not SSD, which is disk drive space, which isn't used for memory unless it's paging to the disk.
I mean to create ipch
for all the workspace files. It use disk drive space.
Can't ipch
be used to speed up?
can it be cached in database(hard disk) and we get the result from database inside of cpptools
@gtianw ipch only has info on a headers for a particular TU, so it would already be used to speed up TU creation.
@heartacker Yeah, we're aware of the possibility of writing references to the database -- that is being tracked internally (I suppose we could also open a new issue on GitHub). It's a major change though, since our tag parser that writes to the database doesn't currently do a full compile, e.g. of includes/defines, or parse into function definitions.
Can we pre-process all tag info for a project and store it in a database to boost find reference speed? This database can be generate once and distribute by project, so devs can just download it instead of generate it locally.
@ODtian We actually already pre-process tag info for the whole project, but with a lexical parser. That approach requires additional symbol confirmation work for find all references which slows it down. Semantic parsing requires a lot of processing time up front and (possibly) a larger database. This is something we're currently investigating.