Command line options for profiling?
Hi all (my apologies if this is answered somewhere!) when running prusti on some large crate, is there some way to get a breakdown of how long it takes per file/module/function or similar? Thanks in advance!
Hi! The verification unit of Prusti are functions, but not everything that Prusti does can be split per function. For example, rustc's execution (type checker, borrow checker...) and the translation from Rust to Viper are done once per Prusti run. The generated Viper programs (one per verified Rust function) are then verified sequentially. When running prusti-rustc, the timing of these various steps can be seen by enabling the info log (PRUSTI_LOG=info). After https://github.com/viperproject/prusti-dev/pull/1374 is merged the timing can be seen with more precision by enabling the debug log.
Be aware that some timings are included in other timings. For example, [... DEBUG viper::verifier] Completed: Viper consistency checks (1.29 seconds) and [... DEBUG viper::verifier] Completed: Viper verification (9.77 seconds are part of [... INFO prusti_utils::stopwatch::log_level] [prusti-server] Completed: verification (11.067 seconds). We should clean up the logging.
When profiling Prusti it's important to know that Viper runs on the JVM, so a proper profiling should first warm up the JVM to let the JIT run. This can be done by starting a prusti-server (disabling the Prusti verification cache) and then running Prusti multiple times using the same server. This warmup logic is already implemented in the benchmark.py script.
Thanks very much Federico! We will try this out!
On Thu, Mar 23, 2023 at 1:29 AM Federico Poli @.***> wrote:
Hi! The verification unit of Prusti are functions, but not everything that Prusti does can be split per function. For example, rustc's execution (type checker, borrow checker...) and the translation from Rust to Viper are done once per Prusti run. The generated Viper programs (one per verified Rust function) are then verified sequentially. The timing of these various steps can be seen by enabling the info log (PRUSTI_LOG=info), and after #1374 https://urldefense.com/v3/__https://github.com/viperproject/prusti-dev/pull/1374__;!!Mih3wA!CqLV5WffJHGhmz2jvcbaxZnRwI4wFRtRosCFXCCoZRkSlLuuxjoVmlrIASF-oogg4J90jGz_9XMJSSVErUXcG661$ also enabling the debug log.
Be aware that some timings are included in which other timings. For example, [... DEBUG viper::verifier] Completed: Viper consistency checks (1.29 seconds) and [... DEBUG viper::verifier] Completed: Viper verification (9.77 seconds are part of [... INFO prusti_utils::stopwatch::log_level] [prusti-server] Completed: verification (11.067 seconds). We should clean up the logging.
When profiling Prusti it's important to know that Viper runs on the JVM, so a proper profiling should first warmup the JVM to let the JIT run. This can be done by starting a prusti-server (disabling the Prusti verification cache) and then running Prusti multiple times using the same server. Some of this logic is implemented in the benchmark.py https://urldefense.com/v3/__https://github.com/viperproject/prusti-dev/blob/master/scripts/benchmark.py__;!!Mih3wA!CqLV5WffJHGhmz2jvcbaxZnRwI4wFRtRosCFXCCoZRkSlLuuxjoVmlrIASF-oogg4J90jGz_9XMJSSVErdg8IvAn$ script.
— Reply to this email directly, view it on GitHub https://urldefense.com/v3/__https://github.com/viperproject/prusti-dev/issues/1372*issuecomment-1480774546__;Iw!!Mih3wA!CqLV5WffJHGhmz2jvcbaxZnRwI4wFRtRosCFXCCoZRkSlLuuxjoVmlrIASF-oogg4J90jGz_9XMJSSVEre5x4FYv$, or unsubscribe https://urldefense.com/v3/__https://github.com/notifications/unsubscribe-auth/AAMS4OFGOIFH3Z2WNN6TO2LW5QCUVANCNFSM6AAAAAAWED63MY__;!!Mih3wA!CqLV5WffJHGhmz2jvcbaxZnRwI4wFRtRosCFXCCoZRkSlLuuxjoVmlrIASF-oogg4J90jGz_9XMJSSVErSWaq7iA$ . You are receiving this because you authored the thread.Message ID: @.***>
--
- Ranjit.