Variables: interacting with large R data.frame column hangs and then errors
Only a 1M row data frame, if you try expanding the column it errors out:
df <- data.frame(b=rep(1:1000000))
Fixing this will require the mechanism described in https://github.com/posit-dev/positron/issues/1419#issuecomment-1810222809
Gotcha. I suspect the variables pane is also doing unnecessary expensive computations also that should be fixed
adding a screen recording for this
https://github.com/posit-dev/positron/assets/329591/ff808aa8-6791-4e78-89da-ad3655f63305
Mentioned by a user here: https://github.com/posit-dev/positron/discussions/4424
See https://github.com/posit-dev/positron/issues/1419#issuecomment-1810222809 for some thoughts about how we could interrupt an R computation that is taking too long.
Well, it seems like a part of this issue is that the variables pane logic for R is doing too much computation? Inspecting variables needs to be relatively fast no matter how large the dataset. I went over the variables logic in Python recently to prevent many kinds of excessive computation, hangs, or runaway memory use so probably we need to do the same type of review for R.
Definitely. Another part of this issue is that an r_task() must take a very short amount of time. It's easy to trigger unexpected R computations via promise forcing or dispatch (S3, ALTREP) so we need a railguard against unexpected situations.
Depending on what Daniel is planning to fix in the short term, we could open a new issue for this. Originally tracked in #1419 which was also about hangs related to reticulate usage.
Verified Fixed
Positron Version(s) : 2024.12.0-66
OS Version : OSX
Test scenario(s)
Reprex looks good. Variables pane list truncated at 1000
Link(s) to TestRail test cases run or created: