Add a `vertical_overflow='crop_above'` as an option to `Live()`
Type of changes
- [x] Bug fix
- [x] New feature
- [ ] Documentation / docstrings
- [ ] Tests
- [ ] Other
Checklist
- [x] I've run the latest black with default args on new code.
- [x] I've updated CHANGELOG.md and CONTRIBUTORS.md where appropriate.
- [ ] I've added tests for new code.
- [x] I accept that @willmcgugan may be pedantic in the code review.
Description
As described in #3263, Live(vertical_overflow="visible") has the unfortunate behavior of duplicating content when the content update exceeds the height of the console. This is especially unfortunate for a few reasons:
- It seems there is no way to get rid of the duplicated content (i.e.,
transient=Truedoesn't help to avoid this). - There are no other
vertical_overflowoptions that allows for the "newest" content to be visible. - In today's Gen AI world where markdown often is received in chunks, it's really useful to have a live display where a (possibly long) markdown string accumulates over time.
This PR proposes a new option vertical_overflow="crop_above" which does the reverse of vertical_overflow="crop" (it displays only the bottom portion of the content instead of the top). It has the nice behavior of always making the "newest" content visible, but without the downside of duplicated content. Here's a demo:
import requests
import time
from rich.live import Live
from rich.markdown import Markdown
readme = requests.get(
"https://raw.githubusercontent.com/posit-dev/py-shiny/refs/heads/main/README.md"
)
readme_chunks = readme.text.replace("\n", " \n ").split(" ")[: 200]
content = ""
with Live(auto_refresh=False, vertical_overflow="crop_above") as live:
for chunk in readme_chunks:
content += chunk + " "
time.sleep(0.01)
live.update(Markdown(content), refresh=True)
https://github.com/user-attachments/assets/6425a645-2658-46a5-9b49-12c63a154063
And note that if you change vertical_overflow="crop_above" to vertical_overflow="visible", this is the behavior:
https://github.com/user-attachments/assets/35f95d37-f1f3-4afb-a0ac-9d360fdac3d6
I'm happy to write tests or anything else you need if you like this overall direction
👍
would love to see this!
@willmcgugan - if we can get a r?
This feature would be amazing for CLI LLM chat apps!
@willmcgugan sorry to spam you. I wanted to frame this a bit and why I think it's super important. LLM agents are proliferating pretty wildly at the moment, and one of the quickest ways to build really effective UIs to give users access to them is via CLIs. IMO, rich is the best tool for building beautiful CLI apps (in python or otherwise), and I think that this specific feature is the single biggest blocker to it being near perfect for that use case
You would be better off using Textual for LLM output. Better Markdown rendering as well.