cli
cli copied to clipboard
Update documentation around `--limit` flag restriction
Describe the feature or problem you’d like to solve
A number of options have a -L or --limit option with defaults (20 or 30) and can be supplied with a --limit n where n is a positive integer option, e.g.:
gh issue list --state=open --limit=2000 --json=number
However, if the user needs to get a list of all of the open issue numbers they need to either guess how many there might be or supply a very high number.
Proposed solution
Either have a --nolimit flag or allow either --limit=0 or --limit=-1 to mean no limit. This would allow things like pre-populating a list of all open tickets for pre-commit, etc. hooks to use to validate commit messages include only open ticket numbers without possibly doing multiple calls during the commit message processing.
Alternatively, a documented mechanism to get more items when piping output could be added - to me this seems more complex to both implement and use.
Additional context
Hi, thank you for the feature request, but it is by design that you must always specify the limit for fetching issues, pull request, repositories, or basically any unbounded set of data on GitHub.
Having a "no limit" setting would be convenient but would also be tricky if people started using it in their workflows or scripts without giving it a second thought. For example, fetching all issues without limit from this repository would likely complete in under a minute, but fetching all issues from some other repositories I belong to could potentially take many minutes. For example, the VS Code repository currently has 124,827 issues. Since we can only ever fetch 100 issues per request, we would need 1,249 sequential requests to fetch all of these issues. Now, if you take into account the overhead of making a single API request to GitHub, then multiply that to a thousand, then also consider rate limit restrictions, you can start to understand why unbounded fetching of data over the network is rarely a good idea.
So, if you need to fetch all issues from a large repository, specify a large --limit number to encompass all data that you are interested in. Specifying a large number as a limit forces you to consider what would happen if there actually were enough records to fill that limit.
However, we could document this restriction better, I agree! I'm going to re-classify this as a documentation issue.
If there were a simple to use continue or start-from mechanism documented this would allow a work-around.
@GadgetSteve Possibly, but I'm not sure how gh would display pagination info in its UI. Exposing lower-level pagination info doesn't seem like a great fit for gh right now, and if you need a way to continue paginating based on a previously fetched set, perhaps you should drop down to the GitHub API level manually where you have access to page parameter in REST API or cursor pagination in the GraphQL API?
query ($endCursor: String) {
repository(owner: "cli", name: "cli") {
issues(states: OPEN, first: 100, after: $endCursor) {
nodes {
number
title
}
pageInfo {
hasNextPage
endCursor
}
}
}
}
Since API returns a flag (hasNextPage) to say that there is more data including this in the response along with the endCursor and adding a --startfrom option could potentially allow a last line to be added to the output text something along the lines of:
More entries to list if needed use --startfrom=<endCursor>
The json response could include a last entries with fields of:
'NextPage:endCursor`
could potentially allow a last line to be added to the output text something along the lines of:
That sounds good in theory, but how would be output this line in machine-readable mode and expect that current scripts won't trip up over it thinking it's an issue record?
The json response could include a last entries with fields
That also sounds good in theory, but the top-level data type of our --json response is a JSON array. We can only add more records to the array but not extra fields without changing the JSON array to a JSON object, which would break backwards-compatibility with JSON consumers.
Popíšte funkciu alebo problém, ktorý chcete vyriešiť
Niekoľko možností má alebo možnosť s predvolenými nastaveniami (20 alebo 30) a môže byť dodaný s možnosťou, kde n je pozitívna celočíselná možnosť, napr.:
-L``--limit``--limit ngh issue list --state=open --limit=2000 --json=numberAk však používateľ potrebuje získať zoznam všetkých otvorených čísel problémov, musí buď uhádnuť, koľko ich môže byť, alebo poskytnúť veľmi vysoké číslo.
Navrhované riešenie
Buď máte vlajku, alebo povoliť alebo znamenať žiadny limit. To by umožnilo, aby veci, ako je predbežné vyplnenie zoznamu všetkých otvorených vstupeniek na predbežné odovzdanie atď. Háky, ktoré sa majú použiť na overenie správ o odovzdaní, zahŕňajú iba otvorené čísla lístkov bez toho, aby sa počas spracovania správy na odovzdanie pravdepodobne vykonávalo viac hovorov.
--nolimit``--limit=0``--limit=-1Prípadne by sa mohol pridať zdokumentovaný mechanizmus na získanie väčšieho počtu položiek pri výstupe potrubia - pre mňa sa to zdá zložitejšie na implementáciu aj použitie.
Dodatočný kontext
Dňa 22. 12. 2021, pdemko93 @.***> napísal(a):
Popíšte funkciu alebo problém, ktorý chcete vyriešiť
Niekoľko možností má alebo možnosť s predvolenými nastaveniami (20 alebo 30) a môže byť dodaný s možnosťou, kde n je pozitívna celočíselná možnosť, napr.:
-L``--limit``--limit n
gh issue list --state=open --limit=2000 --json=number
Ak však používateľ potrebuje získať zoznam všetkých otvorených čísel problémov, musí buď uhádnuť, koľko ich môže byť, alebo poskytnúť veľmi vysoké číslo.
Navrhované riešenie
Buď máte vlajku, alebo povoliť alebo znamenať žiadny limit. To by umožnilo, aby veci, ako je predbežné vyplnenie zoznamu všetkých otvorených vstupeniek na predbežné odovzdanie atď. Háky, ktoré sa majú použiť na overenie správ o odovzdaní, zahŕňajú iba otvorené čísla lístkov bez toho, aby sa počas spracovania správy na odovzdanie pravdepodobne vykonávalo viac hovorov.
--nolimit``--limit=0``--limit=-1
Prípadne by sa mohol pridať zdokumentovaný mechanizmus na získanie väčšieho počtu položiek pri výstupe potrubia - pre mňa sa to zdá zložitejšie na implementáciu aj použitie.
Dodatočný kontext
--
Reply to this email directly or view it on GitHub:
https://github.com/cli/cli/issues/4888#issuecomment-999927671
You are receiving this because you are subscribed to this thread.
Message ID: @.***>
-- Peter Demko
If this is open, can I pick it up?
@deepto98 Thank you for offering! Specifically what did you have in mind to implement?
I understand the limitation to fetch all issues or PRs at once.. but it would be super helpful to configure the default limit to a a bit higher number with a max upper limit of 50-100. IMO 30 is a very low number but it also starts making no sense after displaying more than 100 items.
Please have --limit 0 or --limit -1 indicate no limit. Keep the default at 20. Users who actually need everything will find out about 0 or -1. Most users won't.
Given the speed performance of returning workflow runs, users who do not actually need all items will be motivated to use a limit.
@mislav How about the suggestion for @Liturgist? I would like to contribute on this one.
The lower-level gh api has --paginate (but no --limit), this feels slightly inconsistent. That said, whenever I need something machine-readable, I might default to gh api anyway.
Running into challenges in that the API is limited to returning 1000 results in many cases (i.e. label filter, and others).
Doesn't seem to be a work around on the CLI to make a request for the next 1000, etc.