LLMSurvey icon indicating copy to clipboard operation
LLMSurvey copied to clipboard

Comments on paper

Open ToddMorrill opened this issue 1 year ago • 3 comments

First - nicely done. This must have been a herculean effort to review all of these papers. Here are some ideas:

  1. It would be nice to include more information about Falcon when their paper is released (still "coming soon" per HF). In particular, it seems that the creators of Falcon made a decision to use multi-query attention with an eye toward inference speed. It might nice to provide a little more detail about how different architecture choices (e.g. attention mechanisms, etc.) impact tokens generated per second, which is what engineers and the open source community are very focused on (and quality of generation of course too). Tokens/second really impacts the user experience plus I would love to see how people are thinking about truly enormous context sizes.
  2. This is a small point and feel free to disregard it but the word "besides" has a certain usage pattern among native English speakers. It's commonly used as follows: make a claim about something in your first sentence, then say, "besides", and then make an even stronger claim that basically says, feel free to disregard the first claim because here's an even stronger claim. Here's an example: Tom would never survive life in the army; he's not tough enough. Besides, he's too old to be accepted. The point here is that every time you use "besides" in the paper, you undermine the strength of the sentence before "besides", which is not what you're trying to do. One final note is that "besides" is pretty colloquial and is seldom used in professional writing. What you're really looking for here are the following three linking phrases: also, in addition, and furthermore.

Again, thank you for your work here. There's so much happening in the LLM space so up-to-date reviews like this are really helpful.

ToddMorrill avatar Jun 22 '23 00:06 ToddMorrill

Thanks for your suggestions! We'll add Falcon in the upcoming update, and we will also gradually address the issue of "besides". We would like to include you in the acknowledgments. Could you please provide your name?

EliverQ avatar Jun 22 '23 13:06 EliverQ

Sure, it's Todd Morrill.

ToddMorrill avatar Jun 22 '23 13:06 ToddMorrill

First - nicely done. This must have been a herculean effort to review all of these papers. Here are some ideas:

  1. It would be nice to include more information about Falcon when their paper is released (still "coming soon" per HF). In particular, it seems that the creators of Falcon made a decision to use multi-query attention with an eye toward inference speed. It might nice to provide a little more detail about how different architecture choices (e.g. attention mechanisms, etc.) impact tokens generated per second, which is what engineers and the open source community are very focused on (and quality of generation of course too). Tokens/second really impacts the user experience plus I would love to see how people are thinking about truly enormous context sizes.

  2. This is a small point and feel free to disregard it but the word "besides" has a certain usage pattern among native English speakers. It's commonly used as follows: make a claim about something in your first sentence, then say, "besides", and then make an even stronger claim that basically says, feel free to disregard the first claim because here's an even stronger claim. Here's an example: Tom would never survive life in the army; he's not tough enough. Besides, he's too old to be accepted. The point here is that every time you use "besides" in the paper, you undermine the strength of the sentence before "besides", which is not what you're trying to do. One final note is that "besides" is pretty colloquial and is seldom used in professional writing. What you're really looking for here are the following three linking phrases: also, in addition, and furthermore.

Again, thank you for your work here. There's so much happening in the LLM space so up-to-date reviews like this are really helpful.

Chaim

nmm5060 avatar Jun 26 '23 03:06 nmm5060