feat - GPTSFTChatDataset alignment with OpenAI Messages, compatibility with packed sequences
[!IMPORTANT]
TheUpdate branchbutton must only be pressed in very rare occassions. An outdated branch is never blocking the merge of a PR. Please reach out to the automation team before pressing that button.
What does this PR do ?
Updates the GPTSFTChatDataset to use the messages format from OpenAI Updates sequence packing to be compatible with GPTSFTChatDataset
Collection: llm
GitHub Actions CI
The Jenkins CI system has been replaced by GitHub Actions self-hosted runners.
The GitHub Actions CI will run automatically when the "Run CICD" label is added to the PR. To re-run CI remove and add the label again. To run CI on an untrusted fork, a NeMo user with write access must first click "Approve and run".
Before your PR is "Ready for review"
Pre checks:
- [ ] Make sure you read and followed Contributor guidelines
- [ ] Did you write any new necessary tests?
- [ ] Did you add or update any necessary documentation?
- [ ] Does the PR affect components that are optional to install? (Ex: Numba, Pynini, Apex etc)
- [ ] Reviewer: Does the PR have correct import guards for all optional libraries?
PR Type:
- [ ] New Feature
- [ ] Bugfix
- [ ] Documentation
If you haven't finished some of the above items you can still open "Draft" PR.
Who can review?
Anyone in the community is free to review the PR once the checks have passed. Contributor guidelines contains specific people who can review PRs to various areas.
Additional Information
- Related to # (issue)
I reviewed that parts that I could and left comments, but I feel for changes in GPTSFTChatDataset class, nemo/collections/llm/gpt/data/packed_sequence.py, nemo/utils/sequence_packing_utils.py its good to get review from @cuichenx who is the PIC for SFT and packed sequence in NeMo. @cuichenx could you please review these files ? Thanks!
Also for nemo/collections/llm/gpt/model/llama_nemotron.py @suiyoubi could you please take a look ? Thanks!
just merged #13273. when you rebase you can make use_hf_tokenizer_chat_template: bool = True in ChatDataModule to enable chat template by default for tool calling. Would also be good to update the ChatDataModule documentation to specify the new default behavior.
Also I think you also return context_ids and answer_ids for each sample, but these fields are not used during SFT. If you'd still like to add them they can be added in utils.py
@jenchen13 rebase completed, functionality is merged. Only changed the default value in GPTSFTChatDataset and not in ChatDatamodule so that the behavior from the outer interface didn't change.
For reference the ChatDataModule that we use in Customizer is here, we base off of FinetuningDatamodule and make it compatible with sequence packing.
[🤖]: Hi @soluwalana 👋,
We wanted to let you know that a CICD pipeline for this PR just finished successfully.
So it might be time to merge this PR or get some approvals.
//cc @chtruong814 @ko3n1g @pablo-garay @thomasdhc