smashed icon indicating copy to clipboard operation
smashed copied to clipboard

Add PrefixSuffix multiseq mappers to prepend/append tokens

Open MaksymDel opened this issue 2 years ago • 3 comments

Hi! With the introduction of Smashed, munging datasets of long documents is going to be a lot more fun )

This draft PR is simply to explore the idea below. It does the following:

  1. It adds a new CustomTokensSequencePaddingMapper with corresponding classes for type_ids and attnetion_mask that allow wrapping the sentences with custom ids or strings.
  2. It abstracts away the SequencePaddingMapper to do the general job of adding prefix/suffix tokens depending on the sentence number

In (1), we might want to prepend strings because Text2Text models like T5 expect inputs to have Task prefixes. And we might want to bound sentences with custom special token_ids (e.g., with tokenizer added special tokens) to indicate the type of sentences in the dataset column.

Because of (2), subclasses now do not have to implement the transform function and only define what prefix/suffix tokens to add.

On the pro side, it reduces code duplication (especially considering new CustomPadding classes) and unifies the classes. But on the con side, we now have one more level of inheritance...

If some variation of this proposal fits, I can add docs and tests.

@soldni

MaksymDel avatar Aug 02 '22 05:08 MaksymDel

Hi Maksym!

Thank you for this pull request. I fully support the reasoning behind adding these mappers, but I would prefer avoiding Mappers argument that are callables, as they can create issues with libraries that use function signatures (I have another PR to remove that from MakeFieldMapper, actually).

In general, I would rather have more specialized mappers that do one thing well than fewer, more generic mappers that are highly configurable. It does make for more legible code, and it is easier to maintain.

soldni avatar Aug 03 '22 00:08 soldni

Hi Luca!

I removed callable arguments and added tests.

I think this is ready for review.

MaksymDel avatar Aug 05 '22 01:08 MaksymDel

Hi @soldni!

Does the current state of this PR look good, or is it better to redirect the efforts elsewhere at the moment?

MaksymDel avatar Oct 05 '22 15:10 MaksymDel