TextAttack
TextAttack copied to clipboard
Can you provide a simple attack API for single sentence attack test?
Is your feature request related to a problem? Please describe. No
Describe the solution you'd like Here is a simply idea for single text attack which is rewrited from original batch attack function
def simple_attack(self, text, label):
if torch.cuda.is_available():
self.attack.cuda_()
example, ground_truth_output = text, label
try:
example = textattack.shared.AttackedText(example)
if self.dataset.label_names is not None:
example.attack_attrs["label_names"] = self.dataset.label_names
try:
result = self.attack.attack(example, ground_truth_output)
except Exception as e:
raise e
if (isinstance(result, SkippedAttackResult) and self.attack_args.attack_n) or (
not isinstance(result, SuccessfulAttackResult)
and self.attack_args.num_successful_examples
):
return
else:
return result
except KeyboardInterrupt as e:
raise e
In this way, we need to initialize the attacker as usual, and the dataset loading method can be refactored, e.g., provide a default null dataset object instead of compulsory loading dataset from a file. e.g.,
# dataset = HuggingFaceDataset("sst", split="test")
# data = pandas.read_csv('examples.csv')
dataset = [('', 0)]
dataset = Dataset(dataset)
self.attacker = Attacker(recipe, dataset)
Describe alternatives you've considered No
Additional context I also suggest to support mirror the tfhub which will help people in China(unable to access tfhub.dev). e.g., https://github.com/yangheng95/TextAttack/blob/018c561b27182164dc76fb5e43516f7bc64801b1/textattack/constraints/semantics/sentence_encoders/universal_sentence_encoder/universal_sentence_encoder.py#L21 and https://github.com/yangheng95/TextAttack/blob/018c561b27182164dc76fb5e43516f7bc64801b1/textattack/constraints/semantics/sentence_encoders/universal_sentence_encoder/multilingual_universal_sentence_encoder.py#L23 . However, this is up to you.
Finally, I recommand to distinguish the name of attacked_text in the original_result and perturded_result, as I got trouble in confusing them in experiments, and lost some time.
Thanks very much for this great work!