Humayun Mustafa

Results 1 comments of Humayun Mustafa

This code worked for me: ```{python} import tensorflow as tf import numpy as np import squad_utils from transformers import AlbertTokenizer max_seq_length=384; def output_fn(feature): pass read_file=squad_utils.read_squad_examples("path-to-file-that-contains-SQuaD-example",False) tokenizer=AlbertTokenizer.from_pretrained("albert-base-v2") features=squad_utils.convert_examples_to_features(read_file,tokenizer,384,128,64,False,output_fn,True) tokenized_values = tokenizer(read_file[0].question_text,...