electra icon indicating copy to clipboard operation
electra copied to clipboard

Exporting to SavedModel?

Open artmatsak opened this issue 4 years ago • 2 comments

We have fine-tuned Electra for question answering on a custom dataset and now would like to export it to a SavedModel to use with TensorFlow Serving. We're using TensorFlow 1.15.4. Here's our code:

from finetune import task_builder
from configure_finetuning import FinetuningConfig
from run_finetuning import ModelRunner

def get_config():
    finetune_config = FinetuningConfig('electra_large', 'data_dir/',
                                       model_size='large', task_names=['squad'],
                                       init_checkpoint='data_dir/models/electra_large/finetuning_models/squad_model_1/',
                                       model_dir='data_dir/models/electra_large/finetuning_models/squad_model_1/',
                                       do_train=False, do_eval=True,
                                       use_tfrecords_if_existing=False,
                                       eval_batch_size=3,
                                       predict_batch_size=3,
                                       n_best_size=1,
                                       max_seq_length=512)
    return finetune_config

finetune_config = get_config()
tasks = task_builder.get_tasks(finetune_config)
model_runner = ModelRunner(finetune_config, tasks)

features = {
            'task_id': tf.convert_to_tensor([0]),
            'squad_eid': tf.convert_to_tensor([0]),
            'input_mask': tf.convert_to_tensor(np.zeros((1, 512), dtype='int32')), 
            'input_ids': tf.convert_to_tensor(np.zeros((1, 512), dtype='int32')),
            'segment_ids': tf.convert_to_tensor(np.zeros((1, 512), dtype='int32')),
            'squad_start_positions': tf.convert_to_tensor([0]),
            'squad_end_positions': tf.convert_to_tensor([0]),
            'squad_is_impossible': tf.convert_to_tensor([False])
}

input_func = tf.estimator.export.build_raw_serving_input_receiver_fn(
    features, default_batch_size=None)

model_runner._estimator.export_saved_model('.', input_func)

This fails with the following error though:

Value Error:  Variable global_step already exists, disallowed. Did you mean to set reuse=True or reuse=tf.AUTO_REUSE in VarScope?

How can we export to SavedModel please?

artmatsak avatar Nov 25 '20 13:11 artmatsak

@artmatsak Hi, i am trying to do the same thing, have you figure it out? thanks for your reply

652994331 avatar Jan 07 '21 00:01 652994331

We haven't been able to figure it out unfortunately.

artmatsak avatar Jan 07 '21 09:01 artmatsak