rat-sql
rat-sql copied to clipboard
A relation-aware semantic parsing model from English to SQL
Hi, Can someone please share with me its Codalab worksheets to see how it works and how to evaluate officially on spider challenge? Thanks.
Given a table name and a problem, how do models make predictions? I see the source code only gives the validation script
when I run: def _debug(self, model, sliced_data, output): for i, item in enumerate(tqdm.tqdm(sliced_data)): (_, history), = **model.compute_loss([item], debug=True)** output.write( json.dumps({ 'index': i, 'history': history, }) + '\n') output.flush() def compute_loss(self,...
Hi, in your paper you say "For “oracle sketch”, at every grammar nonterminal the decoder is forced to choose the correct production so the final SQL sketch exactly matches that...
We are using p100 and 25 gb ram to train the bert large model. But when we tried to run the default code with bs=6 and num_batch_accumulated=4, we got cuda...
After I build the docker, when I run the preprocess, it shows:  I don't know why.
Hi I have started to experiment with training this model and it seems that it can make use of a large batchsize, but even then the training times are quite...
Hi! Here are some more detailed questions about the code. When constructing the relationship, there is a `match_foreign_key(cls, desc, col, table)` method(spider_enc_modules.py). At the end of the method (line 705...
Hi, How can I experiment this work with other Transformer based models (like RoBERTa, GPT2...)? Thanks.
Hi, I looked into the code of RATSQL. It seems the initial state of the decoder is zero instead of the end state of the questions. Am I understanding this...