Matt Gardner

Results 80 comments of Matt Gardner

Thanks, this is great! Yeah, I was thinking mostly for slides, to be able to collapse them, and it doesn't make as much sense for other exercises. I also several...

And for getting back to _exactly_ where you were, having individual slides have their own URL would also be awesome, but I don't know how feasible that is (looks like...

Hmm, and because the links to exercises have numbers instead of names, we'd have to modify links if the course content changes... That unfortunate, but probably even harder to get...

Thanks, that did indeed work. Is there a place where this is or should be documented? On Mon, Feb 4, 2013 at 11:02 AM, Lane Schwartz [email protected]: > Multi-line task...

You can see how we handled this in the semantic parsing code here: https://github.com/allenai/allennlp-semparse/blob/c8bbe4e9fdf4fcb82af4e7c5360e80d51e0898eb/allennlp_semparse/state_machines/transition_functions/basic_transition_function.py#L80-L85. Making a similar change in the seq2seq code would be great, if you want to submit...

Sounds good to me, PR welcome. Though does this have to wait until we update our dependency on transformers?

Yes, having most of these inside allennlp would be great. I'm not sure that they fit the `Trainer` abstraction as you say, though, because of the assumptions the trainer makes....

If I were trying to do this, I'd implement a class very much like `TrainModel`, maybe called `CrossValidateModel`, that took whatever inputs it needed, then probably instantiated a trainer inside...

> > For us to do that, we have to be able to save objects for which we don't have implementation code ... > > I'm not sure I follow...

For vocab: `Model.save()` would have the same vocab saving logic that's currently spread throughout the trainer and the archival code. `Model.load()` would take an archive file that includes the vocabulary,...