PRATIK C
PRATIK C
How to visulaize attention weight in output?
> You would be much better off using LIT for this task, as opposed to WIT. https://pair-code.github.io/lit/ > > LIT has built-in support for most NLP analysis tasks, including attention...
> Leverage the "inference_on" parameter, and updated it to make it more intuitive now for multi-GPU usage. -1 is reserved for CPU and ```0 through 998 is reserved for GPUs``....
> > Leverage the "inference_on" parameter, and updated it to make it more intuitive now for multi-GPU usage. -1 is reserved for CPU and ```0 through 998 is reserved for...
What is n_splits=2, order=2?
@bmreiniger any help.
> The converter for OneVsOneClassifier has not been implemented yet. It should not be too complicated to do. In the meantime, OneVsRestClassifier has a converter. But in the document OneVsOneClassifier...
Hi @xadupre any help on this?
> @pratikchhapolika I'm working on this converter now. Can you please share me some of your data (X_train, y_train, X_test, y_test, etc.) so I can do a fully testing? sure....
> @pratikchhapolika I'm working on this converter now. Can you please share me some of your data (X_train, y_train, X_test, y_test, etc.) so I can do a fully testing? @xiaowuhu...