gradio icon indicating copy to clipboard operation
gradio copied to clipboard

Gradio-Lite : zero-shot-classification pipeline returns only 1 score

Open xavierbarbier opened this issue 1 year ago • 2 comments

Describe the bug

Hello, I'm trying to use Gradio-Lite with zero-shot-classification Transformer.js.py pipeline and providing potential classes (as with "classic" Transformer and Transformer.js pipelines). But it outputs only one probability.

Using gr.Interface.from_pipeline(pipe) works but user needs to provide potential classes manually.

Am'I missing some arguments here ?

Have you searched existing issues? 🔎

  • [X] I have searched and found no existing issues

Reproduction

	<gradio-requirements>
	transformers_js_py
	</gradio-requirements>
	
	<gradio-file name="app.py" entrypoint>
	from transformers_js import import_transformers_js
	import gradio as gr
	labels=['politics', 'music','police']
	transformers = await import_transformers_js()
	pipeline = transformers.pipeline
	model_path = 'Xenova/mobilebert-uncased-mnli'
	pipe = await pipeline('zero-shot-classification', model_path, labels )

	async def classify(text):
		pred = await pipe(text )
		
		return pred["scores"]
		

	demo = gr.Interface(classify, "textbox", "textbox")
	demo.launch()
	</gradio-file>

	</gradio-lite>
</body>

Screenshot

No response

Logs

No response

System Info

Gradio-Lite (So I guess it's up to date!)

Severity

Blocking usage of gradio

xavierbarbier avatar May 17 '24 15:05 xavierbarbier

It's about Transformers.js' API spec where you have to pass the labels at prediction, not model initialization (see https://huggingface.co/docs/transformers.js/api/pipelines#module_pipelines.ZeroShotClassificationPipeline).

So your code should be modified like this:

from transformers_js import import_transformers_js
import gradio as gr

labels=['politics', 'music','police']
transformers = await import_transformers_js()
pipeline = transformers.pipeline
model_path = 'Xenova/mobilebert-uncased-mnli'
pipe = await pipeline('zero-shot-classification', model_path) # Not here.

async def classify(text):
	pred = await pipe(text, labels) # Pass `labels` here.

	return pred["scores"]


demo = gr.Interface(classify, "textbox", "textbox")
demo.launch()

whitphx avatar May 20 '24 07:05 whitphx

I first try to define labels within the pipe at inference (as with "classic" Transformer pipeline). Wasn't working. Then I tried at model initialization. Wasn't working either as expected. Seems you have to define them separately from inference call. It's working now. Arigato !

from transformers_js import import_transformers_js
import gradio as gr

labels=['politics', 'music','police'] # works
transformers = await import_transformers_js()
pipeline = transformers.pipeline
model_path = 'Xenova/mobilebert-uncased-mnli'
pipe = await pipeline('zero-shot-classification', model_path) # Not here.
# labels=['politics', 'music','police'] # works 
async def classify(text):
	pred = await pipe(text, labels # Pass `labels` here.
                                   # labels=['politics', 'music','police'] # doesn't work
                                     ) 

	return pred["scores"]


demo = gr.Interface(classify, "textbox", "textbox")
demo.launch()

xavierbarbier avatar May 21 '24 08:05 xavierbarbier