`AttributeError: 'InternVLChatConfig' object has no attribute 'vocab_size'` when trying to obtain token probabilities using `compute_transition_scores`
I'm trying to get obtain the probabilities of tokens generated by your latest excellent model InternVL 26B using the compute_transition_scores method exposed by the model object. However, I'm getting this AttributeError despite vocab_size being defined in the config.json. This approach - derived from the compute_transition_scores method's docstring - works fine for simpler text-only models such as gpt2. I generate the outputs using model.generate method with output_scores set to True. Apologies for not providing the full code (corporate restrictions apply here...). What am I doing wrong? Are there any other approaches to achieve this goal that would work better in this case? In particular what concerns me with my approach presented here is the double generation needed here - ideally it should be possible to generate probabilities during a single call to model.chat (which is the recommended method of obtaining model output judging from HF examples).
transition_scores = model.compute_transition_scores(
generated_tokens,
outputs.scores,
normalize_logits=True
)
*** AttributeError: 'InternVLChatConfig' object has no attribute 'vocab_size'
Hello, the vocab_size is stored in llm_config. Before calling model.compute_transition_scores, you can assign vocab_size from config.llm_config.vocab_size to config.vocab_size.