diffusers
diffusers copied to clipboard
It is so confused to load textual-inversion sdxl embedding as inference prompt
Describe the bug
I trained a textual inversion model on SDXL pretrained model ( RealVisXL3.0 ), and then when I want to inference, I use this textual inversion model as a positive prompt to strengthen some aspect of the model.
But when I try to load the RealVisXL3.0 with from_pretrained() first and load the textual-inversion model with load_textual_inversion(), I get an error. I looked at the relevant information, but there was no good workaround.
Reproduction
Mycode:
Error:
ValueError: Loaded state dictionary is incorrect: {'text_model.embeddings.position_embedding.weight'
Please verify that the loaded state dictionary of the textual embedding either only has a single key or includes the
string_to_param input key.
Logs
---------------------------------------------------------------------------
ValueError Traceback (most recent call last)
Cell In[14], line 3
1 pipe = DiffusionPipeline.from_pretrained("/mnt/asian-t2i/pretrained_models/RealVisXL_V3.0", torch_dtype=torch.float16, use_safetensors=True, variant="fp16")
2 pipe.to("cuda:2")
----> 3 pipe.load_textual_inversion("/mnt/asian-t2i/output/textual-inversion/textual_inversion-sdxl-data1k6-realism/checkpoint-1000/model.safetensors", token="<abc-person-realism>", text_encoder=pipe.text_encoder, tokenizer=pipe.tokenizer)
5 # positive_prompt = '1girl, black hair, solo, sleeveless, denim, plate, ponytail, blurry, cup, indoors, blurry background, table, realistic, turtleneck, belt, pants, long hair, jeans, jewelry, drinking, own hands together, sleeveless shirt, depth of field, mole, shirt, food, holding'
6 positive_prompt = 'a sl_1234# woman with long hair and a white shirt. high quality, Realism, 4K, <xli-person-realism>'
File /opt/conda/lib/python3.10/site-packages/huggingface_hub/utils/_validators.py:118, in validate_hf_hub_args.<locals>._inner_fn(*args, **kwargs)
115 if check_use_auth_token:
116 kwargs = smoothly_deprecate_use_auth_token(fn_name=fn.__name__, has_token=has_token, kwargs=kwargs)
--> 118 return fn(*args, **kwargs)
File /mnt/asian-t2i/diffusers/src/diffusers/loaders/textual_inversion.py:402, in TextualInversionLoaderMixin.load_textual_inversion(self, pretrained_model_name_or_path, token, tokenizer, text_encoder, **kwargs)
396 raise ValueError(
397 f"You have passed a state_dict contains {len(state_dicts)} embeddings, and list of tokens of length {len(tokens)} "
398 f"Make sure both have the same length."
399 )
401 # 4. Retrieve tokens and embeddings
--> 402 tokens, embeddings = self._retrieve_tokens_and_embeddings(tokens, state_dicts, tokenizer)
404 # 5. Extend tokens and embeddings for multi vector
405 tokens, embeddings = self._extend_tokens_and_embeddings(tokens, embeddings, tokenizer)
File /mnt/asian-t2i/diffusers/src/diffusers/loaders/textual_inversion.py:217, in TextualInversionLoaderMixin._retrieve_tokens_and_embeddings(tokens, state_dicts, tokenizer)
215 embedding = state_dict["string_to_param"]["*"]
216 else:
--> 217 raise ValueError(
218 f"Loaded state dictionary is incorrect: {state_dict}. \n\n"
219 "Please verify that the loaded state dictionary of the textual embedding either only has a single key or includes the `string_to_param`"
220 " input key."
221 )
223 if token is not None and loaded_token != token:
224 logger.info(f"The loaded token: {loaded_token} is overwritten by the passed token {token}.")
ValueError: Loaded state dictionary is incorrect: {'text_model.embeddings.position_embedding.weight': tensor([[ 0.0016, 0.0020, 0.0002, ..., -0.0013, 0.0007, 0.0015],
[ 0.0042, 0.0029, 0.0002, ..., 0.0010, 0.0015, -0.0012],
[ 0.0018, 0.0007, -0.0012, ..., -0.0029, -0.0009, 0.0026],
...,
[ 0.0216, 0.0055, -0.0101, ..., -0.0065, -0.0029, 0.0037],
[ 0.0188, 0.0073, -0.0077, ..., -0.0025, -0.0009, 0.0057],
[ 0.0330, 0.0281, 0.0288, ..., 0.0160, 0.0102, -0.0310]]), 'text_model.embeddings.token_embedding.weight': tensor([[-0.0012, 0.0369, 0.0221, ..., 0.0159, 0.0046, -0.0220],
[ 0.0152, 0.0261, -0.0132, ..., -0.0037, 0.0002, 0.0121],
[-0.0154, -0.0131, 0.0065, ..., -0.0206, -0.0139, -0.0025],
...,
[ 0.0320, -0.0165, -0.0014, ..., 0.0161, 0.0044, 0.0141],
[ 0.0324, -0.0169, -0.0012, ..., 0.0159, 0.0046, 0.0143],
[ 0.0323, -0.0169, -0.0012, ..., 0.0158, 0.0045, 0.0144]]), 'text_model.encoder.layers.0.layer_norm1.bias': tensor([-6.3965e-02, -1.8945e-01, -6.6406e-02, 1.1578e-01, -1.9409e-02,
-7.6782e-02, -9.2346e-02, 1.5039e-01, -1.5430e-01, 5.3711e-02,
-2.6001e-01, 1.6309e-01, -6.2378e-02, -2.1582e-01, -2.1027e-02,
1.8555e-01, 2.2461e-01, -8.3801e-02, 6.8848e-02, -1.7285e-01,
6.4941e-02, -1.6846e-02, -1.8417e-02, -2.3926e-01, -6.9336e-02,
-4.7913e-02, 2.9907e-01, 1.0107e-01, -7.4268e-01, -5.8624e-02,
-1.4868e-01, 1.6614e-01, 1.8640e-01, 5.3040e-02, 1.5640e-03,
1.1094e+00, 1.6333e-01, 2.9907e-01, 5.7770e-02, 8.7036e-02,
1.0384e-02, 3.3447e-02, -1.1225e-03, -2.0691e-01, 3.9429e-02,
3.0664e-01, -9.5215e-02, -1.0803e-01, 3.2812e-01, 2.4231e-01,
-5.1086e-02, 7.2205e-02, 5.2734e-02, 5.2246e-02, 4.5508e-01,
9.3994e-03, 1.0803e-01, -3.1860e-01, 1.6333e-01, -1.6138e-01,
6.9763e-02, 9.1858e-02, -1.9043e-02, 1.1182e-01, -2.0508e-01,
5.3467e-02, 2.6196e-01, -5.0391e-01, 1.6846e-02, 2.6758e-01,
2.5391e-01, 9.7229e-02, 1.9775e-02, 7.3853e-02, 1.1871e-01,
4.0771e-02, 1.0614e-03, -7.4219e-02, 7.3242e-02, 3.3936e-01,
-9.5978e-03, 8.1863e-03, -1.7578e-01, 3.1055e-01, -2.2217e-02,
1.6983e-02, -2.7359e-02, 1.7578e-01, 5.6152e-03, 2.9492e-01,
-1.8774e-01, 4.0649e-01, -9.6680e-02, 5.7373e-02, -9.1553e-03,
-4.6875e-01, 3.3264e-02, -9.2773e-02, -9.5978e-03, 1.9043e-01,
1.5820e-01, 9.0527e-01, -1.8860e-01, 1.7407e-01, 5.5511e-02,
-1.3574e-01, -4.3701e-02, 7.9163e-02, 1.8774e-01, 6.7322e-02,
-7.6675e-03, -1.9806e-02, -1.3379e-01, 2.9678e-02, -1.9727e-01,
-6.4941e-02, 5.6396e-02, 1.8079e-01, 3.5889e-01, -2.8076e-02,
1.3770e-01, -3.7537e-02, -1.4746e-01, 3.6133e-02, 2.5635e-02,
-2.2095e-02, -1.2445e-01, 1.0492e-01, 1.7871e-01, 7.1289e-02,
4.1504e-03, -1.4258e-01, -2.4792e-01, 1.5930e-01, -7.1289e-02,
-1.9055e-01, 2.4512e-01, 3.5156e-02, 1.5495e-02, 2.2354e-02,
8.9355e-02, 3.0078e-01, 3.8086e-02, -6.2561e-02, 2.9297e-01,
1.8188e-01, 5.8044e-02, 5.8594e-02, 2.6398e-02, -1.2817e-01,
-1.0992e-01, -4.0039e-02, -3.9368e-02, -1.4441e-01, 8.7402e-02,
8.4106e-02, -1.9836e-01, -1.3184e-01, 1.4062e-01, 1.5527e-01,
2.0264e-02, -4.6173e-02, 1.5930e-01, 4.5654e-02, -2.3364e-01,
1.2988e-01, 2.1387e-01, -8.2581e-02, 1.0162e-01, -2.2461e-01,
5.3062e-03, -4.1992e-02, 3.8672e-01, 1.2323e-01, -1.4954e-01,
2.8516e-01, 1.9531e-01, -1.3623e-01, 2.4316e-01, -8.0078e-02,
-6.7444e-02, -6.7444e-02, -1.0742e-01, 2.4902e-02, 2.8149e-01,
1.2158e-01, 3.6401e-01, 2.3468e-02, -2.8320e-01, 2.2461e-01,
3.6157e-01, -1.0992e-01, 3.4668e-02, 7.4219e-02, 2.7930e-01,
-1.5015e-01, 2.5195e-01, -1.8799e-02, 8.3496e-02, 8.4045e-02,
-9.8572e-02, 2.7734e-01, 7.4219e-02, 8.0139e-02, 2.0215e-01,
8.1055e-02, 4.3359e-01, 1.8692e-02, 8.4534e-02, 2.4414e-01,
-1.6699e-01, -8.8867e-02, -1.5930e-01, -2.0874e-01, 2.5391e-01,
6.0852e-02, 2.0157e-02, -7.9224e-02, -1.6479e-02, -1.3086e-01,
-6.2561e-02, -1.7212e-01, 3.4766e-01, 1.5234e-01, -1.4929e-01,
1.9434e-01, 1.3464e-01, 2.2583e-01, -9.5215e-02, 2.2070e-01,
1.0791e-01, 5.3467e-02, 2.2583e-03, 2.7359e-02, -1.8872e-01,
8.4534e-02, 5.2002e-02, 2.0605e-01, -1.5234e-01, -8.4534e-02,
8.9905e-02, -8.0688e-02, 1.4734e-01, -1.6129e-02, -4.8584e-02,
2.0142e-01, -1.8347e-01, 2.6538e-01, -3.1250e-01, 9.1309e-02,
2.0325e-01, 1.1096e-01, -1.5625e-02, 3.5962e-01, 2.5757e-02,
-2.1313e-01, -7.0572e-03, -1.2207e-01, 1.9434e-01, 1.4868e-01,
1.2360e-01, 1.8640e-01, 2.5513e-02, 3.4485e-02, 1.1035e-01,
1.4648e-01, 2.2232e-02, -5.9357e-02, -1.3281e-01, 3.4570e-01,
-3.0664e-01, 7.9346e-01, 3.9551e-02, -1.4648e-01, 6.5491e-02,
1.7090e-02, -2.0203e-01, 2.3956e-02, -4.0527e-02, -4.6631e-02,
2.1204e-01, -8.7891e-02, -1.6138e-01, 6.9336e-02, -2.5024e-01,
5.2460e-02, -2.3743e-01, 1.8555e-01, 7.2754e-02, 4.7363e-02,
-4.3518e-02, -1.4772e-03, 2.0435e-01, 2.7802e-02, -2.9541e-02,
-2.0312e-01, -1.0567e-02, -1.1188e-01, -3.3813e-01, 5.2185e-02,
-5.6152e-02, -2.4414e-01, 1.7798e-01, -3.4619e-01, -2.2070e-01,
3.2031e-01, -1.0010e-01, -1.3574e-01, -1.9714e-01, -1.7593e-02,
3.4375e-01, 1.9775e-02, 3.2959e-02, -3.2251e-01, 1.0303e-01,
-1.5747e-01, 1.2445e-01, 3.6157e-01, 1.3306e-01, 2.4207e-01,
1.2903e-01, -2.5146e-02, -7.7209e-02, 2.3438e-01, -5.3528e-02,
2.6758e-01, -1.2680e-02, -1.9263e-01, -2.9858e-01, 3.0859e-01,
9.5215e-02, 1.2695e-01, -6.6895e-02, 9.6191e-02, -1.0114e-01,
-3.1641e-01, 2.8882e-01, 5.3467e-02, -1.3000e-01, -1.6211e-01,
2.7344e-01, -1.0883e-01, -6.3538e-02, 1.7102e-01, -4.9133e-02,
-7.3624e-03, -3.3203e-01, -3.0078e-01, 1.6138e-01, -2.2461e-01,
2.2964e-02, 2.1851e-02, 2.3956e-02, -1.7822e-02, 2.8345e-01,
5.8136e-02, -3.5889e-02, 3.2666e-01, -1.0303e-01, 1.5388e-02,
-4.7180e-02, 2.1106e-01, -6.6895e-02, -1.3565e-02, 2.7539e-01,
-1.3184e-01, 3.0078e-01, -3.1250e-01, 9.8755e-02, 9.0881e-02,
1.2164e-01, -2.1094e-01, 4.1968e-01, 1.4062e-01, 1.7493e-01,
-1.5930e-01, -3.8525e-01, 3.5596e-01, 6.1798e-02, 2.9736e-01,
-3.2654e-03, 2.4719e-01, -2.4902e-02, -5.3906e-01, 7.3242e-02,
3.4424e-02, -1.1572e-01, -2.2278e-01, -1.0693e-01, 9.7656e-02,
3.4668e-02, -1.4307e+00, -1.1063e-02, 1.9434e-01, 1.0938e-01,
-3.7695e-01, -1.6907e-01, 3.2275e-01, 1.3184e-01, 2.6367e-01,
3.8330e-01, 6.4087e-02, 1.5820e-01, 1.8326e-02, -2.0599e-03,
1.6022e-02, -4.8584e-02, -1.0345e-01, 1.5747e-01, -1.9684e-02,
-9.4727e-02, 3.1555e-02, -4.3750e-01, -3.6865e-02, -3.7842e-02,
7.2205e-02, 7.6172e-02, -1.3074e-01, -1.9714e-01, -1.7407e-01,
1.3867e-01, 1.5625e-01, 3.2043e-03, -3.8574e-02, -1.8347e-01,
4.7363e-02, -4.3701e-02, -6.6406e-02, 4.0527e-02, 1.7578e-01,
-4.2786e-02, -8.4106e-02, 2.3926e-01, 1.7969e-01, 2.1887e-01,
-9.5215e-02, 8.6212e-03, 1.5527e-01, -6.5430e-02, -6.1798e-02,
5.6152e-02, -2.5391e-01, 2.3633e-01, 9.1003e-02, 1.9434e-01,
3.3203e-01, -8.5510e-02, -1.2109e-01, -4.5264e-01, -2.6978e-01,
2.8882e-01, 2.4646e-01, -1.1719e-01, 3.4570e-01, 1.4746e-01,
4.0527e-02, -4.2969e-02, -7.8174e-01, 3.0289e-02, -1.6760e-01,
1.4465e-02, 3.2422e-01, 2.3828e-01, -1.0443e-01, 6.3477e-02,
2.0325e-01, 4.7913e-02, -4.9072e-01, 7.9224e-02, 9.3872e-02,
-7.2327e-02, 1.6357e-02, -3.3252e-01, -3.0792e-02, -1.1572e-01,
-4.9744e-03, 4.2773e-01, -1.5442e-01, -1.4062e-01, 1.1816e-01,
-2.1997e-01, 2.5977e-01, 4.4708e-02, 2.5391e-01, -1.0880e-02,
-1.2360e-01, -8.1055e-02, -8.4900e-02, 2.9297e-01, 1.9653e-01,
-1.4868e-01, 2.1484e-02, -7.9651e-02, -5.2246e-02, -5.4260e-02,
5.1727e-02, -8.1543e-02, -1.1650e-02, -4.8267e-01, -2.3956e-02,
5.3062e-03, -1.9912e+00, 1.3680e-02, 1.9348e-01, -1.3794e-01,
-1.9714e-01, 4.8370e-02, 3.8330e-02, 5.6299e-01, 2.7734e-01,
1.9287e-02, 2.1582e-01, 4.1565e-02, 2.3340e-01, -8.7402e-02,
-2.3364e-01, -6.5918e-02, -1.8066e-02, -2.0520e-01, -4.7168e-01,
1.2610e-01, 2.6108e-02, -3.2446e-01, 3.5156e-01, -2.5244e-01,
-2.0264e-02, -2.0035e-02, -3.7842e-02, 1.0498e-01, 7.7209e-02,
8.1055e-02, -7.6172e-02, -3.3960e-01, 7.6660e-02, 1.1963e-01,
9.5276e-02, -1.0406e-01, -6.4453e-02, 2.3657e-01, -4.1260e-02,
3.1494e-01, 8.5571e-02, -2.8711e-01, 6.7725e-01, 1.5442e-01,
-4.1748e-02, -1.1658e-02, -2.8516e-01, -1.0211e-01, 3.4698e-02,
1.3770e-01, -6.2012e-02, 7.4768e-02, 2.0532e-01, 1.2266e+00,
-1.0400e-01, 1.1725e-01, -3.6353e-01, 1.1121e-01, 6.8848e-02,
1.8262e-01, -9.1309e-02, -1.1816e-01, -3.2288e-02, 1.4062e-01,
-3.5352e-01, -1.7725e-01, -2.7847e-02, 1.7798e-01, -2.4292e-02,
3.5547e-01, -1.5222e-01, -2.2461e-02, 2.1094e-01, 9.3689e-02,
9.1309e-02, -5.9204e-02, 7.8125e-02, 7.1289e-02, 4.7974e-02,
1.7725e-01, 7.8516e-01, 1.8201e-01, 1.1475e+00, 8.2092e-02,
9.6313e-02, -8.7891e-02, 1.2891e-01, 1.7578e-01, 3.3722e-02,
6.4844e-01, -1.2031e+00, 6.1798e-02, 9.1797e-02, 3.5645e-02,
1.7578e-01, -6.8420e-02, 1.1597e-02, 1.3977e-01, 4.5703e-01,
1.2891e-01, 1.1609e-01, -8.0078e-02, -8.9417e-02, 1.2390e-01,
1.6602e-01, 3.0823e-03, 2.0142e-01, -2.3938e-01, -5.4993e-02,
2.8564e-02, -1.4839e-02, -1.7590e-01, -1.6309e-01, -4.8157e-02,
2.9800e-02, -1.2115e-01, 5.6299e-01, 1.4923e-02, -5.0781e-01,
2.0630e-01, -3.2422e-01, 1.2891e-01, 4.8279e-02, 1.7981e-01,
-7.6172e-02, -1.9043e-01, -7.6782e-02, 1.0608e-01, 1.8774e-01,
1.6016e-01, 2.3438e-01, -2.0422e-01, -2.6093e-03, -8.2520e-02,
5.4297e-01, 5.5908e-02, -1.1475e-01, 1.9922e-01, 1.6699e-01,
-1.6602e-01, -3.2031e-01, 4.1046e-02, -2.0410e-01, 5.2216e-02,
8.0078e-02, 4.4922e-02, 3.0469e-01, 3.2837e-01, -2.1680e-01,
1.5625e-01, 8.3130e-02, 9.1858e-02, -3.0396e-02, -9.9121e-01,
-1.1572e-01, -1.3379e-01, -1.3809e-02, 4.8315e-01, 1.0010e-01,
1.6138e-01, -1.5320e-02, -1.8140e-01, 3.7537e-02, -3.6743e-01,
7.3624e-03, 1.5039e-01, -1.2598e-01, 1.6016e-01, -2.3364e-01,
-9.9030e-03, -1.3684e-01, -7.6111e-02, 3.6377e-01, 2.8345e-01,
6.2012e-02, 4.2419e-02, 1.4648e-01, -7.1533e-01, 8.1253e-03,
2.1643e-01, -1.5625e-01, -2.6538e-01, -1.9089e-02, 1.0101e-01,
5.4199e-02, -3.5645e-02, -2.2473e-01, -1.3867e-01, -1.0400e-01,
2.2559e-01, -5.0087e-03, -1.4062e-01, -2.1622e-02, 5.8960e-02,
-1.5039e-01, 1.0944e-01, -1.1237e-01, 1.0042e-03, 6.4087e-02,
-3.1689e-01, 1.3086e-01, -1.6809e-01, 5.1086e-02, 3.9795e-02,
4.1772e-01, 3.0273e-01, -4.0771e-02, -8.3008e-02, -1.3086e-01,
9.8145e-02, -5.9143e-02, -8.3618e-02, 7.1289e-02, 9.2627e-01,
-2.0605e-01, -8.5938e-02, 8.9844e-02, -2.9602e-03, -5.8937e-03,
-4.3518e-02, 3.7134e-01, -1.4935e-03, -8.9600e-02, -3.2617e-01,
-8.3008e-03, -1.0689e-02, -5.6839e-03, 6.4062e-01, 1.1426e-01,
2.9488e-03, 7.1289e-02, 1.7102e-01, -1.1035e-01, 1.3867e-01,
1.7090e-02, -3.6865e-02, 2.5024e-01, -2.0386e-02, 2.8516e-01,
1.4648e-01, -6.2988e-02, -1.2268e-01, -1.3965e-01, 1.3867e-01,
2.8882e-01, 1.7102e-01, 1.3867e-01, 4.9805e-02, -4.3945e-02,
3.7109e-01, 2.8345e-01, 2.2363e-01, -2.3938e-01, 1.1279e-01,
1.0260e-01, -6.9189e-01, -1.2115e-01, -2.2083e-01, 1.4973e-04,
-3.3722e-02, -4.2076e-03, -3.1494e-01, -2.8882e-01, -1.1188e-01,
9.9609e-02, -6.2469e-02, -8.8379e-02]), 'text_model.encoder.layers.0.layer_norm1.weight': tensor([1.8438, 1.6504, 1.7969, 1.7822, 1.7666, 1.7676, 1.8604, 1.8223, 1.7510,
1.7344, 1.7344, 1.7188, 1.7891, 1.7197, 1.6895, 1.7197, 1.7314, 1.7588,
1.6895, 1.7344, 1.8301, 1.7109, 1.7803, 1.7744, 1.8105, 1.7656, 1.7051,
1.7344, 0.9961, 1.7900, 1.6777, 1.8750, 1.7363, 1.7900, 1.7588, 3.3398,
1.7510, 1.7979, 1.7666, 1.7900, 1.8516, 1.7822, 1.8457, 1.7266, 1.8154,
1.8223, 1.7500, 1.7461, 1.6582, 1.8359, 1.7500, 1.7666, 1.7715, 1.7188,
1.7510, 1.7656, 1.7588, 1.5625, 1.7666, 1.7129, 1.8281, 1.8203, 1.7744,
1.8223, 1.7266, 1.6963, 1.7754, 1.6885, 1.8994, 1.7598, 1.5938, 1.7822,
1.7432, 1.6016, 1.7822, 1.7197, 1.7979, 1.7969, 1.6719, 1.7969, 1.7891,
1.7676, 1.7656, 1.7344, 1.7666, 1.7500, 1.8105, 1.7979, 1.7598, 1.6650,
1.7900, 1.7051, 1.8516, 1.8281, 1.7266, 1.7129, 1.7666, 1.8145, 1.8604,
1.7803, 1.7666, 0.7891, 1.6963, 1.7822, 1.6719, 1.7129, 1.6592, 1.7266,
1.7676, 1.7764, 1.7500, 1.7188, 1.7676, 1.7598, 1.8516, 1.7656, 1.7744,
1.7588, 1.6787, 1.7754, 1.7900, 1.7676, 1.7969, 1.7822, 1.7822, 1.7656,
1.7598, 1.7656, 1.6572, 1.7803, 1.7891, 1.7266, 1.7744, 1.6641, 1.8281,
1.8223, 1.7344, 1.7803, 1.7900, 1.7891, 1.7666, 1.8047, 1.7969, 1.8047,
1.7344, 1.7500, 1.7510, 1.7129, 1.8145, 1.7461, 1.7344, 1.8066, 1.8750,
1.7666, 1.7588, 1.7461, 1.7891, 1.6934, 1.7314, 1.7266, 1.7432, 1.7207,
1.7344, 1.7969, 1.7461, 1.7510, 1.7461, 1.7715, 1.7188, 1.8047, 1.8213,
1.7666, 1.7754, 1.6973, 1.7344, 1.7344, 1.7344, 1.7891, 1.6875, 1.7979,
1.6895, 1.7588, 1.7969, 1.7500, 1.7510, 1.7197, 1.7344, 1.7510, 1.8281,
1.8154, 1.7188, 1.7969, 1.6650, 1.7432, 1.6650, 1.6973, 1.7461, 1.8359,
1.7344, 1.7500, 1.7588, 1.7051, 1.6895, 1.7510, 1.7764, 1.5723, 1.2217,
1.7744, 1.7051, 1.7139, 1.7744, 1.8105, 1.7969, 1.7461, 1.8145, 1.8105,
1.7344, 1.7588, 1.7969, 1.7432, 1.7188, 1.7510, 1.7764, 1.7051, 1.7344,
1.8047, 1.7656, 1.7891, 1.7500, 1.7969, 1.7266, 1.7891, 1.7676, 1.7666,
1.7188, 1.7188, 1.8281, 1.7344, 1.8281, 1.6895, 1.7598, 1.7969, 1.7051,
1.7754, 1.8066, 1.7031, 1.7412, 0.8242, 1.6973, 1.8281, 1.8223, 1.7715,
1.8525, 1.4912, 1.8135, 1.7822, 1.7129, 1.7363, 1.7432, 1.7822, 1.7676,
1.6973, 1.7344, 1.6777, 1.7432, 1.6963, 1.7676, 1.7676, 1.7891, 1.7344,
1.7676, 1.6416, 1.7500, 1.7051, 1.7793, 1.8438, 1.7598, 1.7344, 1.7510,
1.7822, 1.7051, 1.8145, 1.6582, 1.7754, 1.7432, 1.6787, 1.7188, 1.7344,
1.7822, 1.8223, 1.9141, 1.7344, 1.7432, 1.7188, 1.7744, 1.7432, 1.7979,
1.8066, 1.7129, 1.7432, 1.8281, 1.8066, 1.7344, 1.6719, 1.7051, 1.8135,
1.7412, 1.7598, 1.7598, 1.6338, 1.5713, 1.7979, 1.6973, 1.7500, 1.7344,
1.8447, 1.7422, 1.7051, 1.6484, 1.7588, 1.6885, 1.7822, 1.8154, 1.7793,
1.7266, 1.6650, 1.5938, 1.7754, 1.7139, 1.6641, 1.7656, 1.8105, 1.8105,
1.7344, 1.7266, 1.6250, 1.7588, 1.8145, 1.7129, 1.7764, 1.7461, 1.6484,
1.7344, 1.7598, 1.6982, 1.8359, 1.6982, 1.7344, 1.7412, 1.7285, 1.7412,
1.7676, 1.7500, 1.7822, 1.7676, 1.7656, 1.7793, 1.6338, 1.6963, 1.8154,
1.7461, 1.6553, 1.7764, 1.7803, 1.7803, 1.7051, 1.6553, 1.7129, 1.7314,
1.7344, 1.7188, 1.7676, 1.7822, 1.7588, 1.7266, 1.7822, 1.7051, 1.6113,
1.7266, 1.7266, 1.7969, 1.6709, 1.7822, 1.7266, 1.7461, 1.7051, 1.7422,
1.7578, 1.7891, 1.6719, 1.8066, 0.6919, 1.7969, 1.6963, 1.5869, 1.7578,
1.7461, 1.7363, 1.7676, 1.7676, 1.7969, 1.7822, 1.7422, 1.8135, 1.6973,
1.6250, 1.7432, 1.6934, 1.7344, 1.8457, 1.6895, 1.7598, 1.7676, 1.7744,
1.7803, 1.9541, 1.7432, 1.7666, 1.7754, 1.7803, 1.7744, 1.6963, 1.8047,
1.8359, 1.7188, 1.8135, 1.7891, 1.7588, 1.6797, 1.7744, 1.7344, 1.6875,
1.7188, 1.6875, 1.7744, 1.6973, 1.7656, 1.7676, 1.7363, 1.7676, 1.7344,
1.7266, 1.7432, 1.8691, 1.7510, 1.6641, 1.7793, 1.7969, 1.6025, 1.6709,
1.6787, 1.7266, 1.7285, 1.7188, 1.7822, 1.7363, 1.6973, 1.6250, 1.7656,
1.7676, 1.7188, 1.6719, 1.7744, 1.8145, 1.7676, 1.7500, 1.7363, 1.7500,
1.6797, 1.7588, 1.7822, 1.6885, 1.7051, 1.7266, 1.7822, 1.7588, 1.6250,
1.7588, 1.7666, 1.7500, 1.7109, 1.7432, 1.7588, 1.7510, 1.7207, 1.6875,
1.7715, 1.7285, 1.7510, 1.7461, 1.6895, 1.7188, 1.7588, 1.7051, 1.7432,
1.7891, 1.7803, 1.7744, 1.6406, 1.7676, 1.7744, 0.9424, 1.7822, 1.7051,
1.7666, 1.7676, 1.7588, 1.8203, 1.6260, 1.7744, 1.7900, 1.7969, 1.7031,
1.7666, 1.6406, 1.7363, 1.6650, 1.7969, 1.6875, 1.6572, 1.7051, 1.7344,
0.9346, 1.7432, 1.7266, 1.7666, 1.7500, 1.7764, 1.8223, 1.7129, 1.7744,
1.5859, 2.0508, 1.7969, 1.7188, 1.7031, 1.7598, 1.7129, 1.7461, 1.7822,
1.7109, 1.7744, 1.5088, 1.5088, 1.6895, 1.7305, 1.8135, 1.7109, 1.7344,
1.7051, 1.7461, 1.8066, 1.8047, 1.6797, 1.3770, 1.7715, 1.7266, 1.6260,
1.7510, 1.7031, 1.7510, 1.7969, 1.7676, 1.7754, 1.7891, 1.7969, 1.8047,
1.7432, 1.7803, 1.7461, 1.8281, 1.5869, 1.7891, 1.7266, 1.7432, 1.7207,
1.7500, 1.7891, 1.8223, 1.7344, 1.5156, 1.5869, 1.7266, 1.1953, 1.8135,
1.7266, 1.7979, 1.7754, 1.7188, 1.7412, 1.5635, 1.0859, 1.6641, 1.7510,
1.6650, 1.6875, 1.7754, 1.7822, 1.7197, 1.7822, 1.7500, 1.6895, 1.7344,
1.7598, 1.8105, 1.7891, 1.6787, 1.7285, 1.6895, 1.8047, 1.6895, 1.7129,
1.5781, 1.7344, 1.7969, 1.7129, 1.7588, 1.7344, 1.7588, 0.8877, 1.7500,
1.6025, 1.8359, 1.7656, 1.8047, 1.7432, 1.6709, 1.8047, 1.7266, 1.7764,
1.7500, 1.6963, 1.7188, 1.7588, 1.7588, 1.3984, 1.8223, 1.7129, 1.7051,
1.7363, 1.6963, 1.6553, 1.7656, 1.7969, 1.7666, 1.7891, 1.8359, 1.7461,
1.6416, 1.7432, 1.7197, 1.7344, 1.7363, 1.7969, 0.8716, 1.7422, 1.7432,
1.7803, 1.6260, 1.7510, 1.8486, 1.8066, 1.7031, 1.7578, 1.7266, 1.6572,
1.8447, 1.7432, 1.7266, 1.7188, 1.6982, 1.6641, 1.7344, 1.7432, 1.7744,
1.7822, 1.8301, 1.7979, 1.6406, 1.7109, 1.7305, 1.7588, 1.7432, 1.7266,
1.7822, 1.8838, 1.7764, 1.7803, 1.7510, 1.7676, 1.8369, 1.7197, 1.7432,
1.7822, 1.7666, 1.7744, 1.7139, 1.7207, 1.7266, 1.7588, 1.9072, 1.7822,
1.8584, 1.7754, 1.7305, 1.7500, 1.6650, 1.7969, 1.8105, 1.6709, 1.7422,
1.7666, 1.7197, 1.7822, 1.5723, 1.6709, 1.8145, 1.7891, 1.7266, 1.8047,
1.7656, 1.6582, 1.7891, 1.8516, 1.7588, 1.7432, 1.7891, 1.6885, 1.6650,
1.7891, 1.7588, 1.7188, 1.7051, 1.7510, 1.7266, 1.7588, 1.7432, 1.7344,
1.9141, 1.5420, 1.6650, 1.7188, 1.7803, 1.7656, 1.7715, 1.7109, 1.7129,
1.7344, 1.8145, 1.7656, 1.7666, 1.6709, 1.8047, 1.7266, 1.7969, 1.7197,
0.8672, 1.7129, 1.7314, 1.7979, 1.7656, 1.7969, 1.7500, 1.7344, 1.7900,
1.7969, 1.8105, 1.7656]), 'text_model.encoder.layers.0.layer_norm2.bias': tensor([-2.8882e-01, 6.7871e-01, -2.3828e-01, 7.2705e-01, 5.0391e-01,
-1.0254e-01, -3.9307e-01, 7.6172e-01, 2.8149e-01, 2.7539e-01,
6.3721e-01, 3.5400e-02, 5.8643e-01, -8.9355e-02, -2.4719e-01,
2.3633e-01, 6.3721e-01, -6.2500e-02, 8.0859e-01, -1.8584e+00,
1.5442e-01, 2.7344e-01, 3.3398e-01, 1.9238e-01, 6.2500e-01,
-8.7036e-02, 8.8379e-02, 6.7871e-01, -6.2500e-01, 6.2012e-02,
8.7036e-02, 4.9976e-01, 2.2083e-01, -1.1969e-01, -3.7744e-01,
-2.0288e-01, 1.1279e-01, -2.0117e-01, 3.3813e-01, -9.1553e-03,
-5.3711e-02, 3.6572e-01, -1.5625e-01, 4.3359e-01, -2.5220e-01,
3.2031e-01, 1.0742e-01, -5.0049e-02, 3.2446e-01, -6.0059e-01,
4.2969e-01, -2.2205e-01, 2.1106e-01, -1.3794e-01, -9.8633e-02,
-2.0020e-01, 2.0312e-01, -2.1985e-01, 1.2360e-01, 1.6504e-01,
1.3379e-01, -2.3743e-01, 7.5781e-01, 4.1626e-01, 4.2236e-02,
3.0542e-01, -4.3945e-01, -3.7170e-02, -1.1536e-01, -7.0361e-01,
-2.1988e-02, -5.4346e-01, 3.3960e-01, -1.9531e-01, -9.1406e-01,
8.3936e-01, 4.1626e-01, -1.4160e-01, -1.6418e-01, 2.3340e-01,
7.2314e-01, -6.6406e-02, -6.7871e-02, -5.1172e-01, 4.1772e-01,
2.6779e-02, -8.0139e-02, 9.3262e-02, -4.1992e-01, -9.3457e-01,
1.4868e-01, 1.4160e-01, 1.4868e-01, -3.5767e-01, 4.3970e-01,
-2.5391e-01, -2.9736e-01, 2.2668e-01, -4.6289e-01, -5.8643e-01,
1.4465e-01, -8.5938e-01, 9.4141e-01, 3.6182e-01, 1.4929e-01,
3.1250e-01, 5.7764e-01, 7.6660e-01, 8.8867e-02, 8.2520e-02,
-3.0713e-01, -1.1615e-01, 2.3938e-01, -2.5586e-01, 1.0657e-01,
-1.2598e-01, -1.1615e-01, -1.9238e-01, -3.2812e-01, -3.4814e-01,
-6.4062e-01, 3.7109e-01, 4.4360e-01, -1.2708e-01, 7.0312e-01,
2.1704e-01, -1.5356e-01, -5.7861e-01, -3.6133e-02, -9.8047e-01,
1.4551e-01, 7.3828e-01, 2.3340e-01, 8.8770e-01, -2.6196e-01,
2.6562e-01, -1.0553e-01, -4.9072e-01, 5.4718e-02, 2.5586e-01,
1.1572e-01, -1.9934e-01, 5.0049e-03, 3.8330e-01, 6.4502e-01,
1.7285e-01, -4.4922e-01, 3.1982e-02, -2.6367e-01, -5.4297e-01,
-8.0078e-01, -1.7883e-01, 1.0469e+00, 2.4414e-01, -8.6548e-02,
-5.1172e-01, 1.6586e-02, 6.5918e-02, -1.4355e-01, 1.7285e-01,
1.8457e-01, 2.0605e-01, 2.7344e-01, -7.6111e-02, 3.4424e-02,
3.1665e-01, -4.4385e-01, -4.0063e-01, 2.0239e-01, 2.2681e-01,
7.3486e-01, -6.0059e-02, 7.3047e-01, -2.9297e-01, -4.5441e-02,
2.3254e-01, -3.9307e-01, -1.4355e-01, -9.0393e-02, -2.1106e-01,
5.9814e-01, -1.2622e-01, -2.8149e-01, 3.9453e-01, -3.3203e-01,
3.5400e-01, -4.3359e-01, -6.2500e-01, 1.0938e+00, 3.1055e-01,
5.0812e-02, -4.2267e-02, 4.6875e-01, 2.7344e-01, 4.2969e-02,
1.6602e-01, 7.2266e-02, -4.3274e-02, 6.2500e-01, -7.6758e-01,
-5.3009e-02, 6.5234e-01, -1.5356e-01, 3.7549e-01, 3.6377e-01,
6.0156e-01, -2.2363e-01, 9.2285e-02, -2.7344e-01, 2.9297e-01,
9.2285e-01, 9.3872e-02, 8.5547e-01, -6.6504e-01, -4.6417e-02,
-3.2031e-01, 6.3281e-01, -9.3359e-01, 4.6463e-03, 5.7373e-02,
-8.5938e-02, 3.8477e-01, 4.3579e-01, -8.3069e-02, -5.1953e-01,
-2.5024e-01, -1.6614e-01, -3.0884e-01, -4.7656e-01, 6.6797e-01,
4.9591e-02, 2.1680e-01, -5.3284e-02, 3.1860e-01, 1.2396e-01,
5.5078e-01, 8.3496e-02, -3.9453e-01, -3.6377e-02, -5.7861e-01,
-1.9043e-01, -2.7344e-01, -4.0234e-01, -7.3242e-02, -1.8188e-02,
-3.6328e-01, -3.8086e-02, -6.0156e-01, 3.9502e-01, 2.4023e-01,
-5.1562e-01, -2.2375e-01, 9.4360e-02, -4.5898e-01, -1.6418e-01,
8.7988e-01, -8.8379e-02, -8.6816e-01, -2.0508e-01, -3.9062e-01,
7.3828e-01, 4.7168e-01, 4.7681e-01, -4.1455e-01, -6.6406e-01,
-5.8936e-01, -2.9297e-01, -1.3672e-01, 3.8086e-01, -1.6699e-01,
9.0234e-01, -6.3672e-01, 2.2375e-01, 2.1313e-01, -3.2617e-01,
-5.9143e-02, 5.7678e-02, -1.0254e+00, 1.1871e-01, 2.8882e-01,
-1.7676e-01, -1.5137e-01, 9.8584e-01, -3.1494e-01, 1.3867e-01,
-6.8750e-01, 4.4556e-01, -2.5586e-01, 5.3516e-01, -1.8872e-01,
-1.9873e-01, 7.3828e-01, -1.4734e-01, 2.5195e-01, -6.4844e-01,
-5.8594e-01, 1.0791e-01, -2.1875e-01, -2.6953e-01, -3.8055e-02,
1.6235e-01, -6.8359e-02, -1.8188e-01, 5.4297e-01, 6.2109e-01,
-5.9814e-01, -1.4954e-01, 8.1970e-02, -5.4688e-02, -6.0889e-01,
5.5469e-01, 5.7422e-01, 3.9258e-01, 1.5918e-01, -3.9844e-01,
-1.9958e-02, 1.1682e-01, 5.5469e-01, -1.1615e-01, 1.8567e-01,
5.0781e-01, -1.3477e-01, 4.3018e-01, 1.6125e-01, 1.6484e+00,
3.0078e-01, -2.2583e-01, 3.3813e-01, -2.8320e-01, 2.4902e-02,
-3.2251e-01, 5.5237e-02, 4.3945e-02, -1.9312e-01, -5.9814e-01,
-9.0723e-01, -4.0649e-01, -6.1426e-01, -5.7031e-01, 4.0497e-02,
-8.3496e-01, 3.1689e-01, -2.0117e-01, 6.8750e-01, 4.4727e-01,
-1.7578e-01, 1.1719e-01, -1.9873e-01, -3.1250e-01, 5.6689e-01,
3.2422e-01, 6.3965e-02, 1.6602e-02, 4.6173e-02, 2.0740e-01,
2.0874e-01, 1.0156e+00, -3.7939e-01, 9.3689e-02, 3.7720e-01,
-6.6406e-01, 1.0791e+00, -2.7344e-01, -1.8457e-01, -1.7944e-01,
-2.0386e-02, 3.3203e-01, 5.9891e-03, -2.9102e-01, -7.2266e-02,
-8.0371e-01, -1.1127e-01, 1.4648e-01, -1.6235e-01, 2.9102e-01,
-3.0273e-01, -7.9346e-01, 4.0186e-01, -3.1250e-01, -3.3252e-01,
-1.5527e-01, 8.4375e-01, 6.8750e-01, 4.9585e-01, -6.3721e-01,
-3.2056e-01, 5.1562e-01, -3.7549e-01, 4.3579e-01, -3.2031e-01,
1.0059e-01, 1.7212e-01, 7.9688e-01, 3.2031e-01, 5.0000e-01,
4.9780e-01, 5.9814e-02, -8.0994e-02, -1.3965e-01, 6.0577e-02,
1.5845e-01, 1.0234e+00, -7.9346e-01, -5.7031e-01, -4.9609e-01,
-2.4646e-01, 1.4844e-01, -3.5400e-01, 2.5391e-01, 3.1055e-01,
-3.8330e-01, -2.1484e-01, -4.8828e-01, 1.2610e-01, -4.8657e-01,
-1.1250e+00, 4.5361e-01, 7.3242e-02, -7.5439e-01, 6.2866e-03,
1.9629e-01, -1.7969e-01, -9.1846e-01, -9.3262e-02, 5.8594e-01,
-4.8462e-01, 2.5586e-01, 1.0625e+00, 2.6978e-01, 7.2754e-01,
5.1172e-01, 1.9263e-01, 8.9264e-03, 4.0039e-01, -9.9670e-02,
-4.6143e-01, 1.7407e-01, 2.8320e-01, 6.3379e-01, 6.8408e-01,
8.1299e-01, 5.7861e-01, 4.0625e-01, 3.8177e-02, -4.1382e-01,
4.8828e-01, -5.7422e-01, 4.4507e-01, -5.7861e-01, -5.8203e-01,
-7.5195e-02, 4.5923e-01, -7.1777e-01, 9.5215e-02, 6.2939e-01,
2.8320e-01, 2.8906e-01, 2.2363e-01, 2.3254e-01, -8.2092e-02,
1.9238e-01, 1.9531e-02, 4.2627e-01, -5.9082e-01, -2.3254e-01,
4.6875e-01, -4.6729e-01, 6.6016e-01, -6.2109e-01, 1.9641e-01,
1.7798e-01, 5.3516e-01, -6.9336e-02, 9.1797e-01, 6.4502e-01,
5.0830e-01, 2.0020e-02, 7.9492e-01, 2.8198e-02, -1.5845e-01,
-6.7871e-02, 4.2993e-01, 1.2256e-01, -7.2314e-01, 4.3579e-01,
-2.2778e-01, 8.3008e-02, 7.6782e-02, 3.0884e-01, 3.5767e-01,
1.2085e-02, -2.8882e-01, -5.2734e-01, 6.2561e-02, 1.1328e+00,
-2.3743e-01, -2.9102e-01, -2.4744e-01, -9.2285e-01, -6.6406e-02,
1.7114e-01, 3.2227e-02, 3.2031e-01, -2.9492e-01, 1.4551e-01,
-4.9780e-01, -4.8853e-01, -1.7188e-01, 2.4316e-01, -3.5547e-01,
1.9934e-01, 4.8584e-02, 6.2500e-01, 4.8047e-01, 2.9492e-01,
1.5442e-01, -6.4062e-01, 1.3672e-01, 3.9429e-01, 1.4844e-01,
1.4648e-01, 4.3579e-01, 1.0498e-01, -5.8899e-02, 6.6016e-01,
3.5156e-02, -5.6299e-01, 4.5752e-01, 2.2864e-01, -2.3450e-01,
-3.7549e-01, 8.9111e-01, -2.2864e-01, -3.6523e-01, -1.2115e-01,
-2.2583e-01, -1.3867e-01, 7.4219e-02, 3.0469e-01, 1.8738e-02,
3.2837e-01, -4.1016e-01, -3.4570e-01, -6.2109e-01, 2.4414e-01,
-2.0520e-01, 1.7200e-01, 3.1860e-01, 6.4941e-01, 3.9307e-01,
-8.2520e-01, 6.0156e-01, 2.8516e-01, -1.9336e-01, 5.5811e-01,
-9.2773e-02, 1.7090e-02, -4.8633e-01, 2.5781e-01, -2.5806e-01,
3.1274e-01, -1.2207e-01, 6.9922e-01, -3.5962e-01, 4.3750e-01,
9.1455e-01, 1.1719e+00, 1.5234e-01, 6.1768e-01, 4.4918e-04,
-8.2080e-01, -8.6328e-01, -8.9417e-02, 7.1094e-01, -1.7383e-01,
-6.2500e-01, 1.0657e-01, -6.4453e-02, -1.0016e-01, 4.2578e-01,
2.7905e-01, -1.3464e-01, -4.0430e-01, -7.8125e-01, -2.8711e-01,
2.7173e-01, 1.2445e-01, 4.0405e-01, -7.6172e-02, -1.9751e-01,
-4.2212e-01, 4.4482e-01, 1.3379e-01, -9.9365e-02, 6.9922e-01,
1.6907e-01, 1.0156e+00, -6.2988e-01, 4.6094e-01, 4.4189e-01,
-5.3857e-01, 5.5078e-01, -1.0693e-01, -4.1504e-02, 2.9541e-01,
5.8960e-02, 2.2217e-02, -6.4844e-01, 7.9102e-02, -2.7344e-01,
3.1641e-01, 6.9531e-01, -5.6641e-01, -5.3516e-01, 1.7114e-01,
-2.7344e-01, 3.5596e-01, 3.2837e-01, -1.1549e-03, 4.7314e-01,
-2.1790e-01, -5.2393e-01, -3.0975e-02, 1.6724e-01, 3.1860e-01,
-1.3489e-01, 1.9434e-01, 5.8203e-01, -5.5939e-02, 1.9727e-01,
-4.8047e-01, 1.2793e-01, 3.8696e-01, -6.7725e-01, 2.5220e-01,
2.2205e-01, -2.1875e-01, -5.9357e-03, 3.2397e-01, 2.8906e-01,
9.3689e-02, 9.0698e-02, -5.3516e-01, 1.5625e-01, -1.7285e-01,
3.9453e-01, 3.5352e-01, 8.7036e-02, -2.4988e-01, 3.8525e-01,
1.4954e-01, -4.9561e-01, 8.4375e-01, 7.0374e-02, -3.3203e-01,
2.1106e-01, 3.2275e-01, 1.4062e-01, -2.4133e-01, -2.8906e-01,
2.5415e-01, -6.9141e-01, 2.9688e-01, 1.1072e-01, -2.5415e-01,
1.7578e-01, -1.7297e-01, -2.6929e-01, -1.2891e-01, -3.5352e-01,
4.1534e-02, 9.9609e-02, -1.6809e-01, -1.1627e-01, 1.6125e-01,
4.3579e-01, 6.7627e-01, -3.5156e-02, 2.8091e-02, 2.7734e-01,
3.7891e-01, 1.0625e+00, 7.0312e-01, -7.1716e-03, 7.8125e-01,
-4.2969e-02, 4.7485e-01, -1.0078e+00, 1.0559e-01, -1.8188e-01,
-7.2327e-02, 1.0156e+00, -3.5742e-01, -7.3828e-01, -2.3059e-01,
-3.2251e-01, 5.3125e-01, -3.2056e-01, 1.4282e-02, 6.7920e-01,
-6.6016e-01, -2.0117e-01, 3.7280e-01, 1.6211e-01, -3.3960e-01,
-1.9275e-01, -5.0049e-02, 9.4531e-01, -1.0876e-01, 6.2500e-01,
-1.2500e-01, 1.0469e+00, -5.3076e-01, 6.5491e-02, 1.1139e-01,
2.1887e-01, 7.1289e-02, -5.7861e-01, -2.9102e-01, -1.2115e-01,
-1.3867e-01, -1.4844e-01, 4.2755e-02, -8.3008e-02, -1.4258e-01,
-1.2070e-02, 3.2788e-01, -4.2578e-01, -2.4890e-01, -4.9561e-02,
-5.2032e-02, -1.7969e-01, -4.2450e-02, 6.8408e-01, -1.5234e-01,
4.2627e-01, 6.6895e-02, 1.5137e-01, 1.8774e-01, 8.2129e-01,
-2.5781e-01, 8.3203e-01, 2.4133e-01, 6.1768e-01, 3.9502e-01,
-2.3547e-01, -3.6377e-01, -6.6895e-02, -2.9736e-01, 9.6741e-02,
-4.3164e-01, 1.6016e-01, 8.7402e-02, 5.2295e-01, -1.8372e-01,
-8.9355e-02, -1.9238e-01, 2.0691e-01, -3.8696e-01, 3.1250e-01,
-5.1056e-02, -1.3867e-01, 2.8906e-01, 8.6719e-01, 3.4814e-01,
6.2988e-01, 5.0391e-01, -1.4145e-02]), 'text_model.encoder.layers.0.layer_norm2.weight': tensor([1.5479, 1.6260, 1.4609, 1.5156, 1.6973, 1.5352, 1.7051, 1.6504, 1.6719,
1.5469, 1.7188, 1.6650, 1.7197, 1.7031, 1.5635, 1.5947, 1.7139, 1.6553,
1.5469, 2.5625, 1.5234, 1.6094, 1.5723, 1.5479, 1.5410, 1.6260, 1.5781,
1.6875, 2.6562, 1.5332, 1.5801, 1.5771, 1.6406, 1.6973, 1.5332, 1.6797,
1.6777, 1.7188, 1.6172, 1.5703, 1.5771, 1.4531, 1.5723, 1.5723, 1.6895,
1.4307, 1.4688, 1.7266, 1.6328, 1.6484, 1.5479, 1.6094, 1.4697, 1.8135,
1.6016, 1.4238, 1.6650, 1.6934, 1.5410, 1.7412, 1.5234, 1.5635, 1.7510,
1.5938, 1.7500, 1.5713, 1.4941, 1.6504, 1.6650, 1.6338, 1.7139, 1.5781,
1.7656, 1.5635, 1.7129, 1.8906, 1.5781, 1.4766, 1.5869, 1.5000, 1.6650,
1.5479, 1.7051, 1.4697, 1.5078, 1.7676, 1.7139, 1.6641, 1.5332, 1.6025,
1.5479, 1.6406, 1.6260, 1.7676, 1.6113, 1.6016, 1.6895, 1.5781, 1.6260,
1.7822, 1.5723, 2.2500, 1.7266, 1.7363, 1.5947, 1.5410, 1.4160, 1.7578,
1.5322, 1.5938, 1.5947, 1.5557, 1.4609, 1.6641, 1.6328, 1.6484, 1.6025,
1.6094, 1.8604, 1.6699, 1.7109, 1.2812, 1.5938, 1.5479, 1.7598, 1.7207,
1.5479, 1.6162, 1.5801, 1.7500, 1.5391, 1.6982, 1.6426, 1.7188, 1.6582,
1.6777, 1.7979, 1.6328, 1.5010, 1.5234, 1.5391, 1.5547, 1.5625, 1.6797,
1.7197, 1.6250, 1.8213, 1.6885, 1.6963, 1.6641, 1.6895, 1.7129, 1.7891,
1.7051, 1.9219, 1.4541, 1.6699, 1.5635, 1.7129, 1.6182, 1.6504, 1.5869,
1.5547, 1.2109, 1.4697, 1.5938, 1.5078, 1.6162, 1.6016, 1.6094, 1.5020,
1.7051, 1.6553, 1.6719, 1.7188, 1.5332, 1.7803, 1.9150, 1.6094, 1.6094,
1.6875, 1.5332, 1.5332, 1.6172, 1.7432, 1.6416, 1.6787, 0.5898, 1.8145,
1.6094, 1.4941, 1.5898, 1.7129, 1.5625, 1.6895, 1.6406, 1.5547, 1.5938,
0.8359, 1.5098, 1.4844, 1.6025, 1.8047, 1.6484, 1.6328, 1.5781, 2.0312,
1.6162, 1.6250, 1.7822, 1.8301, 1.7588, 1.6260, 1.7803, 1.8486, 1.5625,
1.8457, 1.6895, 1.7266, 1.7031, 1.6094, 1.8281, 1.5869, 1.5156, 1.6797,
1.6250, 1.5781, 1.6777, 1.6963, 1.5781, 1.7188, 1.6650, 1.6484, 1.7051,
1.5947, 1.5410, 1.6895, 1.6641, 1.6875, 1.5420, 1.4609, 1.7109, 1.6875,
1.6025, 1.6484, 1.6934, 1.5625, 2.0312, 1.5771, 1.6797, 1.7656, 1.4844,
1.6016, 1.6709, 1.6484, 1.8105, 1.7754, 1.7109, 1.6572, 1.4766, 1.6963,
1.5479, 1.4766, 1.5781, 1.4941, 1.7432, 1.6963, 1.5723, 1.6504, 1.6260,
1.8516, 1.6094, 1.6553, 1.2188, 1.6016, 1.5332, 1.6699, 1.6484, 1.5635,
1.5938, 1.5117, 1.7510, 1.8604, 1.7305, 1.6172, 1.5869, 1.8584, 1.5410,
1.8066, 1.4385, 1.7363, 1.6787, 1.5781, 1.4990, 1.7188, 1.5332, 1.5635,
1.5469, 1.7266, 1.6875, 1.7266, 1.6426, 1.6504, 1.8457, 1.6094, 1.5332,
1.4844, 1.6484, 1.6582, 1.4219, 1.7129, 1.6709, 1.4609, 1.6172, 1.6504,
1.5449, 1.7432, 1.8691, 1.6240, 1.6875, 1.4229, 1.6406, 1.6025, 1.7344,
1.4385, 1.3525, 1.6484, 1.7500, 1.4688, 1.4854, 1.4844, 1.6094, 1.7051,
1.6260, 1.6025, 1.7500, 1.5391, 1.7891, 1.5947, 1.7344, 1.7188, 1.7031,
1.5703, 1.6016, 1.6709, 1.7822, 1.6504, 1.5869, 1.7188, 1.7314, 1.6182,
1.6250, 1.6553, 1.4609, 1.6113, 1.5771, 1.4375, 1.4697, 1.6484, 1.8047,
1.6973, 1.6260, 1.5117, 1.6592, 1.6582, 1.6260, 1.7656, 1.5635, 1.6582,
1.7588, 1.5391, 1.6699, 1.6787, 1.6885, 1.6172, 1.7109, 1.6973, 1.5244,
1.7510, 1.5234, 1.5557, 1.2900, 1.7197, 1.6484, 1.7432, 1.5469, 1.6162,
1.6025, 1.6416, 1.7744, 1.7412, 2.3145, 1.7979, 1.5156, 1.7666, 1.7188,
1.5234, 1.7031, 1.6328, 1.7109, 1.5469, 1.6553, 1.4307, 1.5332, 1.6260,
1.6074, 1.7139, 1.5244, 1.9062, 1.5547, 1.8281, 1.6885, 1.5898, 1.6895,
1.2588, 1.8848, 1.5469, 1.6172, 1.6016, 1.6406, 1.6328, 1.5547, 1.7363,
1.7266, 1.5547, 1.6797, 1.6592, 1.7969, 1.5938, 1.5332, 1.6172, 1.6719,
1.7188, 1.5156, 1.4551, 1.6504, 1.6504, 1.6797, 1.6797, 1.6641, 1.6875,
1.7510, 1.5234, 1.2285, 1.6406, 1.5479, 1.6348, 1.6250, 1.7891, 1.6709,
1.6338, 1.7344, 1.8750, 1.5781, 1.5625, 1.5244, 1.6338, 1.7344, 1.6719,
1.7891, 1.7422, 1.6328, 1.6260, 1.6094, 1.5547, 1.6416, 1.6592, 1.7188,
1.7188, 1.4941, 1.5723, 1.4766, 1.4609, 1.4609, 1.6416, 1.6162, 1.6709,
1.8604, 1.6650, 1.7207, 1.5479, 1.7207, 1.6592, 1.5869, 1.7900, 1.6025,
1.6182, 1.7793, 1.5244, 1.5781, 1.5938, 1.5244, 1.6025, 1.7285, 1.5479,
1.8223, 1.6094, 1.5410, 1.6338, 1.6650, 1.6982, 1.6260, 1.7432, 1.4922,
1.7109, 1.6699, 1.6777, 1.5898, 1.5479, 1.5938, 1.6709, 1.6162, 1.5938,
...
1.0010, 1.0156, 0.9727, 0.9336, 0.9375, 0.9888, 0.9570, 1.0400, 0.9951,
0.9531, 1.1328, 1.0078, 0.9658, 0.9692, 0.9531, 0.9736, 1.0391, 0.9995,
0.9771, 0.9834, 0.9771])}.
Please verify that the loaded state dictionary of the textual embedding either only has a single key or includes the `string_to_param` input key.
System Info
diffusers version: Name: diffusers Version: 0.27.0.dev0
Who can help?
@sayakpaul @yiyixuxu @DN6
Please share the code as code and not as a screenshot and make it fully reproducible.
from diffusers import DiffusionPipeline from safetensors.torch import load_file import torch from PIL import Image
pipe = DiffusionPipeline.from_pretrained("/mnt/asian-t2i/pretrained_models/RealVisXL_V3.0", torch_dtype=torch.float16, use_safetensors=True, variant="fp16")
pipe.to("cuda:2")
pipe.load_textual_inversion("/mnt/asian-t2i/output/textual-inversion/textual_inversion-sdxl-data1k6-realism/checkpoint-1000/model.safetensors", token="
We don't know where the following come from:
- /mnt/asian-t2i/pretrained_models/RealVisXL_V3.0
- /mnt/asian-t2i/output/textual-inversion/textual_inversion-sdxl-data1k6-realism/checkpoint-1000/model.safetensors
The RealVisXL3.0 is the sdxl base model, i downloaded if from here. The checkpoint-1000/model.safetensors is the generated model after i train the textual inversion sdxl model from here
The checkpoint-1000/model.safetensors is the generated model after i train the textual inversion sdxl model from here
We don't have the bandwidth to do a training run to reproduce the issue here. Please share the checkpoint too.
And in your screenshot, you're supplying a token and and in the code, you didn't supply token. Please make sure the code is exactly what you're looking for.
I trained a textual inversion model on SDXL pretrained model ( RealVisXL3.0 ), and then when I want to inference, I use this textual inversion model as a positive prompt to strengthen some aspect of the model.
Try providing a code snippet for this too as this will help us have all the info we need.
First, the generated textual inversion safetensors is local and how can i share it with you?
Second, i use the following code to load_textual_inversion:
pipe = DiffusionPipeline.from_pretrained("/mnt/asian-t2i/pretrained_models/RealVisXL_V3.0", torch_dtype=torch.float16, use_safetensors=True, variant="fp16") pipe.to("cuda:2") pipe.load_textual_inversion("/mnt/asian-t2i/output/textual-inversion/textual_inversion-sdxl-data1k6-realism/checkpoint-1000/model.safetensors", token="abc-person-realism", text_encoder=pipe.text_encoder, tokenizer=pipe.tokenizer)
First, the generated textual inversion safetensors is local and how can i share it with you?
You can host it on the Hugging Face Hub temporarily and share with us the link.
thank you very much and i will try that
Cc: @yiyixuxu could you give this a look?
sorry to bother you again, and i have upload my textual-inversion-sdxl safetensors in here And the base sdxl model is the RealVisXL3.0 from here Please give me a favor.
And i have another question, i read the readme from here.
The example embedding strtcture is like this:
It is so different from the generated file, i am wondering why?
This issue has been automatically marked as stale because it has not had recent activity. If you think this still needs to be addressed please comment on this thread.
Please note that issues that do not follow the contributing guidelines are likely to be ignored.
@xinli2008 Could you share the command you used to train your textual inversion model please?
Also I think you've uploaded a full text model rather than the inversion embedding here. Which is why I think you're seeing this error.
I'm facing the same problem. The same LoRA works fine with “load_lora_weights” but fails with “load_textual_inversion”. *LoRA uses local files
successful
pipe.load_lora_weights(".", weight_name="./lora/hoge.safetensors")
fail
pipe.load_textual_inversion("./lora/hoge.safetensors", token="hoge", local_files_only=True)
error
Please verify that the loaded state dictionary of the textual embedding either only has a single key or includes the
string_to_paraminput key.
@gulatis A LoRA isn't the same thing as a textual inversion. So it makes sense that the loading would fail.
@DN6 I see. So it was for embedding, thank you.
This issue has been automatically marked as stale because it has not had recent activity. If you think this still needs to be addressed please comment on this thread.
Please note that issues that do not follow the contributing guidelines are likely to be ignored.