dataclasses-json
dataclasses-json copied to clipboard
`to_dict()` fails and ignores `encoder`
Issue
When one of the fields is Dict
and its key is a hashable dataclass, calling to_dict()
fails with TypeError: unhashable type: 'dict'
.
Example code
from dataclasses import dataclass, field
from dataclasses_json import dataclass_json, config
from typing import Dict
@dataclass_json
@dataclass(frozen=True)
class A:
x: int
@dataclass_json
@dataclass
class B:
attr: Dict[A, str] = field(
metadata=config(
encoder=lambda x: 3,
decoder=lambda x: {'foo': 'bar'},
)
)
print(B({A(3): 'bar'}).to_dict())
The error:
Traceback (most recent call last):
File "error.py", line 21, in <module>
print(B({A(3): 'bar'}).to_dict())
File "/usr/local/opt/[email protected]/Frameworks/Python.framework/Versions/3.8/lib/python3.8/site-packages/dataclasses_json/api.py", line 86, in to_dict
return _asdict(self, encode_json=encode_json)
File "/usr/local/opt/[email protected]/Frameworks/Python.framework/Versions/3.8/lib/python3.8/site-packages/dataclasses_json/core.py", line 323, in _asdict
value = _asdict(getattr(obj, field.name), encode_json=encode_json)
File "/usr/local/opt/[email protected]/Frameworks/Python.framework/Versions/3.8/lib/python3.8/site-packages/dataclasses_json/core.py", line 331, in _asdict
return dict((_asdict(k, encode_json=encode_json),
TypeError: unhashable type: 'dict'
Expected behavior
As you can see, I specified how to encode and decode the field using encoder
and decoder
. Hence, dataclasses-json
shouldn't call its inner implementation.
As you can see in the implementation, the following is called:
elif isinstance(obj, Mapping):
return dict((_asdict(k, encode_json=encode_json),
_asdict(v, encode_json=encode_json)) for k, v in
obj.items())
Specifically what fails is _asdict(k, ...)
. Yet, this code shouldn't be called because encoder
is specified.
I also hit this problem, even without special encoders/decoders
I tested this in Python 3.8 / dataclasses-json newest version on master, and this code runs just fine. Closing issue, but if it reappears, please reopen this issue with more details.