functools.lru_cache() makes hamilton think a function is not a node
IMO this is actually just sloppiness on the implementation of lru_cache, not layerable. Need to verify 100% that this is the cause, but we should fix.
Current behavior
Stack Traces
(If applicable)
Screenshots
(If applicable)
Steps to replicate behavior
@functools.lru_cache(maxsize=None)
def config() -> Dict[str, Any]:
return _load_config()
def foo(config: Dict[str, Any]):
return config['foo']
File "/Users/elijahbenizzy/dev/hamilton-os/hamilton/hamilton/driver.py", line 203, in visualize_execution
self.validate_inputs(user_nodes, inputs)
File "/Users/elijahbenizzy/dev/hamilton-os/hamilton/hamilton/driver.py", line 99, in validate_inputs
raise ValueError(error_str)
ValueError: 2 errors encountered:
Error: Required input config not provided for nodes: ['foo'].
Library & System Information
E.g. python version, hamilton library version, linux, etc.
Expected behavior
Additional context
Add any other context about the problem here.
Note this is a workaround for https://github.com/stitchfix/hamilton/issues/17
I think this can be fixed in Node -- need to mess with it. Will take on.
We are moving repositories! Please see the new version of this issue at https://github.com/DAGWorks-Inc/hamilton/issues/43. Also, please give us a star/update any of your internal links.
Note that everything else (slack community, pypi packages, etc...) will not change at all.