sgkit
sgkit copied to clipboard
Regenie validation script is broken
On invoked build
, an error related to numpy typing is currently thrown when running the Glow WGR functions:
2021-07-07 18:39:42,401|INFO|__main__.run:206| --------------------------------------------------
2021-07-07 18:39:42,401|INFO|__main__.run:207| Covariate info:
2021-07-07 18:39:42,403|INFO|__main__.run:208| <class 'pandas.core.frame.DataFrame'>
Index: 50 entries, S0000001 to S0000050
Data columns (total 3 columns):
X000 50 non-null float64
X001 50 non-null float64
X002 50 non-null float64
dtypes: float64(3)
memory usage: 1.6+ KB
Traceback (most recent call last):
File "glow_wgr.py", line 380, in <module>
fire.Fire()
File "/opt/conda/envs/glow/lib/python3.7/site-packages/fire/core.py", line 138, in Fire
component_trace = _Fire(component, args, parsed_flag_args, context, name)
File "/opt/conda/envs/glow/lib/python3.7/site-packages/fire/core.py", line 468, in _Fire
target=component.__name__)
File "/opt/conda/envs/glow/lib/python3.7/site-packages/fire/core.py", line 672, in _CallAndUpdateTrace
component = fn(*varargs, **kwargs)
File "glow_wgr.py", line 376, in run_from_config
alphas=ps_config["alphas"],
File "glow_wgr.py", line 210, in run
stack = RidgeReducer(alphas=alphas)
File "/opt/conda/envs/glow/lib/python3.7/site-packages/typeguard/__init__.py", line 703, in wrapper
retval = func(*args, **kwargs)
File "/opt/conda/envs/glow/lib/python3.7/site-packages/glow/wgr/linear_model/ridge_model.py", line 44, in __init__
self.alphas = create_alpha_dict(alphas)
File "/opt/conda/envs/glow/lib/python3.7/site-packages/typeguard/__init__.py", line 704, in wrapper
check_return_type(retval, memo)
File "/opt/conda/envs/glow/lib/python3.7/site-packages/typeguard/__init__.py", line 554, in check_return_type
raise TypeError(exc) from None
TypeError: type of the return value['alpha_0'] must be nptyping.types._number.Number; got numpy.float64 instead
This issue was fixed in Glow v1.0.1: projectglow/glow#363.
Nice, thanks @gagank1! Currently glow 0.5.0 is used for generating this testing data so hopefully the upgrade isn't too problematic.