PySR
PySR copied to clipboard
[BUG]: PyCall.jlwrap error-- dimensional constraint
What happened?
Hi Miles, I am trying out the new feature where we can impose the dimensional constraint. I just copied the example available from the collection of toy examples (number ten), and I run to the following error:
Version
0.16.3
Operating System
Linux
Package Manager
Conda
Interface
Jupyter Notebook
Relevant log output
RuntimeError Traceback (most recent call last)
Cell In[5], line 1
----> 1 model.fit(
2 X,
3 y,
4 X_units=["Constants.M_sun", "kg", "Constants.R_earth"],
5 y_units="kg * m / s^2"
6 )
File /glade/user/opt/miniconda/envs/eq_new/lib/python3.11/site-packages/pysr/sr.py:1970, in PySRRegressor.fit(self, X, y, Xresampled, weights, variable_names, X_units, y_units)
1967 self._checkpoint()
1969 # Perform the search:
-> 1970 self._run(X, y, mutated_params, weights=weights, seed=seed)
1972 # Then, after fit, we save again, so the pickle file contains
1973 # the equations:
1974 if not self.temp_equation_file:
File /glade/user/opt/miniconda/envs/eq_new/lib/python3.11/site-packages/pysr/sr.py:1800, in PySRRegressor._run(self, X, y, mutated_params, weights, seed)
1796 y_variable_names = [f"y{_subscriptify(i)}" for i in range(y.shape[1])]
1798 # Call to Julia backend.
1799 # See https://github.com/MilesCranmer/SymbolicRegression.jl/blob/master/src/SymbolicRegression.jl
-> 1800 self.raw_julia_state_ = SymbolicRegression.equation_search(
1801 Main.X,
1802 Main.y,
1803 weights=Main.weights,
1804 niterations=int(self.niterations),
1805 variable_names=self.feature_names_in_.tolist(),
1806 display_variable_names=self.display_feature_names_in_.tolist(),
1807 y_variable_names=y_variable_names,
1808 X_units=self.X_units_,
1809 y_units=self.y_units_,
1810 options=options,
1811 numprocs=cprocs,
1812 parallelism=parallelism,
1813 saved_state=self.raw_julia_state_,
1814 return_state=True,
1815 addprocs_function=cluster_manager,
1816 progress=progress and self.verbosity > 0 and len(y.shape) == 1,
1817 verbosity=int(self.verbosity),
1818 )
1820 # Set attributes
1821 self.equations_ = self.get_hof()
RuntimeError: <PyCall.jlwrap (in a Julia function called from Python)
JULIA: MethodError: no method matching _method_instances(::Type{typeof(*)}, ::Type{Tuple{SymbolicRegression.DimensionalAnalysisModule.WildcardQuantity{DynamicQuantities.Quantity{Float32, DynamicQuantities.Dimensions{DynamicQuantities.FixedRational{Int32, 25200}}}}, SymbolicRegression.DimensionalAnalysisModule.WildcardQuantity{DynamicQuantities.Quantity{Float32, DynamicQuantities.Dimensions{DynamicQuantities.FixedRational{Int32, 25200}}}}}})
The applicable method may be too new: running in world age 45056, while current world is 54697.
Closest candidates are:
_method_instances(::Any, ::Any) (method too new to be called from this world context.)
@ Tricks /glade/user/opt/miniconda/envs/eq_new/share/pysr/depot/packages/Tricks/7oAyo/src/Tricks.jl:150
Stacktrace:
[1] #s1778#1
@ /glade/user/opt/miniconda/envs/eq_new/share/pysr/depot/packages/Tricks/7oAyo/src/Tricks.jl:16 [inlined]
[2] var"#s1778#1"(T::Any, ::Any, f::Any, t::Any)
@ Tricks ./none:0
[3] (::Core.GeneratedFunctionStub)(::Any, ::Vararg{Any})
@ Core ./boot.jl:602
[4] deg2_eval(op::typeof(*), l::SymbolicRegression.DimensionalAnalysisModule.WildcardQuantity{DynamicQuantities.Quantity{Float32, DynamicQuantities.Dimensions{DynamicQuantities.FixedRational{Int32, 25200}}}}, r::SymbolicRegression.DimensionalAnalysisModule.WildcardQuantity{DynamicQuantities.Quantity{Float32, DynamicQuantities.Dimensions{DynamicQuantities.FixedRational{Int32, 25200}}}})
@ SymbolicRegression.DimensionalAnalysisModule /glade/user/opt/miniconda/envs/eq_new/share/pysr/depot/packages/SymbolicRegression/XKtla/src/DimensionalAnalysis.jl:142
[5] violates_dimensional_constraints_dispatch(tree::Node{Float32}, x_units::Vector{DynamicQuantities.Quantity{Float32, DynamicQuantities.Dimensions{DynamicQuantities.FixedRational{Int32, 25200}}}}, x::SubArray{Float32, 1, Matrix{Float32}, Tuple{Base.Slice{Base.OneTo{Int64}}, Int64}, true}, operators::DynamicExpressions.OperatorEnumModule.OperatorEnum)
@ SymbolicRegression.DimensionalAnalysisModule /glade/user/opt/miniconda/envs/eq_new/share/pysr/depot/packages/SymbolicRegression/XKtla/src/DimensionalAnalysis.jl:166
[6] violates_dimensional_constraints_dispatch(tree::Node{Float32}, x_units::Vector{DynamicQuantities.Quantity{Float32, DynamicQuantities.Dimensions{DynamicQuantities.FixedRational{Int32, 25200}}}}, x::SubArray{Float32, 1, Matrix{Float32}, Tuple{Base.Slice{Base.OneTo{Int64}}, Int64}, true}, operators::DynamicExpressions.OperatorEnumModule.OperatorEnum)
@ SymbolicRegression.DimensionalAnalysisModule /glade/user/opt/miniconda/envs/eq_new/share/pysr/depot/packages/SymbolicRegression/XKtla/src/DimensionalAnalysis.jl:164
[7] violates_dimensional_constraints_dispatch(tree::Node{Float32}, x_units::Vector{DynamicQuantities.Quantity{Float32, DynamicQuantities.Dimensions{DynamicQuantities.FixedRational{Int32, 25200}}}}, x::SubArray{Float32, 1, Matrix{Float32}, Tuple{Base.Slice{Base.OneTo{Int64}}, Int64}, true}, operators::DynamicExpressions.OperatorEnumModule.OperatorEnum)
@ SymbolicRegression.DimensionalAnalysisModule /glade/user/opt/miniconda/envs/eq_new/share/pysr/depot/packages/SymbolicRegression/XKtla/src/DimensionalAnalysis.jl:161
[8] violates_dimensional_constraints
@ /glade/useropt/miniconda/envs/eq_new/share/pysr/depot/packages/SymbolicRegression/XKtla/src/DimensionalAnalysis.jl:191 [inlined]
[9] violates_dimensional_constraints
@ /glade/user/opt/miniconda/envs/eq_new/share/pysr/depot/packages/SymbolicRegression/XKtla/src/DimensionalAnalysis.jl:177 [inlined]
[10] dimensional_regularization
@ /glade/user/opt/miniconda/envs/eq_new/share/pysr/depot/packages/SymbolicRegression/XKtla/src/LossFunctions.jl:215 [inlined]
[11] _eval_loss(tree::Node{Float32}, dataset::Dataset{Float32, Float32, Matrix{Float32}, Vector{Float32}, Nothing, NamedTuple{(), Tuple{}}, Vector{DynamicQuantities.Quantity{Float32, DynamicQuantities.Dimensions{DynamicQuantities.FixedRational{Int32, 25200}}}}, DynamicQuantities.Quantity{Float32, DynamicQuantities.Dimensions{DynamicQuantities.FixedRational{Int32, 25200}}}, Vector{DynamicQuantities.Quantity{Float32, DynamicQuantities.SymbolicDimensions{DynamicQuantities.FixedRational{Int32, 25200}}}}, DynamicQuantities.Quantity{Float32, DynamicQuantities.SymbolicDimensions{DynamicQuantities.FixedRational{Int32, 25200}}}}, options::Options{Int64, DynamicExpressions.OperatorEnumModule.OperatorEnum, false, Optim.Options{Float64, Nothing}, StatsBase.Weights{Float64, Float64, Vector{Float64}}}, regularization::Bool, idx::Nothing)
@ SymbolicRegression.LossFunctionsModule /glade/user/opt/miniconda/envs/eq_new/share/pysr/depot/packages/SymbolicRegression/XKtla/src/LossFunctions.jl:67
[12] eval_loss(tree::Node{Float32}, dataset::Dataset{Float32, Float32, Matrix{Float32}, Vector{Float32}, Nothing, NamedTuple{(), Tuple{}}, Vector{DynamicQuantities.Quantity{Float32, DynamicQuantities.Dimensions{DynamicQuantities.FixedRational{Int32, 25200}}}}, DynamicQuantities.Quantity{Float32, DynamicQuantities.Dimensions{DynamicQuantities.FixedRational{Int32, 25200}}}, Vector{DynamicQuantities.Quantity{Float32, DynamicQuantities.SymbolicDimensions{DynamicQuantities.FixedRational{Int32, 25200}}}}, DynamicQuantities.Quantity{Float32, DynamicQuantities.SymbolicDimensions{DynamicQuantities.FixedRational{Int32, 25200}}}}, options::Options{Int64, DynamicExpressions.OperatorEnumModule.OperatorEnum, false, Optim.Options{Float64, Nothing}, StatsBase.Weights{Float64, Float64, Vector{Float64}}}; regularization::Bool, idx::Nothing)
@ SymbolicRegression.LossFunctionsModule /glade/user/opt/miniconda/envs/eq_new/share/pysr/depot/packages/SymbolicRegression/XKtla/src/LossFunctions.jl:101
[13] eval_loss
@ /glade/user/opt/miniconda/envs/eq_new/share/pysr/depot/packages/SymbolicRegression/XKtla/src/LossFunctions.jl:93 [inlined]
[14] #score_func#5
@ /glade/user/opt/miniconda/envs/eq_new/share/pysr/depot/packages/SymbolicRegression/XKtla/src/LossFunctions.jl:160 [inlined]
[15] score_func
@ /glade/user/opt/miniconda/envs/eq_new/share/pysr/depot/packages/SymbolicRegression/XKtla/src/LossFunctions.jl:157 [inlined]
[16] PopMember(dataset::Dataset{Float32, Float32, Matrix{Float32}, Vector{Float32}, Nothing, NamedTuple{(), Tuple{}}, Vector{DynamicQuantities.Quantity{Float32, DynamicQuantities.Dimensions{DynamicQuantities.FixedRational{Int32, 25200}}}}, DynamicQuantities.Quantity{Float32, DynamicQuantities.Dimensions{DynamicQuantities.FixedRational{Int32, 25200}}}, Vector{DynamicQuantities.Quantity{Float32, DynamicQuantities.SymbolicDimensions{DynamicQuantities.FixedRational{Int32, 25200}}}}, DynamicQuantities.Quantity{Float32, DynamicQuantities.SymbolicDimensions{DynamicQuantities.FixedRational{Int32, 25200}}}}, t::Node{Float32}, options::Options{Int64, DynamicExpressions.OperatorEnumModule.OperatorEnum, false, Optim.Options{Float64, Nothing}, StatsBase.Weights{Float64, Float64, Vector{Float64}}}, complexity::Nothing; ref::Int64, parent::Int64, deterministic::Bool)
@ SymbolicRegression.PopMemberModule /glade/user/opt/miniconda/envs/eq_new/share/pysr/depot/packages/SymbolicRegression/XKtla/src/PopMember.jl:99
[17] PopMember (repeats 2 times)
@ /glade/user/opt/miniconda/envs/eq_new/share/pysr/depot/packages/SymbolicRegression/XKtla/src/PopMember.jl:88 [inlined]
[18] #2
@ ./none:0 [inlined]
[19] iterate
@ ./generator.jl:47 [inlined]
[20] collect(itr::Base.Generator{UnitRange{Int64}, SymbolicRegression.PopulationModule.var"#2#3"{Float32, Int64, Options{Int64, DynamicExpressions.OperatorEnumModule.OperatorEnum, false, Optim.Options{Float64, Nothing}, StatsBase.Weights{Float64, Float64, Vector{Float64}}}, Int64, Dataset{Float32, Float32, Matrix{Float32}, Vector{Float32}, Nothing, NamedTuple{(), Tuple{}}, Vector{DynamicQuantities.Quantity{Float32, DynamicQuantities.Dimensions{DynamicQuantities.FixedRational{Int32, 25200}}}}, DynamicQuantities.Quantity{Float32, DynamicQuantities.Dimensions{DynamicQuantities.FixedRational{Int32, 25200}}}, Vector{DynamicQuantities.Quantity{Float32, DynamicQuantities.SymbolicDimensions{DynamicQuantities.FixedRational{Int32, 25200}}}}, DynamicQuantities.Quantity{Float32, DynamicQuantities.SymbolicDimensions{DynamicQuantities.FixedRational{Int32, 25200}}}}}})
@ Base ./array.jl:782
[21] #Population#1
@ /glade/user/opt/miniconda/envs/eq_new/share/pysr/depot/packages/SymbolicRegression/XKtla/src/Population.jl:51 [inlined]
[22] Population
@ /glade/user/opt/miniconda/envs/eq_new/share/pysr/depot/packages/SymbolicRegression/XKtla/src/Population.jl:37 [inlined]
[23] #6
@ ./none:0 [inlined]
[24] iterate
@ ./generator.jl:47 [inlined]
[25] collect(itr::Base.Generator{UnitRange{Int64}, SymbolicRegression.SearchUtilsModule.var"#6#8"{Dataset{Float32, Float32, Matrix{Float32}, Vector{Float32}, Nothing, NamedTuple{(), Tuple{}}, Vector{DynamicQuantities.Quantity{Float32, DynamicQuantities.Dimensions{DynamicQuantities.FixedRational{Int32, 25200}}}}, DynamicQuantities.Quantity{Float32, DynamicQuantities.Dimensions{DynamicQuantities.FixedRational{Int32, 25200}}}, Vector{DynamicQuantities.Quantity{Float32, DynamicQuantities.SymbolicDimensions{DynamicQuantities.FixedRational{Int32, 25200}}}}, DynamicQuantities.Quantity{Float32, DynamicQuantities.SymbolicDimensions{DynamicQuantities.FixedRational{Int32, 25200}}}}, Options{Int64, DynamicExpressions.OperatorEnumModule.OperatorEnum, false, Optim.Options{Float64, Nothing}, StatsBase.Weights{Float64, Float64, Vector{Float64}}}}})
@ Base ./array.jl:782
[26] #5
@ ./none:0 [inlined]
[27] iterate
@ ./generator.jl:47 [inlined]
[28] collect(itr::Base.Generator{Vector{Dataset{Float32, Float32, Matrix{Float32}, Vector{Float32}, Nothing, NamedTuple{(), Tuple{}}, Vector{DynamicQuantities.Quantity{Float32, DynamicQuantities.Dimensions{DynamicQuantities.FixedRational{Int32, 25200}}}}, DynamicQuantities.Quantity{Float32, DynamicQuantities.Dimensions{DynamicQuantities.FixedRational{Int32, 25200}}}, Vector{DynamicQuantities.Quantity{Float32, DynamicQuantities.SymbolicDimensions{DynamicQuantities.FixedRational{Int32, 25200}}}}, DynamicQuantities.Quantity{Float32, DynamicQuantities.SymbolicDimensions{DynamicQuantities.FixedRational{Int32, 25200}}}}}, SymbolicRegression.SearchUtilsModule.var"#5#7"{Int64, Options{Int64, DynamicExpressions.OperatorEnumModule.OperatorEnum, false, Optim.Options{Float64, Nothing}, StatsBase.Weights{Float64, Float64, Vector{Float64}}}}})
@ Base ./array.jl:782
[29] init_dummy_pops
@ /glade/user/opt/miniconda/envs/eq_new/share/pysr/depot/packages/SymbolicRegression/XKtla/src/SearchUtils.jl:51 [inlined]
[30] _equation_search(#unused#::Val{:multithreading}, #unused#::Val{1}, datasets::Vector{Dataset{Float32, Float32, Matrix{Float32}, Vector{Float32}, Nothing, NamedTuple{(), Tuple{}}, Vector{DynamicQuantities.Quantity{Float32, DynamicQuantities.Dimensions{DynamicQuantities.FixedRational{Int32, 25200}}}}, DynamicQuantities.Quantity{Float32, DynamicQuantities.Dimensions{DynamicQuantities.FixedRational{Int32, 25200}}}, Vector{DynamicQuantities.Quantity{Float32, DynamicQuantities.SymbolicDimensions{DynamicQuantities.FixedRational{Int32, 25200}}}}, DynamicQuantities.Quantity{Float32, DynamicQuantities.SymbolicDimensions{DynamicQuantities.FixedRational{Int32, 25200}}}}}, niterations::Int64, options::Options{Int64, DynamicExpressions.OperatorEnumModule.OperatorEnum, false, Optim.Options{Float64, Nothing}, StatsBase.Weights{Float64, Float64, Vector{Float64}}}, numprocs::Nothing, procs::Nothing, addprocs_function::Nothing, runtests::Bool, saved_state::Nothing, verbosity::Int64, progress::Bool, #unused#::Val{true})
@ SymbolicRegression /glade/user/opt/miniconda/envs/eq_new/share/pysr/depot/packages/SymbolicRegression/XKtla/src/SymbolicRegression.jl:601
[31] equation_search(datasets::Vector{Dataset{Float32, Float32, Matrix{Float32}, Vector{Float32}, Nothing, NamedTuple{(), Tuple{}}, Vector{DynamicQuantities.Quantity{Float32, DynamicQuantities.Dimensions{DynamicQuantities.FixedRational{Int32, 25200}}}}, DynamicQuantities.Quantity{Float32, DynamicQuantities.Dimensions{DynamicQuantities.FixedRational{Int32, 25200}}}, Vector{DynamicQuantities.Quantity{Float32, DynamicQuantities.SymbolicDimensions{DynamicQuantities.FixedRational{Int32, 25200}}}}, DynamicQuantities.Quantity{Float32, DynamicQuantities.SymbolicDimensions{DynamicQuantities.FixedRational{Int32, 25200}}}}}; niterations::Int64, options::Options{Int64, DynamicExpressions.OperatorEnumModule.OperatorEnum, false, Optim.Options{Float64, Nothing}, StatsBase.Weights{Float64, Float64, Vector{Float64}}}, parallelism::String, numprocs::Nothing, procs::Nothing, addprocs_function::Nothing, runtests::Bool, saved_state::Nothing, return_state::Bool, verbosity::Int64, progress::Bool, v_dim_out::Val{1})
@ SymbolicRegression /glade/user/opt/miniconda/envs/eq_new/share/pysr/depot/packages/SymbolicRegression/XKtla/src/SymbolicRegression.jl:507
[32] equation_search(X::Matrix{Float32}, y::Matrix{Float32}; niterations::Int64, weights::Nothing, options::Options{Int64, DynamicExpressions.OperatorEnumModule.OperatorEnum, false, Optim.Options{Float64, Nothing}, StatsBase.Weights{Float64, Float64, Vector{Float64}}}, variable_names::Vector{String}, display_variable_names::Vector{String}, y_variable_names::Nothing, parallelism::String, numprocs::Nothing, procs::Nothing, addprocs_function::Nothing, runtests::Bool, saved_state::Nothing, return_state::Bool, loss_type::Type{Nothing}, verbosity::Int64, progress::Bool, X_units::Vector{String}, y_units::String, v_dim_out::Val{1}, multithreaded::Nothing, varMap::Nothing)
@ SymbolicRegression /glade/user/opt/miniconda/envs/eq_new/share/pysr/depot/packages/SymbolicRegression/XKtla/src/SymbolicRegression.jl:385
Extra Info
The installed packages are:
Julia 1.9.3
Pysr 0.16.3
python 3.11.5
When I remove the dimensional constraints the code runs with no issue.
I very much appreciate any advice
Hi @Sshamekh,
Thanks for the bug report. This will be a bit lengthy to fix but I’ll get on it asap. In the meantime you can fix this issue by upgrading to Juli 1.10 (it’s currently in beta, but it will fix things).
Cheers, Miles
Can confirm, updating to Julia 1.10 fixed it for me.
@MilesCranmer Hi Miles, I have the same error now. I have tried Julia1.10.3 and Julia1.10.0, but the kernel seems to have crashed ( my previous use of Julia1.9 can run normally ). So I wonder if you can tell the python and julia versions that are currently running correctly and can achieve dimensional balance ? Thank you
@zhuyi-bjut could you please file a new issue for your bug? https://github.com/MilesCranmer/PySR/issues/new/choose