Joseph Weston

Results 66 comments of Joseph Weston

Seems to me that this should be closed, and we should refer to #88. @basnijholt @akhmerov upvote if you agree

We could extend this to be functions defined on *lattices* in R^N. This way we can do things like describe experiments where there is a minimum granularity defined by the...

The following are not yet supported: + 2D and 3D domains + BalancingLearner (requires removing points from the interior of domains)

> Also compared to the existing implementation, global rescaling is missing. (e.g. computing y-scale of the values, and normalizing the data by it) Should this perhaps be something that the...

> Indeed, but then the learner needs two extra hooks: one at each step to update global metrics, and another one to trigger loss recomputation for all subdomains once the...

### TODO + [x] don't evaluate boundary points in `__init__` + [ ] revisit the loss function signature + [x] add tests

>All other learners implement pending_points which is a set. Would that change anything? Now I see you set self.data[x] = None. I'm using `pending_points` now

Where should we put the new LearnerND? Or maybe we should call it something different

I added the new learner to all the learner tests except the following + `test_uniform_sampling`: this test is marked `xfail` anyway + `test_point_adding_order_is_irrelevant`: This is marked xfail for Learner2D and...

Now I believe the only 3 things left to do are: + [ ] decide on the loss function signature (at the moment the loss function gets all data etc....