nalgebra icon indicating copy to clipboard operation
nalgebra copied to clipboard

Consider requiring that Scalar: Default

Open Andlon opened this issue 4 years ago • 0 comments

In many cases, we might want to allocate a buffer that we fill with data. One way to do that might be to use an uninitialized buffer - but this severely complicates the code, as we'll have to deal with complex unsafe reasoning. Therefore it's often preferable to fill the buffer with arbitrary values that will anyway be overwritten, since this can be done safely.

The canonical way to get such an arbitrary way is arguably T: Default. However, when writing generic code with nalgebra we often only want to require T: Scalar. In particular, if we require e.g. T: Scalar + Default then we can not call this method from a method that only has T: Scalar. This is a pretty big problem for composability.

I therefore suggest that we modify the Scalar trait to require Default. This seems like a sane choice: all integer and floating point types implement Default, and there is no reason that arbitrary precision types or big integers from external crates can not implement Default as well.

Andlon avatar Dec 30 '21 08:12 Andlon