bug
bug copied to clipboard
Re-engineer Maps with defaults
The current semantics of .withDefault and .withDefaultValue are tricky: there's no way to know whether .apply(key) will return a value or Nothing.
.get(key) will return Some(value) if explicitly contained in the map, or None if not explicitly contained, even if a default value is provided. I understand that this is coming from the underlying Map without defaults. .default(key) will return the default if defined, or throw an exception; but there's no way to know if a default is defined.
The defaults are also discarded when calling certain Map-returning methods, such as filterKeys, tail, or even +.
I don't know enough about the current implementation, or about CanBuildFrom etc., to know how silly this is, but really, you can't tell what's going on, and can't use them without exception handlers.
In my opinion, if the standard lib supports Maps with defaults, they should have a distinct type, not discard data, and methods like .default(key) should not be present on Maps that don't have defaults.
Thanks and please forgive any errors!
Imported From: https://issues.scala-lang.org/browse/SI-8099?orig=1 Reporter: @refried See #7632
there is some useful detail and newer discussion at https://github.com/scala/bug/issues/8665 (which I have closed as a duplicate)
if the standard lib supports Maps with defaults, they should have a distinct type
Perhaps withDefaultValue should simply return a Function1 rather than any kind of Map, thus eliminating all pitfalls. (But admittedly also removing some of the power of the current design.)
I would like to safely m.remove(k).getOrElse(m.default(null)).
The need to safely access default is mentioned on the duplicate ticket.
However, get should be consonant with getOrElse and remove.
https://github.com/scala/bug/issues/4145 (which I've now closed, since this broader ticket exists) points out another issue with the current design: you lose the default as soon as you e.g. map or filter
Interesting - if we think of Map as a PartialFunction, then providing a default value (or function from key to value) turns it into a total Function which we could andThen.
This also illuminates other problems with map and filter - if we provide a default function that computes value from a key, then mapping in a way that changes the type of keys could not possibly preserve the default function.
And if we filter in a way that a default value does not pass the predicate - should we keep it or not? And if we have a function from key to value, does the filter apply to it as well or not?
you lose the default as soon as you e.g. map or filter
oh, not filter, actually!
scala> Map(1 -> 2, 5 -> 6).withDefaultValue(3).filter((a, b) => a > 1).apply(99)
val res5: Int = 3
This is even more surprising - I said to filter it:
scala> Map(1 -> 2, 5 -> 6).withDefaultValue(3).filter((a, b) => a < 5).apply(99)
val res0: Int = 3
@szeiger writes:
I remember the
withDefaultproblem from the 2.13 redesign. AFAIR we never considered makingMap.WithDefaulta first-class collection type (TotalMap?). In retrospect this looks like the right solution. It would always be clear from the types whether or not the default is preserved, and the semantics could be documented more clearly. Note that you don't need any of this if all you want is aFunction1. You can get this (modulo strictness of the default value) withmap.getOrElse(_, default). (In practice it seems to require a type annotation for the hole; I wonder if type inference could be improved to handle this case; it wouldn't be the first improvement we made to make the 2.13 collections more usable)
The problem is that Map(1 -> 2, 5 -> 6).withDefaultValue(3).filter((a, b) => a < 5) returns a Map, the type information about the default value is lost.
@joroKr21 isn't that exactly what Stefan's suggestion would allow us to fix?
Maybe I misread your quote, I took it as a description of the status-quo.