[Bug] `Layer.getBounds()` type is not compatible with `WebMercatorViewport.fitBounds()`
Description
If one looks at the the fitBounds docs, the example usage is the following:
const [initialViewState, setInitialViewState] = useState<MapViewState>({
longitude: -100,
latitude: 40,
zoom: 4
});
const [hasLoaded, setHasLoaded] = useState<boolean>(false);
const layer = new ScatterplotLayer({...});
const onAfterRender = () => {
if (!hasLoaded && layer.isLoaded) {
setHasLoaded(true);
const viewport = layer.context.viewport as WebMercatorViewport;
const {longitude, latitude, zoom} = viewport.fitBounds(layer.getBounds());
setInitialViewState({longitude, latitude, zoom});
}
};
However, the return type of ScatterPlotLayer.getBounds() will be [number[], number[]] | null, while WebMercatorViewport.fitBounds takes in a [[number, number], [number, number]]. The example code fails to type check as a result of this issue both because layer.getBounds() could return null and because number[] cannot be coerced to [number, number]. The first issue is easy to fix with a null check, but the second requires either an as cast or some more arduous checks.
Flavors
- [ ] Script tag
- [X] React
- [ ] Python/Jupyter notebook
- [ ] MapboxOverlay
- [ ] GoogleMapsOverlay
- [ ] CartoLayer
- [ ] ArcGIS
Expected Behavior
If the output of Layer.getBounds() is truthy, its type should be accepted in WebMercatorView.fitBounds().
Steps to Reproduce
Follow the instructions on the fitBounds() docs to set up a Deck.GL scene.
Environment
- Framework version: 8.9.6
- Browser: Chrome
- OS: MacOs
Logs
No response
@chrisgervang @zbigg Do you have any suggestions how we can better handle this? Both types are technically correct:
layer.getBounds()can be any dimensions depending on the attribute sizeviewport.fitBounds()will lose proper type hint if we loosen it tonumber[][]
Hmm, maybe use a type guard function? Does something like this work?
function isBounds(bounds: any): bounds is [[number, number], [number, number]] {
return Array.isArray(bounds) &&
bounds.length === 2 &&
bounds.every(coord => Array.isArray(coord) && coord.length === 2;
}
const bounds = layer.getBounds();
if (bounds && isBounds(bounds)) {
const { longitude, latitude, zoom } = viewport.fitBounds(bounds);
setInitialViewState({ longitude, latitude, zoom });
}
Otherwise, I've only thought of solutions already mentioned. Type assertion or loosen the type.
Unfortunately i don't have any trick that can help us. We're left with assertion or loosened type.
Note, I've tried also to override type of ScatterPlotLayer.getBounds to return [[number,number,number], [number,number, number]] (actual type), but it's not assignable too:
Argument of type '[XyzPosition, XyzPosition]' is not assignable to parameter of type 'LatLonBounds'. Type at position 0 in source is not compatible with type at position 0 in target. Type 'XyzPosition' is not assignable to type 'LatLon'. Source has 3 element(s) but target allows only 2.(2345)
Looks like TS is too strict for our case and i see no way to loosen those checks. Playground
Pushed by curiosity, i've found one solution to strict checking of tuple arity this SO answer:
export type NonEmptyArray<Type> = [Type, ...Array<Type>];
That declares tuple type that matches 1 or elements but not 0.
Extended to 2 or more, we can create types like:
type Vec1D<Type> = [Type, ...Array<Type>];
type Vec2D<Type> = [Type, Type, ...Array<Type>];
type Vec3D<Type> = [Type, Type, Type, ...Array<Type>];
And if we use such types in getBounds / fitBounds we can have proper type checking as long as ScatterPlotLayer type is not erased. Accessing getBounds through Layer interface will still return potentially 0 or 1 dimension bounds, so we're out of luck.
Having found this, i still think it's rather obscure technique and i wouldn't recommend complicating typings so much.
type LatLon = [number, number, ...number[]]
type Vec3D = [number, number, number, ...number[]];
class Layer {
getBounds(): [number[], number[]] {
return [[1, 2],[4, 5]];
}
}
class ScatterPlotLayer extends Layer {
getBounds(): [Vec3D, Vec3D] {
return super.getBounds() as [Vec3D, Vec3D];
}
}
function fitBoundsStrict(bounds: [LatLon, LatLon]) {
console.log(bounds);
}
const layer = new Layer()
const scatterPlotLayer = new ScatterPlotLayer()
fitBoundsStrict(scatterPlotLayer.getBounds()); // ok
fitBoundsStrict(layer.getBounds()); // error
Layer.getBounds() should only return 2D or 3D bounds, even if attributes can be 1D or 4D also. How about we have:
getBounds():
| [[number, number], [number, number]]
| [[number, number, number], [number, number, number]]
| null {
and then have fitBounds accept both, but drop in the 3D component internally
fitBounds(
bounds:
| [[number, number], [number, number]]
| [[number, number, number], [number, number, number]],
...
) {
const bounds2D: [[number, number], [number, number]] = [
[bounds[0][0], bounds[0][1]],
[bounds[1][0], bounds[1][1]]
];
...
}
If we really want to firm this up, makes me wonder if should we add some bounds type(s) / util in @math.gl/types, since the bounds type is used all over the place (math, loaders, deck, luma, ...).
Bounds2D, Bounds3D, Bounds...
It would be a fair amount of work for a non-functional change.