filter, map, but no reduce, no group
It would be good to complete this set of operations.
I think group would be:
group(Prices, {# > 10})
But I'm not sure the closure syntax can support reduce op due to the need for the closure to take two args, something like
reduce(Prices, 0, {.value + .acc})
Yes, I’m also thinking about reduce built in for some time. Two arguments maybe done with for example #1, #2 or #acc named parameters.
But what should group func do?
Group should partition an array into other arrays based on a key. Like SQL group by
So it should return a map?
I'm not sure. It could be an array of arrays, but the downside would be that you don't then know what the grouping key was.
I think the ultimate litmus test would be - can you do Hadoop style map-reduce?
https://techmytalk.com/2014/11/07/hadoop-mapreduce-group-by-operation-part1/#:~:text=The%20grouping%20of%20map%20output,the%20Map%20to%20the%20Reducer.
@antonmedv I'm not sure I'm skilled enough to write these on my own, could we get one a Zoom and you show me how please?
+1 for reduce, perhaps more than one param could be done by extending closures with a js-like "header" (x,y,z) => {...}, where the header is optional, and when not present basically becomes (#) => ?
Yes, also was thinking about (params) => {...} style. But what about {x, y => ...}?
Looks like a closure returning two values: x and a bool :)
{#1 >= #2}
@antonmedv Hi, any update on the reduce function?
Nope.
So it should return a map?
yes, this feature is useful like as slice to map。
Added groupBy func.
groupBy(1..9, # % 2)[1]
Added reduce
reduce(1..9, # + #acc)
reduce(1..9, # + #acc, 0)