You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I'm at best an Advanced Beginner, so maybe this is something simple that I merely need to learn.
I am aggregating information for nested collections using flatMap(). I later want to generate sequential ID numbers for all the items in the resulting flattened collection. When I try to do this, my ID numbers are unique, but not sequential, and I'd like them to be. I'm made to understand that I want concatMap().
So far, I have learned that I can combine Stream.concat() with map(), but then we lose the beautiful pattern of
Either we need to put Stream.concat() at every level or we need to follow the nested map()s with an equal number of nested Stream.concat()s. I'm hoping for nicer.
So, can we have concatMap() in general (as in, is this reasonable and feasible)? Alternatively, is there some other nice (enough) way for me to do this with Vavr as it is today?
The text was updated successfully, but these errors were encountered:
I'm at best an Advanced Beginner, so maybe this is something simple that I merely need to learn.
I am aggregating information for nested collections using
flatMap()
. I later want to generate sequential ID numbers for all the items in the resulting flattened collection. When I try to do this, my ID numbers are unique, but not sequential, and I'd like them to be. I'm made to understand that I wantconcatMap()
.So far, I have learned that I can combine
Stream.concat()
withmap()
, but then we lose the beautiful pattern ofEither we need to put
Stream.concat()
at every level or we need to follow the nestedmap()
s with an equal number of nestedStream.concat()
s. I'm hoping for nicer.So, can we have
concatMap()
in general (as in, is this reasonable and feasible)? Alternatively, is there some other nice (enough) way for me to do this with Vavr as it is today?The text was updated successfully, but these errors were encountered: