• 周日. 11月 27th, 2022

5G编程聚合网

5G时代下一个聚合的编程学习网

热门标签

Basic part: Java. Stream function, elegant data stream operation

[db:作者]

1月 6, 2022

Preface

Usually operate the set data , We are usually for perhaps iterator To traverse the , It’s not very nice .java Provides Stream The concept of , It allows us to treat aggregate data as elements , And it provides multithreading mode

  • Stream creation
  • Various data operations of streams
  • Termination of stream
  • Aggregation of streams
  • Concurrent flows and CompletableFuture In combination with

Official account , Communicate together , Search on wechat : Sneak forward

github Address , thank star

1 stream Mode of construction

stream Built in constructors

public static<T> Stream<T> iterate(final T seed, final UnaryOperator<T> f)
public static <T> Stream<T> concat(Stream<? extends T> a, Stream<? extends T> b)
public static<T> Builder<T> builder()
public static<T> Stream<T> of(T t)
public static<T> Stream<T> empty()
public static<T> Stream<T> generate(Supplier<T> s)

Collection Declarative stream function

default Stream<E> stream()
  • Collection The statement stream Transformation function , in other words , arbitrarily Collection Subclasses are officially implemented for us by Collection To Stream Methods
  • Example ,List turn Stream

    public static void main(String[] args){
    List<String> demo = Arrays.asList("a","b","c");
    long count = demo.stream().peek(System.out::println).count();
    System.out.println(count);
    }
    -------result--------
    a
    b
    c
    3

2 Interface stream The operation method definition of the element

Filter filter

Stream<T> filter(Predicate<? super T> predicate)
  • Predicate It’s a functional interface , It can be used directly lambda Instead of ; If there’s complicated filtering logic , Then use or、and、negate Method combination
  • Example

    List<String> demo = Arrays.asList("a", "b", "c");
    Predicate<String> f1 = item -> item.equals("a");
    Predicate<String> f2 = item -> item.equals("b");
    demo.stream().filter(f1.or(f2)).forEach(System.out::println);
    -------result--------
    a
    b

Mapping transformation map

<R> Stream<R> map(Function<? super T, ? extends R> mapper)
IntStream mapToInt(ToIntFunction<? super T> mapper);
LongStream mapToLong(ToLongFunction<? super T> mapper);
DoubleStream mapToDouble(ToDoubleFunction<? super T> mapper);
  • Example

    static class User{
    public User(Integer id){this.id = id; }
    Integer id; public Integer getId() { return id; }
    }
    public static void main(String[] args) {
    List<User> demo = Arrays.asList(new User(1), new User(2), new User(3));
    // User To Integer(id)
    demo.stream().map(User::getId).forEach(System.out::println);
    }
    -------result--------
    1
    2
    3

    Data processing peek

    Stream<T> peek(Consumer<? super T> action);
  • And map The difference is that it has no return value
  • Example

    static class User{
    public User(Integer id){this.id = id; }
    Integer id;
    public Integer getId() { return id; }
    public void setId(Integer id) { this.id = id; }
    }
    public static void main(String[] args) {
    List<User> demo = Arrays.asList(new User(1), new User(2), new User(3));
    // id square ,User To Integer(id)
    demo.stream().peek(user -> user.setId(user.id * user.id)).map(User::getId).forEach(System.out::println);
    }
    -------result--------
    1
    4
    9

Mapping flattens flatMap

<R> Stream<R> flatMap(Function<? super T, ? extends Stream<? extends R>> mapper);
IntStream flatMapToInt(Function<? super T, ? extends IntStream> mapper);
LongStream flatMapToLong(Function<? super T, ? extends LongStream> mapper);
DoubleStream flatMapToDouble(Function<? super T, ? extends DoubleStream> mapper);
  • flatMap: Put the element as Stream\<T> The flow of a type is equal to an element type of T Of Stream flow
  • Example

    public static void main(String[] args) {
    List<Stream<Integer>> demo = Arrays.asList(Stream.of(5), Stream.of(2), Stream.of(1));
    demo.stream().flatMap(Function.identity()).forEach(System.out::println);
    }
    -------result--------
    5
    2
    1

duplicate removal distinct

Stream<T> distinct();
  • Example

    List<Integer> demo = Arrays.asList(1, 1, 2);
    demo.stream().distinct().forEach(System.out::println);
    -------result--------
    1
    2

Sort sorted

Stream<T> sorted();
Stream<T> sorted(Comparator<? super T> comparator);
  • Example

    List<Integer> demo = Arrays.asList(5, 1, 2);
    // Default ascending order
    demo.stream().sorted().forEach(System.out::println);
    // Descending
    Comparator<Integer> comparator = Comparator.<Integer, Integer>comparing(item -> item).reversed();
    demo.stream().sorted(comparator).forEach(System.out::println);
    ------- Default ascending order result--------
    1
    2
    5
    ------- Descending result--------
    5
    2
    1

Number limit limit And skip skip

// Before interception maxSize Elements
Stream<T> limit(long maxSize);
// Skip the former n A flow
Stream<T> skip(long n);
  • Example

    List<Integer> demo = Arrays.asList(1, 2, 3, 4, 5, 6);
    // Skip the first two , Then limit the interception of two
    demo.stream().skip(2).limit(2).forEach(System.out::println);
    -------result--------
    3
    4

JDK9 New operations provided

  • and filter The difference between ,takeWhile It’s the element that satisfies the condition , Until I’m not satisfied ;dropWhile It’s discarding elements that satisfy the condition , Until I’m not satisfied

    default Stream<T> takeWhile(Predicate<? super T> predicate);
    default Stream<T> dropWhile(Predicate<? super T> predicate);

3 stream Termination of action

Traverse consumption

// Traverse consumption
void forEach(Consumer<? super T> action);
// Order traversal consumption , and forEach Is the difference between the forEachOrdered In a multithreaded parallelStream perform , The order will not be out of order
void forEachOrdered(Consumer<? super T> action);
  • Example

    List<Integer> demo = Arrays.asList(1, 2, 3);
    demo.parallelStream().forEach(System.out::println);
    demo.parallelStream().forEachOrdered(System.out::println);
    -------forEach result--------
    2
    3
    1
    -------forEachOrdered result--------
    1
    2
    3

Get array results

// Transfer into Object Array
Object[] toArray();
// Transfer into A[] Array , Specify the type A
<A> A[] toArray(IntFunction<A[]> generator)
  • Example

    List<String> demo = Arrays.asList("1", "2", "3");
    //<A> A[] toArray(IntFunction<A[]> generator)
    String[] data = demo.stream().toArray(String[]::new);

Max min

// Get the minimum
Optional<T> min(Comparator<? super T> comparator)
// Get the maximum
Optional<T> max(Comparator<? super T> comparator)
  • Example

    List<Integer> demo = Arrays.asList(1, 2, 3);
    Optional<Integer> min = demo.stream().min(Comparator.comparing(item->item));
    Optional<Integer> max = demo.stream().max(Comparator.comparing(item->item));
    System.out.println(min.get()+"-"+max.get());
    -------result--------
    1-3

Find a match

// Any match
boolean anyMatch(Predicate<? super T> predicate)
// All match
boolean allMatch(Predicate<? super T> predicate)
// Mismatch
boolean noneMatch(Predicate<? super T> predicate)
// find first
Optional<T> findFirst();
// Any one
Optional<T> findAny();

To merge

// A merger of two
Optional<T> reduce(BinaryOperator<T> accumulator)
// A merger of two , With initial values
T reduce(T identity, BinaryOperator<T> accumulator)
// First transform the element type, and then merge them in pairs , With initial values
<U> U reduce(U identity, BiFunction<U, ? super T, U> accumulator, BinaryOperator<U> combiner)
  • Example

    List<Integer> demo = Arrays.asList(1, 2, 3, 4, 5, 6, 7, 8);
    // Numbers are converted into strings , And then use “-” Splice up
    String data = demo.stream().reduce("0", (u, t) -> u + "-" + t, (s1, s2) -> s1 + "-" + s2);
    System.out.println(data);
    -------result--------
    0-1-2-3-4-5-6-7-8

Count the number of elements

long count()
  • Example

    List<Integer> demo = Arrays.asList(1, 2, 3, 4, 5, 6);
    System.out.println(demo.stream().count());
    -------result--------
    6

The aggregation of convection

/**
* supplier: Returns the producer of the result type
* accumulator: Element consumers ( Process and add R)
* combiner: Return results R How to combine ( When multithreading is executed , Multiple return values will be generated R, Need merger )
*/
<R> R collect(Supplier<R> supplier, BiConsumer<R, ? super T> accumulator, BiConsumer<R, R> combiner);
/**
* collector It's usually by supplier、accumulator、combiner、finisher、characteristics Aggregate classes that are composed of
* Collectors Some built-in aggregation classes or methods can be provided
*/
<R, A> R collect(Collector<? super T, A, R> collector);
  • Example , Look below

4 Collector( Aggregate class ) Tool class set of Collectors

Interface Collector And implementation classes CollectorImpl

// The producer of the return value type
Supplier<A> supplier();
// Stream element consumers
BiConsumer<A, T> accumulator();
// Return value combiner ( When multiple threads operate , Multiple return values will be generated , Need merger )
BinaryOperator<A> combiner();
// Return value converter ( The last step is to deal with , The actual return result , Usually return as is )
Function<A, R> finisher();
// The nature of flow
Set<Characteristics> characteristics();
public static<T, A, R> Collector<T, A, R> of(Supplier<A> supplier,
BiConsumer<A, T> accumulator, BinaryOperator<A> combiner,
Function<A, R> finisher, Characteristics... characteristics)

Stream aggregation transforms into List, Set

// Flow is transformed into List
public static <T> Collector<T, ?, List<T>> toList()
// Flow is transformed into Set
public static <T> Collector<T, ?, Set<T>> toSet()
  • Example

    List<Integer> demo = Arrays.asList(1, 2, 3);
    List<Integer> col = demo.stream().collect(Collectors.toList());
    Set<Integer> set = demo.stream().collect(Collectors.toSet());

Stream aggregation transforms into Map

// Flow is transformed into Map
public static <T, K, U> Collector<T, ?, Map<K,U>> toMap(
Function<? super T, ? extends K> keyMapper,
Function<? super T, ? extends U> valueMapper)
/**
* mergeFunction: same key, How to combine values
*/
public static <T, K, U> Collector<T, ?, Map<K,U>> toMap(
Function<? super T, ? extends K> keyMapper,
Function<? super T, ? extends U> valueMapper,
BinaryOperator<U> mergeFunction)
/**
* mergeFunction: same key, How to combine values
* mapSupplier: Return value Map The producers of
*/
public static <T, K, U, M extends Map<K, U>> Collector<T, ?, M> toMap(
Function<? super T, ? extends K> keyMapper,
Function<? super T, ? extends U> valueMapper,
BinaryOperator<U> mergeFunction,
Supplier<M> mapSupplier)
  • If there is the same key The elements of , Will report a mistake ; Or use groupBy
  • Example

    List<User> demo = Arrays.asList(new User(1), new User(2), new User(3));
    Map<Integer,User> map = demo.stream().collect(Collectors.toMap(User::getId,item->item));
    System.out.println(map);
    -------result-------
    {[email protected], [email protected], [email protected]}

String stream aggregation splicing

// Multiple strings are spliced into a string
public static Collector<CharSequence, ?, String> joining();
// Multiple strings are spliced into a string ( Specify the separator )
public static Collector<CharSequence, ?, String> joining(CharSequence delimiter)
  • Example

    List<String> demo = Arrays.asList("c", "s", "c","w"," Sneak forward ");
    String name = demo.stream().collect(Collectors.joining("-"));
    System.out.println(name);
    -------result-------
    c-s-c-w- Sneak forward 

Stream mapping reprocessing

  • It’s equivalent to first map Again collect

    /**
    * mapper: Mapping processor
    * downstream: After mapping, it needs to be aggregated again
    */
    public static <T, U, A, R> Collector<T, ?, R> mapping(Function<? super T, ? extends U> mapper,
    Collector<? super U, A, R> downstream);
  • Example

    List<String> demo = Arrays.asList("1", "2", "3");
    List<Integer> data = demo.stream().collect(Collectors.mapping(Integer::valueOf, Collectors.toList()));
    System.out.println(data);
    -------result-------
    [1, 2, 3]

Aggregate and then convert the result

/**
* downstream: Aggregate processing
* finisher: Result conversion processing
*/
public static<T,A,R,RR> Collector<T,A,RR> collectingAndThen(Collector<T,A,R> downstream,
Function<R, RR> finisher); 
  • Example

    List<Integer> demo = Arrays.asList(1, 2, 3, 4, 5, 6);
    // Polymerization List, Finally, extract the array of size As return value
    Integer size = demo.stream().collect(Collectors.collectingAndThen(Collectors.toList(), List::size));
    System.out.println(size);
    ---------result----------
    6

Stream grouping (Map yes HashMap)

/**
* classifier Appoint T Type a property as Key Value grouping
* After grouping , Use List As a container for each stream
*/
public static <T, K> Collector<T, ?, Map<K, List<T>>> groupingBy(
Function<? super T, ? extends K> classifier);
/**
* classifier: Stream packet
* downstream: Aggregate processor for each stream
*/
public static <T, K, A, D> Collector<T, ?, Map<K, D>> groupingBy(
Function<? super T, ? extends K> classifier,
Collector<? super T, A, D> downstream)
/**
* classifier: Stream packet
* mapFactory: Return value map Our factory (Map Subclasses of )
* downstream: Aggregate processor for each stream
*/
public static <T, K, D, A, M extends Map<K, D>> Collector<T, ?, M> groupingBy(
Function<? super T, ? extends K> classifier,
Supplier<M> mapFactory,
Collector<? super T, A, D> downstream)
  • Example

    public static void main(String[] args) throws Exception {
    List<Integer> demo = Stream.iterate(0, item -> item + 1)
    .limit(15)
    .collect(Collectors.toList());
    // Divide into three groups , And each group of elements is transformed into String type
    Map<Integer, List<String>> map = demo.stream()
    .collect(Collectors.groupingBy(item -> item % 3,
    HashMap::new,
    Collectors.mapping(String::valueOf, Collectors.toList())));
    System.out.println(map);
    }
    ---------result----------
    {0=[0, 3, 6, 9, 12], 1=[1, 4, 7, 10, 13], 2=[2, 5, 8, 11, 14]} 

Stream grouping ( In groups Map yes ConcurrentHashMap)

/**
* classifier: Grouping device ; After grouping , Use List As a container for each stream
*/
public static <T, K> Collector<T, ?, ConcurrentMap<K, List<T>>> groupingByConcurrent(
Function<? super T, ? extends K> classifier);
/**
* classifier: Grouping device
* downstream: Stream aggregation processor
*/
public static <T, K, A, D> Collector<T, ?, ConcurrentMap<K, D>> groupingByConcurrent(
Function<? super T, ? extends K> classifier, Collector<? super T, A, D> downstream)
/**
* classifier: Grouping device
* mapFactory: return type map The production plant of (ConcurrentMap Subclasses of )
* downstream: Stream aggregation processor
*/
public static <T, K, A, D, M extends ConcurrentMap<K, D>> Collector<T, ?, M> groupingByConcurrent(
Function<? super T, ? extends K> classifier,
Supplier<M> mapFactory,
Collector<? super T, A, D> downstream);
  • Usage and groupingBy equally

Split flow , One to two ( Equivalent to special groupingBy)

public static <T> Collector<T, ?, Map<Boolean, List<T>>> partitioningBy(
Predicate<? super T> predicate)
/**
* predicate: Dichotor
* downstream: Stream aggregation processor
*/
public static <T, D, A> Collector<T, ?, Map<Boolean, D>> partitioningBy(
Predicate<? super T> predicate, Collector<? super T, A, D> downstream)
  • Example

    List<Integer> demo = Arrays.asList(1, 2,3,4, 5,6);
    // Even and odd groups
    Map<Boolean, List<Integer>> map = demo.stream()
    .collect(Collectors.partitioningBy(item -> item % 2 == 0));
    System.out.println(map);
    ---------result----------
    {false=[1, 3, 5], true=[2, 4, 6]}

Aggregate to average

// return Double type
public static <T> Collector<T, ?, Double> averagingDouble(ToDoubleFunction<? super T> mapper)
// return Long type
public static <T> Collector<T, ?, Double> averagingLong(ToLongFunction<? super T> mapper)
// return Int type
public static <T> Collector<T, ?, Double> averagingInt(ToIntFunction<? super T> mapper)
  • Example

    List<Integer> demo = Arrays.asList(1, 2, 5);
    Double data = demo.stream().collect(Collectors.averagingInt(Integer::intValue));
    System.out.println(data);
    ---------result----------
    2.6666666666666665

Stream aggregation looks for maximum and minimum values

// minimum value
public static <T> Collector<T, ?, Optional<T>> minBy(Comparator<? super T> comparator)
// Maximum
public static <T> Collector<T, ?, Optional<T>> maxBy(Comparator<? super T> comparator) 
  • Example

    List<Integer> demo = Arrays.asList(1, 2, 5);
    Optional<Integer> min = demo.stream().collect(Collectors.minBy(Comparator.comparing(item -> item)));
    Optional<Integer> max = demo.stream().collect(Collectors.maxBy(Comparator.comparing(item -> item)));
    System.out.println(min.get()+"-"+max.get());
    ---------result----------
    1-5

Aggregate computing Statistics

  • You can get the total number of elements , Cumulative sum of elements , minimum value , Maximum , Average

    // return Int type
    public static <T> Collector<T, ?, IntSummaryStatistics> summarizingInt(
    ToIntFunction<? super T> mapper)
    // return Double type
    public static <T> Collector<T, ?, DoubleSummaryStatistics> summarizingDouble(
    ToDoubleFunction<? super T> mapper)
    // return Long type
    public static <T> Collector<T, ?, LongSummaryStatistics> summarizingLong(
    ToLongFunction<? super T> mapper) 
  • Example

    List<Integer> demo = Arrays.asList(1, 2, 5);
    IntSummaryStatistics data = demo.stream().collect(Collectors.summarizingInt(Integer::intValue));
    System.out.println(data);
    ---------result----------
    IntSummaryStatistics{count=3, sum=8, min=1, average=2.666667, max=5}

JDK12 New polymerization methods provided

// The flow goes through downstream1、downstream2 Aggregate processing , Then merge the two aggregation results
public static <T, R1, R2, R> Collector<T, ?, R> teeing(
Collector<? super T, ?, R1> downstream1,
Collector<? super T, ?, R2> downstream2,
BiFunction<? super R1, ? super R2, R> merger) 

5 Concurrent paralleStream Use

  • coordination CompletableFuture And the use of thread pools
  • Example

    public static void main(String[] args) throws Exception{
    List<Integer> demo = Stream.iterate(0, item -> item + 1)
    .limit(5)
    .collect(Collectors.toList());
    // Example 1
    Stopwatch stopwatch = Stopwatch.createStarted(Ticker.systemTicker());
    demo.stream().forEach(item -> {
    try {
    Thread.sleep(500);
    System.out.println(" Example 1-"+Thread.currentThread().getName());
    } catch (Exception e) { }
    });
    System.out.println(" Example 1-"+stopwatch.stop().elapsed(TimeUnit.MILLISECONDS));
    // Example 2, Pay attention to the need for ForkJoinPool,parallelStream Will use executor Specified thread , Otherwise, the default is used ForkJoinPool.commonPool()
    ExecutorService executor = new ForkJoinPool(10);
    stopwatch.reset(); stopwatch.start();
    CompletableFuture.runAsync(() -> demo.parallelStream().forEach(item -> {
    try {
    Thread.sleep(1000);
    System.out.println(" Example 2-" + Thread.currentThread().getName());
    } catch (Exception e) { }
    }), executor).join();
    System.out.println(" Example 2-"+stopwatch.stop().elapsed(TimeUnit.MILLISECONDS));
    // Example 3
    stopwatch.reset(); stopwatch.start();
    demo.parallelStream().forEach(item -> {
    try {
    Thread.sleep(1000);
    System.out.println(" Example 3-"+Thread.currentThread().getName());
    } catch (Exception e) { }
    });
    System.out.println(" Example 3-"+stopwatch.stop().elapsed(TimeUnit.MILLISECONDS));
    executor.shutdown();
    }
  • ——————-result————————–

     Example 1-main
    Example 1-main
    Example 1-main
    Example 1-main
    Example 1-main
    Example 1-2501
    Example 2-ForkJoinPool-1-worker-19
    Example 2-ForkJoinPool-1-worker-9
    Example 2-ForkJoinPool-1-worker-5
    Example 2-ForkJoinPool-1-worker-27
    Example 2-ForkJoinPool-1-worker-23
    Example 2-1004
    Example 3-main
    Example 3-ForkJoinPool.commonPool-worker-5
    Example 3-ForkJoinPool.commonPool-worker-7
    Example 3-ForkJoinPool.commonPool-worker-9
    Example 3-ForkJoinPool.commonPool-worker-3
    Example 3-1001
  • parallelStream The method does use multithreading to run , And you can specify the thread pool , However, the custom thread must be ForkJoinPool type , Otherwise, it will default to ForkJoinPool.commonPool() The thread of

发表回复

您的电子邮箱地址不会被公开。 必填项已用*标注