Based on Java8 introduced in detail the meaning of Stream Stream and the use of most API methods, as well as Optional container, parallel Stream and other new features of Java8!

“This article has participated in the good article call order activity, click to see: back end, big front end double track submission, 20,000 yuan prize pool for you to challenge!”

1 overview of Stream

public interface Stream< T > extends BaseStream<T,Stream< T >>

Java8 begins with the Stream API, which is a tool for performing complex operations on a set of data in a functional programming manner, or simply “streams.” The characteristics of the stream are as follows:

  1. Stream uses different methods to describe different operations, such as filter, map, reduce, find, match, sort, and its data processing is similar to the operation of a database. We only need to write the core code logic, which allows us to get rid of the loop traversal, if judgment and other control statements to a certain extent, equivalent to a higher level of abstraction of the operation of the set.
  2. Many stream operations themselves return a stream, as opposed to collection operations. We can also link these actions together to represent a complex pipeline of data processing, such as filtering, sorting, and collecting output…
  3. Just as collections must hold data, streams also need a source to provide data, which can be collections, arrays, or a function! If the data source is ordered, then the generated stream is also ordered!
  4. Stream operations can be performed sequentially in a single thread or in parallel with multiple threads.
  5. After the stream is obtained, the data in the stream cannot be added or deleted, all operations are based on the existing data in the stream. Similar to iterators, a stream can only be used (traversed) once, and if we need other streams on demand, we can only retrieve a new stream from the data source again.
  6. Using ordinary sets to manipulate elements requires us to write iterative code, which is called external iterations, such as for, while, and foreach, while streams help us iterate internal data, which makes it possible to optimize the speed of parallel iterations without having to care how it does it!
  7. The flow element is calculated on demand. For example, to find the first number greater than 10, it is not necessary to compare all elements, but to find the first matching element.
  8. Streams has a Basestream.close () method and implements AutoCloseable, but almost all stream instances don’t actually need to be closed after use. In general, only streams from IO channels (such as those returned by files.lines (Path, Charset)) need to be closed. Most streams are supported by collections, arrays, or generation functions that do not require special resource management. (If the stream really needs to be closed, you can declare it as a resource in the try-with-resources statement.)
  9. The Stream API usually takes a functional interface as a parameter, so lambda support is very good. With the Stream API and lambda expressions, we can write very nice and concise chained programming code! It can be said that lambda expressions must be mastered in order to use streams well. For more information on lambda expressions, see this article: Java8-10,000-word lambda expressions with detailed introduction and application cases.

To experience the continuity and simplicity of streaming programming, take a look at a common comparison of common set and stream operations:

public class StreamFrist {
    public static void main(String[] args) {
        // Student set, student have three attributes: age- age, name- name, score- score
        List<Student> students = new ArrayList<>();
        students.add(new Student(10.55."Flower"));
        students.add(new Student(13.100."Wah"));
        students.add(new Student(9.85."Max"));
        students.add(new Student(8.70."XiaoHua"));


        // We need to select the names of students whose grades are greater than or equal to 60

        // Use normal collection operations
        ArrayList<String> nameList1 = new ArrayList<>();
        for (Student student : students) {
            if (student.getScore() >= 60) { nameList1.add(student.getName()); }}// With flow, different operations are chained, which is very suitable for the human mind, without the need to think about iterative, judgment operations
        List<String> nameList2 = students.stream()
                // Select students with scores greater than or equal to 60
                .filter(student -> student.getScore() >= 60)
                // Collect student names
                .map(Student::getName)
                // Return the result.collect(toList()); }}Copy the code

2 Stream operations

The Stream API defines a number of Stream operations that fall into two broad categories:

  1. In the middle of operation
    1. One notable feature of an intermediate operation is that it returns another Stream, such as filter, sorted, linit, etc. The advantage of this is that we can do chain programming and form a pipeline of operations!
    2. Another hidden feature of intermediate operations is delayed execution, or lazy evaluation! Because he just describes what the pipeline is going to do, and he doesn’t actually execute the pipeline, it requires a “trigger operation,” which is the terminal operation!
  2. Terminal operation
    1. A significant feature of terminal operations is that they will not return another Stream, such as count and collect, which is equivalent to obtaining results from the pipeline. Intermediate operations will be executed only when terminal operations exist.
    2. Another hidden feature of a terminal operation is that it terminates the stream, after which the stream cannot be reused, also known as early evaluation!

Case demonstration:

@Test
public void test(a) {
    // Student set, student have three attributes: age- age, name- name, score- score
    List<Student> students = new ArrayList<>();
    students.add(new Student(10.55."Flower"));
    students.add(new Student(13.100."Wah"));
    students.add(new Student(9.85."Max"));
    students.add(new Student(8.70."XiaoHua"));


    // Intermediate operations are not performed for streams that have no terminal operations
    students.stream()
            // Select students with scores greater than or equal to 60
            .filter(student -> {
                System.out.println("Intermediate operation" + student.getScore());
                return student.getScore() >= 60;
            })
            // Collect student names
            .map(Student::getName);


    // Only streams that have terminal operations will be executed
    // Intermediate operations are not performed for streams that have no terminal operations
    students.stream()
            // Select students with scores greater than or equal to 60
            .filter(student -> {
                System.out.println("Terminal operation" + student.getScore());
                return student.getScore() >= 60;
            })
            // Collect student names
            .map(Student::getName)
            // Collect is a terminal operation
            .collect(toList());

}
Copy the code

3 Stream usage

To use streams, you need to do three things:

  1. Data sources, such as collections, arrays, and functions, are required to generate streams
  2. A series of intermediate pipeline operations for filtering, filtering, integrating, etc., of data. Intermediate operations are not required;
  3. A terminal operation that triggers the execution of intermediate operations in the pipeline, consumes the flow, and retrives the results;

We can see that the use of streams is similar to the Java Builder pattern, where the builder pattern uses a series of actions to set properties and configurations until the object is actually created when a build method is called. A stream operation also uses a series of intermediate operations, culminating in a call to a terminal operation method that triggers the execution of the intermediate operation and gets the final result!

Stream provides a large number of API methods, which can be categorized according to different functions!

3.1 access to flow

There are the following common ways to obtain streams:

  1. From the collection
    1. Java8 starts with the Collection superinterface by providing a default stream() method that retrieves the stream of all Collection elements from the calling Collection, so all collections under the Collection system can call the stream() method to retrieve the stream, which is a single Collection element.
    2. Note that there is no way to get streams directly from the Map collection!
  2. From an array
    1. The arrays. stream static method receives an array and returns a stream with all the elements of the array. The stream elements are individual array elements.
  3. From the file
    1. The Files class has methods to read the file and generate a stream, the most important of which is the lines method, which gets a stream of all the lines of the file.
  4. From the function
    1. The Stream API provides two static methods to generate streams from functions: stream.iterate and stream.generate. The stream generated from the function is called an infinite stream, and usually we only need to cut part of it!
    2. Static < T > Stream< T > iterate(T seed, UnaryOperator< T > f), iterate(iterate, UnaryOperator< T > f) The unary operator, which takes the last generated value as an argument and generates the next value.
    3. Static < T > Stream< T > generate(Supplier< T > s) generate < T > Stream< T > generate(Supplier< T > s); Its value is returned from this producer! So if we need to get random numbers, generate is handy!
  5. The specified element
    1. static < T > Stream< T > of(T… Values), which returns an ordered sort stream whose elements are the specified values.
    2. Static < T > Stream< T > of(T T), return a sequential Stream containing a single element.

Example of getting a stream:

/ * * *@author lx
 */
public class CreateTest {

    / * * *@author lx
     */
    class Filter {
        private int x;

        public int getX(a) {
            return x;
        }

        public void setX(int x) {
            this.x = x;
        }

        public Filter(int x) {
            this.x = x;
        }

        public Filter(a) {}@Override
        public String toString(a) {
            return "Filter{" +
                    "x=" + x +
                    '} '; }}/** * Get streams */ from collections
    @Test
    public void test(a) {
        List<Filter> filters = new ArrayList<>();
        filters.add(new Filter(0));
        filters.add(new Filter(3));
        filters.add(new Filter(9));
        filters.add(new Filter(8));
        / / from the collection
        Stream<Filter> stream = filters.stream();
        stream.forEach(System.out::println);
    }

    /** * Get stream from array */
    @Test
    public void test1(a) {
        Filter[] filArr = new Filter[]{new Filter(1),
                new Filter(3),
                new Filter(9),
                new Filter(8)};

        / / from an array
        Stream<Filter> stream = Arrays.stream(filArr);
        stream.forEach(System.out::println);
    }


    /** * Get streams from files */
    @Test
    public void test2(a) {
        // Read the stream of all the lines of the file
        try (Stream<String> lines = Files.lines(Paths.get("target/classes/lines.txt"))) {
            lines.forEach(System.out::println);
        } catch(IOException e) { e.printStackTrace(); }}/** * Iterate for the 10 even numbers */
    @Test
    public void iterate(a) {
        Stream.iterate(0, n -> n + 2)
                // Take the first ten data (as we'll see later, this is called "filtering")
                .limit(10)
                .forEach(System.out::println);


    }

    /** * Iterate through the collection with Stream */
    @Test
    public void iterate2(a) {
        ArrayList<Object> objects = new ArrayList<>();
        objects.add(1);
        objects.add(3);
        objects.add(2);
        objects.add(4);
        Stream.iterate(0, i -> i + 1)
                // Intercept the length of the previous set
                .limit(objects.size())
                .forEach(i -> System.out.println(i + "-- >" + objects.get(i)));

    }

    /** * generate generate random numbers */
    @Test
    public void generate(a) {
        Stream.generate(Math::random)
                .limit(10)
                .forEach(System.out::println);
    }


    // Complex data generation

    /** * Iterate for 10 Fibonacci numbers */
    @Test
    public void iterateFibonacci(a) {
        / / the rule of the Fibonacci sequence: F (0) = 0, F (1) = 1, F (2) = 1, F (n) = F (n - 1) + F (n - 2) (n 2 or more, n ∈ n *)
        New int[]{0, 1}
        // The first element of the subsequent array is the second element of the previous array, and the second element of the subsequent array is the sum of the first and second elements of the previous array
        // This actually generates the following array:
        //new int[]{0, 1}
        //new int[]{1, 1}
        //new int[]{1, 2}
        //new int[]{2, 3}
        //new int[]{3, 5}
        //new int[]{5, 8}
        //new int[]{8, 13}
        //new int[]{13, 21}
        //new int[]{21, 34}
        // Take the first element of each array to be the Fibonacci number

        Stream.iterate(new int[] {0.1},
                t -> new int[]{t[1], t[0] + t[1]})
                // Generate 10 arrays
                .limit(10)
                // Get the first element of each array (this is called "mapping", as we'll see later)
                .map(t -> t[0])
                .forEach(System.out::println);
    }


    /** * Stream.of */
    @Test
    public void of(a) {
        Stream.of(1.2.3."11").forEach(System.out::println); }}Copy the code

3.2 Filtering Operations

The Stream API provides filtering operations for convection elements, both normal conditional filters and special filters such as distinct deweighting, LIMIT, and SKIP intercepts!

The filter operation is an intermediate operation!

Stream< T > filter(Predicate< ? super T > predicate)

Filter, the most widely used filter operation, takes an assertion and returns a stream consisting of the elements of the stream that match the given assertion.

Stream< T > distinct()

Returns a stream consisting of the different elements of the stream (according to the element’s equals method). For ordered flows, the selection of different elements is stable (for repeated elements, the element that first appeared in the encounter order is retained). For unordered flows, stability cannot be guaranteed.

Stream< T > limit(long maxSize)

Returns a stream consisting of the first maxSize elements of the stream. Limit intercepts at most the first maxSize elements of the stream.

Stream< T > skip(long n)

After discarding the first N elements of the stream, the stream consisting of the elements after the first N elements of the stream is returned. If the stream contains fewer than n elements, an empty stream is returned.

Use case:

/ * * *@author lx
 */
public class FilterTest {


    List<Student> students = new ArrayList<>();

    @Before
    public void test(a) {
        students.add(new Student(10.55."Flower"));
        students.add(new Student(13.100."Wah"));
        students.add(new Student(9.85."Max"));
        students.add(new Student(8.70."XiaoHua"));
        students.add(new Student(8.70."XiaoHua"));
    }

    @Test
    public void filter(a) {
        System.out.println("Filter for students with a score of 70 or greater");
        //filter Select students whose score is greater than or equal to 70
        students.stream().filter(student -> student.getScore() >= 70).forEach(System.out::println);

        System.out.println("Filter +distinct screens students with scores greater than or equal to 70, and removes duplicate data");
        //filter+distinct Selects students with scores greater than or equal to 70 and removes duplicate data
        students.stream().filter(student -> student.getScore() >= 70).distinct().forEach(System.out::println);

        System.out.println("Limit can intercept at most the first 2 pieces of data");
        //limit To intercept the first two data
        students.stream().filter(student -> student.getScore() >= 70).limit(2).forEach(System.out::println);

        System.out.println("Skip drops the first two.");
        // Skip discard the first two data
        students.stream().filter(student -> student.getScore() >= 70).skip(2).forEach(System.out::println);

        System.out.println("Skip discards the first data, limit intercepts the first data at most");
        students.stream().filter(student -> student.getScore() >= 70).skip(1).limit(1).forEach(System.out::println);
    }

    static class Student {
        private int age;
        private int score;
        private String name;

        public int getAge(a) {
            return age;
        }

        public void setAge(int age) {
            this.age = age;
        }

        public int getScore(a) {
            return score;
        }

        public void setScore(int score) {
            this.score = score;
        }

        public String getName(a) {
            return name;
        }

        public void setName(String name) {
            this.name = name;
        }

        public Student(int age, int score, String name) {
            this.age = age;
            this.score = score;
            this.name = name;
        }

        public Student(int age) {
            this.age = age;
        }

        @Override
        public String toString(a) {
            return "Student{" +
                    "age=" + age +
                    ", score=" + score +
                    ", name='" + name + '\' ' +
                    '} ';
        }

        @Override
        public boolean equals(Object o) {
            if (this == o) return true;
            if(! (oinstanceof Student)) return false;

            Student student = (Student) o;

            if(getAge() ! = student.getAge())return false;
            if(getScore() ! = student.getScore())return false;
            returngetName() ! =null ? getName().equals(student.getName()) : student.getName() == null;
        }

        @Override
        public int hashCode(a) {
            int result = getAge();
            result = 31 * result + getScore();
            result = 31* result + (getName() ! =null ? getName().hashCode() : 0);
            returnresult; }}}Copy the code

3.3 Sorting Operations

Collections support sorting, and the Stream API also provides a way to sort convection elements!

The sort operation is an intermediate operation!

Stream< T > sorted()

Returns a stream made up of the elements of this stream, sorted in natural order. If the elements of the stream are not of type Comparable, a ClassCastException is thrown when a terminal operation is performed.

Stream< T > sorted(Comparator< ? super T > comparator)

Returns a stream consisting of the elements of the stream, sorted by the provided Comparator. For ordered streams, the ordering is stable. With an unordered flow, stability cannot be guaranteed.

Use case:

/ * * *@author lx
 */
public class SortedTest {
    /** * Natural order */
    @Test
    public void sorted(a) {
        Stream.of(2.0.3.7.5).sorted().forEach(System.out::println);
    }


    /** * Specifies the comparison rule */
    @Test
    public void sortedCom(a) {
        students.stream()
                // Order by SCORE first, and then by ID when equal
                .sorted(Comparator.comparingInt(Student::getScore).thenComparing(Student::getId))
                .forEach(System.out::println);
    }


    List<Student> students = new ArrayList<>();

    @Before
    public void before(a) {
        students.add(new Student(2.100."Flower"));
        students.add(new Student(1.100."Wah"));
        students.add(new Student(3.85."Max"));
        students.add(new Student(4.70."XiaoHua"));
    }


    static class Student {
        private int id;
        private int score;
        private String name;

        public int getId(a) {
            return id;
        }

        public void setId(int id) {
            this.id = id;
        }

        public int getScore(a) {
            return score;
        }

        public void setScore(int score) {
            this.score = score;
        }

        public String getName(a) {
            return name;
        }

        public void setName(String name) {
            this.name = name;
        }

        public Student(int id, int score, String name) {
            this.id = id;
            this.score = score;
            this.name = name;
        }

        public Student(int id) {
            this.id = id;
        }

        @Override
        public String toString(a) {
            return "Student{" +
                    "id=" + id +
                    ", score=" + score +
                    ", name='" + name + '\' ' +
                    '} ';
        }

        @Override
        public boolean equals(Object o) {
            if (this == o) return true;
            if(! (oinstanceof Student)) return false;

            Student student = (Student) o;

            if(getId() ! = student.getId())return false;
            if(getScore() ! = student.getScore())return false;
            returngetName() ! =null ? getName().equals(student.getName()) : student.getName() == null;
        }

        @Override
        public int hashCode(a) {
            int result = getId();
            result = 31 * result + getScore();
            result = 31* result + (getName() ! =null ? getName().hashCode() : 0);
            returnresult; }}}Copy the code

3.4 Mapping Operations

The Stream API allows you to apply the same function to every element of a Stream, while generating a new Stream as a result of the function calculation, similar to a mapping operation. This can be useful in practical situations where a Stream element is a collection of objects and now we need to get the ids of all the objects. We can use the map and flatMap mapping methods provided by Stream to do this.

The mapping operation is an intermediate operation!

3.4.1 track map mapping

< R > Stream< R > map(Function< ? super T,? extends R > mapper)

Returns a stream consisting of the result of a given function applied to all elements of the stream. The element data type of the returned stream is the data type of the result returned by the function.

Use case:

/ * * *@author lx
 */
public class MapTest {
    List<Student> students = new ArrayList<>();

    @Before
    public void before(a) {
        students.add(new Student(1.55."Flower"));
        students.add(new Student(2.100."Wah"));
        students.add(new Student(3.85."Max"));
        students.add(new Student(4.70."XiaoHua"));
    }

    @Test
    public void test(a) {
        students.stream()
                // Get the set of student ids for each student object
                .map(Student::getId)
                // This is a terminal consumption operation, which will be discussed later
                .forEach(System.out::println);


        List<Integer> collect = students.stream()
                // Get the set of student ids for each student object
                .map(Student::getId)
                // This is a terminal collection operation, which will be covered later
                .collect(toList());
        System.out.println(collect);


        /* Convert lowercase letters to uppercase */
        List<String> collected = Stream.of("a"."b"."C")
                .map(String::toUpperCase)
                .collect(toList());
        System.out.println(collected);
    }

    static class Student {
        private int id;
        private int score;
        private String name;

        public int getId(a) {
            return id;
        }

        public void setId(int id) {
            this.id = id;
        }

        public int getScore(a) {
            return score;
        }

        public void setScore(int score) {
            this.score = score;
        }

        public String getName(a) {
            return name;
        }

        public void setName(String name) {
            this.name = name;
        }

        public Student(int id, int score, String name) {
            this.id = id;
            this.score = score;
            this.name = name;
        }

        public Student(int id) {
            this.id = id;
        }

        @Override
        public String toString(a) {
            return "Student{" +
                    "age=" + id +
                    ", score=" + score +
                    ", name='" + name + '\' ' +
                    '} ';
        }

        @Override
        public boolean equals(Object o) {
            if (this == o) return true;
            if(! (oinstanceof Student)) return false;

            Student student = (Student) o;

            if(getId() ! = student.getId())return false;
            if(getScore() ! = student.getScore())return false;
            returngetName() ! =null ? getName().equals(student.getName()) : student.getName() == null;
        }

        @Override
        public int hashCode(a) {
            int result = getId();
            result = 31 * result + getScore();
            result = 31* result + (getName() ! =null ? getName().hashCode() : 0);
            returnresult; }}}Copy the code

3.4.2 flatMap Flattens

< R > Stream< R > flatMap(Function< ? super T,? extends Stream<? extends R >> mapper)

Returns a stream that replaces the result of each element of the stream with the contents of the mapping stream produced by applying the provided mapping function to each element. Each mapped stream is closed after its content is put into this stream. (Use an empty stream if the mapped stream is NULL).

Simply put, the return value of the flatMap function must be Stream< XXX >, and then the returned result Stream will be combined into a single Stream, so the final Stream element type is XXX. But in front of the map function return values are not necessarily Stream < XXX > flow type, generally should return an entity type XXX, so eventually flow element type is XXX; If map returns Stream< XXX >, then the final Stream element type is Stream< XXX >.

The results of a map will be collected using a flow, and then the flow will be returned. A flatMap must return a flow, and then all the returned flows will be combined into one flow. This is where flattening comes in.

Therefore flatMap is often used to break up different batches of data into individual elements and then merge them!

Use case:

/ * * *@author lx
 */
public class FlatMapTest {

    List<String> words = new ArrayList<>();

    @Before
    public void before(a) {
        words.add("hello");
        words.add("word");
    }


    / * * * will words again according to the characters of the elements in an array, then character to heavy, eventually reach [" h ", "e", "l", "o", "w", "r", "d"] * /
    @Test
    public void test2(a) {

        // If you use map, you can see that the Stream is of type Stream
      
       >, and the last element collected is of type Stream
       
        List<Stream<String>> mapList = words.stream()
                .map(word -> Arrays.stream(word.split("")))
                .distinct()
                .collect(toList());
        System.out.println(mapList);


        // If you use flatMap, you can achieve the desired result
        List<String> flatMapList = words.stream()
                .flatMap(word -> Arrays.stream(word.split("")))
                .distinct()
                .collect(Collectors.toList());
        System.out.println(flatMapList);
    }

    /** * Reason */
    @Test
    public void test(a) {
        The arrays. stream method returns stream 
      
        Stream<String> stream = Arrays.stream("word".split(""));
        // If map is used, the returned Stream is of type Stream
      
       >, and the Stream element is also a Stream...
      
        // Because map collects the stream 
      
        returned by arrays. stream as a stream element and then returns that stream
      
        Stream<Stream<String>> streamStream = words.stream().map(word -> Arrays.stream(word.split("")));
        // If flatMap is used, then the returned Stream is a Stream
      
       , which is the normal type
      
        The flatMap merges the stream 
      
        returned by the arrays. stream method and then returns the merged stream
      
        Stream<String> stringStream = words.stream().flatMap(word -> Arrays.stream(word.split(""))); }}Copy the code

3.5 Viewing Operations

Stream< T > peek(Consumer< ? super T > action)

Returns a stream consisting of the elements of that stream, and performs the provided consumption operations on each element as it is consumed from the generated stream. This method can be used to view elements at a point in the pipeline, and it is currently possible to modify the attributes of some elements, but be thread-safe yourself!

The view operation is an intermediate operation!

Use case:

/ * * *@author lx
 */
public class PeekTest {
    @Test
    public void test(a){
        System.out.println(Stream.of(1.2.3.4.5)
                .peek(System.out::println)
                .map(i -> i + 1)
                .collect(Collectors.toList()));

        System.out.println(Stream.of("one"."two"."three"."four")
                .filter(e -> e.length() > 3)
                .peek(e -> System.out.println("Filtered value: " + e))
                .map(String::toUpperCase)
                .peek(e -> System.out.println("Mapped value: "+ e)) .collect(Collectors.toList())); }}Copy the code

3.6 Matching Operations

In collection and array operations, we often need to determine whether a collection contains or does not contain elements with certain rules. Stream API also provides allMatch, anyMatch, and noneMatch methods to complete rule matching operations.

The match operation is a terminal operation!

boolean allMatch(Predicate< ? super T > predicate)

Return true if all elements of the stream match the supplied assertion, and false if the stream is empty or has any mismatched elements.

boolean noneMatch(Predicate< ? super T > predicate)

Return true if none of the elements of the stream match the supplied assertion or if the stream is empty, or false if any at least one of the elements matches.

boolean anyMatch(Predicate< ? super T > predicate)

Return true if at least one element of the stream matches the supplied assertion, and false if either the stream is empty or no elements match.

Note: Whether the stream is empty is the first condition to determine. Empty collections, empty arrays, empty files, and so on all build an empty stream.

Use case:

/ * * *@author lx
 */
public class MatchTest {

    List<Student> students = new ArrayList<>();

    @Before
    public void before(a) {
        students.add(new Student(1.55."Flower"));
        students.add(new Student(2.100."Wah"));
        students.add(new Student(3.85."Max"));
        students.add(new Student(4.70."XiaoHua"));
    }


    /** * empty flow test */
    @Test
    public void testNull(a) {
        System.out.println(Stream.of().anyMatch(i -> true));
        System.out.println(Stream.of().anyMatch(i -> false));
        System.out.println(Stream.of().noneMatch(i -> true));
        System.out.println(Stream.of().noneMatch(i -> false));
        System.out.println(Stream.of().anyMatch(i -> true));
        System.out.println(Stream.of().anyMatch(i -> false));
    }

    /** * match test */
    @Test
    public void testMatch(a) {
        // Whether there is a student whose name is Xiaohua in the collection
        System.out.println(students.stream().anyMatch(student -> "Wah".equals(student.getName())));
        // Whether there is a student whose name is Xiaohua1 in the collection
        System.out.println(students.stream().anyMatch(student -> "Wah 1".equals(student.getName())));

        // Whether all students in the set have scores greater than 55
        System.out.println(students.stream().allMatch(student -> student.getScore() > 55));
        // Whether all students in the set have scores greater than or equal to 55
        System.out.println(students.stream().allMatch(student -> student.getScore() >= 55));

        // Whether all students in the set have a score of 55 or above
        System.out.println(students.stream().noneMatch(student -> student.getScore() < 55));
        // Whether all students in the set have a score of 55 or greater
        System.out.println(students.stream().noneMatch(student -> student.getScore() <= 55));
    }


    static class Student {
        private int id;
        private int score;
        private String name;

        public int getId(a) {
            return id;
        }

        public void setId(int id) {
            this.id = id;
        }

        public int getScore(a) {
            return score;
        }

        public void setScore(int score) {
            this.score = score;
        }

        public String getName(a) {
            return name;
        }

        public void setName(String name) {
            this.name = name;
        }

        public Student(int id, int score, String name) {
            this.id = id;
            this.score = score;
            this.name = name;
        }

        public Student(int id) {
            this.id = id;
        }

        @Override
        public String toString(a) {
            return "Student{" +
                    "age=" + id +
                    ", score=" + score +
                    ", name='" + name + '\' ' +
                    '} ';
        }

        @Override
        public boolean equals(Object o) {
            if (this == o) return true;
            if(! (oinstanceof Student)) return false;

            Student student = (Student) o;

            if(getId() ! = student.getId())return false;
            if(getScore() ! = student.getScore())return false;
            returngetName() ! =null ? getName().equals(student.getName()) : student.getName() == null;
        }

        @Override
        public int hashCode(a) {
            int result = getId();
            result = 31 * result + getScore();
            result = 31* result + (getName() ! =null ? getName().hashCode() : 0);
            returnresult; }}}Copy the code

3.7 Search Operations

Stream provides two methods for findFirst to find the first Stream element and findAny to findAny Stream element, but no other lookup operations are provided!

The find operation is a terminal operation!

Optional< T > findFirst()

Returns the Optional to the first element of this stream. An empty Optional is returned if the stream is empty, or any element may be returned if the source data is not in order.

Optional< T > findAny()

Returns the Optional to any element of this stream. An empty Optional is returned if the stream is empty, or any element may be returned if the source data is not in order. This method is mainly for the efficiency of the parallel flow, if the single thread is returned the first element of the stream!

Use case:

@Test
public void find(a) {
    Optional<String> first = Stream.of("xx"."ccc"."eee").findFirst();
    Optional<String> any = Stream.of("xx"."ccc"."eee").findAny();

    // Usually we use it this way
    // If first exists
    if (first.isPresent()) {
        String s = first.get();
    }
    // If any exists
    if (any.isPresent()) {
        String s = any.get();
    }
    
    // Or use it this way
    first.ifPresent(System.out::println);
}
Copy the code

The find method doesn’t have much to say, but notice that it returns an Optional object. The Optional is a new Java8 feature that was added to help programmers avoid null-pointer exceptions!

3.7.1 Optional

The Optional< T > class (java.util.Optional) is a container class that represents the presence or absence of a value.

If findAny returns a primitive null, then a careless programmer may have forgotten the null check and caused an exception. Therefore, return an Optional object that must not be null. This object wraps the true return value around the actual return value. Get a value through a series of methods or other operations!

methods instructions
public boolean isPresent() Returns true if a value exists, false otherwise.
public T get() Return a value if Optional exists, otherwise NoSuchElementException is thrown.
public void ifPresent(Consumer consumer) If a value exists, the specified consumer is invoked with that value as a parameter, otherwise nothing is done. If the value exists and consumer is empty, a NullPointerException is thrown.
public T orElse(T other) Return value (if present), otherwise return a default value.
public T orElseGet(Supplier other) Return value (if present), otherwise call OTHER and return the result of that call.
public T orElseThrow(Supplier exceptionSupplier) Returns the contained value, if any, or raises the exception returned by the Supplier parameter. If there is no value, the specified exception is thrown; If there is no value and exceptionSupplier is empty, a NullPointerException is thrown.
public static < T > Optional< T > empty() Return an empty instance of Optional.
public static < T > Optional< T > of(T value) Return Optional with a non-null value. If value is NULL, a NullPointerException is thrown.
public static < T > Optional< T > ofNullable(T value) Returns Optional with the specified value. Value can be null.
public Optional< T > filter(Predicate predicate) Return an Optional containing a value if it exists and the given assertion for that value matches, or an empty Optional otherwise. If the assertion is NULL, a NullPointerException is thrown.
public < U > Optional< U > map(Function mapper) If there is a value, the provided mapping function is applied, and if the result is not empty, an Optional containing the result is returned. Otherwise return an empty Optional. If the function is NULL, a NullPointerException is thrown.
public < U > Optional< U > flatMap(Function> mapper) If a value exists, then the provided mapping function must be applied and the result returned must be an Optional, which contains the real result, or an empty Optional if the real result is null. If the function is NULL or returns a null result, a NullPointerException is thrown.

Use case:

@Test
public void optional(a) {
    System.out.println("filter");
    System.out.println(Optional.of(3).filter(integer -> integer > 2));
    System.out.println(Optional.of(3).filter(integer -> integer > 3));

    System.out.println("map");
    System.out.println(Optional.of(3).map(integer -> integer + 2));
    System.out.println(Optional.of(2).map(integer -> integer + 2));

    System.out.println("flatMap");
    System.out.println(Optional.of(3).flatMap(integer -> Optional.of(integer + 2)));
    System.out.println(Optional.of(2).flatMap(integer -> Optional.of(integer + 2)));
    // The Stream flatMap method is similar to the map method
    System.out.println(Optional.of(2).map(integer -> Optional.of(integer + 2)));
}
Copy the code

3.8 Induction Operations

In collection and array operations, we sometimes need to take a total count of the set’s elements, especially the sum, difference, maximize, count, and so on. These operations need to use each element and return a unique result. Such operations are called “induction”. Stream API provides reduce, Max, min and other methods to complete induction!

The induction operation is a terminal operation!

Optional< T > reduce(BinaryOperator< T > accumulator)

No initial value, returns an Optional object to indicate that the result may not exist.

Accumulator: A binary operator that operates on two elements to obtain a new element. The first calculation takes the first two elements, and the value of the calculation will be the first parameter of the next calculation, and the next element will be the second parameter, and so on. Finally, the result will be returned.

T reduce(T identity, BinaryOperator< T > accumulator)

  1. Identity: an initial value that will eventually return a value of the same type.
  2. Accumulator: A binary operator that operates on two elements to obtain a new element. In serial mode: the first calculation takes the initial value of the first parameter, and the second parameter takes the first element. The calculated value will be the first parameter of the next calculation, and the following elements will be the second parameter, and so on. Finally, the result of the calculation will be returned. In parallel mode: An accumulator function is applied to the initial value and each element (the initial value is the first parameter, and the flow element is the second parameter) to obtain the corresponding number of results. Finally, an accumulator is integrated again through the accumulator, that is, an accumulator is applied to the results obtained in the first round of accumulator!

< U > U reduce(U identity,BiFunction< U,? super T,U > accumulator,BinaryOperator< U > combiner)

  1. Identity: an initial value that will eventually return a value of the same type.
  2. Accumulator: A binary function that operates on two elements to obtain a new element. In serial mode: the first calculation takes the initial value of the first parameter, and the second parameter takes the first element. The calculated value will be the first parameter of the next calculation, and the following elements will be the second parameter, and so on. Finally, the result of the calculation will be returned. In parallel mode: The accumulator function is applied to the initial value and each element (the initial value is the first parameter and the stream element is the second parameter). The accumulator function will get the result of the corresponding number of elements. Finally, combiner is integrated!
  3. Combiner: binary operator. The third parameter takes effect in parallel mode. In parallel mode, the initial value will be calculated with each stream element for an Accumulator (the initial value is the first parameter and the stream element is the second parameter), and then the combiner will be used to integrate the results of these calculations, and finally return the integrated results!

Use cases (for more detailed testing in the Parallel Flow section) :

/ * * *@author lx
 */
public class ReduceTest {


    /** * Verify the binary operator parameter relationship */
    @Test
    public void subtract(a) {
        //Optional< T > reduce(BinaryOperator< T > accumulator);
        // Subtract the last element from the previous one
        Optional<Integer> subtractReduce1 = Stream.of(1.2.3.4.7)
                .reduce((i, j) -> i - j);
        subtractReduce1.ifPresent(System.out::println);

        //Optional< T > reduce(BinaryOperator< T > accumulator);
        // Subtract the value of the last element from the previous one
        Optional<Integer> subtractReduce2 = Stream.of(1.2.3.4.7)
                .reduce((i, j) -> j - i);
        subtractReduce2.ifPresent(System.out::println);
    }


    @Test
    public void test(a) {
        //Optional< T > reduce(BinaryOperator< T > accumulator);
        / / sum
        Optional<Integer> sumReduce = Stream.of(1.2.3.4.7)
                // Use the method reference Integer::sum to indicate the intention to sum
                .reduce(Integer::sum);
        sumReduce.ifPresent(System.out::println);


        //Optional< T > reduce(BinaryOperator< T > accumulator);
        / / get the most value
        Optional<Integer> maxReduce = Stream.of(1.2.3.4.7)
                // Use the method reference Integer:: Max to indicate the intent to maximize the value
                .reduce(Integer::max);
        maxReduce.ifPresent(System.out::println);


        //T reduce(T identity, BinaryOperator< T > accumulator);
        // The initial value is 10
        System.out.println(Stream.of(1.2.3.4.7)
                // Use the method reference Integer:: Max to indicate the intent to maximize the value
                .reduce(10, Integer::max));

        //T reduce(T identity, BinaryOperator< T > accumulator);
        / / count
        System.out.println(Stream.of(1.2.3.4.7)
                .map(d -> 1)
                .reduce(0, Integer::sum));
        // The simpler method, the numeric flow used internally, is actually the characteristic flow operation, which will be discussed later
        System.out.println(Stream.of(1.2.3.4.7).count());
    }


    /** * < U > U reduce(U identity,BiFunction
      
        accumulator, BinaryOperator < U > combiner) * * / complex calculation
      ,?>
    @Test
    public void test2(a) {

        // Invalid in serial mode
        System.out.println(Stream.of(1.2.3.4.7).reduce(2, Integer::sum, Integer::sum));
        System.out.println(Stream.of(1.2.3.4.7).reduce(2, Integer::sum, Integer::max));

        // In parallel mode
        // the initial value is 2. Firstly, an accumulator is applied to each element. The result is 3,4,5,6,9
        System.out.println(Stream.of(1.2.3.4.7).parallel().reduce(2, Integer::sum, Integer::sum));
        // the initial value is 2. Firstly, an accumulator is applied to each element and the result is 3,4,5,6,9. Finally, the combiner integrates the values: 9
        System.out.println(Stream.of(1.2.3.4.7).parallel().reduce(2, Integer::sum, Integer::max));
        // The initial value is 2. Firstly, an accumulator is applied to each element and the result is 1, 2, 3, 4, 7. Finally, combiner integrates these values: 1
        System.out.println(Stream.of(1.2.3.4.7).parallel().reduce(0, Integer::sum, Integer::min));

        // The initial value is 2. Firstly, an accumulator is applied to each element, and the result is -1, 0, 1, 2, 5. Finally, the combiner integrates the values: 7
        System.out.println(Stream.of(1.2.3.4.7).parallel().reduce(2, (i, j) -> j - i, Integer::sum));
        // The initial value is 2. First, an accumulator is applied to each element and the accumulator is calculated. The result is 1, 0, -1, -2, -5
        System.out.println(Stream.of(1.2.3.4.7).parallel().reduce(2, (i, j) -> i - j, Integer::sum));


        System.out.println(Stream.of(1.2.3.4.7).parallel().reduce(2, (i, j) -> {
                    // View threads, multithreading
                    System.out.println(Thread.currentThread().getName());
                    return j - i;
                }
                , Integer::sum));

        System.out.println(Stream.of(1.2.3.4.7).reduce(2, (i, j) -> {
                    // View thread, single thread
                    System.out.println(Thread.currentThread().getName());
                    returnj - i; } , Integer::sum)); }}Copy the code

3.9 Feature Operations

Only objects can be stored in a collection, and unpacking is also involved for basic types. The generalizing operations above, such as summing, differential, and so on, also involve automatic unboxing and boxing of the basic types, because the Stream stores data internally as objects. In addition, Stream does not provide a ready-made API for simple summing operations, mainly because Stream is oriented to all types, and ordinary object types cannot be summed or maximized.

Are there any improvements in Java8? Of course, Java8 provides us with three stream-based streams: IntStream, LongStream, DoubleStream. They only accept data of the specified type. The underlying layer is based on the basic type and directly operates on the basic type, avoiding the performance overhead of unpacking. And provides sum, min and other quick induction method! In fact, internally, the Stream count method converts the Stream of objects into a LongStream numeric Stream, and then calls the sum method.

The methods for converting a characteristic Stream are as follows. These methods are intermediate operations:

IntStream mapToInt(ToIntFunction<? super T> mapper)

Returns an IntStream containing the result of applying the given function to the elements of the stream.

LongStream mapToLong(ToLongFunction<? super T> mapper)

Returns a LongStream containing the result of applying the given function to the elements of the stream.

DoubleStream mapToDouble(ToDoubleFunction<? super T> mapper)

Returns a DoubleStream containing the result of applying the given function to the elements of the stream.

In addition to including most of Strean’s methods, some of the common methods that characterize streams are as follows, using IntStream as an example:

< U > Stream< U > mapToObj(IntFunction<? extends U> mapper)

Returns an object value Stream containing the result of applying the given function to the elements of the Stream. This method converts the characteristic flow into an object flow.

Stream< Integer > boxed()

Boxing each element, returning a Stream< Integer >. This method converts the characteristic flow into an object flow.

OptionalDouble average()

Returns an OptionalDouble of the arithmetic mean of the elements of this stream, or an empty OptionalDouble if the stream is empty.

OptionalInt max()

Returns the OptionalInt of the largest element of this stream, or an empty OptionalInt if the stream is empty.

OptionalInt min()

Returns the OptionalInt of the smallest element of this stream, or an empty OptionalInt if the stream is empty.

long count()

Returns the number of elements in this stream.

static IntStream range(int startInclusive,int endExclusive)

Returns an ordered IntStream in the range of [startInclusive endExclusive).

static IntStream rangeClosed(int startInclusive,int endInclusive)

Returns an ordered IntStream in the range [startInclusive endInclusive].

OptionalInt, OptionalLong, and OptionalDouble are Optional classes. They also avoid unpacking. Their methods are similar to Optional’s, but they do not accept null elements.

Use case:

/ * * *@author lx
 */
public class Specialtest {
    @Test
    public void test(a) {
        / / sum
        System.out.println(Stream.of(1.2.3.4.7).mapToInt(x -> x).sum());
        // Obtain the maximum value
        Stream.of(1.2.3.4.7).mapToInt(x -> x).max().ifPresent(System.out::println);
        // Minimize the value
        Stream.of(1.2.3.4.7).mapToInt(x -> x).min().ifPresent(System.out::println);
        // Calculate the arithmetic mean
        Stream.of(1.2.3.4.7).mapToInt(x -> x).average().ifPresent(System.out::println); }}Copy the code

3.10 Collecting operations

After pipelining, we always want to use a data structure to collect the flow elements, or to get a summary result. At this point, we can use the very powerful collect operation in Stream, through the collect operation, we can get the data structure we want or a result.

In fact, by simply passing a generic Collector Collector to the collect method that defines how to collect elements and what results to return, we can implement almost all of the terminal operations we’ve learned earlier, such as Reduce, Max, count… ! Java8 has predefined Collectors that can be acquired through the static methods of the Collectors factory class. Typically, all static methods of the Collectors are imported directly. Of course, we can also create our own collector to accomplish more special functions!

3.10.1 inductive

The Collect method supports almost all the induction operations of the Reduce method, such as counting, summing, Max, etc., and Collectos has predefined collectors for these induction operations, we can use directly!

3.10.1.1 count

The collector for counting is defined in the static method of Collectos!

public static < T > Collector<T,? ,Long> counting()

Returns a Collector that counts the number of input elements, or zero if there are no elements. Reducing (0L, e -> 1L, Long::sum)

Use case:

/** * Count operation */
@Test
public void count(a) {
    // There are three types of counting operations
    //Stream's count method
    System.out.println(Stream.of(1.2.3.4.7).count());
    // The count method of the characteristic flow
    System.out.println(Stream.of(1.2.3.4.7).mapToInt(x -> x).count());
    // Map + Reduce implements count
    System.out.println(Stream.of(1.2.3.4.7)
            .map(d -> 1)
            .reduce(0, Integer::sum));


    // Now, the collect operation also provides counting, and a predetermined collector has been provided
    // The static counting() method returns a collector for counting
    System.out.println(Stream.of(1.2.3.4.7).collect(counting()));
}
Copy the code

The most value 3.10.1.2

The collector for maximizing values is defined in the static methods of Collectos!

public static < T > Collector<T,? ,Optional< T >> minBy(Comparator<? super T> comparator)

Return a Collector that generates the minimum element based on the given Comparator, returning Optional< T >.

public static < T > Collector<T,? ,Optional< T >> maxBy(Comparator<? super T> comparator)

Return a Collector that generates the maximum element based on the given Comparator, returning Optional< T >.

Use case:

/** * Maximum value */
@Test
public void max_min(a) {
    // There are two kinds of maximums

    // The method of characterizing the flow
    Stream.of(1.2.3.4.7).mapToInt(x -> x).max().ifPresent(System.out::println);
    / / reduce method
    Optional<Integer> maxReduce = Stream.of(1.2.3.4.7)
            // Use the method reference Integer:: Max to indicate the intent to maximize the value
            .reduce(Integer::max);
    maxReduce.ifPresent(System.out::println);


    // Now, the collect operation also provides the max-value function, and a predetermined collector has been provided

    // The minBy(Comparator) static method returns a collector for minimizing values
    Stream.of(1.2.3.4.7).collect(minBy(Integer::compareTo)).ifPresent(System.out::println);
    // The maxBy(Comparator) static method returns a collector to maximize
    Stream.of(1.2.3.4.7).collect(maxBy(Integer::compareTo)).ifPresent(System.out::println);
    Stream.of(1.2.3.4.7).collect(maxBy(Comparator.comparingInt(x -> x))).ifPresent(System.out::println);
}
Copy the code

3.10.1.3 summary

Collectors are predefined in the static methods of Collectors for summarizing operations, such as summing, averaging, and a collector that returns all parameters!

public static < T > Collector<T,? ,Integer> summingInt(ToIntFunction<? super T> mapper)

Returns a Collector that generates the sum of the int-valued functions applied to the input element. If there are no elements, the result is 0.

public static < T > Collector<T,? ,Long> summingLong(ToLongFunction<? super T> mapper)

Returns a Collector that generates the sum of the long functions applied to the input element. If there are no elements, the result is 0.

public static < T > Collector<T,? ,Double> summingDouble(ToDoubleFunction<? super T> mapper)

Returns a Collector that generates the sum of the double-valued functions applied to the input element. If there are no elements, the result is 0.

public static < T > Collector<T,? ,Double> averagingInt(ToIntFunction<? super T> mapper)

Returns a Collector that generates the arithmetic average of the int value function applied to the input element. If there are no elements, the result is 0.

public static < T > Collector<T,? ,Double> averagingLong(ToLongFunction<? super T> mapper)

Returns a Collector that generates the arithmetic mean of the long value function applied to the input element. If there are no elements, the result is 0.

public static < T > Collector<T,? ,Double> averagingDouble(ToDoubleFunction<? super T> mapper)

Returns a Collector that generates the arithmetic average of the double value function applied to the input element. If there are no elements, the result is 0.

public static < T > Collector<T,? ,IntSummaryStatistics> summarizingInt(ToIntFunction<? super T> mapper)

Return a Collector that returns the total number of int data, the sum, average, maximum, and minimum values.

public static < T > Collector<T,? ,LongSummaryStatistics> summarizingLong(ToLongFunction<? super T> mapper)

Returns a Collector that returns the total number, sum, average, maximum, and minimum of long data.

public static < T > Collector<T,? ,DoubleSummaryStatistics> summarizingDouble(ToDoubleFunction<? super T> mapper)

Return a Collector that returns the total number of double data, the sum, average, maximum, and minimum.

Use case:

/** * Summary */
@Test
public void sum(a) {
    // There are two kinds of summation operations

    // The method of characterizing the flow
    System.out.println(Stream.of(1.2.3.4.7).mapToInt(x -> x).sum());
    / / reduce method
    Stream.of(1.2.3.4.7)
            // Use the method reference Integer::sum to indicate the intention to sum
            .reduce(Integer::sum)
            .ifPresent(System.out::println);


    // Now, the collect operation also provides summation, and a predetermined collector has been provided

    // The summingInt(ToIntFunction) static method returns a collector that evaluates int values
    System.out.println(Stream.of(1.2.3.4.7)
            .collect(summingInt(x -> x)));

    // The summingLong(ToLongFunction) static method returns a collector for the sum of long data and returns a value of type long
    System.out.println(Stream.of(1.2.3.4.7)
            .collect(summingLong(x -> x)));

    SummingDouble (ToDoubleFunction) The summingDouble(ToDoubleFunction) static method returns a collector that evaluates the sum of double data, returning a value of type double
    System.out.println(Stream.of(1.2.3.4.7)
            .collect(summingDouble(x -> x)));

    // Now, the count operation also provides the ability to average, and a predetermined collector has been provided

    // The static averagingInt(ToIntFunction) method returns a collector for finding the arithmetic mean of int data
    System.out.println(Stream.of(1.2.3.4.7)
            .collect(averagingInt(x -> x)));

    // The averagingLong(ToLongFunction) static method returns a collector for finding the arithmetic mean of long data
    System.out.println(Stream.of(1.2.3.4.7)
            .collect(averagingLong(x -> x)));

    // The averagingDouble(ToDoubleFunction) static method returns a collector for finding the arithmetic mean of double data
    System.out.println(Stream.of(1.2.3.4.7)
            .collect(averagingDouble(x -> x)));


    // The static method of summarizingInt(ToIntFunction) returns a collector for finding the sum, average, maximum, and minimum values of the int data
    // Return the IntSummaryStatistics object with all the values collected internally
    System.out.println(Stream.of(1.2.3.4.7)
            .collect(summarizingInt(x -> x)));

    // The static method of summarizingLong(ToLongFunction) returns a collector for finding the sum, average, maximum, and minimum of the long data
    // Return the LongSummaryStatistics object with all the values collected internally
    System.out.println(Stream.of(1.2.3.4.7)
            .collect(summarizingLong(x -> x)));

    // The static method of the double (ToDoubleFunction) returns a collector for finding the sum, average, maximum and minimum of the double data
    // Return the DoubleSummaryStatistics object, which collects all the values internally
    System.out.println(Stream.of(1.2.3.4.7)
            .collect(summarizingDouble(x -> x)));
}
Copy the code

3.10.1.4 connection

Collectors predefined Collectors for concatenating all string elements!

public static Collector<CharSequence,? ,String> joining()

Returns a Collector that concatenates the input elements into a String in order.

public static Collector<CharSequence,? ,String> joining(CharSequence delimiter)

Return a Collector with input elements concatenated in sequence as strings, delimiter delimited for each element.

public static Collector<CharSequence,? ,String> joining(CharSequence delimiter, CharSequence prefix, CharSequence suffix)

Return a Collector with input elements concatenated in sequence as strings, delimiter delimited for each element. Add prefix before the connection starts and suffix after the connection completes.

Of course, the first two methods can also be replaced directly with the join method that Java8 added to the String API. Use case:

/** * Connection string */
@Test
public void string(a) {

    //join()
    System.out.println(Stream.of("School beauty"."Flower"."Max"."A joke").collect(joining()));
    System.out.println(String.join(""."School beauty"."Flower"."Max"."A joke"));

    //joining(CharSequence delimiter)
    System.out.println(Stream.of("School beauty"."Flower"."Max"."A joke").collect(joining("——")));
    System.out.println(String.join("——"."School beauty"."Flower"."Max"."A joke"));

    //joining(CharSequence delimiter, CharSequence prefix, CharSequence suffix)
    System.out.println(Stream.of("School beauty"."Flower"."Max"."A joke").collect(joining("——"."Begin:".". The end of the")));
}
Copy the code

3.10.1.5 custom

In fact, all of the above predefined induction operations are internally called factory methods, or special cases of reducing induction operations. When the above predefined definition methods don’t meet our requirements, we can use Reducing to define our own induction operation!

public static < T > Collector<T,? ,Optional< T >> reducing(BinaryOperator< T > op)

Op: a binary operator that operates on two elements to produce a new element. The first calculation takes the first two elements, and the value of the calculation will be the first parameter of the next calculation, and the next element will be the second parameter.

Returns a Collector. No initial value, returns an Optional object to indicate that the result may not exist.

public static < T > Collector<T,? ,T> reducing(T identity, BinaryOperator< T > op)

  1. Identity: an initial value that will eventually return a value of the same type.
  2. Op: a binary operator that operates on two elements to produce a new element. The first calculation takes the initial value of the first argument, and the second argument takes the first element. The computed value will be the first argument of the next calculation, and the following element will be the second argument.

public static <T,U> Collector<T,? ,U> reducing(U identity, Function<? super T,? extends U> mapper, BinaryOperator< U > op)

  1. Identity: an initial value that will eventually return a value of the same type.
  2. Mapper: a unary function that operates on a stream element to produce a result.
  3. Op: a binary operator that operates on two elements to produce a new element. The first calculation takes the initial value of the first parameter, and the second parameter takes the result of applying mapper to the first element. The calculated value will be the first parameter of the next calculation, and the result of applying mapper to the next element will be the second parameter.

In this way, it seems that the methods of Collect and Reduce overlap in many induction functions, but they still have significant differences. The most important thing is that Reduce itself is regarded as an immutable induction, and each operation should generate a new value from two values rather than changing the original value, while collect is regarded as a variable induction. Support for parallel operations!

Use case:

/** * custom * looks similar to reduce */
@Test
public void custom(a) {

    // Customize the summation operation
    System.out.println(Stream.of(1.2.3.4.7)
            .collect(reducing(0, Integer::sum)));

    // Custom count operation
    System.out.println(Stream.of(1.2.3.4.7)
            .collect(reducing(0, x -> 1, Integer::sum)));

    // The initial value is 10
    System.out.println(Stream.of(1.2.3.4.7)
            // Use the method reference Integer:: Max to indicate the intent to maximize the value
            .collect(reducing(10, Integer::max)));

    / / get the most value
    Stream.of(1.2.3.4.7)
            // Use the method reference Integer:: Max to indicate the intent to maximize the value
            .collect(reducing(Integer::max))
            .ifPresent(System.out::println);

}
Copy the code

3.10.2 collection

One of the most important collection operations of Collect is to convert stream elements into collections. Of course, Collectos static methods also define collectors that output stream elements to collections! With these methods, we can easily and quickly achieve the collection conversion!

3.10.2.1 Collection was collected

public static < T > Collector<T,? ,List< T >> toList()

Returns a Collector that collects all the stream elements in order into a List. The actual type is an ArrayList, which is not safe!

public static < T > Collector<T,? ,Set< T >> toSet()

Returns a Collector that collects all the flow elements into a Set in order. The actual type is a HashSet, which is not safe!

public static <T,C extends Collection< T >> Collector<T,? ,C> toCollection(Supplier< C > collectionFactory)

CollectionFactory: a producer that specifies the actual type of Collection, as long as it is a Collection in the Collection system!

Returns a Collector to collect all the flow elements in a Collection in order. The actual type is specified by the arguments passed in, as long as it is a Collection in the Collection system!

3.10.2.2 Map was collected

public static <T,K,U> Collector<T,? ,Map<K,U>> toMap(Function<? super T,? extends K> keyMapper, Function<? super T,? extends U> valueMapper)

  1. KeyMapper: a function of the map’s key, taking each stream element as an argument.
  2. ValueMapper: A function of the map’s value, taking each stream element as an argument.
  3. Returns a Collector that collects the elements into a Map whose keys and values are the result of applying the provided mapping functions to the input elements (stream elements). The actual type is a HashMap, which is not safe!
  4. Note: If there are duplicate keys, an IllegalStateException will be thrown!

public static <T,K,U> Collector<T,? ,Map<K,U>> toMap(Function<? super T,? extends K> keyMapper, Function<? super T,? extends U> valueMapper, BinaryOperator< U > mergeFunction)

  1. KeyMapper: a function of the map’s key, taking each stream element as an argument.
  2. ValueMapper: A function of the map’s value, taking each stream element as an argument.
  3. MergeFunction: a binary operator that handles key conflicts! The first argument is the value of the previous conflicting key, and the second argument is the value of the later conflicting key. Return a result as value.
  4. Returns a Collector that collects the elements into a Map whose keys and values are the result of applying the provided mapping functions to the input elements (stream elements). The actual type is a HashMap, which is not safe! Specify a conflict resolution policy!

public static <T,K,U,M extends Map<K,U>> Collector<T,? ,M> toMap(Function<? super T,? extends K> keyMapper, Function<? super T,? extends U> valueMapper, BinaryOperator< U > mergeFunction, Supplier< M > mapSupplier)

  1. KeyMapper: a function of the map’s key, taking each stream element as an argument.
  2. ValueMapper: A function of the map’s value, taking each stream element as an argument.
  3. MergeFunction: a binary operator that handles key conflicts! The first argument is the value of the previous conflicting key, and the second argument is the value of the later conflicting key. Return a result as value.
  4. MapSupplier: a producer that specifies the actual type of a collection, as long as it is in the Map system!
  5. Returns a Collector that collects the elements into a Map whose keys and values are the result of applying the provided mapping functions to the input elements (stream elements). The actual type is specified by the parameters passed in, as long as it is a collection in the Map system! Specify a conflict resolution policy!

3.10.2.3 ConcurrentMap was collected

public static <T,K,U> Collector<T,? ,ConcurrentMap<K,U>> toConcurrentMap(Function<? super T,? extends K> keyMapper, Function<? super T,? extends U> valueMapper)

  1. KeyMapper: a function of the map’s key, taking each stream element as an argument.
  2. ValueMapper: A function of the map’s value, taking each stream element as an argument.
  3. Returns a Collector that collects the elements into a ConcurrentMap whose keys and values are the result of applying the provided mapping functions to the input elements (stream elements). The actual type is a ConcurrentHashMap, thread-safe!
  4. Note: If there are duplicate keys, an IllegalStateException will be thrown!

public static <T,K,U> Collector<T,? ,ConcurrentMap<K,U>> toConcurrentMap(Function<? super T,? extends K> keyMapper, Function<? super T,? extends U> valueMapper, BinaryOperator< U > mergeFunction)

  1. KeyMapper: a function of the map’s key, taking each stream element as an argument.
  2. ValueMapper: A function of the map’s value, taking each stream element as an argument.
  3. MergeFunction: a binary operator that handles key conflicts! The first argument is the value of the previous conflicting key, and the second argument is the value of the later conflicting key. Return a result as value.
  4. Returns a Collector that collects the elements into a ConcurrentMap whose keys and values are the result of applying the provided mapping functions to the input elements (stream elements). The actual type is a ConcurrentHashMap, thread-safe! Specify a conflict resolution policy!

public static <T,K,U,M extends ConcurrentMap<K,U>> Collector<T,? ,M> toConcurrentMap(Function<? super T,? extends K> keyMapper, Function<? super T,? extends U> valueMapper, BinaryOperator< U > mergeFunction, Supplier< M > mapSupplier)

  1. KeyMapper: a function of the map’s key, taking each stream element as an argument.
  2. ValueMapper: A function of the map’s value, taking each stream element as an argument.
  3. MergeFunction: a binary operator that handles key conflicts! The first argument is the value of the previous conflicting key, and the second argument is the value of the later conflicting key. Return a result as value.
  4. MapSupplier: a producer that specifies the actual type of a collection, as long as the collection is in the ConcurrentMap system!
  5. Returns a Collector that collects the elements into a ConcurrentMap whose keys and values are the result of applying the provided mapping functions to the input elements (stream elements). The actual type is specified by the arguments passed in, as long as it is a collection in the ConcurrentMap system! Specify a conflict resolution policy!

3.10.2.4 Use Case

/ * * *@author lx
 */
public class CollectCollection {


    /** * collection */
    @Test
    public void collection(a) {
        // Collect all student scores ArrayList collection
        List<Integer> scoreArrayList = students.stream().map(Student::getScore).collect(toList());
        // Collect all student scores HashSet set
        Set<Integer> scoreHashSet = students.stream().map(Student::getScore).collect(toSet());
        // Collect all student scores LinkedHashSet set
        Set<Integer> scoreLinkedHashSet = students.stream().map(Student::getScore).collect(toCollection(LinkedHashSet::new));

        System.out.println(scoreArrayList);
        System.out.println(scoreHashSet);
        System.out.println(scoreHashSet);
    }

    /** * map */
    @Test
    public void map(a) {

        //public static 
      
        Collector
       
        > toMap(Function
         keyMapper, Function
         valueMapper)
       ,?>
      ,k,u>
        // Note: If there are duplicate keys, an IllegalStateException will be thrown

        // A HashMap that collects the names of all students' scores -- scores will throw an exception
        //Map<String, Integer> nameStoreHashMap = students.stream().collect(toMap(Student::getName, Student::getScore));

        // Collect all student scores id- scores HashMap, does not throw an exception
        Map<Integer, Integer> idStoreHashMap = students.stream().collect(toMap(Student::getId, Student::getScore));

        System.out.println(idStoreHashMap);


        //public static 
      
        Collector
       
        > toMap(Function
         keyMapper, Function
         valueMapper, BinaryOperator< U > mergeFunction)
       ,?>
      ,k,u>
        // Specify the key conflict resolution policy

        // Collect the names of all students' scores - HashMap of the scores, taking the value of the previous conflict
        Map<String, Integer> nameStoreHashMap1 = students.stream().collect(toMap(Student::getName, Student::getScore, (x
                , y) -> {
            System.out.println(x);
            System.out.println(y);
            return x;
        }));
        // Collect a HashMap of all student scores, taking the value of the last conflict
        Map<String, Integer> nameStoreHashMap2 = students.stream().collect(toMap(Student::getName, Student::getScore, (x
                , y) -> {
            System.out.println(x);
            System.out.println(y);
            return y;
        }));

        System.out.println(nameStoreHashMap1);
        System.out.println(nameStoreHashMap2);



        //public static 
      
       > Collector
       
         toMap(Function
         keyMapper, Function
         valueMapper, BinaryOperator< U > mergeFunction, Supplier< M > mapSupplier)
       ,?>
      ,k,u,m>
        // Specify the key conflict resolution policy and the Map type LinkedHashMap
        LinkedHashMap<Integer, String> scoreNameLinkedHashMap = students.stream()
                .collect(toMap(Student::getScore, Student::getName, (x, y) -> y, LinkedHashMap::new));
        // Specify the key conflict resolution policy and specify the Map type, TreeMap
        TreeMap<Integer, String> scoreNameTreeMap = students.stream()
                .collect(toMap(Student::getScore, Student::getName, (x, y) -> y, TreeMap::new));

        System.out.println(scoreNameLinkedHashMap);
        System.out.println(scoreNameTreeMap);
    }


    /** * ConcurrentMap */
    @Test
    public void concurrentMap(a) {

        //public static 
      
        Collector
       
        > toConcurrentMap(Function
         keyMapper, Function
         valueMapper)
       ,?>
      ,k,u>
        // Note: If there are duplicate keys, an IllegalStateException will be thrown

        // A HashMap that collects the names of all students' scores -- scores will throw an exception
        //ConcurrentMap<String, Integer> collect = students.stream().collect(toConcurrentMap(Student::getName,Student::getScore));

        // ConcurrentHashMap that collects all student scores ID - scores, does not throw an exception
        ConcurrentMap<Integer, Integer> idStoreHashMap = students.stream().collect(toConcurrentMap(Student::getId, Student::getScore));

        System.out.println(idStoreHashMap);


        //public static 
      
        Collector
       
        > toMap(Function
         keyMapper, Function
         valueMapper, BinaryOperator< U > mergeFunction)
       ,?>
      ,k,u>
        // Specify the key conflict resolution policy

        // Collect all student score names - ConcurrentHashMap of the score, taking the value of the previous conflict
        ConcurrentMap<String, Integer> nameStoreHashMap1 = students.stream().collect(toConcurrentMap(Student::getName, Student::getScore, (x
                , y) -> {
            System.out.println(x);
            System.out.println(y);
            return x;
        }));
        // Collect a HashMap of all student scores, taking the value of the last conflict
        ConcurrentMap<String, Integer> nameStoreHashMap2 = students.stream().collect(toConcurrentMap(Student::getName, Student::getScore, (x
                , y) -> {
            System.out.println(x);
            System.out.println(y);
            return y;
        }));

        System.out.println(nameStoreHashMap1);
        System.out.println(nameStoreHashMap2);



        //public static 
      
       > Collector
       
         toMap(Function
         keyMapper, Function
         valueMapper, BinaryOperator< U > mergeFunction, Supplier< M > mapSupplier)
       ,?>
      ,k,u,m>
        // Specify the key conflict resolution policy and the Map type ConcurrentHashMap
        ConcurrentMap<Integer, String> scoreNameConcurrentHashMap = students.stream()
                .collect(toConcurrentMap(Student::getScore, Student::getName, (x, y) -> y, ConcurrentHashMap::new));
        // Specify the key conflict resolution policy and specify the Map type, ConcurrentSkipListMap
        ConcurrentMap<Integer, String> scoreNameConcurrentSkipListMap = students.stream()
                .collect(toConcurrentMap(Student::getScore, Student::getName, (x, y) -> y, ConcurrentSkipListMap::new));

        System.out.println(scoreNameConcurrentHashMap);
        System.out.println(scoreNameConcurrentSkipListMap);
    }


    List<Student> students = new ArrayList<>();

    @Before
    public void before(a) {
        students.add(new Student(1.55."Flower"));
        students.add(new Student(2.100."Wah"));
        students.add(new Student(3.85."Max"));
        students.add(new Student(4.70."XiaoHua"));
        students.add(new Student(5.70."Small"));
        students.add(new Student(6.66."Small"));
        students.add(new Student(7.60."The promise"));
        students.add(new Student(8.77."Flower"));
    }


    static class Student {
        private int id;
        private int score;
        private String name;

        public int getId(a) {
            return id;
        }

        public void setId(int id) {
            this.id = id;
        }

        public int getScore(a) {
            return score;
        }

        public void setScore(int score) {
            this.score = score;
        }

        public String getName(a) {
            return name;
        }

        public void setName(String name) {
            this.name = name;
        }

        public Student(int id, int score, String name) {
            this.id = id;
            this.score = score;
            this.name = name;
        }

        public Student(int id) {
            this.id = id;
        }

        @Override
        public String toString(a) {
            return "Student{" +
                    "age=" + id +
                    ", score=" + score +
                    ", name='" + name + '\' ' +
                    '} ';
        }

        @Override
        public boolean equals(Object o) {
            if (this == o) return true;
            if(! (oinstanceof Student)) return false;

            Student student = (Student) o;

            if(getId() ! = student.getId())return false;
            if(getScore() ! = student.getScore())return false;
            returngetName() ! =null ? getName().equals(student.getName()) : student.getName() == null;
        }

        @Override
        public int hashCode(a) {
            int result = getId();
            result = 31 * result + getScore();
            result = 31* result + (getName() ! =null ? getName().hashCode() : 0);
            returnresult; }}}Copy the code

3.10.3 conversion

3.10.3.1 Conversion collection

public static <T,U,A,R> Collector<T,? ,R> mapping(Function<? super T,? extends U> mapper, Collector<? super U,A,R> downstream)

  1. Mapper: Unary function that applies to each stream element and returns a value that will be used as the initial stream element for the downstream collector!
  2. Downstream: a collector that collects flow elements after the transformation.
  3. Returns a Collector. The mapping method first converts elements of the element stream into corresponding data by applying unary functions, and then uses a collector to collect the data. Combination of map+ Collect methods similar to Stream.

This method might not be used in the collection operation above, because we could have called Srteam’s Map method transformation directly before collecting. But in later grouping and partitioning methods, it is very useful, will be discussed later!

3.10.3.2 Collecting conversions

public static <T,A,R,RR> Collector<T,A,RR> collectingAndThen(Collector<T,A,R> downstream, Function<R,RR> finisher)

  1. Downstream: a collector that collects convection elements.
  2. Finisher: a unary function that is applied to each collected result. The returned value will be used as the final return value!
  3. Returns a Collector. The collectingAndThen method first collects the convection elements, then uses a function of one variable to evaluate the results of the collection, and finally returns the results.

As you can see, the mapping and collectingAndThen methods are the opposite, with the former first converting and then collecting, and the latter collectingAndThen converting!

3.10.3.3 Use Case

/ * * *@author lx
 */
public class MappingTest {
    Mapping * is similar to map + collect */
    @Test
    public void mappingTest(a) {
        // Collect all student scores ArrayList collection
        System.out.println(students.stream().collect(mapping(Student::getScore, toList())));
        // Collect all student scores HashSet set
        System.out.println(students.stream().collect(mapping(Student::getScore, toSet())));
        // Collect all student scores LinkedHashSet set
        LinkedHashSet<Integer> collect = students.stream().collect(mapping(Student::getScore,
                toCollection(LinkedHashSet::new)));
        System.out.println(collect);

        // Collect the sum of all students' scores
        System.out.println(students.stream().collect(mapping(Student::getScore, reducing(0, Integer::sum))));
    }


    /** * collectingAndThen */
    @Test
    public void collectingAndThenTest(a) {
        // Collect the total number of students
        Integer collect = students.stream().collect(collectingAndThen(toList(), List::size));
        System.out.println(collect);

        // Collect the sum of all students' scores
        Integer collect2 = students.stream().collect(collectingAndThen(mapping(Student::getScore, reducing(Integer::sum)), Optional::get));
        System.out.println(collect2);
    }


    List<Student> students = new ArrayList<>();

    @Before
    public void before(a) {
        students.add(new Student(1.55."Flower"));
        students.add(new Student(2.100."Wah"));
        students.add(new Student(3.85."Max"));
        students.add(new Student(4.70."XiaoHua"));
        students.add(new Student(5.70."Small"));
        students.add(new Student(6.66."Small"));
        students.add(new Student(7.60."The promise"));
        students.add(new Student(8.77."Flower"));
    }


    static class Student {
        private int id;
        private int score;
        private String name;

        public int getId(a) {
            return id;
        }

        public void setId(int id) {
            this.id = id;
        }

        public int getScore(a) {
            return score;
        }

        public void setScore(int score) {
            this.score = score;
        }

        public String getName(a) {
            return name;
        }

        public void setName(String name) {
            this.name = name;
        }

        public Student(int id, int score, String name) {
            this.id = id;
            this.score = score;
            this.name = name;
        }

        public Student(int id) {
            this.id = id;
        }

        @Override
        public String toString(a) {
            return "Student{" +
                    "age=" + id +
                    ", score=" + score +
                    ", name='" + name + '\' ' +
                    '} ';
        }

        @Override
        public boolean equals(Object o) {
            if (this == o) return true;
            if(! (oinstanceof Student)) return false;

            Student student = (Student) o;

            if(getId() ! = student.getId())return false;
            if(getScore() ! = student.getScore())return false;
            returngetName() ! =null ? getName().equals(student.getName()) : student.getName() == null;
        }

        @Override
        public int hashCode(a) {
            int result = getId();
            result = 31 * result + getScore();
            result = 31* result + (getName() ! =null ? getName().hashCode() : 0);
            returnresult; }}}Copy the code

3.10.4 grouping

In front of a collection of operations, we can through such means as toMap will each element with a specified function mapping become the key and the value of the map, but sometimes we need a batch of data according to certain attributes for a general grouping, rather than simply for each element, we can use the grouping operation!

The groupingBy static method of Collectos defines a collector that groups elements of a stream! With these methods, we can easily and quickly implement the collection grouping! The name looks just like the SQL group operation group by. The group operation in Stream has the same meaning as the group in SQL!

3.10.4.1 Grouping to a Map

public static <T,K> Collector<T,? ,Map<K,List< T >>> groupingBy(Function<? super T,? extends K> classifier)

  1. Classifier: a grouping function. The parameter is each stream element, the returned value is the key of the generated map, and the List of all elements in the stream with the same return value (key) is the value.
  2. Returns a Collector that groups elements into a Map. Key is the result returned by calculating the flow elements using the grouping function. Value is a List of flow elements with the same return result (key). The actual Map type is a HashMap, which is thread unsafe; The actual value type is an ArrayList, which is thread unsafe.

public static <T,K,A,D> Collector<T,? ,Map<K,D>> groupingBy(Function<? super T,? extends K> classifier, Collector<? super T,A,D> downstream)

  1. Classifier: a grouping function. The parameter is each stream element, the returned value is the key of the generated map, and the List of all elements in the stream with the same return value (key) is the value.
  2. Downstream: a collector that specifies rules for bringing together flow elements that return the same key, such as specifying what collection to use as a value, even using an induction operation, and so on! Here we can use the mapping method to further process the original stream data and then collect it to complete more detailed work!
  3. Returns a Collector that collects groups of elements into a Map, the key is the result returned by the flow element calculation using the grouping function, and the value is the result returned by the downstream Collector applying a flow element with the same return result (key). The actual Map type is a HashMap, which is thread unsafe; The actual value type is determined based on the type of downstream collector.

public static <T,K,D,A,M extends Map<K,D>> Collector<T,? ,M> groupingBy(Function<? super T,? extends K> classifier, Supplier< M > mapFactory, Collector<? super T,A,D> downstream)

  1. Classifier: a grouping function. The parameter is each stream element, the returned value is the key of the generated map, and the List of all elements in the stream with the same return value (key) is the value.
  2. MapFactory: a producer that specifies the actual type of a collection, as long as it is in the Map system!
  3. Downstream: a collector that specifies rules for bringing together flow elements that return the same key, such as specifying what collection to use as a value, even using an induction operation, and so on! Here we can use the mapping method to further process the original stream data and then collect it to complete more detailed work!
  4. Returns a Collector that collects groups of elements into a Map, the key is the result returned by the flow element calculation using the grouping function, and the value is the result returned by the downstream Collector applying a flow element with the same return result (key). The actual Map type is specified by itself and is required to be a set in the Map system. The actual value type is determined based on the type of downstream collector.

3.10.4.2 Grouping into ConcurrentMap

public static <T,K> Collector<T,? ,ConcurrentMap<K,List< T >>> groupingByConcurrent(Function<? super T,? extends K> classifier)

  1. Classifier: a grouping function. The parameter is each stream element, the returned value is the key of the generated map, and the List of all elements in the stream with the same return value (key) is the value.
  2. Returns a Collector that groups elements into a Map. Key is the result returned by calculating the flow elements using the grouping function. Value is a List of flow elements with the same return result (key). The actual Map type is a ConcurrentHashMap, which is thread safe; The actual value type is an ArrayList, which is thread unsafe.

public static <T,K,A,D> Collector<T,? ,ConcurrentMap<K,D>> groupingByConcurrent(Function<? super T,? extends K> classifier, Collector<? super T,A,D> downstream)

  1. Classifier: a grouping function. The parameter is each stream element, the returned value is the key of the generated map, and the List of all elements in the stream with the same return value (key) is the value.
  2. Downstream: a collector that specifies rules for bringing together flow elements that return the same key, such as specifying what collection to use as a value, even using an induction operation, and so on! Here we can use the mapping method to further process the original stream data and then collect it to complete more detailed work!
  3. Returns a Collector that collects groups of elements into a Map, the key is the result returned by the flow element calculation using the grouping function, and the value is the result returned by the downstream Collector applying a flow element with the same return result (key). The actual Map type is a ConcurrentHashMap, which is thread safe; The actual value type is determined based on the type of downstream collector.

public static <T,K,A,D,M extends ConcurrentMap<K,D>> Collector<T,? ,M> groupingByConcurrent(Function<? super T,? extends K> classifier, Supplier< M > mapFactory, Collector<? super T,A,D> downstream)

  1. Classifier: a grouping function. The parameter is each stream element, the returned value is the key of the generated map, and the List of all elements in the stream with the same return value (key) is the value.
  2. MapFactory: a producer that specifies the actual type of a collection, as long as the collection is in the ConcurrentMap system!
  3. Downstream: a collector that specifies rules for bringing together flow elements that return the same key, such as specifying what collection to use as a value, even using an induction operation, and so on! Here we can use the mapping method to further process the original stream data and then collect it to complete more detailed work!
  4. Returns a Collector that collects groups of elements into a Map, the key is the result returned by the flow element calculation using the grouping function, and the value is the result returned by the downstream Collector applying a flow element with the same return result (key). The actual Map type is self-specified. It is required to be a collection in the ConcurrentMap system and is thread safe. The actual value type is determined based on the type of downstream collector.

3.10.4.3 Use Case

/ * * *@author lx
 */
public class CollectGrouping {

    /** * groupingBy a parameter */
    @Test
    public void groupingByOne(a) {

        // Group students with different levels
        Map<Integer, List<Student>> gradeMap = students.stream().collect(groupingBy(Student::getGrade));
        System.out.println(gradeMap);


        // Collect and group students with different scores, and customize grouping rules
        Map<Integer, List<Student>> collect = students.stream().collect(groupingBy(
                // Customize grouping rules through functions
                student -> {
                    int score = student.getScore();
                    if (score >= 90) {
                        return 1;
                    } else if (score >= 70) {
                        return 2;
                    } else {
                        return 3; }})); System.out.println(collect); }/** * groupingBy 2 parameters * The second Collector parameter can implement various collection logic */
    @Test
    public void groupingByTwo(a) {
        // Collect and group the students at different levels, using the List to collect the value element, which is actually the method called internally by the single-parameter groupingBy above
        Map<Integer, List<Student>> collectList = students.stream().collect(groupingBy(Student::getGrade,
                toList()));
        System.out.println(collectList);
        System.out.println("= = = = = = = = = = = = =");

        // Use ArrayList to collect elements
        Map<Integer, ArrayList<Student>> collectArrayList = students.stream().collect(groupingBy(Student::getGrade,
                toCollection(ArrayList::new)));
        System.out.println(collectArrayList);
        System.out.println("= = = = = = = = = = = = =");

        // Use HashSet to collect elements
        Map<Integer, HashSet<Student>> collectSet = students.stream().collect(groupingBy(Student::getGrade,
                toCollection(HashSet::new)));
        System.out.println(collectSet);

        System.out.println("= = = = = = = = = = = = =");

        // Use Map to collect the ids of students in the same group
        Map<Integer, Map<Integer, String>> collect = students.stream().collect(groupingBy(Student::getGrade,
                toMap(Student::getId, Student::getName)));
        System.out.println(collect);
        System.out.println("= = = = = = = = = = = = =");


        // Use Integer to collect the number of students in the same group
        Map<Integer, Long> collectCounting = students.stream().collect(groupingBy(Student::getGrade,
                counting()));
        System.out.println(collectCounting);
        System.out.println("= = = = = = = = = = = = =");
        // Use TreeMap to collect the ids of students in the same group and sort them in reverse order by ID size
        Map<Integer, TreeMap<Integer, String>> collect1 = students.stream().collect(groupingBy(Student::getGrade,
                toMap(Student::getId, Student::getName, (x, y) -> x,
                        () -> new TreeMap<>(Comparator.comparingInt(o -> (int) o).reversed()))));
        System.out.println(collect1);
    }

    /** * groupingBy = groupingBy = groupingBy
    @Test
    public void groupingByThr(a) {
        // Collect and group students of different levels, using TreeMap sorting for the outside
        TreeMap<Integer, List<Student>> collectTreeMap = students.stream().collect(groupingBy(Student::getGrade,
                TreeMap::new,
                toList()));
        System.out.println(collectTreeMap);

    }

    /** * groupingByConcurrent is a parameter similar to groupingBy, except that the outer Map is safe */
    @Test
    public void groupingByConcurrentByOne(a) {
        // Group students with different levels
        ConcurrentMap<Integer, List<Student>> gradeMap = students.stream().collect(groupingByConcurrent(Student::getGrade));
        System.out.println(gradeMap);


        // Collect and group students with different scores, and customize grouping rules
        ConcurrentMap<Integer, List<Student>> collect = students.stream().collect(groupingByConcurrent(
                // Customize grouping rules through functions
                student -> {
                    int score = student.getScore();
                    if (score >= 90) {
                        return 1;
                    } else if (score >= 70) {
                        return 2;
                    } else {
                        return 3; }})); System.out.println(collect); }/** * groupingByConcurrent = groupingByConcurrent = groupingByConcurrent
    @Test
    public void groupingByConcurrentByTwo(a) {
        // Group students with different levels
        ConcurrentMap<Integer, List<Student>> gradeMap = students.stream().collect(groupingByConcurrent(Student::getGrade));
        System.out.println(gradeMap);


        // Collect and group students with different scores, and customize grouping rules
        ConcurrentMap<Integer, List<Student>> collect = students.stream().collect(groupingByConcurrent(
                // Customize grouping rules through functions
                student -> {
                    int score = student.getScore();
                    if (score >= 90) {
                        return 1;
                    } else if (score >= 70) {
                        return 2;
                    } else {
                        return 3; }})); System.out.println(collect); }/** * Grouping or groupingByConcurrent combined with mapping can achieve a variety of powerful functions which almost meet all business needs */
    @Test
    public void groupMapping(a) {

        // Use ArrayList to collect the names of students in the same group
        Map<Integer, List<Integer>> collectName = students.stream().collect(groupingBy(Student::getGrade,
                mapping(Student::getId, toList())));
        System.out.println(collectName);


        // The Integer is used to collect the sum of the scores of students in the same group
        Map<Integer, Integer> collectSum = students.stream().collect(groupingBy(Student::getGrade,
                mapping(Student::getId, reducing(0, Integer::sum))));
        System.out.println(collectSum);

        System.out.println(collectSum);
    }


    List<Student> students = new ArrayList<>();

    @Before
    public void before(a) {
        students.add(new Student(1.55."Flower".4));
        students.add(new Student(2.100."Wah".1));
        students.add(new Student(3.85."Max".2));
        students.add(new Student(4.70."XiaoHua".2));
        students.add(new Student(5.70."Small".2));
        students.add(new Student(6.66."Small".3));
        students.add(new Student(7.60."The promise".3));
        students.add(new Student(8.77."Flower".3));
    }


    static class Student {
        private int id;
        private int score;
        private String name;
        private int grade;

        public Integer getId(a) {
            return id;
        }

        public void setId(int id) {
            this.id = id;
        }

        public int getScore(a) {
            return score;
        }

        public void setScore(int score) {
            this.score = score;
        }

        public String getName(a) {
            return name;
        }

        public void setName(String name) {
            this.name = name;
        }

        public int getGrade(a) {
            return grade;
        }

        public void setGrade(int grade) {
            this.grade = grade;
        }

        public Student(int id, int score, String name, int grade) {
            this.id = id;
            this.score = score;
            this.name = name;
            this.grade = grade;
        }

        @Override
        public String toString(a) {
            return "Student{" +
                    "id=" + id +
                    ", score=" + score +
                    ", name='" + name + '\' ' +
                    ", grade=" + grade +
                    '} ';
        }

        @Override
        public boolean equals(Object o) {
            if (this == o) return true;
            if(! (oinstanceof Student)) return false;

            Student student = (Student) o;

            if(getId() ! = student.getId())return false;
            if(getScore() ! = student.getScore())return false;
            if(getGrade() ! = student.getGrade())return false;
            returngetName() ! =null ? getName().equals(student.getName()) : student.getName() == null;
        }

        @Override
        public int hashCode(a) {
            int result = getId();
            result = 31 * result + getScore();
            result = 31* result + (getName() ! =null ? getName().hashCode() : 0);
            result = 31 * result + getGrade();
            returnresult; }}}Copy the code

3.10.5 partition

Partitioning can be seen as a special case of grouping. Instead of grouping flow elements into multiple groups, partitioning operations can divide flow elements into at most two regions. Partition operations can be used when a flow element needs to be divided into two parts based on whether certain conditions or conditions are met.

After the partitioning operation, the map has only two key-value pairs. One is the data that meets the condition and its key is true, and the other is the data that does not meet the condition and its key is false.

public static < T > Collector<T,? ,Map<Boolean,List< T >>> partitioningBy(Predicate<? super T> predicate)

  1. Predicate: partitioning assertions that partition elements based on whether they meet the assertions.
  2. Returns a Collector. It determines the input elements based on the assertion, collects the elements that meet or do not meet the criteria into a list, and returns data of type Map<Boolean, list < T >>. The actual Map type is a HashMap, which is thread unsafe; The actual value type is an ArrayList, which is thread unsafe.

public static <T,D,A> Collector<T,? ,Map<Boolean,D>> partitioningBy(Predicate<? super T> predicate, Collector<? super T,A,D> downstream)

  1. Predicate: partitioning assertions that partition elements based on whether they meet the assertions.
  2. Downstream: a collector that performs additional collection operations on the two downstream elements after partitioning.
  3. Returns a Collector. It determines the input elements based on the assertion, and continues to apply the second collector to the elements that do or do not meet the criteria. The final value type is determined by the return value from the second collector! The actual Map type is a HashMap, which is thread unsafe; The actual value type is determined based on the type of downstream collector.

Use case:

/ * * *@author lx
 */
public class CollectPartitioning {


    @Test
    public void partitioning(a) {
        // Students are divided according to whether the score is greater than or equal to 80
        Map<Boolean, List<Student>> collect = students.stream()
                .collect(partitioningBy(student -> student.getScore() >= 80));
        System.out.println(collect);


        // Select * from student where student's score is greater than or equal to 80
        Map<Boolean, List<String>> collect1 = students.stream()
                .collect(partitioningBy(student -> student.getScore() >= 80, mapping(Student::getName, toList())));
        System.out.println(collect1);


        // Select * from student where student id is greater than or equal to 80; // Select * from student where id is greater than or equal to 80
        Map<Boolean, Map<Integer, String>> collect2 = students.stream()
                .collect(partitioningBy(student -> student.getScore() >= 80, toMap(Student::getId, Student::getName)));
        System.out.println(collect2);
    }


    List<Student> students = new ArrayList<>();

    @Before
    public void before(a) {
        students.add(new Student(1.55."Flower"));
        students.add(new Student(2.100."Wah"));
        students.add(new Student(3.85."Max"));
        students.add(new Student(4.70."XiaoHua"));
        students.add(new Student(5.70."Small"));
        students.add(new Student(6.66."Small"));
        students.add(new Student(7.60."The promise"));
        students.add(new Student(8.77."Flower"));
    }


    static class Student {
        private int id;
        private int score;
        private String name;

        public int getId(a) {
            return id;
        }

        public void setId(int id) {
            this.id = id;
        }

        public int getScore(a) {
            return score;
        }

        public void setScore(int score) {
            this.score = score;
        }

        public String getName(a) {
            return name;
        }

        public void setName(String name) {
            this.name = name;
        }

        public Student(int id, int score, String name) {
            this.id = id;
            this.score = score;
            this.name = name;
        }

        public Student(int id) {
            this.id = id;
        }

        @Override
        public String toString(a) {
            return "Student{" +
                    "age=" + id +
                    ", score=" + score +
                    ", name='" + name + '\' ' +
                    '} ';
        }

        @Override
        public boolean equals(Object o) {
            if (this == o) return true;
            if(! (oinstanceof Student)) return false;

            Student student = (Student) o;

            if(getId() ! = student.getId())return false;
            if(getScore() ! = student.getScore())return false;
            returngetName() ! =null ? getName().equals(student.getName()) : student.getName() == null;
        }

        @Override
        public int hashCode(a) {
            int result = getId();
            result = 31 * result + getScore();
            result = 31* result + (getName() ! =null ? getName().hashCode() : 0);
            returnresult; }}}Copy the code

3.10.6 Custom collector

The Collectors we have described above are defined for us by the Collectors factory class and contain most of the possible scenarios, so we usually just use the factory method. But you can actually implement your own collector and implement your own logic.

3.10.6.1 Collector interface

Implementing a custom Collector first requires implementing the Collector interface. Here’s what the Collector parameters and methods mean:

/** * Collector interface, which implementations can define their own collectors **@param< T > Type of stream element to be collected *@param<A> Type of the accumulator used to collect stream elements *@param<R> The result type returned by the collector, which can be either the collector type or another type */
public interface Collector<T.A.R> {
    / * * *@returnReturns a producer. The producer gets an accumulator, which is either a container or an object */
    Supplier<A> supplier(a);

    / * * *@returnReturns a binary consumer. The first argument is the acquired accumulator, and the second argument is the stream element. * We need to apply the stream element to the accumulator for various custom operations, such as adding, summation, and so on, and finally return an accumulator. * The returned accumulator is used as the first argument of the next calculation */
    BiConsumer<A, T> accumulator(a);

    /** * This method is called in parallel mode, because in parallel mode it is possible to split the stream into multiple sub-streams, and there may be multiple accumulators * you need to do a custom operation such as combining the accumulator results, and only one accumulator will be returned **@returnReturns a binary operator. Combining two accumulators returns one accumulator, which eventually returns only one accumulator. * /
    BinaryOperator<A> combiner(a);

    / * * *@returnReturns a function. Get the final data to be returned from the accumulator * If the accumulator is the data to be returned, there is no conversion */
    Function<A, R> finisher(a);

    /** * Each collector instance has its own characteristics, which are a set of objects that describe the collector, based on which the framework can optimize the calculation of the collector appropriately **@returnAn immutable collection of collector features whose elements are selected from the internal Characteristics enumeration */
    Set<Characteristics> characteristics(a);

    /** * The collector is characterized by an enumeration that describes whether the stream can be generalized in parallel and the optimizations that can be used. * /
    enum Characteristics {
        /** * Indicates that the collector is multithreaded, that is, the Accumulator supports multithreaded calls. * This means that the result container can support accumulator functions called simultaneously with the same result container from multiple threads * 

* If the collector is not marked UNORDERED, Then it can be generalized in parallel only for unordered data sources. * /

CONCURRENT, /** * indicates that the collection operation does not promise to preserve the order of input elements. * / UNORDERED, /** * This indicates that the finisher method returns an identity function that returns what is input. * In this case, the accumulator object will be used directly as the final result of the induction process. * / IDENTITY_FINISH } /** * return a new collector instance with the default IDENTITY_FINISH feature@paramSupplier Supplier function * for the new collector@paramAccumulator The new collector's accumulator function *@paramCombiner The combiner function of the new collector@paramCharacteristics Collection of features of the new collector *@param< T > Type of stream element to be collected *@param<R> The type of the accumulator and the result returned by the collector are the same type *@returnNew accumulator instance *@throwsNullPointerException If any parameter is NULL */ public static <T, R> Collector<T, R, R> of(Supplier supplier, BiConsumer accumulator, BinaryOperator combiner, Characteristics... characteristics) ,> { Objects.requireNonNull(supplier); Objects.requireNonNull(accumulator); Objects.requireNonNull(combiner); Objects.requireNonNull(characteristics); // Set the feature set Set<Characteristics> cs = (characteristics.length == 0)? Collectors.CH_ID : Collections.unmodifiableSet(EnumSet.of(Characteristics.IDENTITY_FINISH, characteristics));// Return an instance of the CollectorImpl collector implemented internally by Collectors return new Collectors.CollectorImpl<>(supplier, accumulator, combiner, cs); } /** * returns a new collector instance@paramSupplier Supplier function * for the new collector@paramAccumulator The new collector's accumulator function *@paramCombiner The combiner function of the new collector@paramFinisher The finisher function of the new collector *@paramCharacteristics Collection of features of the new collector *@param< T > Type of stream element to be collected *@param<A> Type of accumulator *@param<R> and the type of result that the collector eventually returns *@return the new {@code Collector} * @throws NullPointerException if any argument is null */ public static <T, A, R> Collector<T, A, R> of(Supplier supplier, BiConsumer accumulator, BinaryOperator ,> combiner, Function finisher, Characteristics... characteristics) ,> { / / null check Objects.requireNonNull(supplier); Objects.requireNonNull(accumulator); Objects.requireNonNull(combiner); Objects.requireNonNull(finisher); Objects.requireNonNull(characteristics); // Set the feature set Set<Characteristics> cs = Collectors.CH_NOID; if (characteristics.length > 0) { cs = EnumSet.noneOf(Characteristics.class); Collections.addAll(cs, characteristics); cs = Collections.unmodifiableSet(cs); } // Return an instance of the CollectorImpl collector implemented internally by Collectors return newCollectors.CollectorImpl<>(supplier, accumulator, combiner, finisher, cs); }}/** * A simple implementation class for the collector, as an inner class for Collectors **@param< T > Type of stream element to be collected *@param<A> Type of the accumulator used to collect stream elements *@param<R> The result type returned by the collector, which can be either the collector type or another type */ static class CollectorImpl<T.A.R> implements java.util.stream.Collector<T.A.R> { // Save the passed function private final Supplier<A> supplier; private final BiConsumer<A, T> accumulator; private final BinaryOperator<A> combiner; private final Function<A, R> finisher; private final Set<Characteristics> characteristics; // Two constructors that receive various functions CollectorImpl(Supplier<A> supplier, BiConsumer<A, T> accumulator, BinaryOperator<A> combiner, Function<A, R> finisher, Set<Characteristics> characteristics) { this.supplier = supplier; this.accumulator = accumulator; this.combiner = combiner; this.finisher = finisher; this.characteristics = characteristics; } CollectorImpl(Supplier<A> supplier, BiConsumer<A, T> accumulator, BinaryOperator<A> combiner, Set<Characteristics> characteristics) { this(supplier, accumulator, combiner, castingIdentity(), characteristics); } // When these methods are called, return our custom functions @Override public BiConsumer<A, T> accumulator(a) { return accumulator; } @Override public Supplier<A> supplier(a) { return supplier; } @Override public BinaryOperator<A> combiner(a) { return combiner; } @Override public Function<A, R> finisher(a) { return finisher; } @Override public Set<Characteristics> characteristics(a) { returncharacteristics; }}The finisher function for the "of" method with fewer arguments is this function, as shown by the direct conversion type **@returnReturns itself */ @SuppressWarnings("unchecked") private static <I, R> Function<I, R> castingIdentity(a) { return i -> (R) i; } Copy the code

As you can see, the collector actually collects through five internal methods. And also provides the method of the of method is actually used to return a custom collector, so we don’t have to write a collector’s implementation class, through the method of, pass the several functions and progress can obtain a custom collector instance, the collector is also Collectors internally to help us, The type is CollectorImpl!

3.10.6.2 Use Case

/ * * *@author lx
 */
public class CollectCustom {
    @Test
    public void test(a) {
        / / using toList
        List<Student> collect1 = students.stream().collect(Collectors.toList());
        System.out.println(collect1);

        // A custom collector performs toList functions
        List<Student> collect = students.stream().collect(Collector.of((Supplier<List<Student>>) ArrayList::new, List::add, (x, y) -> {
            x.addAll(y);
            return x;
        }));
        System.out.println(collect);

        // In fact, the source code for the toList method is similar to this custom collector:
        //new CollectorImpl<>((Supplier<List< T >>) ArrayList::new, List::add,(left, right) -> { left.addAll(right); return left; },CH_ID);


        / / usually provide predefined collector can satisfy the business Collects
        // The custom collector is usually used for some unique business, such as scoring students when they are collected
        Function<Student, Integer> function = o -> {
            int score = o.getScore();
            if (score >= 90) {
                return 1;
            } else if (score >= 70) {
                return 2;
            } else {
                return 3; }}; List<Student> collect2 = students.stream().collect(Collector.of((Supplier<List<Student>>) ArrayList::new, (x, y) -> {
            y.setGrade(function.apply(y));
            x.add(y);
        }, (x, y) -> {
            x.addAll(y);
            return x;
        }));
        System.out.println(collect2);


    }

    static List<Student> students = new ArrayList<>();

    @Before
    public void before(a) {
        students.add(new Student(1.55."Flower"));
        students.add(new Student(2.100."Wah"));
        students.add(new Student(3.85."Max"));
        students.add(new Student(4.70."XiaoHua"));
        students.add(new Student(5.70."Small"));
        students.add(new Student(6.66."Small"));
        students.add(new Student(7.60."The promise"));
        students.add(new Student(8.77."Flower"));
    }


    static class Student {
        private int id;
        private int score;
        private String name;
        private Integer grade;

        public int getId(a) {
            return id;
        }

        public void setId(int id) {
            this.id = id;
        }

        public int getScore(a) {
            return score;
        }

        public void setScore(int score) {
            this.score = score;
        }

        public String getName(a) {
            return name;
        }

        public void setName(String name) {
            this.name = name;
        }

        public int getGrade(a) {
            return grade;
        }

        public void setGrade(int grade) {
            this.grade = grade;
        }

        public Student(int id, int score, String name) {
            this.id = id;
            this.score = score;
            this.name = name;
        }

        public Student(int id) {
            this.id = id;
        }

        @Override
        public String toString(a) {
            return "Student{" +
                    "id=" + id +
                    ", score=" + score +
                    ", name='" + name + '\' ' +
                    ", grade=" + grade +
                    '} ';
        }

        @Override
        public boolean equals(Object o) {
            if (this == o) return true;
            if(! (oinstanceof Student)) return false;

            Student student = (Student) o;

            if(getId() ! = student.getId())return false;
            if(getScore() ! = student.getScore())return false;
            returngetName() ! =null ? getName().equals(student.getName()) : student.getName() == null;
        }

        @Override
        public int hashCode(a) {
            int result = getId();
            result = 31 * result + getScore();
            result = 31* result + (getName() ! =null ? getName().hashCode() : 0);
            returnresult; }}}Copy the code

4 parallel flows

4.1 an overview of the

Along with the development of computer, many servers or personal computers on the market are multicore multi-threading CPU, in a multicore multi-threading CPU, the program can do real parallel execution, if we write multithreaded parallel code, so for the performance of the procedure of handling a large amount of data of ascension is very considerable.

Before Java 7, if we want to write parallel processing data code, generally we need to clear the first hand data can be divided into several parts, and then using a thread for each part of the calculation, and finally summarize the results of calculation, in the meantime we must achieve thread-safety, code is very complicated to implement.

The ForkJoinPool, also known as the Branch/Merge framework, is an extension of the thread pool framework, making it easier to write parallel data processing code.

In Java8, the new Stream supports parallel execution of tasks and is much easier to use than ForkJoinPool. The bottom line of a parallel Stream is to call ForkJoinPool, but it already does tasks splitting, results gathering, and so on. You just need some method to start it! The principles of ForkJoinPool have been explained in previous articles, similar to the dive-and-conquer algorithm, and will not be explained here!

Parallelizing the flow of operations requires only one method call change. If you already have a Stream object, calling its PARALLEL method gives it the ability to operate in parallel. If you want to create a stream from a collection class, you can call parallelStream and immediately get a stream with parallelism capabilities. Sequential specifies the use of a serial stream. Only one pattern can be applied to a stream. If parallel and sequential methods are called at the same time, the last method called takes effect.

We tested the Reduce method using both serial and parallel streams:

/ * * *@author lx
 */
public class ParallelTest {


    @Test
    public void test1(a) {
        System.out.println("Serial stream");
        / / the serial stream
        System.out.println(Stream.of("s"."qq".";".".".".".".".".").reduce("-", (x, y) -> {
            System.out.println(Thread.currentThread().getName());
            return x + y;
        }));

    }

    @Test
    public void test2(a) {
        System.out.println("Parallel parallel flow");
        // Add the parallel method to make the stream parallel
        System.out.println(Stream.of("s"."qq".";".".".".".".".".").parallel().reduce("-", (x, y) -> {
            System.out.println(Thread.currentThread().getName());
            return x + y;
        }));

    }


    @Test
    public void test7(a) {
        System.out.println("Serial stream");
        Optional<Integer> reduce = Arrays.asList(1.2.3.4.5.6.7).stream().reduce((x, y) -> {
            System.out.println(Thread.currentThread().getName());
            return x + y;
        });
        reduce.ifPresent(System.out::println);
    }


    @Test
    public void test8(a) {
        System.out.println(ParallelStream parallelStream parallelStream);
        Optional<Integer> reduce = Arrays.asList(1.2.3.4.5.6.7).parallelStream().reduce((x, y) -> {
            System.out.println(Thread.currentThread().getName());
            return x + y;
        });
        reduce.ifPresent(System.out::println);
    }


    @Test
    public void test3(a) {
        System.out.println("Serial stream");

        / / the serial stream
        System.out.println(Arrays.asList("s"."qq".";".".".".".".".".").stream().reduce("-", (x, y) -> {
            System.out.println(Thread.currentThread().getName());
            return x + y;
        }));
    }


    @Test
    public void test4(a) {
        System.out.println(ParallelStream parallelStream parallelStream);
        / / parallel flows
        System.out.println(Arrays.asList("s"."qq".";".".".".".".".".").parallelStream().reduce("-", (x, y) -> {
            System.out.println(Thread.currentThread().getName());
            return x + y;
        }));
    }


    @Test
    public void test30(a) {
        System.out.println("Serial stream");

        / / the serial stream
        System.out.println(Arrays.asList("s"."qq".";".".".".".".".".").stream().reduce("-", (x, y) -> {
            System.out.println(Thread.currentThread().getName());
            return x + y;
        }, (x, y) -> x + y));
    }

    @Test
    public void test31(a) {
        System.out.println("Parallel flow");

        / / the serial stream
        System.out.println(Arrays.asList("s"."qq".";".".".".".".".".").parallelStream().reduce("-", (x, y) -> {
            System.out.println(Thread.currentThread().getName());
            return x + y;
        }, (x, y) -> {
            System.out.println(Thread.currentThread().getName());
            return x + y + x;
        }));
    }


    @Test
    public void test5(a) {
        System.out.println("Sequential serial stream");
        / / the serial stream
        System.out.println(Arrays.asList("s"."qq".";".".".".".".".".").parallelStream().sequential().reduce("-", (x, y) -> {
            System.out.println(Thread.currentThread().getName());
            return x + y;
        }));
    }

    @Test
    public void test6(a) {
        System.out.println("Sequential serial stream");

        / / the serial stream
        System.out.println(Arrays.asList("s"."qq".";".".".".".".".".").stream().parallel().sequential().reduce("-", (x, y) -> {
            System.out.println(Thread.currentThread().getName());
            returnx + y; })); }}Copy the code

4.2 Correct Use

Parallelism is good, but it must be used correctly, especially for our own functions. Parallelism does not guarantee thread-safe functions, so it requires our functions to be naturally thread-safe.

If we use an algorithm that changes the state of some of the shared objects’ properties, and we don’t have our own parallel security, we can cause data exceptions! In addition, in the Reduce method, the parallel flow operation needs to ensure that the initial value and the value after other operations will not change, and the operation needs to meet the associative law.

Here are two complex cases:

/ * * 1.@author lx
 */
public class ParallelErr {

    /** * sum, long ordinary variable will lose data in serial mode */
    @Test
    public void test(a) {
        Accumulator accumulator = new Accumulator();

        IntStream.rangeClosed(1.10000).forEach(accumulator::add);
        System.out.println(accumulator);
    }

    /** * find the sum, long ordinary variable in parallel mode will lose data */
    @Test
    public void test1(a) {
        Accumulator accumulator = new Accumulator();
        IntStream.rangeClosed(1.10000).parallel().forEach(accumulator::add);
        System.out.println(accumulator);
    }

    /** * sum, using the LongAdder accumulator, which is a thread-safe accumulator */
    @Test
    public void test2(a) {
        LongAdder longAdder = new LongAdder();
        IntStream.rangeClosed(1.10000).parallel().forEach(longAdder::add);
        System.out.println(longAdder);
    }


    public static class Accumulator {
        public static int total = 0;

        public void add(int value) {
            total += value;
        }

        @Override
        public String toString(a) {
            return ""+ total; }}/** * requires: compute n factorial, multiply the result by 5, and return */
    @Test
    public void test4(a) {
        System.out.println("Safe serial");

        //5 factorial times 5, and then return
        // Using Reduce in serial mode is ok
        System.out.println(IntStream.rangeClosed(1.5).reduce(5, (x, y) -> x * y));


        System.out.println("Wrong parallelism");

        // If you only change to parallel operation, then you will have a problem. The final result is 375000
        // because the parallel operation will first split the numbers between [1,5] into five sets of data multiplied by 5, to get 5, 10, 15, 20, 25
        5*10=50, 20*25=500, 15*500=7500, 50, 50*7500=375000
        System.out.println(IntStream.rangeClosed(1.5).parallel().reduce(5, (x, y) -> {
            System.out.println("x:" + x + " y: " + y + "- >" + (x * y));
            return x * y;
        }));

        System.out.println("CollectingAndThen improvement");

        // In parallel, we need to make sure that the initial value is the same as the value after the other operation, so the initial value can only be 1, and then we can split and multiply to get: 1, 2, 3, 4, 5
        1* 3= 2, 4*5=20, 3, 30*3=60, 2, 60*2=120
        // So we have the factorial of 5 first, and then we have the operation of multiplying by 5, and we have to make sure that we can't do this in parallel
        // So use the collectingAndThen method to multiply by 5 at the end. CollectingAndThen ensures that the last step is serial
        Integer collect = IntStream.rangeClosed(1.5).parallel().boxed().collect(Collectors.collectingAndThen(Collectors.reducing(1, (x, y) -> x * y), x -> x * 5));
        System.out.println(collect);


        System.out.println("Custom collector improvements");
        // Currently we can also customize the collector to meet our requirements, but with better performance

        // The full version is as follows
        System.out.println(IntStream.rangeClosed(1.5).parallel().boxed().collect(Collector.of(new Supplier<List<Integer>>() {

            /** returns the accumulator function */
            @Override
            public List<Integer> get(a) {
                return newArrayList<>(); }},new BiConsumer<List<Integer>, Integer>() {
            /** * Handle element functions, add to the collection */
            @Override
            public void accept(List<Integer> integers, Integer integer) { integers.add(integer); }},new BinaryOperator<List<Integer>>() {
            We multiply the first element of the two sets, and then replace the first element. The final result is 1*2*3*4*5=120 */
            @Override
            public List<Integer> apply(List<Integer> integers, List<Integer> integers2) {
                Integer integer = integers.get(0);
                Integer integer1 = integers2.get(0);
                integers.add(0, integer * integer1);
                returnintegers; }},new Function<List<Integer>, Integer>() {
            /** * returns the final result
            @Override
            public Integer apply(List<Integer> integers) {
                return integers.get(0) * 5; }})));// This is where the power of defining a collector comes in
        System.out.println(IntStream.rangeClosed(1.5).parallel().boxed().collect(Collector.of(ArrayList::new, List::add, (integers, integers2) -> {
            integers.add(0, integers.get(0) * integers2.get(0));
            return integers;
        }, (Function<List<Integer>, Integer>) integers -> integers.get(0) * 5))); }}Copy the code

In addition to our own security requirements, parallel flows are not necessarily appropriate for all operations, and we usually need to perform performance tests before using parallel flows in production.

In general, there are several factors that affect parallelization performance:

  1. Amount of source data: Parallelization works better only when there is enough data, because it requires additional time to start new threads, divide tasks, and so on.
  2. Source data structure: The structure of the source data has a significant impact on the data splitting operation. Structures that support random access, such as ArrayList, array, or IntStream.range, are easily broken down. HashSet and TreeSet involve complex tree structures that are not easily decomposed, but can be reluctantly decomposed by decompressing the tree structure. As for the data structures such as LinkedList, Stream. iterate, BufferedReader. Lines, it is difficult to choose the point for decomposition.
  3. Packing and unpacking: If you’re dealing with primitive types of raw data, try to use a characteristic stream because there’s no intermediate unpacking, which can save a lot of time in large data cases.
  4. Pipelination of a single element takes time: The longer the processing time of an element in the pipeline, the higher the benefits of parallelization.
  5. Merge operation time: The last step of parallelization merges the results, and if the merge operation (such as the Combiner method in Collector) is a very time consuming operation, performance may not be as good as the serial flow.
  6. Number of processor threads: If it is a single-core single thread, then there is no need for parallelization, because it must not be parallel, only concurrent, the benefits are generally not large, the more available processor threads, the higher the benefits of parallelization!

5 concludes

Java8’s new Stream uses a batch of data to be processed as the source data, and a series of intermediate data processing as a pipeline. The batch of data is sequential through each node in the pipeline. Each node is an intermediate processing operation, such as filtering, sorting, aggregation, etc. The elements in the pipeline will finally pass a terminal operation after each node is processed to obtain the processing results! The whole process looks like a flowing production line, or the flow of water through a pipe, hence the name “flow”!

The Stream API provides a high-level abstraction of Java collection operations and program structure representations, such as groupingBy for grouping, Max for maximizing, summing for summing, forEach for loping, and so on, in an intuitive way similar to querying data from a database in SQL statements. This way of declarative programming, for the traditional Java code structure has been completely changed, such as the outer loop into the inner loop, Optional container and other new features to make our code become more efficient, streamlined and robust, greatly improve the productivity of Java programmers! Anyway, Java8 Stream is worth learning!

Related articles:

Lambda: Java8-10,000 words of lambda expressions detailed introduction and application cases

If you need to communicate, or the article is wrong, please leave a message directly. Also hope to like, collect, follow, I will continue to update a variety of Java learning blog!