Asynchrony in Java: Future, CompletableFuture, and Structured Concurrency

Java was originally designed for multithreading and parallel computing. Over time, various methods for working with the results of asynchronous tasks have emerged—from the classic Future to modern Structured Concurrency. Let's look at the main mechanisms, their pros and cons, and when to use them.

1. Future<T> (Java 5)

Description: Future is an interface representing the result of an asynchronous task that will be ready in the future.

Main methods:

  • get() — blocks the thread until the task completes and returns the result.
  • cancel() — attempts to cancel the task.
  • isDone() — checks whether the task has completed.
  • isCancelled() — checks whether the task was canceled.

Pros: simple way to get the result of an asynchronous task; Integrates with ExecutorService.

Cons: blocking get() method, no built-in support for chaining or combining tasks, manual thread and exception management.

ExecutorService executor = Executors.newFixedThreadPool(2);
Future<String> future = executor.submit(() -> "Hello Future");
System.out.println(future.get());
executor.shutdown();
Scheme:
[Main Thread] ---calls---> [Future.get()]
|
v
[Worker Thread]
|
v
Result
The Main Thread blocks while the Worker Thread executes the task.

2. CompletableFuture<T> (Java 8)

Description: CompletableFuture extends Future and adds non-blocking chains, error handling, and combining multiple tasks.

Pros: Asynchronous chains (thenApply, thenCompose, thenAccept), easy combining of multiple tasks (allOf, anyOf), non-blocking execution.

Cons: Requires understanding of chains and callbacks, harder to debug with long task chains.

CompletableFuture.supplyAsync(() -> "Hello")
.thenApply(s -> s + " World")
.thenAccept(System.out::println);
Flowchart:
[supplyAsync] --> [thenApply] --> [thenAccept]
| | |
Worker Thread Worker Thread Worker Thread
The Main Thread does not block.
Each stage can be executed on a different thread.
Multiple chains can be combined (thenCombine, allOf, anyOf).

3. Structured Concurrency (Java 19+)

Description: The Structured Concurrency API allows you to combine multiple asynchronous tasks into a logical block, manage their lifecycle, and handle errors centrally.

Pros: All tasks in a scope are combined, no "scattered Futures" are created, all tasks are automatically canceled if one fails, results can be easily combined (scope.join()), and it works well with virtual threads (Loom) for scalability.

Cons: Currently an incubator API (Java 19–25), it does not replace data synchronization.

try (var scope = new StructuredTaskScope.ShutdownOnFailure<String>()) {
var f1 = scope.fork(() -> "Task 1"); 
var f2 = scope.fork(() -> "Task 2"); 

scope.join(); 
scope.throwIfFailed(); 

System.out.println(f1.resultNow()); 
System.out.println(f2.resultNow());
}
Scheme:
 +--------------------------+ 
| StructuredTaskScope | 
| | 
| [Task1] [Task2] [Task3] | 
| ... | 
+-----------+---------------+ 
| 
Scope.join() 
| 
All results / Errors

All tasks are combined into a single scope.
An error in one task → the rest are automatically canceled.
Easily collect results and handle exceptions centrally.

4. Reactive Libraries

RxJava, Project Reactor, and others offer powerful abstractions for event streams and asynchronous data. Support for backpressure, multiple events, and stream combinations. Used for high-load systems and microservices.

Schema:
[Source Observable] --> [map/filter] --> [flatMap/merge] --> [Subscriber]
Event flow from source to subscriber.
Backpressure support.
Multiple events, parallel processing, stream combinations.

5. General Comparison Table

Mechanism Java Version Type Main Pros Main Cons
Future 5 Blocking Simple, Executor Integration Blocking, No Chaining, Manual Control
CompletableFuture 8 Non-blocking Chains, combining, error handling More complex with long chains
Structured Concurrency API 19+ High-level Lifecycle management, automatic cancellation, scope Currently an incubator API, requires Java 19+
RxJava / Reactor 2+ Reactive Stream Event support, back-pressure, combining More complex curve Study

Performance and Scalability Comparison of Asynchrony Mechanisms in Java

Below is a comparison table of the main approaches to asynchrony in Java: Future, CompletableFuture, Structured Concurrency, and reactive libraries (RxJava / Reactor). The table focuses on performance, scalability, locking, and threading considerations.

Mechanism Thread Type Blocking Scalability Overhead / Performance Notes
Future System Threads Yes Low (limited by number of threads) High (context switch) Suitable for a small number of blocking tasks
CompletableFuture System Threads (ForkJoinPool) No (asynchronous chains) Medium (thousands of tasks with short calculations) Medium (callbacks, chains) Asynchronous chains, task combinations
Structured Concurrency + Virtual Threads Virtual Threads No Very high (tens/hundreds of thousands of tasks) Low (virtual threads are almost free) Modern Production, high-load, long-running tasks
RxJava / Reactor System or virtual threads None Very high (infinite threads, event-driven) Low (backpressure, minimal switching) Ideal for event streams, microservices, I/O

Summary

  • Future — for simple tasks with blocking waits.
  • CompletableFuture — a universal tool for asynchronous chains and task combinations.
  • Structured Concurrency — a modern approach for Production: combines tasks into a logical block, automatically manages errors and cancellations.
  • Reactive libraries – for event streams and high-load systems.

For Production on Java 25, the optimal combination today is Structured Concurrency + Virtual Threads (Loom): safe, scalable, and structured task management.


🌐 На русском
Total Likes:0

Оставить комментарий

My social media channel
By sending an email, you agree to the terms of the privacy policy

Useful Articles:

Struct, methods and interfaces in Go vs Java | Types - Language
Series: Go for Java Developers — exploring struct, interface, receiver types and type embedding In this article, we will examine how types architecture is built in Go. For a Java developer, this is e...
Memory / Runtime / Allocator - Go vs Java
Memory management, pointers, and profiling are fundamental aspects of efficient code. Let s consider three key concepts: slice backing array, pointer, and profiling (pprof / trace), and compare Go wit...
Map internals from random order to bucket evacuation | Go ↔ Java
In this article, we will examine the internal structure of maps / hash tables in Go and Java. If you are a Java developer used to HashMap, you will be interested in how differently Go thinks. If you a...

New Articles:

Concurrency is not about “starting many threads”. It’s about agreements between them. Imagine a restaurant kitchen: — cooks (threads / goroutines) — orders (tasks) — and the main question: how do th...
When HashMap starts killing production: the engineering story of ConcurrentHashMap
Imagine a typical production service. 32 CPU hundreds of threads configuration / session / rate limits cache tens of thousands of operations per second And somewhere inside — a regular Map. At first...
Zero Allocation in Java: what it is and why it matters
Zero Allocation — is an approach to writing code in which no unnecessary objects are created in heap memory during runtime. The main idea: fewer objects → less GC → higher stability and performance. ...
Fullscreen image