New JEPs: Computed Constants, a new approach to Ahead-of-Time and stabilisation of the FFM API - JVM Weekly vol. 52
Hasn't it been a while since we talked about JEPs? I've felt their absence as well, so today I have three of them! Additionally, there's the new IntelliJ Idea that comes with a built-in AI Assistant.
1. The new JEPs: Computed Constants, new approach to Ahead-of-Time & stabilisation and FFM APIs
JEP draft: Foreign Function & Memory API
We will start with something rather obvious, but still interesting.
JDK 21 marks the stabilization of numerous eagerly anticipated APIs, with one significant exception - the Foreign Function & Memory API. This vital feature from Panama will only be released as a Preview, marking the third such instance. But all signs suggest this will also be the last. A JEP Draft that outlines the specifications for the API's final version has recently been published and updated. This development indicates that one of the upcoming Java versions, possibly JDK 22, will allow us to comfortably utilize this feature. The JEP itself concentrates on finalizing the FFM API, implementing several enhancements, including the addition of the `Enable-Native-Access` manifest attribute. This allows JAR files to access specific methods without the need for particular flags. Other improvements brings the automatic creation of C function descriptors, increased support for native variable-size arrays, and compatibility with various character sets in native strings.
JEP draft: Computed Constants
Java developers often encounter a conflict between immutability and initialization flexibility, particularly when handling final
fields. This stems from the requirement in Java that these fields must be initialized specifically within the constructor:
class DeepThought {
final int theAnswer;
public DeepThought() {
value = ultimateQuestion();
}
private int ultimateQuestion() {
// Expensive computation here
return 42;
}
Of course, there are patterns such as the class-holder idiom:
class DeepThought {
private static class AnswerHolder {
static final int theAnswer = ultimateQuestion();
private static int ultimateQuestion() {
// Expensive computation here
return 42;
}
}
public DeepThought() {
}
public int getTheAnswer() {
return AnswerHolder.theAnswer;
}
}
or (in slightly more complicated cases involving concurrency) the nasty double-checked locking known, for example, from Effective Java:
class DeepThought {
private static class AnswerHolder {
static final int theAnswer = ultimateQuestion();
private static int ultimateQuestion() {
// Expensive computation here, takes 7.5 million years
return 42;
}
}
public int getTheAnswer() {
return AnswerHolder.theAnswer;
}
}
class LazySingleton {
private static volatile LazySingleton instance;
private final DeepThought deepThought;
private LazySingleton() {
deepThought = new DeepThought();
}
public static LazySingleton getInstance() {
if (instance == null) {
synchronized (LazySingleton.class) {
if (instance == null) {
instance = new LazySingleton();
}
}
}
return instance;
}
public int getTheAnswer() {
return deepThought.getTheAnswer();
}
}
While they provide utility, they also present challenges, including increased complexity and are harder in optimization at runtime. This stands in contrast to languages like C#, having features like the Lazy
:
class DeepThought {
private static readonly Lazy<int> theAnswer = new Lazy<int>(() => UltimateQuestion());
private static int UltimateQuestion() {
// Expensive computation here
return 42;
}
static int getTheAnswer() {
return theAnswer.get();
}
}
or Kotlin (which use lazy
for lazy initialization):
class DeepThought {
val theAnswer: Int by lazy { ultimateQuestion() }
private fun ultimateQuestion(): Int {
// Expensive computation here
return 42
}
}
Wrapping up, the rigorous rules in Java for final
fields lead to difficulties in postponing the initialization of constants. This situation obliges developers to find ways around the constraints of the language.
JEP draft: Computed Constants introduces the concept of Computed Constants, aiming to bridge this gap by providing a feature similar to constructs found in languages such as C# and Kotlin. They are non-mutable and are guaranteed to be assigned at most once, offering the benefits of fields marked as final
, but with more flexibility in initialization time.
class DeepThought {
private static final ComputedConstant<Int> theAnswer =
ComputedConstant.of(() -> ultimateQuestion());
static int ultimateQuestion() {
// Expensive computation here
// (evaluation made before the first access)
return 42
}
static int getTheAnswer() {
return theAnswer.get();
}
}
As you can see, the solution is quite elegant and fits nicely into recent language conventions such as Scoped Values. The original JEP, already mentioned, also includes an example for collections.
And now the “the nerd test” - How much "Expensive computation" was going on in the above examples? Hint in the thumbnail of this post 😄
JEP draft: Ahead of Time Compilation for the Java Virtual Machine
And finally, perhaps my favorite piece of meat, as we return to the much-loved topic of faster JVM startup.
Java applications and libraries are run through a three-stage model that begins with interpretation, followed by compilation into C1 code, and then into C2 code. This dynamic process entails numerous cycles of optimization and de-optimization throughout the code's lifespan. Consequently, warming up the code (which involves compiling it to the highly optimized version at runtime) may require a significant amount of time.
The Graal compiler provides a perfect illustration. This compiler, which few might recall, once served as an alternative to C2 before it was jettisoned with Java 17. The rationale behind this decision was fairly logical: optimization of the compiler itself had a detrimental impact on the entire application, given that the Graal needed to be compiled first before it could operate at a satisfactory speed, thereby offsetting potential performance improvements. Furthermore, the process of de-optimization that happens when C2 compiled code encounters a flawed assumption can be expensive, as the C1 profiling variant must be recompiled, only to be dismissed once more when C2 compiles the identical method. Existing alternatives, such as GraalVM and Native Image, conversely, work on what's known as the 'closed-world assumption'. This does not permit additional classes to be dynamically loaded during runtime, thus necessitating significant alterations when compared to the standard execution of Java applications by the JVM.
A viable approach is to create a persistent version of the Ahead-of-Time C1 compiled method, thereby enabling both the Interpreter and C1 to be circumvented. As a result, the execution commences with the precompiled C1 code and quickly moves on to C2 compilation. The proposal laid out in the JEP draft: Ahead of Time Compilation for the Java Virtual Machine, aims to enhance the JVM by providing the capacity to load Java applications and libraries that have been compiled into native code prior to startup. This involves entirely skipping the Interpreter, thereby permitting Java code, standard libraries, and any JVM Ahead-of-Time components to be transformed into native code. This modification permits the compiled code to execute at once, directly progressing to C2 compilation, and also retaining the compiled C1 version, which streamlines the de-optimization process.
It appears to be somewhat of a fusion between the CRaC approach and the Leyden project, wouldn't you agree? The latter is even referenced in the original JEP, and I must seize this moment to tease a little spoiler: We will be revisiting the Leyden subject soon, as fresh updates on the matter were just released yesterday. However, I'll need to take time to sift through and comprehend them, as the situation is slightly more intricate than it appears. Rest assured, I don't wish to offer you anything hasty and undercooked.
2. The launch of the new IntelliJ Idea with AI Assistant
A new version of IntelliJ was released, and, naturally, like all tools in 2023, it had to include AI/LLM support.
Generative artificial intelligence, including Large-Language Models (LLM), has been integrated into the JetBrains IDE, with the objective (naturally) of enhancing developers' efficiency. This new feature incorporates capabilities such as code queries through chat, automatic generation of documentation, suggestions for names, and the creation of commit messages. These AI functionalities are delivered via the JetBrains AI service, currently based on OpenAI, though the developers have assured that support from other vendors and local models is anticipated. Unfortunately, this feature is only available to a select group of users at this time. Everything I've mentioned is derived from the release notes, as I'm still awaiting access myself 🤷♂️.
The improvements in IntelliJ IDEA 2023.2 don't stop there. Java has benefited from enhanced code inspection and highlighting, along with improved support for the @snippet
tag that was introduced in JDK 18 for Javadoc comments. Additionally, SQL strings in both Java and Kotlin have been upgraded with more sophisticated analysis to identify unsafe queries and potential susceptibilities to SQL injection attacks.
The new release enhances support for Scala 3, incorporating improvements like optimized utilization of enums, a TASTy decompiler, editor support for IArray
, and streamlined syntax support with fewer parentheses. Additionally, the IDE now boasts improved directory assistance and offers better rendering of ScalaDoc.
Along with the AI Assistant and other features previously discussed, the forthcoming IntelliJ IDEA 2023.2 release will mark the retirement of several plugins such as Struts2, Play, and Cloud Foundry - these will no longer receive updates. I felt nostalgia, particularly about the Play Framework, once hailed as a fresh innovation in the JVM world but now considered obsolete and lacking full Java 17 compatibility. After being passed to the community by Lightbend, its continuity is maintained thanks to the efforts of Matthias Kurz, who ensures it is sustained with the support of Open Collective. Interestingly, the new IntelliJ release will also terminate support for Windows 7, necessitating an upgrade to Windows 10 or a more recent operating system.
3. What do you think about meeting up at KotlinConf '24?
And finally, since we've been discussing JetBrains products, I'll add a minor announcement.
The fifth edition of KotlinConf has been announced, and it is scheduled to be held from May 22 to 24, 2024, at the Bella Center in Copenhagen, Denmark.
KotlinConf'24 is set to begin with a day dedicated to workshops, followed by two intense days filled with sessions, networking opportunities, and a host of other engaging activities. I've shared my thoughts on last year's Keynote in a post titled TLDW: Opinionated Wrap-up of KotlinConf 2023 Keynote, so you can gauge how appealing this type of event might be for you. I'm mentioning this because I recommend acting quickly if you want to attend - both the super early bird and early bird promotion tickets have already sold out, and it's been less than a week since the announcement.
Oh, and the mention is completely unsolicited, I have no affiliation with it - but if anyone reading this could bump me up in the AI Assistant queue, I would genuinely appreciate it 😄.
While we're on the topic of Kotlin, I have one more small tidbit to share: To eliminate confusion and inconsistency in naming, the product name "Kotlin Multiplatform Mobile" (KMM) is being retired. From now on, "Kotlin Multiplatform" (KMP) will be the official name for Kotlin code-sharing solutions across various platforms, no matter the specific combination of target platforms. You can find more details and the context for this change in the article Update on the Name of Kotlin Multiplatform.
PS: Answer to the question raised in the text: Expensive computation took 7.5 million years 🐁
I'm really happy with the AI Assistant. I don't always agree with the refactoring suggestions, but sometimes they're helpful. I can't live without the autogenerated git commit messages, though. Those are awesome. I hope you get your copy soon so you can play with them. :) Btw, loved the HHG2tG references