Welcome!

Weblogic Authors: Yeshim Deniz, Elizabeth White, Michael Meiner, Michael Bushong, Avi Rosenthal

Related Topics: Java IoT, Open Source Cloud

Java IoT: Article

A Light Java Runtime to Bundle with Applications

Steve Jobs once said that Java is a big heavyweight ball and chain. Good news: the ball is now optional!

Java Development on Ulitzer

Steve Jobs once said that Java is a big heavyweight ball and chain. Good news: the ball is now optional! In this article, I share results we achieved after implementing a component deployment model, also known as JRE modularity, for the core of J2SE 5.0 and Java SE 6. The technology’s been in production use for more than two years and proved effective.

This is not a mere “Java gets smaller” message. Given that Project Jigsaw is emerging in JDK 7, I also offer some insights on the challenges that any implementation of modularity for the Java SE core may face, all derived from our practical experience.

“Divide and Conquer” Has Worked Out
We did not pioneer the idea that the monolithic Java SE platform needs to be split into components, it was in the air. We merely found a way of how to do that without breaking Java compatibility and implemented it in a compliant Java SE VM, Excelsior JET, back in 2007.

The goal was to let Java programmers bundle a light version of the Java Runtime with their applications leaving the unused components out so as to reduce the size of the installation package. Easier said than done, but we’ve got it made in a Java spec-compliant manner and called the technology Java Runtime Slim-Down (after Project Jigsaw appeared, it’s finally got to me that we should have to call it “Project Rock breaker” or the like).

It has been proved effective for many Java applications. For GUI applications, in particular, the size of a complete installation package with bundled Java Runtime starts from 5MB. In support of this assertion, I refer you to SWTPaint, a sample program taken from the latest Eclipse SDK.

The use of Java Runtime Slim-Down yields results you won’t get with any other Java deployment tool:

  • The size of the SWTPaint installer is 5.5MB (here is a direct download link).
  • The installed application need not the JRE to run and does not download any components from the Internet (so it won’t disturb your firewall).

Note that the GUI application in question is written in Java and uses Java SE 6. Oh, sorry! I had to mention that download size for Swing applications start from 8MB. Stirring the flame of Swing vs. SWT was by no means my intention. We prepared installation packages for a few sample applications, both Swing- and SWT-based, and you may download them from this page. If you are still in doubt, try the deployment technique yourself. This flash demo will help you get started.

There are good reasons for end users to love “all-inclusive” installation packages at reduced footprint rates, and it’s where a lightweight Java Runtime is of much help. At a larger scale, however, the lack of the JRE modularity impeded the evolution and adoption of the Java SE platform.

In Between a Rock and a Hard Place
A bit of history. Remember JSR-83, a proposal on the “multiarray” package originated by IBM. Its implementation could have had a great impact on the number crunching performance in Java. Nevertheless, it was approved only as a Java Standard Extension, never appeared among the core packages and eventually was withdrawn. In the final ballot, Sun made a noticeable comment: “...The proposal requires at least 82 new classes, and this seems inappropriate for the J2SE core...”. Though I personally was disappointed with the outcome, the need to damp the Java core inflation down sounded reasonable

Other JSRs were more lucky and the Java Community Process kept Java SE moving forward over the years. In 2006, I attended the Java Licensee Day event. During the Q&A part of the session devoted to then new Java SE 6, one of the licensees sharply asked: “With each release the JRE gets bigger and bigger. Our customers do not need all those new APIs. When will it stop bloating?” I then found myself thinking I agreed with him but I would rather say “Not all our customers need all those new APIs...”.

The question is how many “useful” APIs have not been approved just not to make the JRE bigger? One may ask also how many “useless” APIs have been approved and did make the JRE bigger? It’s clear that requirements of different projects vary and there is no single answer to these questions, but splitting the JRE into components could resolve these issues gracefully.

Here’s a practical example from our support records. We have customers who previously got stuck with J2SE 1.4.2 simply because the footprint of later Java versions was unacceptable for their deployment requirements. Now, after switching to the component model, they are happy users of Java SE 6.

However, I would not like to discuss the Sun’s policy on modularizing Java SE here. There were many pros and cons to consider, both technical an legal, and I fully realized some of them only when working on the Java Runtime Slim-Down technology. No shooting (in the foot)

As often happens, once we had started the design, the scope of work suddenly increased. For the truth to be told, we would not reach the goal by simply splitting the Java SE API into components and enabling the user to drop some of them. The big question was how to make the technology usable and reliable? After all, we did not want to create a thing that does not work just because the programmer removed some components too aggressively. We decided to explore the limits of this approach by interviewing those enthusiasts who pursued us for this matter. The results confirmed our suspicion – programmers are not always aware of what parts of the Java SE API are actually used in their applications. A good illustration would be the following transcript written then:

Client: I do not use that “Baggage-To-Trim” API and no longer want to carry it with my app. As you are a JVM vendor, make me happy, please.

Support Engineer: We understand you concern. Are you sure you don’t use the “Baggage-To-Trim” API?

Client: Absolutely.

Support Engineer:. We kindly ask you to double check it. Please run your app with java –verbose:class and inspect the log.

Client: Oops.. You surprised me! It’s proved to be used. Frankly speaking, I did not write that code where it’s used. Let me think of it.

Needless to say that we also had to think of it. In addition, Java SE components may depend on each other implicitly, via the implementing classes, and most programmers not familiar with the internals could not play safe when removing the components.

We conducted some R&D and figured out that such a technology should come with tools that help the user not shoot himself in the foot and rules which, just in case, provide the fastest recovery.

Tools and Rules
The final solution included a dependency analyzer and “safety net”. The analyzer takes the application’s classes, infers what Java SE components are likely in use and advises to the user.

Under the covers, it’s not simply checking import dependencies as that would work poorly in terms of precision. For instance, such a simplistic analysis would not have revealed that the SWT-AWT bridge, which is part of the SWT package, is not used by the SWTPaint application mentioned above. As a result, the AWT component would be sucked in and the installation size would increase. That said, analyzer design and testing had engaged us for some time.

Does it guarantee that any deployed application will never miss the removed components? I would not bet money on it. After all, a programmer could detach some components by mistake or an application may load a plug-in that uses the Java SE API more extensively than the application itself. Here the following rule comes into play. All removed components are put into a detached package and the developer has to place it on a Web server at the URL s/he assigned when creating the installation. The Web server is considered a "safety net": should the deployed application attempt to use any of the removed components, the Java Runtime will pull the package down from the server and load the requested Java classes.

On the formal side of things, we run the Java Compatibility Kit (JCK) deployed in this mode with some and all Java SE components detached. Noteworthy is that all the tests pass.

However, it is unlikely that a download of a detached package will occur in practice, provided the developer listened to the word of wisdom from the analyzer. For example, these sample applications have been downloaded over a thousand of times since we published them in May 2007, but there was not a single download of a detached package so far.

The last note is about the splitting. We have managed to carve a kernel part of Java SE, about 4MB, that have to be bundled with any application. We could not get it smaller. We wanted to break down the whole thing into more components, each of a smaller size, but were unable to do that. In general, we could obtain better download size figures if the Java SE API implementation classes would not be so tightly coupled, full of cross-references, strongly connected, melted and fused together.

The Truth About Sun Java Kernel
Initially, Sun Java Kernel was supposed include a deployment technology for reducing the download size. But what appeared in Java 6 Update 10 under the name Java Kernel is still far from the solution. The Java Kernel contains the VM and some core classes like java.lang.*, java.io.* and meets the needs of HelloWorld of sorts only. Upon application launch, the Java Kernel inevitably starts downloading the remaining packages from a Sun web site and no means are provided to package the required bundles with the application. In essence, the end users "download a downloader" and all you can do with it is specify which missing bundles must be downloaded first. This short table highlights the key differences between Java Runtime Slim-Down and Java Kernel:

You may find more details in the Java Kernel FAQ. Probably, the Java Kernel is just a preliminary step toward a solution that may appear in the future.

Historical Notes
The first mention of the JRE modularity being found in the Annals relates to the times of JDK 1.2(!) The discussions lasted for years, the first implementation in the Sun JRE was planned for Java 1.5, then moved to Java 6. The Java Kernel appeared in Java 6 Update 10 proved to be far from the solution.

In parallel with the Java Kernel, Java Module System (JSR-277) was in the works, which, in particular, could address the JRE modularity problem but the deadlock between Sun and an OSGi lobby had it buried.

Finally, in the end of 2008, Project Jigsaw was announced and “Episode IV. The New Hope” commenced.

Four Challenges for Jigsaw
I’m not with the Java SE core group at Sun but I’ve been working with the Java SE core for more than ten years. Below I share my opinion on the main challenges to the adoption of Jigsaw in the future.

World Peace
The corporate battle around OSGi needs a break. None has won and the Java Community has lost because, time after time, JRE modularity solutions appear to disappear. To use or not to use OSGi now is less of an issue as compared with the next challenge.

Backward Compatibility
In modular approach, each component should be sufficiently isolated and import relations should be declared statically. Besides that, import graph should be (close to) acyclic to minimize the number of indirectly used components.

The problem is that the reference implementation of the Java SE API was coded without having modularity in mind. Somewhat it was a side-effect of the Java’s lazy classloading which created an illusion that a use of any class in Java code costs nothing until it’s actually executed at run time. It’s proved to be a technical debt and now is the time to pay the interest.

In practice, it means that spaghetti-like dependencies between the implementing classes are omnipresent and breaking the ties without breaking backward compatibility with previous Java versions is double tough. The danger is to create something like Apache Harmony: everything is implemented with an elegant internal architecture, samples work, but existing Java apps have issues.

Profiles All Over Again?
Some outlines of Jigsaw mention so called Java SE Profiles, e.g. Profile for headless apps, for basic RIA, for rich desktop apps, etc. A potential threat here is the repetition of the same old story of the monolithic JRE unless user-defined profiles will be allowed. One size does not fit all even if the presets are defined carefully.

Need for Total Modularization
There is a risk to getting no benefits from the JRE modularity alone. Most real-world applications use third-party components (Java APIs), which are yet to be modularized. Typically, applications tend to use only a part of the functionality provided by an API. Without total modularization, chances are good that unnecessary parts of (modularized) JRE will be taken in due to use of a (monolithic) third-party component. And it’s no matter whether the import declarations will be written in OSGi bundle manifests or somewhere else.

Conclusion
In closing, I’d like to say that we implemented Java Runtime Slim-Down due to high demand from our clients. I strongly believe, however, that modularized JRE has to come with Java SE out-of-the-box, not just as a vendor solution. I wrote this article with a little hope that our past experience would be of some help for the future development of the Java platform.

P.S. One man said that Java is a big heavyweight ball and chain. Good news: the ball is now optional! You may detach it and use module Chain only.

More Stories By Vitaly Mikheev

Vitaly Mikheev is the chief technology officer for Excelsior, LLC, a company focusing on design and development of optimizing compilers. Vitaly has been involved in software development since 1987 and focused on compiler construction technologies for the last decade. He started working with Java in 1998 as the architect of the Excelsior Java Virtual Machine. Before that, he worked on proprietary optimizing compilers for Nortel Networks. Vitaly is a member of ACM and a co-author of the patent on the garbage collector algorithm implemented in the Samsung's J2ME CDC virtual machine. He holds an MS in computer science from the Novosibirsk State University, Russia.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@ThingsExpo Stories
"MobiDev is a software development company and we do complex, custom software development for everybody from entrepreneurs to large enterprises," explained Alan Winters, U.S. Head of Business Development at MobiDev, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
I think DevOps is now a rambunctious teenager - it's starting to get a mind of its own, wanting to get its own things but it still needs some adult supervision," explained Thomas Hooker, VP of marketing at CollabNet, in this SYS-CON.tv interview at DevOps Summit at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
Recently, WebRTC has a lot of eyes from market. The use cases of WebRTC are expanding - video chat, online education, online health care etc. Not only for human-to-human communication, but also IoT use cases such as machine to human use cases can be seen recently. One of the typical use-case is remote camera monitoring. With WebRTC, people can have interoperability and flexibility for deploying monitoring service. However, the benefit of WebRTC for IoT is not only its convenience and interopera...
Cloud-enabled transformation has evolved from cost saving measure to business innovation strategy -- one that combines the cloud with cognitive capabilities to drive market disruption. Learn how you can achieve the insight and agility you need to gain a competitive advantage. Industry-acclaimed CTO and cloud expert, Shankar Kalyana presents. Only the most exceptional IBMers are appointed with the rare distinction of IBM Fellow, the highest technical honor in the company. Shankar has also receive...
It is of utmost importance for the future success of WebRTC to ensure that interoperability is operational between web browsers and any WebRTC-compliant client. To be guaranteed as operational and effective, interoperability must be tested extensively by establishing WebRTC data and media connections between different web browsers running on different devices and operating systems. In his session at WebRTC Summit at @ThingsExpo, Dr. Alex Gouaillard, CEO and Founder of CoSMo Software, presented ...
WebRTC is great technology to build your own communication tools. It will be even more exciting experience it with advanced devices, such as a 360 Camera, 360 microphone, and a depth sensor camera. In his session at @ThingsExpo, Masashi Ganeko, a manager at INFOCOM Corporation, introduced two experimental projects from his team and what they learned from them. "Shotoku Tamago" uses the robot audition software HARK to track speakers in 360 video of a remote party. "Virtual Teleport" uses a multip...
Business professionals no longer wonder if they'll migrate to the cloud; it's now a matter of when. The cloud environment has proved to be a major force in transitioning to an agile business model that enables quick decisions and fast implementation that solidify customer relationships. And when the cloud is combined with the power of cognitive computing, it drives innovation and transformation that achieves astounding competitive advantage.
Data is the fuel that drives the machine learning algorithmic engines and ultimately provides the business value. In his session at Cloud Expo, Ed Featherston, a director and senior enterprise architect at Collaborative Consulting, discussed the key considerations around quality, volume, timeliness, and pedigree that must be dealt with in order to properly fuel that engine.
IoT is rapidly becoming mainstream as more and more investments are made into the platforms and technology. As this movement continues to expand and gain momentum it creates a massive wall of noise that can be difficult to sift through. Unfortunately, this inevitably makes IoT less approachable for people to get started with and can hamper efforts to integrate this key technology into your own portfolio. There are so many connected products already in place today with many hundreds more on the h...
When shopping for a new data processing platform for IoT solutions, many development teams want to be able to test-drive options before making a choice. Yet when evaluating an IoT solution, it’s simply not feasible to do so at scale with physical devices. Building a sensor simulator is the next best choice; however, generating a realistic simulation at very high TPS with ease of configurability is a formidable challenge. When dealing with multiple application or transport protocols, you would be...
Detecting internal user threats in the Big Data eco-system is challenging and cumbersome. Many organizations monitor internal usage of the Big Data eco-system using a set of alerts. This is not a scalable process given the increase in the number of alerts with the accelerating growth in data volume and user base. Organizations are increasingly leveraging machine learning to monitor only those data elements that are sensitive and critical, autonomously establish monitoring policies, and to detect...
In his keynote at 18th Cloud Expo, Andrew Keys, Co-Founder of ConsenSys Enterprise, provided an overview of the evolution of the Internet and the Database and the future of their combination – the Blockchain. Andrew Keys is Co-Founder of ConsenSys Enterprise. He comes to ConsenSys Enterprise with capital markets, technology and entrepreneurial experience. Previously, he worked for UBS investment bank in equities analysis. Later, he was responsible for the creation and distribution of life settl...
In his session at @ThingsExpo, Dr. Robert Cohen, an economist and senior fellow at the Economic Strategy Institute, presented the findings of a series of six detailed case studies of how large corporations are implementing IoT. The session explored how IoT has improved their economic performance, had major impacts on business models and resulted in impressive ROIs. The companies covered span manufacturing and services firms. He also explored servicification, how manufacturing firms shift from se...
DevOpsSummit New York 2018, colocated with CloudEXPO | DXWorldEXPO New York 2018 will be held November 11-13, 2018, in New York City. Digital Transformation (DX) is a major focus with the introduction of DXWorldEXPO within the program. Successful transformation requires a laser focus on being data-driven and on using all the tools available that enable transformation if they plan to survive over the long term. A total of 88% of Fortune 500 companies from a generation ago are now out of bus...
The Jevons Paradox suggests that when technological advances increase efficiency of a resource, it results in an overall increase in consumption. Writing on the increased use of coal as a result of technological improvements, 19th-century economist William Stanley Jevons found that these improvements led to the development of new ways to utilize coal. In his session at 19th Cloud Expo, Mark Thiele, Chief Strategy Officer for Apcera, compared the Jevons Paradox to modern-day enterprise IT, examin...
IoT solutions exploit operational data generated by Internet-connected smart “things” for the purpose of gaining operational insight and producing “better outcomes” (for example, create new business models, eliminate unscheduled maintenance, etc.). The explosive proliferation of IoT solutions will result in an exponential growth in the volume of IoT data, precipitating significant Information Governance issues: who owns the IoT data, what are the rights/duties of IoT solutions adopters towards t...
Amazon started as an online bookseller 20 years ago. Since then, it has evolved into a technology juggernaut that has disrupted multiple markets and industries and touches many aspects of our lives. It is a relentless technology and business model innovator driving disruption throughout numerous ecosystems. Amazon’s AWS revenues alone are approaching $16B a year making it one of the largest IT companies in the world. With dominant offerings in Cloud, IoT, eCommerce, Big Data, AI, Digital Assista...
Organizations planning enterprise data center consolidation and modernization projects are faced with a challenging, costly reality. Requirements to deploy modern, cloud-native applications simultaneously with traditional client/server applications are almost impossible to achieve with hardware-centric enterprise infrastructure. Compute and network infrastructure are fast moving down a software-defined path, but storage has been a laggard. Until now.
Digital Transformation is much more than a buzzword. The radical shift to digital mechanisms for almost every process is evident across all industries and verticals. This is often especially true in financial services, where the legacy environment is many times unable to keep up with the rapidly shifting demands of the consumer. The constant pressure to provide complete, omnichannel delivery of customer-facing solutions to meet both regulatory and customer demands is putting enormous pressure on...
In his general session at 19th Cloud Expo, Manish Dixit, VP of Product and Engineering at Dice, discussed how Dice leverages data insights and tools to help both tech professionals and recruiters better understand how skills relate to each other and which skills are in high demand using interactive visualizations and salary indicator tools to maximize earning potential. Manish Dixit is VP of Product and Engineering at Dice. As the leader of the Product, Engineering and Data Sciences team at D...