Welcome!

Weblogic Authors: Yeshim Deniz, Elizabeth White, Michael Meiner, Michael Bushong, Avi Rosenthal

Related Topics: Weblogic

Weblogic: Article

What's Wrong with Web Applications

And What to Do About It

Criticizing something as wildly successful as the World Wide Web seems a bit radical and potentially unpopular. There is no doubt that Tim Berners-Lee's elegantly simple invention enabled an unprecedented revolution in the way computers are used and by whom.

The amount of information currently accessible to anyone with an Internet connection is truly mind-boggling, and I do not think it is an exaggeration to say that the Web is the most revolutionary invention since the automobile in terms of the effect that it has had on how we live our lives.

So what brings me to gore this sacred cow? How can I be negative about something that everyone acknowledges as a technological marvel?

To be honest, most Web applications are distinctly user-unfriendly. The Web's undisputed power to access and present information has been mistaken for a universal entry point to computing and information resources. The view that every interaction between user and computer is a nail to be banged in by the hammer of the Web browser and server has been a step backward for both users and developers. Users have to live with horribly restrictive modal form-based solutions stripped of powerful UI paradigms common in conventionally delivered software. For instance, drag and drop and the ability to save application state in a locally stored document are not available. Developers suffer because Web-based architectures divide an application along illogical boundaries, which complicate the creation of a seamless user experience, and require the use of multiple divergent programming languages and frameworks.

These problems are primarily the result of taking a great idea and stretching it beyond the scope for which it was intended. The Web was initially about access to static content. It really wasn't much different from existing hypertext systems, except that the content was transparently distributed on the Internet - a small difference in implementation, a huge difference in results. Suddenly, authors could easily incorporate into their own works other material from a vast virtual library.

However the rapid adoption of the Web and the lure of its commercial potential made this technology newcomer the apparent solution to all the problems in the world. It didn't happen immediately, but it wasn't long before the Web browser became more than just a viewer, but a generic application deployment platform. Slowly but surely, dedicated client programs in traditional three-tier architectures were replaced by interfaces implemented as Web pages. There are some valid reasons for making this transition, but in the rush to migrate seemingly every application to a browser-based model, decision-makers threw the baby out with the bathwater. The fact of the matter is that HTML makes a very poor user interface layer. Even when tarted up - as it must be to even be functional - with Javascript or other client-side scripting solutions, browser-based interfaces fall far short of creating a rewarding experience for the user.

In addition to shortchanging the end user, webifying an application puts a heavy load on the developer, who suddenly must contend with a multitude of thorny issues in the areas of portability, state-management, multiple implementation languages, client-side vs. server-side operations, and security. This article examines these problems for users and developers in detail, with examples of the pros and (mostly) cons of Web-based applications. I'll conclude with suggestions for alternatives that include the positive aspects of Web applications, but without the downsides.

I am not arguing that all Web-based applications are clumsy and evil and should be eliminated. Particularly when you wish to make your application available to the entire Internet world and there are no existing standards or protocols for that application, choosing to deliver on the Web may be the only choice. I am primarily concerned with the webification of applications provided by corporate IT departments, such as those for submitting travel expense reports, managing benefit packages, or reporting defects. First though, let's examine an application that falls outside that category.

Before the World Wide Web, there was Usenet, an Internet-based system for discussion groups. Information was exchanged between server systems via the Network News Transport Protocol (NNTP). The beauty of Usenet was that it did not specify a particular client interface. Users could choose among a variety of "news readers" according to their particular preferences. Depending on the client program, posts could be made with the editor of the user's choice. Emacs users could even read news and post responses from within the editor. It was the best of both worlds: a standard that allowed the open exchange of information with people all over the world, but that also allowed users the freedom to interact with that system in the manner they found most productive.

Usenet has not gone away; there are now even Web-based news readers such as the one provided by Google. But now, forums that once would have been well served by Usenet have been moved to discussion systems that are accessible only via a particular Web site. If you want to participate, you have to use the interface provided by that Web site. You will have to create a new user account if you want to contribute to discussions and you will have to edit your posts with a somewhat-less-than-full-featured standard browser text box. So much for progress.

There are political reasons service providers might prefer the closed Web-based solution. It allows them to identify their users and to control and monitor access to the site. But these advantages to the provider of the service come at the expense of a poorer experience for the users of the service. Similar trade-offs can be seen in the evolution of travel expense report applications. At one company, paper travel expense forms were replaced by a standardized spreadsheet that the user would complete, print, sign, and submit.

It was a simple solution that gave the user power and flexibility. She had the full interface of a standard spreadsheet program at her disposal, the ability to save partially completed forms as well as save local copies of previously submitted forms. Spreadsheet fanatics, if so inclined, could graph their daily expenses, while the tabular presentation of data made it easy for mortals to notice if they forgot to include the charge for Wednesday's lunch. This document-centric scheme has since been replaced by a Web-based application that automates the logistical aspects of expense reporting. Report submission and approval are now done electronically via a Web-based application that routes reports through the designated approval chain. From a bureaucratic perspective, things are much neater. Managers no longer have to shuffle paper, bean counters get a centralized view of the process at all stages, and even the lowly employee submitting the report gets an easy way to track it through to reimbursement.

Unfortunately, the initial creation of the report is cumbersome, forms-based tedium, where each expense has to be entered one at a time. And after the expenses are entered one by one in isolation, instead of a nice calendar-based display, the user sees only a linear list of expenses to be included in the report. The interface itself provides practically no support to the user for organizing expenses, which again leads us to this irony of supposed progress. Yes, the Web-based application automates and centralizes the process. But again, it does so at the expense of crippling the end-user with an unhelpful, inflexible, and arcane user interface.

Why does this seem to be the inevitable result of webifying an application? Because the Web is a horribly inadequate user-interface solution platform, crafting a full-featured Web application that functions correctly is an exercise in cunning and frustration for the developer, while the end result typically tests a user's patience. Let's look at the restrictions that make it so.

There are two primary obstacles. The first one is that the Web is based on a stateless protocol. The predominant model of application design for the last two decades has centered on a document (i.e., a collection of state information) that the application operates on. In a conventionally programmed application, the full strength of the particular programming language employed can be brought to bear on the problem of representing the data being operated on. In Web-based applications, the programmer has a choice of several less-than-optimal techniques for maintaining state: fields (some hidden) in forms stored on the Web page itself, or server-side state maintained by various tricks for each user session or request. Each has particular drawbacks.

Values stored in forms-based storage might not be propagated between pages and can be viewed and possibly altered by clever users, hence posing security risks. Server-side storage may not scale well, or the techniques used to correlate a particular user with a particular set of program state, like cookies, may be disabled by the user's browser. Further, accessing the state information is not entirely transparent to the programmer and available storage resources on the client computer will probably go unused because the computations requiring the state occur on the server system. Finally, depending on the cleverness of the Web site developer, the document metaphor familiar to the user goes entirely out the window. For instance, there is typically no way to save multiple shopping carts for the same online vendor, the way a conventional word processor could save two versions of my annual letter to Santa.

The second hurdle is the fact that HTML and HTTP are technologies oriented entirely around the concept of the page. Rather than directly manipulate the user-interface components displayed to the user, the Web server has to render a page based on the last user request. Hence, even the simplest operation - like scrolling to the next page of a list of items - entails a round-trip to the server. The URL page request paradigm just does not map well to an interactive application. This is perhaps an oversimplification, but imagine telling an application developer of the late 80s or early 90s that any possible state or screen of the application had to be addressable by a string (i.e., URL).

Now it is true that a vast amount of work has been done to provide frameworks and utilities that abstract that task away from the direct responsibility of the programmer. For instance, WebLogic Workshop's Struts-based Java Page Flow architecture simplifies and reduces the work required to implement the series of pages a user traverses to achieve a particular task. It's way easier than implementing the form and state management logic anew for every single application. However toolkits and frameworks, while simplifying life for programmers, don't make the inherent problems go away. They still show up where browser bookmarks are added in the middle of a multiple page state-dependent sequence, in accidental multiple presses of a submit button, in refreshes of a page that resulted from a POST rather than a GET. I am certain any experienced Web developer can add to the list. Think of how much more productivity could be on tap if framework developers didn't need to compensate for such a crippled application model.

A further consequence of using the Web as an application platform is the ever-present browser/server dichotomy. Can a particular functionality (such as form validation) be provided by the browser or must the logic reside on the server? Again, lots of programming effort has been expended to attempt to add user interface functionality to the browser. Most contemporary Web applications would be nonstarters without the availability of client-side scripting tools like JavaScript that were not even part of the original conception of the World Wide Web. Just as with server-side page generation frameworks, there is a wide range of solutions available for this problem: ActiveX components, VBScript, Java applets, Flash, and DHTML.

While undoubtedly powerful, these tools complicate matters for the developer. To begin with, they entail possible compatibility issues. If you decide to use VBScript, you've suddenly excluded any user not running Internet Explorer. Java Applets are great, giving you a "real" language in which to program your user interface, but what happens when the user's browser doesn't run a compatible JVM or she doesn't have the proper Java plug-in installed?

Beyond compatibility issues, these client-side technologies complicate development project roles. While graphic designers might have responsibility for the purely HTML aspects of a Web page, we can't count on them to have the programming chops to embed the JavaScript required to populate a SELECT item with the desired options. Truth be told, we can't expect even trained programmers to always understand the incantations necessary to achieve the desired effect via client-side scripting, particularly when the project requires the interface to be functional whether or not JavaScript is enabled on the user's browser. Judging from the number of forums and "tip-sheets" dedicated to the intricate tricks of getting even the simplest UI functionality to operate in the Web environment, it is a rare programmer indeed who is fully competent in all of the tricks of the trade. JSPs and tag libraries were supposed to help separate the roles of graphic designer and programmer, but the necessity of resorting to some form of client-side wizardry to create even the most rudimentary interface defies that solution.

Even if one uses the browser plug-in technologies that provide for a full-featured interface such as Flash and Java applets, besides the above-mentioned potential browser compatibility problems, these applications will operate in isolation from other similar Web applications. For instance, there's no dropping an object from a Java applet onto a Flash application and having something meaningful happen. Such functionality might be possible with Active-X plug-ins, but they are proprietary solutions that only work on Windows platforms and configuring them to allow such interoperability could quite easily create security issues for the client computer.

All right. Enough complaining already. Even if you don't necessarily agree with all of my arguments, I hope you at least grant that the average user of Web applications won't ever rave about the Web user interface the way someone might that of a conventionally programmed application like iTunes or Photoshop. Does it have to be that way? Is there not a path that can give us the connectedness of the Web along with a powerful, intuitive user interface?

We want technology to enable the creation of client programs with full-featured user interfaces that can access server resources over the Internet. This is nothing new; client server architectures were the hot technology prior to the advent of the Web, although client server applications were primarily deployed on intranets rather than the Internet. So how might we incorporate Web concepts to advance the technology? Some of the benefits of Web applications that would be desirable to retain are:

  1. Zero administration. Web users don't have to install anything to run their applications. Updates are transparent.
  2. Flexibly networked. For instance, an application for planning travel should be able to access flights, hotels, and car rentals from multiple vendors.
  3. The ability to render and display HTML. While I have complained vociferously about the restrictions of HTML and HTTP, they are both firmly entrenched and there are application areas, such as online help, that would benefit from a modularized way of displaying HTML content.
There are no technical reasons that prevent the development of solutions that address these requirements. In fact, technologies exist today that address these needs; they're just underutilized in this context. I will discuss two possible approaches: first, thick clients written in Java and delivered over the network, and secondly, AJAX (Asynchronous JavaScript And XML), a relatively new technology that transfers processing power to the client-side (though still within a browser) and also mitigates the need to refresh an entire Web page for each transfer of data from the server. For the dedicated client path, requirement number 1 is fulfilled by Sun's Java Web Start and its underlying Java Network Launch Protocol (JNLP). For some reason, this simple and effective solution has never really caught on. It's been available for over four years and is now a standard part of the Java runtime.

You can read all about it at the Java Web Start home page (http://java.sun.com/products/javawebstart/index.jsp), but in a nutshell, you package your application into a JAR file or files, write a JNLP descriptor file and an HTML file with a link to that descriptor file, and then when the user of a properly configured browser (MIME types you know) clicks on that link, Java Web Start fires up and invokes the application. The JAR files are downloaded and executed on the local client machine. Henceforth, running the application does not require re-downloading the JAR files unless the application has been updated and the user requests the new version.

By default, Java Web Start applications run in a secure environment similar to the Java applet sandbox, but there are JNLP APIs for use by applications that need to access the local clipboard or disk. Applications that are implemented by signed JAR files can run in an unrestricted environment (once the user approves).

It is a mystery to me why Java Web Start has not gained more traction with the developer community. Java is a powerful, easy language for creating full-featured applications that are highly portable and Java Web Start is an easy to use tool for deploying such applications. So, no excuses for not having Requirement 1 fulfilled.

What about Requirement 2? You'd have to be dead or deaf to not be aware of all the work that's been done over the past few years in the area popularly called "Web services" or "service-oriented architectures." The XML-based technologies of UDDI, WSDL, and SOAP have made possible a sort of uber-RPC for network clients where RPC servers with various functionalities can be discovered and those functionalities accessed. While in my opinion the reality in this arena has not matched the hype, there is no doubt that the potential is there. Google has WSDL and SOAP-based APIs available as a beta (so it is somewhat disappointing that their desktop search tool is based on HTML pages rather than a thick client).

The big obstacle here is the standardization of APIs for given functionalities. UDDI and WSDL describe functionality and how to access it, but if two different travel vendors (for instance) use different models for their Web services, it is a lot of work for the client developer to account for the differences in each and every provider they want to access. These issues are being addressed, but how quickly industries will define and adopt standard interfaces is unknown.

So, it would seem that Requirement 2 is partially fulfilled with more promise for the future. Incidentally, for an interesting report on combining Java Web Start and Web service technologies, see Allan Poda's devx.com paper entitled "Leverage JNLP and SOAP for Java Thick-client Development" (www.devx.com/Java/Article/22537), where he describes his experiments migrating a previous Web-based interface to a Java thick-client interface.

Requirement 3 (for some sort of HTML rendering capability within the thick client) is, as they say, "a simple matter of software." There are commercial products available, such as ICEBrowser and ICEReader from ICESOFT. There is also a number of freely available implementation including the barebones Swing HTMLEditorKit built into the Java runtime. Capabilities of these components obviously vary. You can find a good rundown of several freely licensable implementations in the article "Java Sketchbook: The HTML Renderer Shootout, Part 1" at http://today.java.net/pub/a/today/2004/05/24/html-pt1.html.

We haven't touched on requirements for security. Obviously this is vital, particularly for apps that are to be delivered over the greater Internet. Component code-signing techniques provide some measure of safety for downloading and running code, but users need to be educated to make good decisions when faced with the dialog that asks "Do you want to install and run: Bizmumble Signed and distributed by: So and So corporation?"

A full discussion of the security requirements and infrastructure for Web services would take an entire book (or two) of its own. The architects of Web services technologies understand the importance of security, and so it has not been neglected in the design and implementation of the technologies that make up Web services. Understanding the implications and necessities for a truly secure Internet application might be daunting, but not impossible. If you want to stick with a pure browser-based solution, AJAX is a clever conglomeration of existing technologies that while not perfect, smoothes over many of the UI bumps. While the technology has been around for several years, it is getting new attention due to its use in several Google Web applications including Gmail, Google Maps, and Google Suggestions. A complete description is beyond the scope of this article, but essentially it is sophisticated JavaScript programming that takes advantage of XMLHttpRequest objects that make asynchronous server requests and update user interface components based on the results of these asynchronous queries. It eliminates the need to refresh the entire page simply to update one form element. For more info, try Googling "ajax Web interface" or see Jesse James Garret's excellent overview at http://adaptivepath.com/publications/essays/archives/000385.php.

So, from the examples presented, it is clear that the technologies required to replace a clunky HTML-based Web application with an application having a more full-featured user interface exist today. There may be other options in addition to the solution technology examples I provided. I believe that the goal of Microsoft's .NET is to realize a sort of "programmable Web" that would include thick client applications.

I grant that not all of these pieces that enable networked application interfaces superior to those of browser-based applications are fully formed and developed. But the groundwork is being laid. I hope that unlike the relatively weak adoption of Java Web Start, that the story for Web services and thick clients is a different one. Users deserve modern user interfaces for the programs they rely on, not forms-based tedium more appropriate for the days of block terminals. Developers and service providers need to start thinking outside the browser box and consider the benefits of thick network clients that use Web services, or at least think about enriching the user experience via innovative technologies like Ajax. They could get a much-needed boost in that direction if suppliers of the supporting infrastructure, like BEA, would move to provide support for these technologies in their development tools and product paradigms. In this way, everyone wins: users get more usable applications, developers get the credit, and tool vendors create pull for the new versions of their products by including innovative new features.

More Stories By Channing Benson

Channing Benson is a senior technical consultant at Hewlett-Packard. He works with HP's software partners in the areas of Java performance, J2EE architectures, and Itanium migration. A 20-year industry veteran, his previous areas of expertise include Lisp and X11/Motif. His recreational interests are music, snowboarding, backcountry skiing, and Ultimate Frisbee.

Comments (4) View Comments

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


Most Recent Comments
Alex Rudkevich 04/08/05 04:30:46 PM EDT

I believe, the website moderator's role is to filter out such "comments" as the one submitted by D.Diggerman.
I am absolutely agree with the author of the article, especially in the part concerning the nightmare of client-side development for different browsers with unpredictable behaviour. From the more general point of view, software development is always restricted by a paradigm, which very soon becomes a dogma. The material about new alternativ approaches is really useful.

D Diggerman 04/07/05 10:12:59 PM EDT

Channing Benson's rave is based on the adjective filled junk written in the paragraph "To be honest, most...". Yes the web is full of rubbish of which the article is part of. "multiple divergent programming languages and frameworks... " what the frig is that! "Developers suffer because Web-based architectures divide an application along illogical boundaries which complicate the creation of a seamless user experience..". Whose suffering the user or developer? "Seamless"?

Please be a good cyber citizen and refrain from clogging up disk storage with rubbish.

D Diggerman

wwwWWW 04/05/05 12:36:03 PM EDT

|| Criticizing something as wildly successful as the World Wide Web seems a bit radical and potentially unpopular. ||

Understatement of the century!

Scott 03/30/05 12:15:26 PM EST

> I hope you at least grant that the average user of Web
> applications won't ever rave about the Web user interface
> the way someone might that of a conventionally programmed
> application like iTunes or Photoshop.

The author is assuming that the usability and state persistence issues of web applications will not be fixed and improved upon. The growing popularity of Ajax and standardized portlets are proving this to be a false assumption.

@ThingsExpo Stories
Charles Araujo is an industry analyst, internationally recognized authority on the Digital Enterprise and author of The Quantum Age of IT: Why Everything You Know About IT is About to Change. As Principal Analyst with Intellyx, he writes, speaks and advises organizations on how to navigate through this time of disruption. He is also the founder of The Institute for Digital Transformation and a sought after keynote speaker. He has been a regular contributor to both InformationWeek and CIO Insight...
DXWorldEXPO LLC, the producer of the world's most influential technology conferences and trade shows has announced the 22nd International CloudEXPO | DXWorldEXPO "Early Bird Registration" is now open. Register for Full Conference "Gold Pass" ▸ Here (Expo Hall ▸ Here)
Join IBM November 1 at 21st Cloud Expo at the Santa Clara Convention Center in Santa Clara, CA, and learn how IBM Watson can bring cognitive services and AI to intelligent, unmanned systems. Cognitive analysis impacts today’s systems with unparalleled ability that were previously available only to manned, back-end operations. Thanks to cloud processing, IBM Watson can bring cognitive services and AI to intelligent, unmanned systems. Imagine a robot vacuum that becomes your personal assistant tha...
"MobiDev is a software development company and we do complex, custom software development for everybody from entrepreneurs to large enterprises," explained Alan Winters, U.S. Head of Business Development at MobiDev, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
I think DevOps is now a rambunctious teenager - it's starting to get a mind of its own, wanting to get its own things but it still needs some adult supervision," explained Thomas Hooker, VP of marketing at CollabNet, in this SYS-CON.tv interview at DevOps Summit at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
Recently, WebRTC has a lot of eyes from market. The use cases of WebRTC are expanding - video chat, online education, online health care etc. Not only for human-to-human communication, but also IoT use cases such as machine to human use cases can be seen recently. One of the typical use-case is remote camera monitoring. With WebRTC, people can have interoperability and flexibility for deploying monitoring service. However, the benefit of WebRTC for IoT is not only its convenience and interopera...
Cloud-enabled transformation has evolved from cost saving measure to business innovation strategy -- one that combines the cloud with cognitive capabilities to drive market disruption. Learn how you can achieve the insight and agility you need to gain a competitive advantage. Industry-acclaimed CTO and cloud expert, Shankar Kalyana presents. Only the most exceptional IBMers are appointed with the rare distinction of IBM Fellow, the highest technical honor in the company. Shankar has also receive...
It is of utmost importance for the future success of WebRTC to ensure that interoperability is operational between web browsers and any WebRTC-compliant client. To be guaranteed as operational and effective, interoperability must be tested extensively by establishing WebRTC data and media connections between different web browsers running on different devices and operating systems. In his session at WebRTC Summit at @ThingsExpo, Dr. Alex Gouaillard, CEO and Founder of CoSMo Software, presented ...
Business professionals no longer wonder if they'll migrate to the cloud; it's now a matter of when. The cloud environment has proved to be a major force in transitioning to an agile business model that enables quick decisions and fast implementation that solidify customer relationships. And when the cloud is combined with the power of cognitive computing, it drives innovation and transformation that achieves astounding competitive advantage.
WebRTC is great technology to build your own communication tools. It will be even more exciting experience it with advanced devices, such as a 360 Camera, 360 microphone, and a depth sensor camera. In his session at @ThingsExpo, Masashi Ganeko, a manager at INFOCOM Corporation, introduced two experimental projects from his team and what they learned from them. "Shotoku Tamago" uses the robot audition software HARK to track speakers in 360 video of a remote party. "Virtual Teleport" uses a multip...
Data is the fuel that drives the machine learning algorithmic engines and ultimately provides the business value. In his session at Cloud Expo, Ed Featherston, a director and senior enterprise architect at Collaborative Consulting, discussed the key considerations around quality, volume, timeliness, and pedigree that must be dealt with in order to properly fuel that engine.
IoT is rapidly becoming mainstream as more and more investments are made into the platforms and technology. As this movement continues to expand and gain momentum it creates a massive wall of noise that can be difficult to sift through. Unfortunately, this inevitably makes IoT less approachable for people to get started with and can hamper efforts to integrate this key technology into your own portfolio. There are so many connected products already in place today with many hundreds more on the h...
When shopping for a new data processing platform for IoT solutions, many development teams want to be able to test-drive options before making a choice. Yet when evaluating an IoT solution, it’s simply not feasible to do so at scale with physical devices. Building a sensor simulator is the next best choice; however, generating a realistic simulation at very high TPS with ease of configurability is a formidable challenge. When dealing with multiple application or transport protocols, you would be...
Detecting internal user threats in the Big Data eco-system is challenging and cumbersome. Many organizations monitor internal usage of the Big Data eco-system using a set of alerts. This is not a scalable process given the increase in the number of alerts with the accelerating growth in data volume and user base. Organizations are increasingly leveraging machine learning to monitor only those data elements that are sensitive and critical, autonomously establish monitoring policies, and to detect...
In his keynote at 18th Cloud Expo, Andrew Keys, Co-Founder of ConsenSys Enterprise, provided an overview of the evolution of the Internet and the Database and the future of their combination – the Blockchain. Andrew Keys is Co-Founder of ConsenSys Enterprise. He comes to ConsenSys Enterprise with capital markets, technology and entrepreneurial experience. Previously, he worked for UBS investment bank in equities analysis. Later, he was responsible for the creation and distribution of life settl...
In his session at @ThingsExpo, Dr. Robert Cohen, an economist and senior fellow at the Economic Strategy Institute, presented the findings of a series of six detailed case studies of how large corporations are implementing IoT. The session explored how IoT has improved their economic performance, had major impacts on business models and resulted in impressive ROIs. The companies covered span manufacturing and services firms. He also explored servicification, how manufacturing firms shift from se...
DevOpsSummit New York 2018, colocated with CloudEXPO | DXWorldEXPO New York 2018 will be held November 11-13, 2018, in New York City. Digital Transformation (DX) is a major focus with the introduction of DXWorldEXPO within the program. Successful transformation requires a laser focus on being data-driven and on using all the tools available that enable transformation if they plan to survive over the long term. A total of 88% of Fortune 500 companies from a generation ago are now out of bus...
The Jevons Paradox suggests that when technological advances increase efficiency of a resource, it results in an overall increase in consumption. Writing on the increased use of coal as a result of technological improvements, 19th-century economist William Stanley Jevons found that these improvements led to the development of new ways to utilize coal. In his session at 19th Cloud Expo, Mark Thiele, Chief Strategy Officer for Apcera, compared the Jevons Paradox to modern-day enterprise IT, examin...
IoT solutions exploit operational data generated by Internet-connected smart “things” for the purpose of gaining operational insight and producing “better outcomes” (for example, create new business models, eliminate unscheduled maintenance, etc.). The explosive proliferation of IoT solutions will result in an exponential growth in the volume of IoT data, precipitating significant Information Governance issues: who owns the IoT data, what are the rights/duties of IoT solutions adopters towards t...
Amazon started as an online bookseller 20 years ago. Since then, it has evolved into a technology juggernaut that has disrupted multiple markets and industries and touches many aspects of our lives. It is a relentless technology and business model innovator driving disruption throughout numerous ecosystems. Amazon’s AWS revenues alone are approaching $16B a year making it one of the largest IT companies in the world. With dominant offerings in Cloud, IoT, eCommerce, Big Data, AI, Digital Assista...