Welcome!

Weblogic Authors: Yeshim Deniz, Elizabeth White, Michael Meiner, Michael Bushong, Avi Rosenthal

Related Topics: Weblogic

Weblogic: Article

What's Wrong with Web Applications

And What to Do About It

Criticizing something as wildly successful as the World Wide Web seems a bit radical and potentially unpopular. There is no doubt that Tim Berners-Lee's elegantly simple invention enabled an unprecedented revolution in the way computers are used and by whom.

The amount of information currently accessible to anyone with an Internet connection is truly mind-boggling, and I do not think it is an exaggeration to say that the Web is the most revolutionary invention since the automobile in terms of the effect that it has had on how we live our lives.

So what brings me to gore this sacred cow? How can I be negative about something that everyone acknowledges as a technological marvel?

To be honest, most Web applications are distinctly user-unfriendly. The Web's undisputed power to access and present information has been mistaken for a universal entry point to computing and information resources. The view that every interaction between user and computer is a nail to be banged in by the hammer of the Web browser and server has been a step backward for both users and developers. Users have to live with horribly restrictive modal form-based solutions stripped of powerful UI paradigms common in conventionally delivered software. For instance, drag and drop and the ability to save application state in a locally stored document are not available. Developers suffer because Web-based architectures divide an application along illogical boundaries, which complicate the creation of a seamless user experience, and require the use of multiple divergent programming languages and frameworks.

These problems are primarily the result of taking a great idea and stretching it beyond the scope for which it was intended. The Web was initially about access to static content. It really wasn't much different from existing hypertext systems, except that the content was transparently distributed on the Internet - a small difference in implementation, a huge difference in results. Suddenly, authors could easily incorporate into their own works other material from a vast virtual library.

However the rapid adoption of the Web and the lure of its commercial potential made this technology newcomer the apparent solution to all the problems in the world. It didn't happen immediately, but it wasn't long before the Web browser became more than just a viewer, but a generic application deployment platform. Slowly but surely, dedicated client programs in traditional three-tier architectures were replaced by interfaces implemented as Web pages. There are some valid reasons for making this transition, but in the rush to migrate seemingly every application to a browser-based model, decision-makers threw the baby out with the bathwater. The fact of the matter is that HTML makes a very poor user interface layer. Even when tarted up - as it must be to even be functional - with Javascript or other client-side scripting solutions, browser-based interfaces fall far short of creating a rewarding experience for the user.

In addition to shortchanging the end user, webifying an application puts a heavy load on the developer, who suddenly must contend with a multitude of thorny issues in the areas of portability, state-management, multiple implementation languages, client-side vs. server-side operations, and security. This article examines these problems for users and developers in detail, with examples of the pros and (mostly) cons of Web-based applications. I'll conclude with suggestions for alternatives that include the positive aspects of Web applications, but without the downsides.

I am not arguing that all Web-based applications are clumsy and evil and should be eliminated. Particularly when you wish to make your application available to the entire Internet world and there are no existing standards or protocols for that application, choosing to deliver on the Web may be the only choice. I am primarily concerned with the webification of applications provided by corporate IT departments, such as those for submitting travel expense reports, managing benefit packages, or reporting defects. First though, let's examine an application that falls outside that category.

Before the World Wide Web, there was Usenet, an Internet-based system for discussion groups. Information was exchanged between server systems via the Network News Transport Protocol (NNTP). The beauty of Usenet was that it did not specify a particular client interface. Users could choose among a variety of "news readers" according to their particular preferences. Depending on the client program, posts could be made with the editor of the user's choice. Emacs users could even read news and post responses from within the editor. It was the best of both worlds: a standard that allowed the open exchange of information with people all over the world, but that also allowed users the freedom to interact with that system in the manner they found most productive.

Usenet has not gone away; there are now even Web-based news readers such as the one provided by Google. But now, forums that once would have been well served by Usenet have been moved to discussion systems that are accessible only via a particular Web site. If you want to participate, you have to use the interface provided by that Web site. You will have to create a new user account if you want to contribute to discussions and you will have to edit your posts with a somewhat-less-than-full-featured standard browser text box. So much for progress.

There are political reasons service providers might prefer the closed Web-based solution. It allows them to identify their users and to control and monitor access to the site. But these advantages to the provider of the service come at the expense of a poorer experience for the users of the service. Similar trade-offs can be seen in the evolution of travel expense report applications. At one company, paper travel expense forms were replaced by a standardized spreadsheet that the user would complete, print, sign, and submit.

It was a simple solution that gave the user power and flexibility. She had the full interface of a standard spreadsheet program at her disposal, the ability to save partially completed forms as well as save local copies of previously submitted forms. Spreadsheet fanatics, if so inclined, could graph their daily expenses, while the tabular presentation of data made it easy for mortals to notice if they forgot to include the charge for Wednesday's lunch. This document-centric scheme has since been replaced by a Web-based application that automates the logistical aspects of expense reporting. Report submission and approval are now done electronically via a Web-based application that routes reports through the designated approval chain. From a bureaucratic perspective, things are much neater. Managers no longer have to shuffle paper, bean counters get a centralized view of the process at all stages, and even the lowly employee submitting the report gets an easy way to track it through to reimbursement.

Unfortunately, the initial creation of the report is cumbersome, forms-based tedium, where each expense has to be entered one at a time. And after the expenses are entered one by one in isolation, instead of a nice calendar-based display, the user sees only a linear list of expenses to be included in the report. The interface itself provides practically no support to the user for organizing expenses, which again leads us to this irony of supposed progress. Yes, the Web-based application automates and centralizes the process. But again, it does so at the expense of crippling the end-user with an unhelpful, inflexible, and arcane user interface.

Why does this seem to be the inevitable result of webifying an application? Because the Web is a horribly inadequate user-interface solution platform, crafting a full-featured Web application that functions correctly is an exercise in cunning and frustration for the developer, while the end result typically tests a user's patience. Let's look at the restrictions that make it so.

There are two primary obstacles. The first one is that the Web is based on a stateless protocol. The predominant model of application design for the last two decades has centered on a document (i.e., a collection of state information) that the application operates on. In a conventionally programmed application, the full strength of the particular programming language employed can be brought to bear on the problem of representing the data being operated on. In Web-based applications, the programmer has a choice of several less-than-optimal techniques for maintaining state: fields (some hidden) in forms stored on the Web page itself, or server-side state maintained by various tricks for each user session or request. Each has particular drawbacks.

Values stored in forms-based storage might not be propagated between pages and can be viewed and possibly altered by clever users, hence posing security risks. Server-side storage may not scale well, or the techniques used to correlate a particular user with a particular set of program state, like cookies, may be disabled by the user's browser. Further, accessing the state information is not entirely transparent to the programmer and available storage resources on the client computer will probably go unused because the computations requiring the state occur on the server system. Finally, depending on the cleverness of the Web site developer, the document metaphor familiar to the user goes entirely out the window. For instance, there is typically no way to save multiple shopping carts for the same online vendor, the way a conventional word processor could save two versions of my annual letter to Santa.

The second hurdle is the fact that HTML and HTTP are technologies oriented entirely around the concept of the page. Rather than directly manipulate the user-interface components displayed to the user, the Web server has to render a page based on the last user request. Hence, even the simplest operation - like scrolling to the next page of a list of items - entails a round-trip to the server. The URL page request paradigm just does not map well to an interactive application. This is perhaps an oversimplification, but imagine telling an application developer of the late 80s or early 90s that any possible state or screen of the application had to be addressable by a string (i.e., URL).

Now it is true that a vast amount of work has been done to provide frameworks and utilities that abstract that task away from the direct responsibility of the programmer. For instance, WebLogic Workshop's Struts-based Java Page Flow architecture simplifies and reduces the work required to implement the series of pages a user traverses to achieve a particular task. It's way easier than implementing the form and state management logic anew for every single application. However toolkits and frameworks, while simplifying life for programmers, don't make the inherent problems go away. They still show up where browser bookmarks are added in the middle of a multiple page state-dependent sequence, in accidental multiple presses of a submit button, in refreshes of a page that resulted from a POST rather than a GET. I am certain any experienced Web developer can add to the list. Think of how much more productivity could be on tap if framework developers didn't need to compensate for such a crippled application model.

A further consequence of using the Web as an application platform is the ever-present browser/server dichotomy. Can a particular functionality (such as form validation) be provided by the browser or must the logic reside on the server? Again, lots of programming effort has been expended to attempt to add user interface functionality to the browser. Most contemporary Web applications would be nonstarters without the availability of client-side scripting tools like JavaScript that were not even part of the original conception of the World Wide Web. Just as with server-side page generation frameworks, there is a wide range of solutions available for this problem: ActiveX components, VBScript, Java applets, Flash, and DHTML.

While undoubtedly powerful, these tools complicate matters for the developer. To begin with, they entail possible compatibility issues. If you decide to use VBScript, you've suddenly excluded any user not running Internet Explorer. Java Applets are great, giving you a "real" language in which to program your user interface, but what happens when the user's browser doesn't run a compatible JVM or she doesn't have the proper Java plug-in installed?

Beyond compatibility issues, these client-side technologies complicate development project roles. While graphic designers might have responsibility for the purely HTML aspects of a Web page, we can't count on them to have the programming chops to embed the JavaScript required to populate a SELECT item with the desired options. Truth be told, we can't expect even trained programmers to always understand the incantations necessary to achieve the desired effect via client-side scripting, particularly when the project requires the interface to be functional whether or not JavaScript is enabled on the user's browser. Judging from the number of forums and "tip-sheets" dedicated to the intricate tricks of getting even the simplest UI functionality to operate in the Web environment, it is a rare programmer indeed who is fully competent in all of the tricks of the trade. JSPs and tag libraries were supposed to help separate the roles of graphic designer and programmer, but the necessity of resorting to some form of client-side wizardry to create even the most rudimentary interface defies that solution.

Even if one uses the browser plug-in technologies that provide for a full-featured interface such as Flash and Java applets, besides the above-mentioned potential browser compatibility problems, these applications will operate in isolation from other similar Web applications. For instance, there's no dropping an object from a Java applet onto a Flash application and having something meaningful happen. Such functionality might be possible with Active-X plug-ins, but they are proprietary solutions that only work on Windows platforms and configuring them to allow such interoperability could quite easily create security issues for the client computer.

All right. Enough complaining already. Even if you don't necessarily agree with all of my arguments, I hope you at least grant that the average user of Web applications won't ever rave about the Web user interface the way someone might that of a conventionally programmed application like iTunes or Photoshop. Does it have to be that way? Is there not a path that can give us the connectedness of the Web along with a powerful, intuitive user interface?

We want technology to enable the creation of client programs with full-featured user interfaces that can access server resources over the Internet. This is nothing new; client server architectures were the hot technology prior to the advent of the Web, although client server applications were primarily deployed on intranets rather than the Internet. So how might we incorporate Web concepts to advance the technology? Some of the benefits of Web applications that would be desirable to retain are:

  1. Zero administration. Web users don't have to install anything to run their applications. Updates are transparent.
  2. Flexibly networked. For instance, an application for planning travel should be able to access flights, hotels, and car rentals from multiple vendors.
  3. The ability to render and display HTML. While I have complained vociferously about the restrictions of HTML and HTTP, they are both firmly entrenched and there are application areas, such as online help, that would benefit from a modularized way of displaying HTML content.
There are no technical reasons that prevent the development of solutions that address these requirements. In fact, technologies exist today that address these needs; they're just underutilized in this context. I will discuss two possible approaches: first, thick clients written in Java and delivered over the network, and secondly, AJAX (Asynchronous JavaScript And XML), a relatively new technology that transfers processing power to the client-side (though still within a browser) and also mitigates the need to refresh an entire Web page for each transfer of data from the server. For the dedicated client path, requirement number 1 is fulfilled by Sun's Java Web Start and its underlying Java Network Launch Protocol (JNLP). For some reason, this simple and effective solution has never really caught on. It's been available for over four years and is now a standard part of the Java runtime.

You can read all about it at the Java Web Start home page (http://java.sun.com/products/javawebstart/index.jsp), but in a nutshell, you package your application into a JAR file or files, write a JNLP descriptor file and an HTML file with a link to that descriptor file, and then when the user of a properly configured browser (MIME types you know) clicks on that link, Java Web Start fires up and invokes the application. The JAR files are downloaded and executed on the local client machine. Henceforth, running the application does not require re-downloading the JAR files unless the application has been updated and the user requests the new version.

By default, Java Web Start applications run in a secure environment similar to the Java applet sandbox, but there are JNLP APIs for use by applications that need to access the local clipboard or disk. Applications that are implemented by signed JAR files can run in an unrestricted environment (once the user approves).

It is a mystery to me why Java Web Start has not gained more traction with the developer community. Java is a powerful, easy language for creating full-featured applications that are highly portable and Java Web Start is an easy to use tool for deploying such applications. So, no excuses for not having Requirement 1 fulfilled.

What about Requirement 2? You'd have to be dead or deaf to not be aware of all the work that's been done over the past few years in the area popularly called "Web services" or "service-oriented architectures." The XML-based technologies of UDDI, WSDL, and SOAP have made possible a sort of uber-RPC for network clients where RPC servers with various functionalities can be discovered and those functionalities accessed. While in my opinion the reality in this arena has not matched the hype, there is no doubt that the potential is there. Google has WSDL and SOAP-based APIs available as a beta (so it is somewhat disappointing that their desktop search tool is based on HTML pages rather than a thick client).

The big obstacle here is the standardization of APIs for given functionalities. UDDI and WSDL describe functionality and how to access it, but if two different travel vendors (for instance) use different models for their Web services, it is a lot of work for the client developer to account for the differences in each and every provider they want to access. These issues are being addressed, but how quickly industries will define and adopt standard interfaces is unknown.

So, it would seem that Requirement 2 is partially fulfilled with more promise for the future. Incidentally, for an interesting report on combining Java Web Start and Web service technologies, see Allan Poda's devx.com paper entitled "Leverage JNLP and SOAP for Java Thick-client Development" (www.devx.com/Java/Article/22537), where he describes his experiments migrating a previous Web-based interface to a Java thick-client interface.

Requirement 3 (for some sort of HTML rendering capability within the thick client) is, as they say, "a simple matter of software." There are commercial products available, such as ICEBrowser and ICEReader from ICESOFT. There is also a number of freely available implementation including the barebones Swing HTMLEditorKit built into the Java runtime. Capabilities of these components obviously vary. You can find a good rundown of several freely licensable implementations in the article "Java Sketchbook: The HTML Renderer Shootout, Part 1" at http://today.java.net/pub/a/today/2004/05/24/html-pt1.html.

We haven't touched on requirements for security. Obviously this is vital, particularly for apps that are to be delivered over the greater Internet. Component code-signing techniques provide some measure of safety for downloading and running code, but users need to be educated to make good decisions when faced with the dialog that asks "Do you want to install and run: Bizmumble Signed and distributed by: So and So corporation?"

A full discussion of the security requirements and infrastructure for Web services would take an entire book (or two) of its own. The architects of Web services technologies understand the importance of security, and so it has not been neglected in the design and implementation of the technologies that make up Web services. Understanding the implications and necessities for a truly secure Internet application might be daunting, but not impossible. If you want to stick with a pure browser-based solution, AJAX is a clever conglomeration of existing technologies that while not perfect, smoothes over many of the UI bumps. While the technology has been around for several years, it is getting new attention due to its use in several Google Web applications including Gmail, Google Maps, and Google Suggestions. A complete description is beyond the scope of this article, but essentially it is sophisticated JavaScript programming that takes advantage of XMLHttpRequest objects that make asynchronous server requests and update user interface components based on the results of these asynchronous queries. It eliminates the need to refresh the entire page simply to update one form element. For more info, try Googling "ajax Web interface" or see Jesse James Garret's excellent overview at http://adaptivepath.com/publications/essays/archives/000385.php.

So, from the examples presented, it is clear that the technologies required to replace a clunky HTML-based Web application with an application having a more full-featured user interface exist today. There may be other options in addition to the solution technology examples I provided. I believe that the goal of Microsoft's .NET is to realize a sort of "programmable Web" that would include thick client applications.

I grant that not all of these pieces that enable networked application interfaces superior to those of browser-based applications are fully formed and developed. But the groundwork is being laid. I hope that unlike the relatively weak adoption of Java Web Start, that the story for Web services and thick clients is a different one. Users deserve modern user interfaces for the programs they rely on, not forms-based tedium more appropriate for the days of block terminals. Developers and service providers need to start thinking outside the browser box and consider the benefits of thick network clients that use Web services, or at least think about enriching the user experience via innovative technologies like Ajax. They could get a much-needed boost in that direction if suppliers of the supporting infrastructure, like BEA, would move to provide support for these technologies in their development tools and product paradigms. In this way, everyone wins: users get more usable applications, developers get the credit, and tool vendors create pull for the new versions of their products by including innovative new features.

More Stories By Channing Benson

Channing Benson is a senior technical consultant at Hewlett-Packard. He works with HP's software partners in the areas of Java performance, J2EE architectures, and Itanium migration. A 20-year industry veteran, his previous areas of expertise include Lisp and X11/Motif. His recreational interests are music, snowboarding, backcountry skiing, and Ultimate Frisbee.

Comments (4) View Comments

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


Most Recent Comments
Alex Rudkevich 04/08/05 04:30:46 PM EDT

I believe, the website moderator's role is to filter out such "comments" as the one submitted by D.Diggerman.
I am absolutely agree with the author of the article, especially in the part concerning the nightmare of client-side development for different browsers with unpredictable behaviour. From the more general point of view, software development is always restricted by a paradigm, which very soon becomes a dogma. The material about new alternativ approaches is really useful.

D Diggerman 04/07/05 10:12:59 PM EDT

Channing Benson's rave is based on the adjective filled junk written in the paragraph "To be honest, most...". Yes the web is full of rubbish of which the article is part of. "multiple divergent programming languages and frameworks... " what the frig is that! "Developers suffer because Web-based architectures divide an application along illogical boundaries which complicate the creation of a seamless user experience..". Whose suffering the user or developer? "Seamless"?

Please be a good cyber citizen and refrain from clogging up disk storage with rubbish.

D Diggerman

wwwWWW 04/05/05 12:36:03 PM EDT

|| Criticizing something as wildly successful as the World Wide Web seems a bit radical and potentially unpopular. ||

Understatement of the century!

Scott 03/30/05 12:15:26 PM EST

> I hope you at least grant that the average user of Web
> applications won't ever rave about the Web user interface
> the way someone might that of a conventionally programmed
> application like iTunes or Photoshop.

The author is assuming that the usability and state persistence issues of web applications will not be fixed and improved upon. The growing popularity of Ajax and standardized portlets are proving this to be a false assumption.

@ThingsExpo Stories
SYS-CON Events announced today that Yuasa System will exhibit at the Japan External Trade Organization (JETRO) Pavilion at SYS-CON's 21st International Cloud Expo®, which will take place on Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. Yuasa System is introducing a multi-purpose endurance testing system for flexible displays, OLED devices, flexible substrates, flat cables, and films in smartphones, wearables, automobiles, and healthcare.
Join IBM November 1 at 21st Cloud Expo at the Santa Clara Convention Center in Santa Clara, CA, and learn how IBM Watson can bring cognitive services and AI to intelligent, unmanned systems. Cognitive analysis impacts today’s systems with unparalleled ability that were previously available only to manned, back-end operations. Thanks to cloud processing, IBM Watson can bring cognitive services and AI to intelligent, unmanned systems. Imagine a robot vacuum that becomes your personal assistant tha...
Organizations do not need a Big Data strategy; they need a business strategy that incorporates Big Data. Most organizations lack a road map for using Big Data to optimize key business processes, deliver a differentiated customer experience, or uncover new business opportunities. They do not understand what’s possible with respect to integrating Big Data into the business model.
Nordstrom is transforming the way that they do business and the cloud is the key to enabling speed and hyper personalized customer experiences. In his session at 21st Cloud Expo, Ken Schow, VP of Engineering at Nordstrom, will discuss some of the key learnings and common pitfalls of large enterprises moving to the cloud. This includes strategies around choosing a cloud provider(s), architecture, and lessons learned. In addition, he’ll go over some of the best practices for structured team migrat...
Coca-Cola’s Google powered digital signage system lays the groundwork for a more valuable connection between Coke and its customers. Digital signs pair software with high-resolution displays so that a message can be changed instantly based on what the operator wants to communicate or sell. In their Day 3 Keynote at 21st Cloud Expo, Greg Chambers, Global Group Director, Digital Innovation, Coca-Cola, and Vidya Nagarajan, a Senior Product Manager at Google, will discuss how from store operations...
Recently, REAN Cloud built a digital concierge for a North Carolina hospital that had observed that most patient call button questions were repetitive. In addition, the paper-based process used to measure patient health metrics was laborious, not in real-time and sometimes error-prone. In their session at 21st Cloud Expo, Sean Finnerty, Executive Director, Practice Lead, Health Care & Life Science at REAN Cloud, and Dr. S.P.T. Krishnan, Principal Architect at REAN Cloud, will discuss how they bu...
SYS-CON Events announced today that Dasher Technologies will exhibit at SYS-CON's 21st International Cloud Expo®, which will take place on Oct 31 - Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. Dasher Technologies, Inc. ® is a premier IT solution provider that delivers expert technical resources along with trusted account executives to architect and deliver complete IT solutions and services to help our clients execute their goals, plans and objectives. Since 1999, we'v...
SYS-CON Events announced today that TidalScale, a leading provider of systems and services, will exhibit at SYS-CON's 21st International Cloud Expo®, which will take place on Oct 31 - Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. TidalScale has been involved in shaping the computing landscape. They've designed, developed and deployed some of the most important and successful systems and services in the history of the computing industry - internet, Ethernet, operating s...
Enterprises have taken advantage of IoT to achieve important revenue and cost advantages. What is less apparent is how incumbent enterprises operating at scale have, following success with IoT, built analytic, operations management and software development capabilities – ranging from autonomous vehicles to manageable robotics installations. They have embraced these capabilities as if they were Silicon Valley startups. As a result, many firms employ new business models that place enormous impor...
SYS-CON Events announced today that Taica will exhibit at the Japan External Trade Organization (JETRO) Pavilion at SYS-CON's 21st International Cloud Expo®, which will take place on Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. Taica manufacturers Alpha-GEL brand silicone components and materials, which maintain outstanding performance over a wide temperature range -40C to +200C. For more information, visit http://www.taica.co.jp/english/.
SYS-CON Events announced today that MIRAI Inc. will exhibit at the Japan External Trade Organization (JETRO) Pavilion at SYS-CON's 21st International Cloud Expo®, which will take place on Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. MIRAI Inc. are IT consultants from the public sector whose mission is to solve social issues by technology and innovation and to create a meaningful future for people.
As hybrid cloud becomes the de-facto standard mode of operation for most enterprises, new challenges arise on how to efficiently and economically share data across environments. In his session at 21st Cloud Expo, Dr. Allon Cohen, VP of Product at Elastifile, will explore new techniques and best practices that help enterprise IT benefit from the advantages of hybrid cloud environments by enabling data availability for both legacy enterprise and cloud-native mission critical applications. By rev...
With major technology companies and startups seriously embracing Cloud strategies, now is the perfect time to attend 21st Cloud Expo October 31 - November 2, 2017, at the Santa Clara Convention Center, CA, and June 12-14, 2018, at the Javits Center in New York City, NY, and learn what is going on, contribute to the discussions, and ensure that your enterprise is on the right path to Digital Transformation.
SYS-CON Events announced today that Datera will exhibit at SYS-CON's 21st International Cloud Expo®, which will take place on Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. Datera offers a radically new approach to data management, where innovative software makes data infrastructure invisible, elastic and able to perform at the highest level. It eliminates hardware lock-in and gives IT organizations the choice to source x86 server nodes, with business model option...
Infoblox delivers Actionable Network Intelligence to enterprise, government, and service provider customers around the world. They are the industry leader in DNS, DHCP, and IP address management, the category known as DDI. We empower thousands of organizations to control and secure their networks from the core-enabling them to increase efficiency and visibility, improve customer service, and meet compliance requirements.
Digital transformation is changing the face of business. The IDC predicts that enterprises will commit to a massive new scale of digital transformation, to stake out leadership positions in the "digital transformation economy." Accordingly, attendees at the upcoming Cloud Expo | @ThingsExpo at the Santa Clara Convention Center in Santa Clara, CA, Oct 31-Nov 2, will find fresh new content in a new track called Enterprise Cloud & Digital Transformation.
SYS-CON Events announced today that N3N will exhibit at SYS-CON's @ThingsExpo, which will take place on Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. N3N’s solutions increase the effectiveness of operations and control centers, increase the value of IoT investments, and facilitate real-time operational decision making. N3N enables operations teams with a four dimensional digital “big board” that consolidates real-time live video feeds alongside IoT sensor data a...
SYS-CON Events announced today that NetApp has been named “Bronze Sponsor” of SYS-CON's 21st International Cloud Expo®, which will take place on Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. NetApp is the data authority for hybrid cloud. NetApp provides a full range of hybrid cloud data services that simplify management of applications and data across cloud and on-premises environments to accelerate digital transformation. Together with their partners, NetApp emp...
Smart cities have the potential to change our lives at so many levels for citizens: less pollution, reduced parking obstacles, better health, education and more energy savings. Real-time data streaming and the Internet of Things (IoT) possess the power to turn this vision into a reality. However, most organizations today are building their data infrastructure to focus solely on addressing immediate business needs vs. a platform capable of quickly adapting emerging technologies to address future ...
SYS-CON Events announced today that Avere Systems, a leading provider of hybrid cloud enablement solutions, will exhibit at SYS-CON's 21st International Cloud Expo®, which will take place on Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. Avere Systems was created by file systems experts determined to reinvent storage by changing the way enterprises thought about and bought storage resources. With decades of experience behind the company’s founders, Avere got its ...