Weblogic Authors: Yeshim Deniz, Elizabeth White, Michael Meiner, Michael Bushong, Avi Rosenthal

Related Topics: Weblogic

Weblogic: Article

What's Wrong with Web Applications

And What to Do About It

Criticizing something as wildly successful as the World Wide Web seems a bit radical and potentially unpopular. There is no doubt that Tim Berners-Lee's elegantly simple invention enabled an unprecedented revolution in the way computers are used and by whom.

The amount of information currently accessible to anyone with an Internet connection is truly mind-boggling, and I do not think it is an exaggeration to say that the Web is the most revolutionary invention since the automobile in terms of the effect that it has had on how we live our lives.

So what brings me to gore this sacred cow? How can I be negative about something that everyone acknowledges as a technological marvel?

To be honest, most Web applications are distinctly user-unfriendly. The Web's undisputed power to access and present information has been mistaken for a universal entry point to computing and information resources. The view that every interaction between user and computer is a nail to be banged in by the hammer of the Web browser and server has been a step backward for both users and developers. Users have to live with horribly restrictive modal form-based solutions stripped of powerful UI paradigms common in conventionally delivered software. For instance, drag and drop and the ability to save application state in a locally stored document are not available. Developers suffer because Web-based architectures divide an application along illogical boundaries, which complicate the creation of a seamless user experience, and require the use of multiple divergent programming languages and frameworks.

These problems are primarily the result of taking a great idea and stretching it beyond the scope for which it was intended. The Web was initially about access to static content. It really wasn't much different from existing hypertext systems, except that the content was transparently distributed on the Internet - a small difference in implementation, a huge difference in results. Suddenly, authors could easily incorporate into their own works other material from a vast virtual library.

However the rapid adoption of the Web and the lure of its commercial potential made this technology newcomer the apparent solution to all the problems in the world. It didn't happen immediately, but it wasn't long before the Web browser became more than just a viewer, but a generic application deployment platform. Slowly but surely, dedicated client programs in traditional three-tier architectures were replaced by interfaces implemented as Web pages. There are some valid reasons for making this transition, but in the rush to migrate seemingly every application to a browser-based model, decision-makers threw the baby out with the bathwater. The fact of the matter is that HTML makes a very poor user interface layer. Even when tarted up - as it must be to even be functional - with Javascript or other client-side scripting solutions, browser-based interfaces fall far short of creating a rewarding experience for the user.

In addition to shortchanging the end user, webifying an application puts a heavy load on the developer, who suddenly must contend with a multitude of thorny issues in the areas of portability, state-management, multiple implementation languages, client-side vs. server-side operations, and security. This article examines these problems for users and developers in detail, with examples of the pros and (mostly) cons of Web-based applications. I'll conclude with suggestions for alternatives that include the positive aspects of Web applications, but without the downsides.

I am not arguing that all Web-based applications are clumsy and evil and should be eliminated. Particularly when you wish to make your application available to the entire Internet world and there are no existing standards or protocols for that application, choosing to deliver on the Web may be the only choice. I am primarily concerned with the webification of applications provided by corporate IT departments, such as those for submitting travel expense reports, managing benefit packages, or reporting defects. First though, let's examine an application that falls outside that category.

Before the World Wide Web, there was Usenet, an Internet-based system for discussion groups. Information was exchanged between server systems via the Network News Transport Protocol (NNTP). The beauty of Usenet was that it did not specify a particular client interface. Users could choose among a variety of "news readers" according to their particular preferences. Depending on the client program, posts could be made with the editor of the user's choice. Emacs users could even read news and post responses from within the editor. It was the best of both worlds: a standard that allowed the open exchange of information with people all over the world, but that also allowed users the freedom to interact with that system in the manner they found most productive.

Usenet has not gone away; there are now even Web-based news readers such as the one provided by Google. But now, forums that once would have been well served by Usenet have been moved to discussion systems that are accessible only via a particular Web site. If you want to participate, you have to use the interface provided by that Web site. You will have to create a new user account if you want to contribute to discussions and you will have to edit your posts with a somewhat-less-than-full-featured standard browser text box. So much for progress.

There are political reasons service providers might prefer the closed Web-based solution. It allows them to identify their users and to control and monitor access to the site. But these advantages to the provider of the service come at the expense of a poorer experience for the users of the service. Similar trade-offs can be seen in the evolution of travel expense report applications. At one company, paper travel expense forms were replaced by a standardized spreadsheet that the user would complete, print, sign, and submit.

It was a simple solution that gave the user power and flexibility. She had the full interface of a standard spreadsheet program at her disposal, the ability to save partially completed forms as well as save local copies of previously submitted forms. Spreadsheet fanatics, if so inclined, could graph their daily expenses, while the tabular presentation of data made it easy for mortals to notice if they forgot to include the charge for Wednesday's lunch. This document-centric scheme has since been replaced by a Web-based application that automates the logistical aspects of expense reporting. Report submission and approval are now done electronically via a Web-based application that routes reports through the designated approval chain. From a bureaucratic perspective, things are much neater. Managers no longer have to shuffle paper, bean counters get a centralized view of the process at all stages, and even the lowly employee submitting the report gets an easy way to track it through to reimbursement.

Unfortunately, the initial creation of the report is cumbersome, forms-based tedium, where each expense has to be entered one at a time. And after the expenses are entered one by one in isolation, instead of a nice calendar-based display, the user sees only a linear list of expenses to be included in the report. The interface itself provides practically no support to the user for organizing expenses, which again leads us to this irony of supposed progress. Yes, the Web-based application automates and centralizes the process. But again, it does so at the expense of crippling the end-user with an unhelpful, inflexible, and arcane user interface.

Why does this seem to be the inevitable result of webifying an application? Because the Web is a horribly inadequate user-interface solution platform, crafting a full-featured Web application that functions correctly is an exercise in cunning and frustration for the developer, while the end result typically tests a user's patience. Let's look at the restrictions that make it so.

There are two primary obstacles. The first one is that the Web is based on a stateless protocol. The predominant model of application design for the last two decades has centered on a document (i.e., a collection of state information) that the application operates on. In a conventionally programmed application, the full strength of the particular programming language employed can be brought to bear on the problem of representing the data being operated on. In Web-based applications, the programmer has a choice of several less-than-optimal techniques for maintaining state: fields (some hidden) in forms stored on the Web page itself, or server-side state maintained by various tricks for each user session or request. Each has particular drawbacks.

Values stored in forms-based storage might not be propagated between pages and can be viewed and possibly altered by clever users, hence posing security risks. Server-side storage may not scale well, or the techniques used to correlate a particular user with a particular set of program state, like cookies, may be disabled by the user's browser. Further, accessing the state information is not entirely transparent to the programmer and available storage resources on the client computer will probably go unused because the computations requiring the state occur on the server system. Finally, depending on the cleverness of the Web site developer, the document metaphor familiar to the user goes entirely out the window. For instance, there is typically no way to save multiple shopping carts for the same online vendor, the way a conventional word processor could save two versions of my annual letter to Santa.

The second hurdle is the fact that HTML and HTTP are technologies oriented entirely around the concept of the page. Rather than directly manipulate the user-interface components displayed to the user, the Web server has to render a page based on the last user request. Hence, even the simplest operation - like scrolling to the next page of a list of items - entails a round-trip to the server. The URL page request paradigm just does not map well to an interactive application. This is perhaps an oversimplification, but imagine telling an application developer of the late 80s or early 90s that any possible state or screen of the application had to be addressable by a string (i.e., URL).

Now it is true that a vast amount of work has been done to provide frameworks and utilities that abstract that task away from the direct responsibility of the programmer. For instance, WebLogic Workshop's Struts-based Java Page Flow architecture simplifies and reduces the work required to implement the series of pages a user traverses to achieve a particular task. It's way easier than implementing the form and state management logic anew for every single application. However toolkits and frameworks, while simplifying life for programmers, don't make the inherent problems go away. They still show up where browser bookmarks are added in the middle of a multiple page state-dependent sequence, in accidental multiple presses of a submit button, in refreshes of a page that resulted from a POST rather than a GET. I am certain any experienced Web developer can add to the list. Think of how much more productivity could be on tap if framework developers didn't need to compensate for such a crippled application model.

A further consequence of using the Web as an application platform is the ever-present browser/server dichotomy. Can a particular functionality (such as form validation) be provided by the browser or must the logic reside on the server? Again, lots of programming effort has been expended to attempt to add user interface functionality to the browser. Most contemporary Web applications would be nonstarters without the availability of client-side scripting tools like JavaScript that were not even part of the original conception of the World Wide Web. Just as with server-side page generation frameworks, there is a wide range of solutions available for this problem: ActiveX components, VBScript, Java applets, Flash, and DHTML.

While undoubtedly powerful, these tools complicate matters for the developer. To begin with, they entail possible compatibility issues. If you decide to use VBScript, you've suddenly excluded any user not running Internet Explorer. Java Applets are great, giving you a "real" language in which to program your user interface, but what happens when the user's browser doesn't run a compatible JVM or she doesn't have the proper Java plug-in installed?

Beyond compatibility issues, these client-side technologies complicate development project roles. While graphic designers might have responsibility for the purely HTML aspects of a Web page, we can't count on them to have the programming chops to embed the JavaScript required to populate a SELECT item with the desired options. Truth be told, we can't expect even trained programmers to always understand the incantations necessary to achieve the desired effect via client-side scripting, particularly when the project requires the interface to be functional whether or not JavaScript is enabled on the user's browser. Judging from the number of forums and "tip-sheets" dedicated to the intricate tricks of getting even the simplest UI functionality to operate in the Web environment, it is a rare programmer indeed who is fully competent in all of the tricks of the trade. JSPs and tag libraries were supposed to help separate the roles of graphic designer and programmer, but the necessity of resorting to some form of client-side wizardry to create even the most rudimentary interface defies that solution.

Even if one uses the browser plug-in technologies that provide for a full-featured interface such as Flash and Java applets, besides the above-mentioned potential browser compatibility problems, these applications will operate in isolation from other similar Web applications. For instance, there's no dropping an object from a Java applet onto a Flash application and having something meaningful happen. Such functionality might be possible with Active-X plug-ins, but they are proprietary solutions that only work on Windows platforms and configuring them to allow such interoperability could quite easily create security issues for the client computer.

All right. Enough complaining already. Even if you don't necessarily agree with all of my arguments, I hope you at least grant that the average user of Web applications won't ever rave about the Web user interface the way someone might that of a conventionally programmed application like iTunes or Photoshop. Does it have to be that way? Is there not a path that can give us the connectedness of the Web along with a powerful, intuitive user interface?

We want technology to enable the creation of client programs with full-featured user interfaces that can access server resources over the Internet. This is nothing new; client server architectures were the hot technology prior to the advent of the Web, although client server applications were primarily deployed on intranets rather than the Internet. So how might we incorporate Web concepts to advance the technology? Some of the benefits of Web applications that would be desirable to retain are:

  1. Zero administration. Web users don't have to install anything to run their applications. Updates are transparent.
  2. Flexibly networked. For instance, an application for planning travel should be able to access flights, hotels, and car rentals from multiple vendors.
  3. The ability to render and display HTML. While I have complained vociferously about the restrictions of HTML and HTTP, they are both firmly entrenched and there are application areas, such as online help, that would benefit from a modularized way of displaying HTML content.
There are no technical reasons that prevent the development of solutions that address these requirements. In fact, technologies exist today that address these needs; they're just underutilized in this context. I will discuss two possible approaches: first, thick clients written in Java and delivered over the network, and secondly, AJAX (Asynchronous JavaScript And XML), a relatively new technology that transfers processing power to the client-side (though still within a browser) and also mitigates the need to refresh an entire Web page for each transfer of data from the server. For the dedicated client path, requirement number 1 is fulfilled by Sun's Java Web Start and its underlying Java Network Launch Protocol (JNLP). For some reason, this simple and effective solution has never really caught on. It's been available for over four years and is now a standard part of the Java runtime.

You can read all about it at the Java Web Start home page (http://java.sun.com/products/javawebstart/index.jsp), but in a nutshell, you package your application into a JAR file or files, write a JNLP descriptor file and an HTML file with a link to that descriptor file, and then when the user of a properly configured browser (MIME types you know) clicks on that link, Java Web Start fires up and invokes the application. The JAR files are downloaded and executed on the local client machine. Henceforth, running the application does not require re-downloading the JAR files unless the application has been updated and the user requests the new version.

By default, Java Web Start applications run in a secure environment similar to the Java applet sandbox, but there are JNLP APIs for use by applications that need to access the local clipboard or disk. Applications that are implemented by signed JAR files can run in an unrestricted environment (once the user approves).

It is a mystery to me why Java Web Start has not gained more traction with the developer community. Java is a powerful, easy language for creating full-featured applications that are highly portable and Java Web Start is an easy to use tool for deploying such applications. So, no excuses for not having Requirement 1 fulfilled.

What about Requirement 2? You'd have to be dead or deaf to not be aware of all the work that's been done over the past few years in the area popularly called "Web services" or "service-oriented architectures." The XML-based technologies of UDDI, WSDL, and SOAP have made possible a sort of uber-RPC for network clients where RPC servers with various functionalities can be discovered and those functionalities accessed. While in my opinion the reality in this arena has not matched the hype, there is no doubt that the potential is there. Google has WSDL and SOAP-based APIs available as a beta (so it is somewhat disappointing that their desktop search tool is based on HTML pages rather than a thick client).

The big obstacle here is the standardization of APIs for given functionalities. UDDI and WSDL describe functionality and how to access it, but if two different travel vendors (for instance) use different models for their Web services, it is a lot of work for the client developer to account for the differences in each and every provider they want to access. These issues are being addressed, but how quickly industries will define and adopt standard interfaces is unknown.

So, it would seem that Requirement 2 is partially fulfilled with more promise for the future. Incidentally, for an interesting report on combining Java Web Start and Web service technologies, see Allan Poda's devx.com paper entitled "Leverage JNLP and SOAP for Java Thick-client Development" (www.devx.com/Java/Article/22537), where he describes his experiments migrating a previous Web-based interface to a Java thick-client interface.

Requirement 3 (for some sort of HTML rendering capability within the thick client) is, as they say, "a simple matter of software." There are commercial products available, such as ICEBrowser and ICEReader from ICESOFT. There is also a number of freely available implementation including the barebones Swing HTMLEditorKit built into the Java runtime. Capabilities of these components obviously vary. You can find a good rundown of several freely licensable implementations in the article "Java Sketchbook: The HTML Renderer Shootout, Part 1" at http://today.java.net/pub/a/today/2004/05/24/html-pt1.html.

We haven't touched on requirements for security. Obviously this is vital, particularly for apps that are to be delivered over the greater Internet. Component code-signing techniques provide some measure of safety for downloading and running code, but users need to be educated to make good decisions when faced with the dialog that asks "Do you want to install and run: Bizmumble Signed and distributed by: So and So corporation?"

A full discussion of the security requirements and infrastructure for Web services would take an entire book (or two) of its own. The architects of Web services technologies understand the importance of security, and so it has not been neglected in the design and implementation of the technologies that make up Web services. Understanding the implications and necessities for a truly secure Internet application might be daunting, but not impossible. If you want to stick with a pure browser-based solution, AJAX is a clever conglomeration of existing technologies that while not perfect, smoothes over many of the UI bumps. While the technology has been around for several years, it is getting new attention due to its use in several Google Web applications including Gmail, Google Maps, and Google Suggestions. A complete description is beyond the scope of this article, but essentially it is sophisticated JavaScript programming that takes advantage of XMLHttpRequest objects that make asynchronous server requests and update user interface components based on the results of these asynchronous queries. It eliminates the need to refresh the entire page simply to update one form element. For more info, try Googling "ajax Web interface" or see Jesse James Garret's excellent overview at http://adaptivepath.com/publications/essays/archives/000385.php.

So, from the examples presented, it is clear that the technologies required to replace a clunky HTML-based Web application with an application having a more full-featured user interface exist today. There may be other options in addition to the solution technology examples I provided. I believe that the goal of Microsoft's .NET is to realize a sort of "programmable Web" that would include thick client applications.

I grant that not all of these pieces that enable networked application interfaces superior to those of browser-based applications are fully formed and developed. But the groundwork is being laid. I hope that unlike the relatively weak adoption of Java Web Start, that the story for Web services and thick clients is a different one. Users deserve modern user interfaces for the programs they rely on, not forms-based tedium more appropriate for the days of block terminals. Developers and service providers need to start thinking outside the browser box and consider the benefits of thick network clients that use Web services, or at least think about enriching the user experience via innovative technologies like Ajax. They could get a much-needed boost in that direction if suppliers of the supporting infrastructure, like BEA, would move to provide support for these technologies in their development tools and product paradigms. In this way, everyone wins: users get more usable applications, developers get the credit, and tool vendors create pull for the new versions of their products by including innovative new features.

More Stories By Channing Benson

Channing Benson is a senior technical consultant at Hewlett-Packard. He works with HP's software partners in the areas of Java performance, J2EE architectures, and Itanium migration. A 20-year industry veteran, his previous areas of expertise include Lisp and X11/Motif. His recreational interests are music, snowboarding, backcountry skiing, and Ultimate Frisbee.

Comments (4) View Comments

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.

Most Recent Comments
Alex Rudkevich 04/08/05 04:30:46 PM EDT

I believe, the website moderator's role is to filter out such "comments" as the one submitted by D.Diggerman.
I am absolutely agree with the author of the article, especially in the part concerning the nightmare of client-side development for different browsers with unpredictable behaviour. From the more general point of view, software development is always restricted by a paradigm, which very soon becomes a dogma. The material about new alternativ approaches is really useful.

D Diggerman 04/07/05 10:12:59 PM EDT

Channing Benson's rave is based on the adjective filled junk written in the paragraph "To be honest, most...". Yes the web is full of rubbish of which the article is part of. "multiple divergent programming languages and frameworks... " what the frig is that! "Developers suffer because Web-based architectures divide an application along illogical boundaries which complicate the creation of a seamless user experience..". Whose suffering the user or developer? "Seamless"?

Please be a good cyber citizen and refrain from clogging up disk storage with rubbish.

D Diggerman

wwwWWW 04/05/05 12:36:03 PM EDT

|| Criticizing something as wildly successful as the World Wide Web seems a bit radical and potentially unpopular. ||

Understatement of the century!

Scott 03/30/05 12:15:26 PM EST

> I hope you at least grant that the average user of Web
> applications won't ever rave about the Web user interface
> the way someone might that of a conventionally programmed
> application like iTunes or Photoshop.

The author is assuming that the usability and state persistence issues of web applications will not be fixed and improved upon. The growing popularity of Ajax and standardized portlets are proving this to be a false assumption.

@ThingsExpo Stories
DX World EXPO, LLC, a Lighthouse Point, Florida-based startup trade show producer and the creator of "DXWorldEXPO® - Digital Transformation Conference & Expo" has announced its executive management team. The team is headed by Levent Selamoglu, who has been named CEO. "Now is the time for a truly global DX event, to bring together the leading minds from the technology world in a conversation about Digital Transformation," he said in making the announcement.
"Space Monkey by Vivent Smart Home is a product that is a distributed cloud-based edge storage network. Vivent Smart Home, our parent company, is a smart home provider that places a lot of hard drives across homes in North America," explained JT Olds, Director of Engineering, and Brandon Crowfeather, Product Manager, at Vivint Smart Home, in this SYS-CON.tv interview at @ThingsExpo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
SYS-CON Events announced today that Conference Guru has been named “Media Sponsor” of the 22nd International Cloud Expo, which will take place on June 5-7, 2018, at the Javits Center in New York, NY. A valuable conference experience generates new contacts, sales leads, potential strategic partners and potential investors; helps gather competitive intelligence and even provides inspiration for new products and services. Conference Guru works with conference organizers to pass great deals to gre...
The Internet of Things will challenge the status quo of how IT and development organizations operate. Or will it? Certainly the fog layer of IoT requires special insights about data ontology, security and transactional integrity. But the developmental challenges are the same: People, Process and Platform. In his session at @ThingsExpo, Craig Sproule, CEO of Metavine, demonstrated how to move beyond today's coding paradigm and shared the must-have mindsets for removing complexity from the develop...
In his Opening Keynote at 21st Cloud Expo, John Considine, General Manager of IBM Cloud Infrastructure, led attendees through the exciting evolution of the cloud. He looked at this major disruption from the perspective of technology, business models, and what this means for enterprises of all sizes. John Considine is General Manager of Cloud Infrastructure Services at IBM. In that role he is responsible for leading IBM’s public cloud infrastructure including strategy, development, and offering m...
"Evatronix provides design services to companies that need to integrate the IoT technology in their products but they don't necessarily have the expertise, knowledge and design team to do so," explained Adam Morawiec, VP of Business Development at Evatronix, in this SYS-CON.tv interview at @ThingsExpo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
To get the most out of their data, successful companies are not focusing on queries and data lakes, they are actively integrating analytics into their operations with a data-first application development approach. Real-time adjustments to improve revenues, reduce costs, or mitigate risk rely on applications that minimize latency on a variety of data sources. In his session at @BigDataExpo, Jack Norris, Senior Vice President, Data and Applications at MapR Technologies, reviewed best practices to ...
Widespread fragmentation is stalling the growth of the IIoT and making it difficult for partners to work together. The number of software platforms, apps, hardware and connectivity standards is creating paralysis among businesses that are afraid of being locked into a solution. EdgeX Foundry is unifying the community around a common IoT edge framework and an ecosystem of interoperable components.
Large industrial manufacturing organizations are adopting the agile principles of cloud software companies. The industrial manufacturing development process has not scaled over time. Now that design CAD teams are geographically distributed, centralizing their work is key. With large multi-gigabyte projects, outdated tools have stifled industrial team agility, time-to-market milestones, and impacted P&L stakeholders.
"Akvelon is a software development company and we also provide consultancy services to folks who are looking to scale or accelerate their engineering roadmaps," explained Jeremiah Mothersell, Marketing Manager at Akvelon, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
"IBM is really all in on blockchain. We take a look at sort of the history of blockchain ledger technologies. It started out with bitcoin, Ethereum, and IBM evaluated these particular blockchain technologies and found they were anonymous and permissionless and that many companies were looking for permissioned blockchain," stated René Bostic, Technical VP of the IBM Cloud Unit in North America, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Conventi...
In his session at 21st Cloud Expo, Carl J. Levine, Senior Technical Evangelist for NS1, will objectively discuss how DNS is used to solve Digital Transformation challenges in large SaaS applications, CDNs, AdTech platforms, and other demanding use cases. Carl J. Levine is the Senior Technical Evangelist for NS1. A veteran of the Internet Infrastructure space, he has over a decade of experience with startups, networking protocols and Internet infrastructure, combined with the unique ability to it...
22nd International Cloud Expo, taking place June 5-7, 2018, at the Javits Center in New York City, NY, and co-located with the 1st DXWorld Expo will feature technical sessions from a rock star conference faculty and the leading industry players in the world. Cloud computing is now being embraced by a majority of enterprises of all sizes. Yesterday's debate about public vs. private has transformed into the reality of hybrid cloud: a recent survey shows that 74% of enterprises have a hybrid cloud ...
"Cloud Academy is an enterprise training platform for the cloud, specifically public clouds. We offer guided learning experiences on AWS, Azure, Google Cloud and all the surrounding methodologies and technologies that you need to know and your teams need to know in order to leverage the full benefits of the cloud," explained Alex Brower, VP of Marketing at Cloud Academy, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clar...
Gemini is Yahoo’s native and search advertising platform. To ensure the quality of a complex distributed system that spans multiple products and components and across various desktop websites and mobile app and web experiences – both Yahoo owned and operated and third-party syndication (supply), with complex interaction with more than a billion users and numerous advertisers globally (demand) – it becomes imperative to automate a set of end-to-end tests 24x7 to detect bugs and regression. In th...
"MobiDev is a software development company and we do complex, custom software development for everybody from entrepreneurs to large enterprises," explained Alan Winters, U.S. Head of Business Development at MobiDev, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
Coca-Cola’s Google powered digital signage system lays the groundwork for a more valuable connection between Coke and its customers. Digital signs pair software with high-resolution displays so that a message can be changed instantly based on what the operator wants to communicate or sell. In their Day 3 Keynote at 21st Cloud Expo, Greg Chambers, Global Group Director, Digital Innovation, Coca-Cola, and Vidya Nagarajan, a Senior Product Manager at Google, discussed how from store operations and ...
"There's plenty of bandwidth out there but it's never in the right place. So what Cedexis does is uses data to work out the best pathways to get data from the origin to the person who wants to get it," explained Simon Jones, Evangelist and Head of Marketing at Cedexis, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
SYS-CON Events announced today that CrowdReviews.com has been named “Media Sponsor” of SYS-CON's 22nd International Cloud Expo, which will take place on June 5–7, 2018, at the Javits Center in New York City, NY. CrowdReviews.com is a transparent online platform for determining which products and services are the best based on the opinion of the crowd. The crowd consists of Internet users that have experienced products and services first-hand and have an interest in letting other potential buye...
SYS-CON Events announced today that Telecom Reseller has been named “Media Sponsor” of SYS-CON's 22nd International Cloud Expo, which will take place on June 5-7, 2018, at the Javits Center in New York, NY. Telecom Reseller reports on Unified Communications, UCaaS, BPaaS for enterprise and SMBs. They report extensively on both customer premises based solutions such as IP-PBX as well as cloud based and hosted platforms.