Welcome!

Weblogic Authors: Yeshim Deniz, Elizabeth White, Michael Meiner, Michael Bushong, Avi Rosenthal

Related Topics: Silverlight, Machine Learning , @CloudExpo

Silverlight: Blog Feed Post

XAJAX Perfect Choice for Cloud Computing Environments

XAJAX Great for Build Scalable Web Applications

XAJAX at Cloud Expo

An interesting thing happens when you combine toolkits like XAJAX and SAJAX and the ability to perform content-based routing: you can actually achieve function-level load balancing in both cloud-based and traditional architectures.

 As you might have discovered from previous posts mentioning it, I still do web application development to support hobby interests in my (very little) spare time. I’m currently in love with the XAJAX library, which has made development of what is supposed to be a very interactive application nearly effortless.

xajax-logoI’m also very much enamored of load balancing/application delivery and cloud computing, specifically how to get the most out of the latter using the former. XAJAX is a perfect example of how the choice of development environment can impact – positively – the ability of an intelligent intermediary to drill-down into the application workload on a functional basis and more efficiently distribute requests to get the most out of an architecture. 

Before someone argues that SAJAX is a better choice I’ll include it, as well, as a “this is a great option, too” for cloud computing environments. XAJAX is strictly for PHP (which is fine for me but not for everyone) while SAJAX supports a broader range of languages and data formats – XAJAX supports only XML, which for purposes of integrating with my BIG-IP via iControl is just fine but again, may not suit everyone’s needs. Both XAJAX and SAJAX pass as part of the data exchange the server-side function being invoked, which is really what’s necessary to accomplish load balancing at the functional layer and what makes both very well-suited to building highly scalable web applications for cloud computing deployment.


LOAD BALANCING AT THE FUNCTIONAL LAYER

One of the things we rarely talk about is that what we’re really distributing (and thus load balancing) in a cloud computing environment is complete applications. The ability to distribute individual workloads is not part and parcel of cloud computing environments today even though it is closely associated with the purist definition of cloud computing. This means if one particular function in an application is more compute intense than another we simply scale the entire application, regardless of whether scaling the individual function would be a more efficient use of resources or not. In many cases this is because we don’t have the means by which we can distribute workload across multiple instances in any environment, cloud or traditional. We simply don’t have the information necessary to accomplish it.

SOA, specifically when utilizing SOAP, affords us that level of granularity by including in the SOAP envelope the specific operation (function) being invoked. But SOA and SOAP are rarely used in Web 2.0 applications, developers for whatever reason preferring AJAX-based techniques and POX (Plain Old XML) or JSON over the more complex SOAP-encoded data.

XAJAX, and SAJAX (and I’m sure other AJAX-based libraries as well) affords us the same granularity and opportunity to scale applications at the functional layer. The same mechanisms that abstract the invocation of server-side functions that makes it so blessedly easy for developers to use and develop web applications is the same abstraction that makes it blessedly easy for a network-side scripting capable application delivery controller (load balancer) to scale a web application based on individual functions. This also, by the way, enables the ability to distribute part of the application in the local data center and another “out in the cloud” with equal alacrity.


WHY XAJAX is PERFECTLY SUITED to CLOUD COMPUTING

xajax-function

If you take a look at the screen shot you’ll see a capture of a Firebug capture of interaction between my browser and my web application. You’ll note that the request sent uses HTTP method POST, and there are three variables sent by XAJAX, one of which is xjxfun – the XAJAX function on the server that I am invoking. In this case it’s the amazingly self-explanatory “getThing” function. In SAJAX this parameter is called rs but it passes the same level of information: the function name and its parameters.

What’s important here is not what the function does but how it’s called from the client: by name, and clearly tagged with an associative descriptive, namely xjxfun.

Assuming that when I profiled my web application (you are profiling your web applications, right?) I discovered (gasp!) that the getThing function requires a disproportionate amount of resources, I might decide that I’d like to scale my application by distributing the getThing function to their own instance of the application rather than being lumped in with all the other functions that will, as utilization climbs, negatively impact the performance of the entire application.

A network-side scripting capable intermediary can easily handle this by (1) inspecting the request, (2) extracting the value of xjxfun from the POST body, and (3) routing the request to the appropriate instance(s) of the application, the ones designated specifically to handle this more compute intense function. Interestingly enough because the arguments are also passed clearly in both XAJAX and SAJAX, you could route requests based on the values of the parameters. This requires some pretty in-depth knowledge of the impact of certain queries and parameters on the execution of your application. You have to know ahead of time that a parameter of X necessarily results in a huge result set from a query and takes more time to execute and uses more memory than the same function invoked using parameter Y. Imagine Twitter compiles a list hourly of users with the most followers and most active update frequency. It could push this information to the intermediary assuming that the intermediary used that list as part of the condition for routing requests. If the parameter passed to a specific function were part of that list, the intermediary would know to route that request knowing it was going to take longer to fulfill or required more resources. Infrastructure 2.0 capable intermediaries are more than capable of incorporating application-specific data with environmental and operational status and data to determine where best to route a request. It’s kind of what they’re designed to do.

Now, if this is a local data center deployed application you could do some cloudbalancing-like behavior and direct the request for that function to a cloud-deployed instance instead. This would be particularly useful in the case that the function is storage-related, and you want to take advantage of cloud storage as a means to reduce the overhead of building out a similar infrastructure locally. This works because when a Load balancer/application delivery controller is introduced into the architecture it can mediate for every request. It has to broker between the client and application because it makes the determination where to route the request – to which instance of the application, local or cloud-based, etc… Thus it sees every request and can inspect every request and can act on every request. Given that XAJAX and SAJAX so nicely presents function level information to us, we can easily use that information to base decisions on where to route a given request.


FINER CONTROL, GREATER FLEXIBILITY

That offers some extraordinary opportunities to architect innovative solutions to traditional scalability and performance issues. It gives us the ability to actually migrate individual “workloads” between clouds and local data centers. We can better leverage existing resources in the data center based on an understanding of specific compute resource needs of individual functions. We can replace, augment, or secure individual functions on an on-demand basis, in real-time, near-time, anytime.

It may be, in the long term, that one of the effects of cloud computing will be that choices made by developers will impact decisions made regarding infrastructure but the opposite is also true. It’s quite possible that the ease with which XAJAX and SAJAX can be leveraged by intelligent intermediaries because of the way it naturally communicates with the server may give it an advantage in a debate between it and another library that does not offer similar ease of intermediary integration.

Regardless, if you’re using XAJAX or SAJAX or trying to decide which AJAX library to use – or even building your own – keep in mind the ability to integrate with intermediaries and how that integration – or lack thereof - will impact your overall application performance and scalability strategy in the long run.

Follow me on Twitter    View Lori's profile on SlideShare  friendfeed icon_facebook

AddThis Feed Button Bookmark and Share

Related blogs & articles:

Read the original blog entry...

More Stories By Lori MacVittie

Lori MacVittie is responsible for education and evangelism of application services available across F5’s entire product suite. Her role includes authorship of technical materials and participation in a number of community-based forums and industry standards organizations, among other efforts. MacVittie has extensive programming experience as an application architect, as well as network and systems development and administration expertise. Prior to joining F5, MacVittie was an award-winning Senior Technology Editor at Network Computing Magazine, where she conducted product research and evaluation focused on integration with application and network architectures, and authored articles on a variety of topics aimed at IT professionals. Her most recent area of focus included SOA-related products and architectures. She holds a B.S. in Information and Computing Science from the University of Wisconsin at Green Bay, and an M.S. in Computer Science from Nova Southeastern University.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


IoT & Smart Cities Stories
The term "digital transformation" (DX) is being used by everyone for just about any company initiative that involves technology, the web, ecommerce, software, or even customer experience. While the term has certainly turned into a buzzword with a lot of hype, the transition to a more connected, digital world is real and comes with real challenges. In his opening keynote, Four Essentials To Become DX Hero Status Now, Jonathan Hoppe, Co-Founder and CTO of Total Uptime Technologies, shared that ...
At CloudEXPO Silicon Valley, June 24-26, 2019, Digital Transformation (DX) is a major focus with expanded DevOpsSUMMIT and FinTechEXPO programs within the DXWorldEXPO agenda. Successful transformation requires a laser focus on being data-driven and on using all the tools available that enable transformation if they plan to survive over the long term. A total of 88% of Fortune 500 companies from a generation ago are now out of business. Only 12% still survive. Similar percentages are found throug...
Never mind that we might not know what the future holds for cryptocurrencies and how much values will fluctuate or even how the process of mining a coin could cost as much as the value of the coin itself - cryptocurrency mining is a hot industry and shows no signs of slowing down. However, energy consumption to mine cryptocurrency is one of the biggest issues facing this industry. Burning huge amounts of electricity isn't incidental to cryptocurrency, it's basically embedded in the core of "mini...
Every organization is facing their own Digital Transformation as they attempt to stay ahead of the competition, or worse, just keep up. Each new opportunity, whether embracing machine learning, IoT, or a cloud migration, seems to bring new development, deployment, and management models. The results are more diverse and federated computing models than any time in our history.
At CloudEXPO Silicon Valley, June 24-26, 2019, Digital Transformation (DX) is a major focus with expanded DevOpsSUMMIT and FinTechEXPO programs within the DXWorldEXPO agenda. Successful transformation requires a laser focus on being data-driven and on using all the tools available that enable transformation if they plan to survive over the long term. A total of 88% of Fortune 500 companies from a generation ago are now out of business. Only 12% still survive. Similar percentages are found throug...
Dion Hinchcliffe is an internationally recognized digital expert, bestselling book author, frequent keynote speaker, analyst, futurist, and transformation expert based in Washington, DC. He is currently Chief Strategy Officer at the industry-leading digital strategy and online community solutions firm, 7Summits.
Digital Transformation is much more than a buzzword. The radical shift to digital mechanisms for almost every process is evident across all industries and verticals. This is often especially true in financial services, where the legacy environment is many times unable to keep up with the rapidly shifting demands of the consumer. The constant pressure to provide complete, omnichannel delivery of customer-facing solutions to meet both regulatory and customer demands is putting enormous pressure on...
IoT is rapidly becoming mainstream as more and more investments are made into the platforms and technology. As this movement continues to expand and gain momentum it creates a massive wall of noise that can be difficult to sift through. Unfortunately, this inevitably makes IoT less approachable for people to get started with and can hamper efforts to integrate this key technology into your own portfolio. There are so many connected products already in place today with many hundreds more on the h...
The standardization of container runtimes and images has sparked the creation of an almost overwhelming number of new open source projects that build on and otherwise work with these specifications. Of course, there's Kubernetes, which orchestrates and manages collections of containers. It was one of the first and best-known examples of projects that make containers truly useful for production use. However, more recently, the container ecosystem has truly exploded. A service mesh like Istio addr...
Digital Transformation: Preparing Cloud & IoT Security for the Age of Artificial Intelligence. As automation and artificial intelligence (AI) power solution development and delivery, many businesses need to build backend cloud capabilities. Well-poised organizations, marketing smart devices with AI and BlockChain capabilities prepare to refine compliance and regulatory capabilities in 2018. Volumes of health, financial, technical and privacy data, along with tightening compliance requirements by...