The Cloud Effect – Enterprises Adopting Internet Strategies


1 – The Changing Nature Of Enterprise IT

1.1 – Tear Down This Wall

The last time a major wall came down, it was Berlin in 1989; the wall was known simply as the ‘Berlin Wall’; and it had divided a country, its resident families, and friends for over 28 years. Just prior to the destruction of the wall there was euphoria all around, a whiff of freedom in the air, and a tear in just about everyone’s eye. The impact of the wall’s takedown was massive and affected over 75 million people living all over the country. The result of this historical moment was unprecedented as Germany rose from the ashes to become a major world power.

It’s been nearly 22 years since that monumental event, and now it’s time for another. This time however, the wall is the ‘enterprise firewall’, and it has separated the enterprise computing systems from the outside world of the web for far too long. This time too, in anticipation of the wall’s teardown, the atmosphere is electric, the possibilities seem endless and the potential feels completely untapped. The impact of this wall’s takedown will be humongous and will affect billions of people living all over the world. The result of this upcoming phenomenon – brace yourselves, cause we are about to find out!

1.2 – A Giant Leap For Mankind

Lately, the blogosphere has been hijacked by an era of ‘Acronym Anarchy’, and with good reason. SOA/WOA/ROA… SAAS/PAAS/IAAS… all point to but one thing – the simple fact that two great worlds – the enterprise and internet are about to collide in what shall be the biggest bang yet.

For years, the common enterprise architect had no choice but to direct his troops to live within the confines of the enterprise firewall, to build isolated information silos, and then to somehow connect them all together; while at the same time keeping costs at a minimum, performance at a maximum, and achieving ROIs set by CIOs to scarcely believable figures. Then arrived the age of architecture nomenclature which introduced the enterprise systems to whole lot of design patterns, which led to a whole lot of expenditure, without a whole lot of return. Shockingly however, this virus of architecture overexposure did not attack the thousands of web based startups springing up every year, and they seemed to do just fine without it, as rags to riches became a common Silicon Valley story. Clearly, the enterprises had a lot to learn, and now it seems like the class is finally in session.

With the advent of web 2.0, a number of highly promising new concepts, ideologies and technologies have emerged which shall forever reshape the landscape of enterprise IT, while at the same time creating a seamless world of integration between 2 widely different computing platforms. For years it was believed that the dynamic, light-weight, volatile web patterns could not possibly be applied to the rigid, heavy, rule-based enterprise systems. It was believed that the flaky web constructs could never perform the workhorse role of enterprise applications. It was believed that the implementation, sustenance and distribution mechanisms of the web had no place in an enterprise architecture governed by the iron fist of its ivory tower bound architect. However the recent success of web based SAAS/PAAS/IAAS in the world of the enterprise tells a very different story. The wise folks who overcame their fear, let down their barriers, and let in the web based services through their enterprise firewall were instantly rewarded with cost savings, state of the art functionalities and frequently updated specialized software. This may have been a small step for a high rise CIO, but it was most definitely a giant leap for mankind.

1.3 – On The Origin Of Species

The invasion of the enterprise world by the ever conquering web warlords has however set in motion a new chain of events, one whose history dates back to the origins of the world, and one whose reach far surpasses human imagination. This enigmatic natural phenomenon was first discovered by a man named ‘Charles Darwin’, and he then proceeded to enlighten the rest of mankind about its existence in his publication called ‘On the Origin of Species’, which established the theory of ‘evolutionary adaptation through natural selection’. The 2 distinguished computing worlds fighting for existence in the same space shall see certain technologies taking a ‘rudimentary form’, certain ideologies eliminated through the ‘survival of the fittest’ and certain concepts being diminished due to disuse through the ‘inheritance of acquired characteristics’. However the most exciting prospect of this invasion is the gradual change of various existing systems to adapt to their modified environments, and ultimately, accumulating over time to form a new species. It is through this factor that we shall soon find ourselves talking about the Enternet, a new breed of IT spawned through the successful evolution of the enterprise and the web through natural selection, and its stride as one combined unit out to dominate all of machine kind.

This progression however requires adaption. Thus, this junction is an important one as it splits into two roads, one leading to evolution, and the other leading to extinction. The enterprise systems would be forced to shed weight, to become more flexible, and to inherit a view of unified open standards. This journey is being refuted as impossible by many, especially given the enormous investments made by CIOs in the existing enterprise architectures, primarily SOA – which has gripped the attention of enterprise architects worldwide with its numerous long term promises. However, what shall make this job a lot easier is the fact that the SOA must not be viewed as a hindrance, but a key enabler of migration to more advanced systems, as it is widely believed that the web itself is the world’s largest SOA in existence.

Keeping the upcoming challenge in mind, the propaganda of IT architects everywhere must manifest itself into a new form when constructing a modern application and their vision of the enterprise system must be updated with a dose of reality. The internet has grown over time into the IT world’s greatest ever success story, and has done so by accumulating a set of personality traits which have helped it develop into the beast it is today. Many of the prominent internet traits can infact also be leveraged into the enterprise world to create a new breed of IT constructions which are capable of surviving in their new habitat. The enterprises have been learning from their past experiences and have been developing increasingly better systems, however, as the following comparisons of their various aspects with those of the web shall prove, the future lies in adopting many of the successful internet attributes to some degree within the enterprise systems, in order to help them keep up with the technological advancements in the ever growing IT sector.

2 – Software Development In The New Era – Comparisons And Conclusions

2.1 – Ready, Steady, Fight!

2.1.1 – Integration vs Consumption – Design Consideration:


The SOA philosophy highly promotes the concept of service reuse. However the scope of this concept is just limited to reuse by other systems within the enterprise IT ecosystem. Due to this limited scope, the architectures usually employ certain highly complex set of specifications, which in turn enforce numerous pre-conditions that must be met by all interfacing systems in order to use its services. This over time leads to multiple layers of unnecessary abstraction which results in restricting the very integration that it was originally meant to offer.


The web overcomes this obstacle by favoring consumption over integration. By following highly open standards, using widely adopted protocols and providing easily accessible web APIs/services, the netizens are able to outdo their enterprise counterparts by following one of the computing world’s oldest rules – KISS. As a result, all web based services are easily available for consumption by a wide variety of audiences, all of which it can cater to by simply not forcing the clients to employ a set of complex specifications. Moreover, a highly consumable service is easily absorbed by the masses, which may lead to further development through crowdsourcing. Crowdsourcing, which is now viewed as a viable business strategy by numerous web based companies, has the potential to significantly increase the worth of a service by means of community based value addition. – Technical Implementation:


Although SOA is a mere design philosophy which does not specify a set of technologies for its implementation, over the years enterprise communities have standardized its technical components, and generated a list of unofficial best practices which are now considered to be the law of the land. Primary among these is the SOAP – WSDL – UDDI collaboration which is meant to serve as a global standard for exposing application services. This standard however, enforces constraints on all interacting systems, by forcing them to adopt dozens of heavyweight WS-* specifications, which results in the SOAP services being harder to consume, and therefore seem less attractive to prospective clients.


The web overcomes this obstacle by favoring REST over SOAP as a means of exposing application services. REST offers various advantages over SOAP simply by being based on the most basic of internet protocols – HTTP, which leads to rapid adoption due to its ‘ease of use’ appeal, and eventually results in widespread consumption. Moreover, the contract for a restful service may be implicit, and thus it is easily handled by thin clients (primarily the browser – which has a harder time dealing with the explicitly defined WSDLs mandated by a SOAP based service) which further boosts its reputation as a widely applicable service exposure technique. The easy integration of REST with popular client side scripting languages such as Javascript serves only to enhance its charm. The success of REST in the World Wide Web becomes obvious when you consider the fact that every single web page on the internet is infact also a read-only REST service.

2.1.2 – Discoverable vs Searchable – Design Consideration:


SOA’s service reuse principle highly depends upon service discovery as a way to provide visibility of an enterprise’s services and of the components that make up those services. As enterprise systems continue to grow more complex, with thousands of services being offered by numerous applications internally, architects have been hard at work trying to find a way to organize these services in a systematic manner in order to comply with the SOA norms of making these services discoverable. The idea at play is to allow a service from an application to be easily located by other applications based on a certain criteria, which would enable loose-coupling by preventing these services to be hard wired between applications, and in turn create a highly flexible system with maximum service reuse capabilities. However, the enterprise’s implementation of the service discovery principle has been extremely flawed, as it further complicates matters by introducing new registries of service offerings, which must be built from the ground up, and have its APIs integrated with all applications intending to use it.


The web overcomes this obstacle of providing service visibility by using one of its most successful features till date – the internet search. Even with gazillions of web-pages available over the internet, one can keep track of all of them by using one of the many high quality searches offered by numerous vendors, which enables its users to find even the most obscure web resources using a variety of search criteria. Moreover, the scope of these searches may vary, with websites developing their own private search mechanisms and implementing site-maps to enable discovery of resources internal to the website. With every web-page on the internet having a URI, the web follows a resource centric approach to locating the various requested items, and easily returns back the search result in the form of the resource’s unique address. What further helps the web in its quest to organize its resources for discoverability is the fact that one web-page may refer another through a hyperlink, which helps in creating a tightly woven net, where everything over the web may be accessed simply by traversing through its endless itinerary. – Technical Implementation:


The primary mechanism of enabling service discovery in SOA is through building a repository in the form of a UDDI. However, this imposes numerous challenges for the architects in the form of expenses, integration with applications and scope of use, which leads to it offering very little (if anything at all) in the form of ROI. The UDDI must be constructed from scratch which leads to development costs. Moreover, it must have its search services infused with all existing and future systems, which leads to compatibility issues. Finally, the fact that UDDI is a registry for only SOAP based services leads to its applicability being highly limited, and its operations not extending to most legacy constructs.


The above issues seem prehistoric in the world of the web which has successfully implemented an amazing resource discovery mechanism. The same concepts may be applied to the world of exposed application services using REST web-services. This is due to the fact that each REST service can be published as a URI, which leads to it being indexable by the backend web crawlers of numerous internet search companies, and thus leads to it being searchable in the same way as a web-page. Further, each REST service may be a referenced resource in another REST service, which leads to a deeply linked set of services, much like the web itself, and thus a REST based network of service offerings is capable of being easily traversed. Moreover, the adoption of RDF, RDF Schema and OWL as W3C standards has once again fueled the movement towards creating a machine traversable semantic web. Therefore, an architecture based on REST services is capable of taking advantage of these technologies by associating machine readable meta-data with the URI of each of its published services, thereby facilitating highly advanced searches based on extremely specific criteria.

2.1.3 – Heavy-Weight vs Feather-Weight – Design Consideration:


The enterprises have a long history of following a strict set of standards, basing applications on complex specifications, and creating multiple layers of abstraction which lead to the creation of a giant heavy-weight system. This over time leads to charging a very heavy consumption tax on service composition and invocation by all IT constructions within the enterprise system. The use of these heavy technologies ends up adding weight to the messaging systems used within the enterprise, which leads to the cost of communication between applications being extremely high, and results in huge amounts of expenses being incurred by the business in terms of performance, hardware utilization and network load.


The web overcomes this obstacle by favoring feather-weight technologies in all its constructs. This is made possible through offerings which incorporate only the most used features of all related primetime technologies and chip away the unnecessary flab. The web developers have long since realized, that not all features offered by a highly packed specification are utilized by all functionalities, offered by all applications, all of the time; and therefore apply a ‘pay as you go’ technique to building services where the choice of the implementing technology is based on only the required aspects of that particular service. This leads to more efficient messaging systems due to the reduced fat of the messages passing between them. Moreover, these systems are easier to implement and integrate, which results in a higher ROI. The internet, being the world’s largest network, channeling the world’s highest amount of traffic, and creating the world’s most cost effective solutions, is clearly the frontrunner when it comes to implementing messaging systems. – Technical Implementation:


The primary messaging model adopted by enterprises in their implementation of SOA is SOAP based web-service, which is associated with a full fledged stack of WS-* specifications that are seldom used, and hence it does not justify being treated as a ‘one size fits all’ form of data transmission solution in every application. SOAP is defined to be transport independent; however this mostly results in performance degradation as it does not take advantage of certain HTTP aspects, such as restful usage of URLs and methods. Moreover, it bypasses existing TCP/IP mechanisms such as sequence management, flow control and service discovery, which leads to a highly inefficient transport system. Furthermore, it mandates messages to be passed in the form of XMLs, which bloats up the message size and hence increases the cost of serializing/deseriaizing each message. Also, a SOAP header is attached to each transmitted XML, which results in any systems intending to produce/consume it, needing a SOAP library. It is due to these issues that an enterprise messaging system results in being extremely obese.


The web is biased towards restful technologies, which enable it to maintain quick response times, primarily due to a highly efficient inter-application communication system. The web developers hold REST web-services in high regard as it allows them to implement JSON as a data container. This format employs a ‘size zero’ approach to structuring data, which leads to minimal overhead in terms of message size, and therefore results in optimal serialization/deserialization times. Moreover, JSON shines as a programming language-independent representation of typical programming language data structures, especially with a dynamic programming language where a reasonable in-memory representation of a JSON object can be obtained simply by calling a library function, which leads to swifter parsing due to reduced external data restructuring logic. Furthermore, REST allows its GET function calls to be cached, which greatly boosts its ability to provide rapid transmission of information. Also, the use of a cache with REST services reduces the load on the backend hardware, thereby improving application performance.

2.1.4 – Scalability vs Agility – Design Consideration:


The functionalities offered by an enterprise system are delivered to the end user in the form of a standard GUI which in turn calls an application modular service. The services presented by the GUI are enhanced in batches, and are dependent upon the corresponding backend system’s release cycles, which are usually spaced at around 3 month intervals. It is due to these restrictions that there is a considerable lag in translating business requirements to technical implementation, which may cause an organization to lose its competitive edge, and therefore is far from being an optimal solution. Moreover, the enterprises have historically built applications in the form of giant grounded garrisons, which heavily trade off the ‘time to market’ aspect of these IT constructions in return for their long term capabilities. It’s the underlying complexities of the technical components used to build these applications which cause their development to proceed at a snail’s pace.


The web overcomes this obstacle by relying on the independent delivery of the various services offered by its various applications. The web developers pay prime importance to the loose coupling of services within the application which enables them to upgrade one operation without affecting another. Moreover, the services may be exposed to the end user in the form of various small portlets, where each portlet is a GUI for one or more functionalities provided by the backend systems. Following this approach, based loosely on grid computing, makes it possible for web establishments to enhance their service offerings without requiring a full application release. This form of a service delivery model also enables the possibility of building mashups, where various different portlet functionalities offered by various different vendors may be aggregated into one combined portal, which may add value to the native services in expected or unexpected ways. Furthermore, the web based applications are built using lightweight technologies, where aspects like ‘learning curve’, ‘development speed’ and ‘time to market’ take precedence over most other considerations. It is due to these factors that the web poises itself to be the more agile of the two computing worlds. – Technical Implementation:


The enterprise has over the years developed a standard way of building frontend representations of their backend services using HTML, CSS and Javascript. These frontend systems are extremely static, providing very little in the form of client interaction, and thereby limiting their functionality to simply the ‘display of data’. Moreover, the use of such technologies causes the GUI having to load the entire page repeatedly each time a new operation is requested, which in turn leads to inefficient performance as the unaffected data must also be re-requested for from the underlying database, resulting in higher network/database/processor loads. A new operation invoked on these frontend systems results in a call to the services offered by their backend counterparts, which are primarily built using workhorse languages such as C/C++/Java etc. Applications built using these languages, although highly scalable, take extremely long to design, develop and deploy. Therefore most enterprise applications follow an iterative development model, with quarterly release cycles, which leads to slow business growth. Moreover, these applications do not offer an alternative delivery model for each of their independent services, thereby mandating a full application release in order to provide the upgraded functionality to the end user.


The web has had extreme growth over the past few years in terms of ‘rich internet applications’. These applications are built in the form of widgets, and provide a rich customer experience in terms of client interaction, leading to higher productivity as the GUIs are capable of offering a vast array of functionalities. This is accomplished using modern web technologies, such as Ajax – which requires no third party libraries, or various custom frameworks such as Flash, Java FX and Silverlight – which require the installation of third party libraries prior to their use. Moreover, various widgets may be embedded together into a combined web page, leading to higher flexibility. Each of these embedded GUIs communicate asynchronously with their corresponding backend systems, which results in the operation invoked on one widget not affecting another, thereby boosting application performance while at the same time reducing hardware costs. It is also this ability of widgets to act as independent applications by themselves which allows them to be upgraded one at a time without requiring a complete release of their corresponding backend applications. These widgets may also be distributed to the masses as they are designed to be highly portable, allowing themselves to be added by users to any existing web page running HTML without the knowledge of their technical aspects, which permits an organization to extend its reach well beyond the formal boundaries. The distribution of these widgets may also give rise to the creation of mashups, either by using a format such as EMML, or through the various online frameworks available to serve this purpose. Moreover, these widgets receive data from their backend systems in a variety of formats such as RSS/ATOM/JSON, which are easily consumable by various client side scripting languages. The backend applications providing these services are in turn built using dynamic scripting languages, which offer swift development times, and therefore help convert business requirements to technical implementation in short time periods.

2.2 – And The Winner Is

Had the above exhibition of enterprise and web philosophies been a boxing match, the former would have been knocked out in the first round, with nothing to show for its effort. Clearly, the web has learnt to “float like a butterfly, sting like a bee”, a technique which has allowed it to infiltrate the IT departments of most businesses. IT development has suddenly shifted gears and moved into a new era, a revolution which has been led by the web at its forefront, at a pace which the enterprises are finding difficult to keep up with. Never before has the world witnessed innovation at a rate so exorbitant, fuelled by a community so expansive, resulting in the realization of solutions so elite. As large, mighty and powerful as the enterprise systems may be, the ‘K-T extinction event’ which lead to the demise of dinosaurs has proven that failure to adapt quickly to a rapidly evolving ecosystem can lead to the instant collapse of even the most brilliant of species, a theory that my well be applied to the enterprise systems in a few years time if they fail to immediately step on the gas.

3 – The Road Ahead

3.1 – The Future Looks Cloudy

We have come a long way since the introduction of computers as a way to solve core business problems, with the demands of companies moving well beyond using computers to simply automate repetitive business processes, and the expectations of its customers skyrocketing way higher than receiving efficient services. In this new era of computing, organizations are expected to focus their IT resources on core value addition activities; a task made much harder by the fact that the world is facing extremely hard economic times, with CIOs reducing capital budgets, and CFOs being forced to cut operational expenses. It is this time of need that has led to the rise of a new form IT service, namely ‘cloud computing’, where SAAS, PAAS, and IAAS attempt to satisfy the demands of the companies and meet the expectations of its customers.

Cloud computing is seen as a gateway for organizations to focus IT on driving the business and not on maintenance, creating new applications with minimal upfront provisioning costs, extending the capabilities of current applications without new infrastructure, increasing the system capacity dynamically, and providing a better disaster recovery plan. The sheer amount of transparent ROI visible through these factors is enabling companies to look past the risks and shift their existing applications to external IAAS offerings, stage the development of new applications on external PAAS offerings, and to abandon the development of certain in-house applications in favor of external SAAS offerings. This embracement of public cloud services marks the beginning of the web infiltrating the previously firewall guarded enterprise.

The cloud brings with it several promises, which shall usher in a new era of computing, the impact of which shall be far greater than the cloud itself. The web based companies offering these cloud services bring with them their expertise in various internet technologies, many of which shall be implemented in their SAAS/PAAS/IAAS offerings, and now be integrated with the enterprise systems opting to use these services. This shall help the enterprises realize the true value of the various web technologies without facing the risk of having to experiment with implementing it themselves. Moreover, the move towards the public cloud shall decrease the proximity of the enterprise to the internet, thus opening up new avenues for the businesses to explore, and take advantage of the various opportunities unique to the web culture. It is therefore in the enterprise’s best interest to align their IT technologies with the grain of the web, in order to facilitate easy integration with the various internet services in the near future, which shall enable them to expand their business at an unheard-of pace by exploiting new prospects.

3.2 – The Imminent Merger

Today, most enterprise architects don’t think of an application service’s direct consumption by outside world as a key criteria while developing its structure; however the move to the public cloud shall mark a paradigm shift in the thought process which goes into designing even the most backend applications, due to the endless possibilities available for organizations to grow the business by exposing their services to the end user over the web. This, coupled with the countless pre-existing services available over the internet, which the enterprise systems may consume in order to provide a range of new age value added services to its customers, shall ensure that the enterprises pave the way for the web technologies to be a part of their systems. The coming years shall witness the enterprise applications becoming consumable, their services – searchable, their architecture – feather-weight, their delivery – agile and their culture – collaborative. The IT world has seen many great mergers over the years; however this unification of the enterprise and web worlds shall forever raise the bar, and open up the doors to unimaginable possibilities.

The story of the enterprise and the web is much like a movie script, with two long lost brothers reuniting near the climax to take on the bad guy, and the audience cheering their every move. The enterprise and the web were separated at childhood, raised by separate communities, and went on to develop vastly different personalities. However, after years of living in ignorance, they have now finally rediscovered each other. The end is in sight, the dream is alive, and the entire world is watching, hoping, praying for a happy ending.


Source by Sanat Vij

Related Posts

Leave a Reply

Your email address will not be published. Required fields are marked *