- In general observation, SOA demands significantly more computing power from a system than earlier monolithic or tightly coupled designs.
- The very notion of loosely coupled services implies message-centric application development. Developers not only have to write traditional processing logic; they also have to handle message transmission, validation, interpretation, and generation-all of which are CPU- and process-intensive.
- As more organizations use SOA, we can expect messaging volume to explode and put a tremendous load on existing IT systems. The potential for adverse effects will escalate.
Monday, July 26, 2010
Are web "applications" real applications?
Web applications seem to be more and more popular. But are these "applications" real applications? Due to limitations in web technology it is obvious that web "applications" cannot offer the functionalities that a real desktop application can offer. This is why 96.4% of users are still using desktop applications (such as Word) instead of rudimentary online tools (such as Google Docs).
I have found some interesting insights about limitations of SOA is "Software Pipelines and SOA: Releasing the Power of Multi-Core Processing".
"How fast can you adapt your software to meet new needs and competitive threats? The popularity and rapid adoption of service-oriented architecture (SOA) is hard evidence of the demand for more flexible software systems.
SOA is a superior technology. Compared to earlier trends in IT architecture, SOA delivers better on its promises. But it presents its own challenges. If you're using SOA for development, it's even more important to address performance and scalability, because of the following factors:
Predictions show that over the next year or two, organizations using SOA will run into performance issues. This is nothing new; historically, each time the business world adopts a new software architecture, it suffers through growing pains. In the past twenty years, the shakeout period for each new major paradigm shift in software development has lasted about one to three years for a given evolutionary phase (any early J2EE user can attest to that). During that time, businesses gradually adopt the new design, and while doing so, they face significant performance- and scalability-related problems. In many cases software developers cannot overcome the steep learning curve; many projects end in outright failure when the deployed application doesn't perform as expected.
Until recently, hardware was the saving grace for such immature architectures. Whenever the computer industry made a significant advance, mostly in CPU performance, performance bottlenecks could be fixed by plugging in a faster chip or by using some other mechanical solution. That advantage is now gone. We've hit a plateau in microprocessor technology, which comes from physical factors such as power consumption, heat generation, and quantum mechanics. The industry can no longer easily increase the clock speed of single CPUs. Therefore, for now and the foreseeable future, CPU vendors are relying on multi-core designs to increase horsepower. The catch is that if you want to take advantage of these new multi-core chips, your software must implement parallel processing-not a common capability in the majority of today's applications."