Durham, N.C. (PRWEB) January 12, 2006 –-
Web Performance, Inc. announces the release of two articles describing how the performance of the popular open source application server Tomcat differs on Windows and Linux.
Apache Tomcat is a the official reference implementation of Java Servlet and JavaServer Pages, both specifications developed by Sun Microsystems under the Java Community Process. Apache Tomcat powers numerous large-scale, mission-critical web applications across a diverse range of industries and organizations. Users include such well-known companies such as WalMart and The Weather Channel.
Sun's release of their J2EE specifications has been followed by enthusiastic developers crafting powerful applications which can be seen on the web in businesses of every size. The result is one very hot topic continually focused on by many developer groups, but mature enough to be well known in all aspects of the industry. Some of the first questions that must be asked with any new project remain: what is the best starting point, and on what operating system should this application be deployed? These reports provide a guide as to what trade-offs can be expected before that decision is made.
The reports look at the performance of Apache Tomcat on the Windows and Linux operating systems, and show that Linux was able to handle about 32% more users than Windows on identical hardware and identical test conditions. Testing performed against our Windows server gave users marginally shorter wait times by electing to turn away some traffic. Our Linux server, however, was able to scale to serving a greater number of users with reasonable responsiveness before it's maximum capacity was reached.
Due to the variability of application and hardware configurations, this report does not tell you how many users your application will be able to handle in your environment (which is why we sell a product for exactly that purpose!). This report shows numbers such as hits/sec and average page duration, but the raw numbers have little value by themselves. Only when the statistics are compared against the other servers do they provide valuable insight on which to make performance judgments.
No matter how this testing is performed, someone will scream "Foul Play!" A primary goal of this effort is a test that is easily duplicated by anyone else who wants to try. No doubt the exact numbers will not be duplicated, due to differences in hardware and measurement technique, but the relative differences between the servers should be reproducible. All of the materials related to this report in the Supplemental Materials section.
For more details refer to the complete report:
Web Performance, Inc. is the creator of industry-leading Web Performance Trainer™ and Web Performance Analyzer web load and stress testing software. Clients include a large portion of the Fortune 1000 as well as numerous governments. For more information visit http://www.webperformance.com
(1) 919-845-7601 x1761