Oracle MapViewer: JRockit or Hotspot?

The story begins with the installation of the latest release of Oracle Fusion Middleware MapViewer on a clean Windows 7 box. This requires a Java application server.

The past years I have been using different releases of the Oracle Application Server. But now it is time to leave the Oracle Application Server behind and shift to Oracle’s strategic application server,  Oracle WebLogic Server.

Installing Oracle Fusion Middleware 11g Release 1 including Oracle WebLogic Server was a snap. After the base installation I applied the latest patch set and created a separate MapViewer domain. I decided to not install a managed server but to use a standalone server domain instead and deployed MapViewer 11g sp4 onto the administration server  running in production mode.

During the installation of Oracle Fusion Middleware I had to make a Java runtime selection for the application server. Oracle (formerly Sun) Hotspot  or Oracle (formerly Bea) JRockit?

This installation step forced me to answer the following question: Which Java runtime is the most appropriate for running Oracle MapViewer?

The market opinion favours Oracle JRockit but I decided to prove it by letting the numbers tell the tale; the starting point for assessing performance characteristics of JRockit and Hotspot to answer the question “Which Java runtime can process most map requests?”.



For the performance test I used the sample data for Oracle Fusion Middleware MapViewer demos, also known as the MVDEMO data set.

The first step was to define a representative map request in XML format. The next request includes many map elements such as a legend, themes and a basemap and therefore will be a good candidate for the test.




This map request results into the following map.




Baseline response time

The next step is to determine how long it takes for Oracle MapViewer to generate such map. This is the response time for the given XML map request and will be used as a baseline for the test to measure the performance difference between Oracle JRockit and Oracle Hotspot.

The test performance will be expressed as a percentage of the baseline response time. For example, 100% performance means that the measured response time is the same as the baseline. 50% performance means the response time is two times slower than the baseline. 200% performance means that the response time is two times faster than the baseline.

To establish a good baseline response time I’ve sent the map request many times sequentially, one at a time, to Oracle MapViewer and calculated the average response time.

For my configuration I established a baseline response time of 362 milliseconds using Oracle’s Hotspot. Note that for establishing the baseline response time it is not really of any importance which of the Java runtime to use.


The test: Oracle JRockit versus Oracle Hotspot

After establishing a representative baseline the next step was to generate an incremental load by sending the map request to Oracle MapViewer each second for the first minute of the test. The next minute two map requests per second. The third minute three map requests per second until a saturation point is reached. The saturation point is a point at which the server’s response time increases significantly, often times jumping from a few milliseconds up to seconds.

The performance test is executed twice. One test running Oracle WebLogic Server on Oracle Hotspot and the same test running Oracle WebLogic Server on Oracle JRockit.

The most important metrics of the two performance tests are combined and depicted into the following two graphs. The upper graph shows the map requests processed by Oracle MapViewer. The lower graph shows the corresponding performance related to the baseline response time (see paragraph baseline response time).

The blue line shows the results when running Oracle MapViewer on Oracle Hotspot. The red line shows the results when running Oracle MapViewer on Oracle JRockit. What strikes me is that the configuration Oracle WebLogic Server and Oracle Hotspot outperforms significantly Oracle WebLogic Server and Oracle JRockit from the beginning.

The saturation point for the latter configuration is reached just after two minutes when the load is increased from two maps per second to three maps per second. While the configuration Oracle Weblogic Server and Oracle Hotspot is steady for five minutes and hit the saturation point when the load is increased from five maps per second to six maps per second.




Another test: Oracle Application Server versus Oracle WebLogic Server

The benchmark result challenged me to conduct another performance test to compare Oracle Application Server 10g and Oracle Fusion Middleware 11g running on Oracle Hotspot. The result should aswer the question if I’m better of to stick to Oracle Application Server 10g for the moment and to postpone the migration to Oracle Fusion Middleware. The results are shown in the following graph and self-explanatory.





The benchmark result showed obviously that Oracle Hotspot is the most appropriate Java runtime for running Oracle Fusion Middleware MapViewer. This result was to some degree surprising to me and at the same time a confirmation again that a performance test is really inevitable.

The extra test between Oracle WebLogic Server and Oracle Application Server also showed me that there is no performance reason to stick to the Oracle Application Server for the moment.


Additional notes

Although the benchmark is based on the XML interface of Oracle MapViewer only I have also tested the WMS and Oracle Maps interface. The test results were in line with the results of the XML interface.

I also conducted the same test for previous releases of Oracle MapViewer, i.e. Oracle MapViewer 11g R1 and Oracle MapViewer ps3. The benchmark result also apply for these releases.

For simplicity of the performance test I used a static, non-parameterized XML map request. I’m aware that caching behaviour of the operating system, application and database server may have impact on the absolute results. However, the goal of the benchmark was a relative comparison and therefore justified this approach.

It took me no more than 1 hour to conduct the benchmark. I have used the following software stack