I'm currently working on a multi-threaded java console application. It is utilizing numerous thread-pools in order to limit the amount of active threads at any given moment. These threads are either parsing or inserting nodes into XML documents.
I'm developing this application on my Windows XP Pro, Pentium Dual Core 2.00 GHz machine with 2g of ram. A typical run on a specific task takes a maximum of 15 minutes (which is ideal).
Now, here's where things get weird, I moved the code (a jar file) to one of our servers (a Windows Server 2003, two Intel Xeon quad-core[1.6Ghz] machine with 4g of ram), and the process runs through (using the exact same data) in about 30 minutes.
I'm utilizing Java SE 1.5 on the workstation and have tried both Java SE 1.5 & 1.6 on the server (yielding little difference). I'm also utilizing the -server JVM parameter as well as the memory heap parameters exactly the same on both machines.
Does anyone know why such a drastic difference in execution times? Is there some known issue with Windows 2003 and java? What I noticed was the app on the server seems to pause intermittently and have since tried numerous garbage collection parameters as suggested by Sun's Java Performance white papers.
Any help or suggestions would be more than appreciated!