In my j2ee web application am firing a query to oracle/mysql data source which would return around 4 million records and firing another query to mysql/oracle data source which would return more than a million records. Then I need to perform some transformation on data,compare data in two result sets and should return records which are equal( or based on some condition)currently I am converting data in resultset to hashmap and using hashmap for further comparision. So I have 2 hashmaps( one for oracle results and another for mysql resultset) and one to store intermediate results.Its going slow as the number of records in database increases.Is there any way I can optimise and improve performance.If I use resultset then i need to keep jdbc connection open till the end.Please suggest me if i can use disconnected resultset like cached rowset and some info about it.Am thinking to avoid the conversion of resultset to hashmap step by converting resultset to cachedrowset and using cachedrowset for further operation.Please suggest me
a) Is it a feasible solution?
b)Using cached rowset consumes more cpu usage and memory as the number of records in database increases? does it cause outOfMemory for large set of records?
c)any other options?
Instead of saving the result sets in hashmaps , you can sort the result sets for comparison, so there is no need to save the result.
Thanks for replying. Along with comparision I need to apply some kind of transformations on the data in resultset and further I need to send that transformed results to jsp from ejb.Thats the reason I was saving results in hashmap.