In my j2ee web application am invoking EJB from servlet and in EJB am firing a query to oracle/mysql data source which would return around 4 million records and firing another query to mysql/oracle data source which would return more than a million records. Then I need to perform some transformation on data,compare data in two result sets and should return records which are equal( or based on some condition)currently I am converting data in resultset to hashmap and using hashmap for further comparision. So I have 2 hashmaps( one for oracle results and another for mysql resultset) and one to store intermediate results.Its going slow as the number of records in database increases.Is there any way I can optimise and improve performance.If I use resultset then i need to keep jdbc connection open till the end.Please suggest me if i can use disconnected resultset like cached rowset and some info about it.Am thinking to avoid the conversion of resultset to hashmap step by converting resultset to cachedrowset and using cachedrowset for further operation.Please suggest me
a) Is it a feasible solution?
b)Using cached rowset consumes more cpu usage and memory as the number of records in database increases? does it cause outOfMemory for large set of records?
c)any other options?