We need to fetch data from Terabase DB with the join of multiple tables. The query uses Subquery inside it to fetch some data for the base query. Its finally a huge query crossing 2 pages. If we execute the same query directly in the DB, its taking some fraction of seconds to produce the reply.
But if we use Hibernate, its taking more time around 5-10 mins or some time it raises memory crashes and session problems. We have proper ORM mapping between the objects and uses JPA for mapping with the tables.
My question is, is Hibernate the right solution for executing Big and complex queries?
Experts please help.
Thanks in advance.
It all depends on how you define your object relationships. If you have too many CASCADE or too many EAGER loading, even the same way you are querying, you will get much worse result. BTW, are you using Native Query or HQL?
Hi, Thanks for your reply. We are using Criteria to execute the query. The query is automatically generated from the base class for the given conditions. We are not using Native query for the execution, but planning now.
Note: If we execute the same query generated by the Hibernate in DB directly, its not taking much time.
Generated Query looks fine, but why Hibernate takes more time for returning the data comparing to normal JBDC execution? Is it because of dealing with lakes of records in DB and generating objects for each records? If that is the case, we cannot use Hibernate to work in DB which has mass data load and has complex interrelations within the tables.
I am still in doubt on whether we are missing something on fine-tune or we should not use Hibernate in these situations. Please clarify.
Do you solved the problem?
Because I have to start an application that has a large database. And I´m worried.
Sorry, by my poor english.