- Posted by: jason priestly
- Posted on: June 14 2002 07:12 EDT
I have a problem where I need to connect to another server through HTTP and the concurrent user count could be over 1000. I'd like to reuse the connections since I think opening and closing the connections would be a waste of resources. The problem can be solved I think with HTTP connection pooling.
Or are there any other solutions? Is HttpURLConnection pooling connections behind the scene? If I hold on to 1000 connected HttpURLConnections will there be 1000 connections to the backend server? Or just 100?
- Pooling HTTP Connections... by Q Werty on June 14 2002 07:36 EDT
- Pooling HTTP Connections... by Kenny MacLeod on June 19 2002 10:52 EDT
Pooling HTTP connection il similar to other pooling object.
- use a factory to get an instance of your HTTP connections
- dont forget to release it, via the factory, to your pool when no more used
- in your pool management don't forget to check and close your connections.
I don't advise you to re-code all pool management, but rather use existing librairies like
How will this work in a J2EE environment. I guess the pooling class stops threads that try to get a connection when there is no free connection. Is this ok in a J2EE environment?
Has someone tried this with HTTP connections?
What's happen when there is no more free object depend on your choices. Jakarta Commons-pool enable you :
- to reject immediatly demand of new HTTP connection with a "pool exhausted exception"
- to increase pool size despite his limits
- to wait during N ms, expecting a free HTTP connection (exception after delay).
Keep in mind Jakarta Commons-pool was designed for server side applications, so may be thread safe. For exemple it's used in Tomcat 4.1.x series for it's database pooling (DBCP).
(NOTE : I'am not a Jakarta member, just a user ...)
I tried the apache commons library and it looks. But I found out that I cannot use HttpURLConnection. Since calling HttpURLConnection.setRequestProperty throws an exception after getting an InputStream from it. I need to set the Cookie header field for each request since the connection can be reused by a different user.
But I guess I can use the Socket class with the commons library instead and set the cookie header every time.
I doubt you'll have much luck in pooling HTTP connections. HTTP is not a persistent protocl, it's not designed for multi requests/responses on the same connection.
Butr it seems to me that pooling connections isn't really what you want. Rather, you want to limit the number of simultaneous connections. You could configure commons-pool to open and close the connections when they are requested/returned to the pool. It would also allow you to limit the number of simultaneous connections, and block if there are none immediately available.
I agree with Kenny. HTTPURLConnection is stateless. Every time you create a URL connection, it's associated with the resource(ie jsp/servlet/...) you are fetching. Limiting user connections to your second server is a good way to solve your problem.
I also think that pooling connections might be difficult/impossible now that I have looked in to the problem. I just have to wait for a new protocol or version of HTTP.
HTTP 1.1 *does* support persistent connections. In other words, you *can* reuse the same TCP/IP connection to access several resources.
The protocol might support it to some extent, but HttpURLConnection doesn't. HTTP 1.1 has the ability to grab a bunch of stuff from one connection (e.g. the web page + images), but it's still not designed for multi-requests across a long period of time.