Discussions

News: 'AJAX in Action' on JavaRSS

  1. 'AJAX in Action' on JavaRSS (29 messages)

    JavaRSS, an aggregation site, has published an article called "AJAX in Action," showing how AJAX is being used on the site to display summaries of items based on mouseover, rather than including the summaries in the home page. Benefits for JavaRSS included a front page that was much smaller and, thus, much faster on initial load.

    The heart of the AJAX functionality is in this code, which talks to a custom JSP page on JavaRSS to retrieve the description:
    function getDescription(channelId,itemId) {
        var url = 'http://localhost:8080/getDescription.jsp?channelId=' + channelId + '&itemId=' + itemId;
        if (window.XMLHttpRequest) {
            req = new XMLHttpRequest();
        } else if (window.ActiveXObject) {
            req = new ActiveXObject("Microsoft.XMLHTTP");
        }
        req.onreadystatechange = processRequest;
        req.open("GET", url, true);
        req.send(null);
    }

    function processRequest() {
        if (req.readyState == 4) {
            if (req.status == 200) {
              parseMessages();
            } else {
              alert ( "Not able to retrieve description" );
            }
        }
    }

    function parseMessages() {
            response = req.responseXML.documentElement;
            itemDescription = response.getElementsByTagName('description')[0].firstChild.data;
            alert ( itemDescription );
    }
    There has been a large amount of AJAX-related content on the Internet lately. Is AJAX poised to revolutionize the web user interface? Also, JavaRSS uses a setting to enable AJAX in the user interface. How are you handling cases where the browser might not be able to handle AJAX? What about accessibility? Has anyone worked out how to handle blind users' interactions with AJAX?

    Threaded Messages (29)

  2. AJAX Matters[ Go to top ]

    I have found that intra-page interactions combined with server-side business rules work well with AJAX. There's a fine line between AJAX actually improving an application. I think there's a tendency to use AJAX for everything but this leads to overly complex code and a decrease in usability (yes, I speak from experience).

    I there needs to be more focus on real world scenarios and best usages of AJAX rather than posting the same snippet of code everyone else is posting :)

    I have put together a site AJAX Matters which is a collection of XMLHTTP libraries available, articles, and sites using AJAX.
  3. AJAX Matters[ Go to top ]

    I've written a quick article about a possible usage of Ajax using a standard protocol because i think it's better to use an existing standard rather than introducing your own protocol which will be harder to maintain.

    Have a quick read by clicking here
  4. While on the surface it may seem like this would help, I wonder if this would really improve the performance of the app.

    Bandwidth shouldn't be your only concern. Each HTTP request comes with overhead, not to mention the overhead of each DB call and servlet invocation, etc.

    Is one 50k request really worse than one 40k request and 10-20 small description requests? It'd probably take some good load testing to find out.
  5. While on the surface it may seem like this would help, I wonder if this would really improve the performance of the app. Bandwidth shouldn't be your only concern. Each HTTP request comes with overhead, not to mention the overhead of each DB call and servlet invocation, etc.Is one 50k request really worse than one 40k request and 10-20 small description requests? It'd probably take some good load testing to find out.

    Jason,

    Its not just about the request-response speed,its also about the whole page getting refreshed with the traditional
    request-response from server.Refreshing only a section of a page really improves the usability.

    Regards
    Surajeet
  6. Beauty of AJAX[ Go to top ]

    While on the surface it may seem like this would help, I wonder if this would really improve the performance of the app.

    Give it a try yourself. you will notice the difference.
    Bandwidth shouldn't be your only concern. Each HTTP request comes with overhead, not to mention the overhead of each DB call and servlet invocation, etc.

    That's the beauty of AJAX's asynchronous nature. You are retrieving only the description through xml and not refreshing the whole page with the headers and footers. If you want to reduce DB calls, you can use caching.
    Is one 50k request really worse than one 40k request and 10-20 small description requests? It'd probably take some good load testing to find out.

    For JavaRSS.com, figures is different. It should look like this:

    50k > 25K + 6k ( 20 x 0.3k ) ;-)

    - Jay
  7. Beauty of AJAX[ Go to top ]

    While on the surface it may seem like this would help, I wonder if this would really improve the performance of the app.
    Give it a try yourself. you will notice the difference.
    Bandwidth shouldn't be your only concern. Each HTTP request comes with overhead, not to mention the overhead of each DB call and servlet invocation, etc.
    That's the beauty of AJAX's asynchronous nature. You are retrieving only the description through xml and not refreshing the whole page with the headers and footers. If you want to reduce DB calls, you can use caching.
    Is one 50k request really worse than one 40k request and 10-20 small description requests? It'd probably take some good load testing to find out.
    For JavaRSS.com, figures is different. It should look like this:50k > 25K + 6k ( 20 x 0.3k ) ;-)- Jay

    As I said, bandwidth should not be your only concern. Anyway, I was trying to get to a more general point that this kind of stuff can be misused. Sure you could rewrite all of TheServerSide.com such that you wouldn't see the full text of a comment until you hovered over it. But should you? Again I would suggest that one 50k request would in some curcumstances consume fewer system resources than one 25k request and 20 0.3k requests. Especially if the AJAX requests read from the database (I know caching blah blah...this isn't always possible or practical).

    Not that there aren't plenty of legitimate uses for some XMLHttpRequest magic, I'm just not sure this is it.
  8. Is one 50k request really worse than one 40k request and 10-20 small description requests? It'd probably take some good load testing to find out.

    No need, because others have done so before you.

    In short, 10-20 small requests is really bad. These days, bandwidth is cheap, latency expensive.
  9. Is one 50k request really worse than one 40k request and 10-20 small description requests? It'd probably take some good load testing to find out.
    No need, because others have done so before you.In short, 10-20 small requests is really bad. These days, bandwidth is cheap, latency expensive.

    That didn't make any sense.

    The debate is the same as paging results, how much do you fetch and show? The answer has always been case by case.

    For JavaRSS.com, do most users browse all of the summaries? If so, then they aren't saving themselves anything with AJAX. If they found that most users only view a small percentage of RSS summaries, then they might save themselves bandwidth.

    In my experience, everything comes down to bandwidth. Throwing a couple extra boxes in your server room is much cheaper than adding/renting additional lines.

    -- Jacob
    JSF Facelets
  10. In my experience, everything comes down to bandwidth. Throwing a couple extra boxes in your server room is much cheaper than adding/renting additional lines.-- JacobJSF Facelets

    That's just plain wrong. Latency is playing a steadily growing role in how you design your protocol (at whatever level, even AJAX)

    Just look at WebDAV: its failure is largely due to the common design flaw where latency is not considered. I/O bandwidth *is* becoming cheap, latency *is* almost constant and has been so for the past 20 years and will be for the foreseable future. It's a mathematical fact.

    This principle applies to L1-3 caching, disk I/O and networking - i.e. the entire i/o hiearchy.
  11. Latency over the internet isn't even remotely close to being "almost constant." You're talking completely out of your ass. How could even say something so absurd?? The internet isn't even designed to provide consistent latency.
  12. I think what he meant was that no matter how much your bandwidth may grow in the future, latency probably won't decrease in the same proportion.
  13. In most scenario, we have to reload the whole page for a minor change in web page. AJAX is way better. However, what annoys me a lot is that I could not find a generic way to call XSLT transform from browser that limits the AJAX application.
    Is one 50k request really worse than one 40k request and 10-20 small description requests? It'd probably take some good load testing to find out.
    No need, because others have done so before you.In short, 10-20 small requests is really bad. These days, bandwidth is cheap, latency expensive.
  14. In most scenario, we have to reload the whole page for a minor change in web page. AJAX is way better. However, what annoys me a lot is that I could not find a generic way to call XSLT transform from browser that limits the AJAX application.
    Is one 50k request really worse than one 40k request and 10-20 small description requests? It'd probably take some good load testing to find out.
    No need, because others have done so before you.In short, 10-20 small requests is really bad. These days, bandwidth is cheap, latency expensive.
    Agreed thats its not for suitable for every application. Take a look at maps.google.com and traverse through the maps; thats the beauty of AJAX.
  15. Agreed thats its not for suitable for every application. Take a look at maps.google.com and traverse through the maps; thats the beauty of AJAX.

    I have a question:
    Lets pretend for a moment that every client machine out there have JRE 1.4.2_08 at least or can easy get it.

    I am wondering in which case google.maps would be more supportable and developer friendly:
    - if it is done as a simple Java applet;
    - or if done as current AJAX based mixture of technologies;

    Personally I suspect that Java based version would be simpler, and if I am right then AJAX is a duck-tape solution for a deeper problem: necessity of a decent universal network client. It does not have to be a Java, I think that mandatory XWindow server on client would be even better.
  16. ...a deeper problem: necessity of a decent universal network client. It does not have to be a Java, I think that mandatory XWindow server on client would be even better.

    That's the real crux of the problem, we're doing everything we can to add state to a stateless protocol (HTTP).

    I completely agree, had it been done as a Java applet, not only would it be easier for the developers, but it would be a richer application.
  17. Agree and disagree[ Go to top ]

    Yes, it would be easier if HTTP weren't stateless. But instead of dxefaulting to Java, consider the rest of the developers in this world that don't use Java and that some languages to fade from glory. Instead, make a stateful protocol and standards that *can* be implemented in Java, or any other language.
  18. Agree and disagree[ Go to top ]

    Yes, it would be easier if HTTP weren't stateless. But instead of dxefaulting to Java, consider the rest of the developers in this world that don't use Java and that some languages to fade from glory. Instead, make a stateful protocol and standards that *can* be implemented in Java, or any other language.
    There is such protocol and spec: it is called CORBA. And it is implemented for most languages. But actually the problem is broader than just state(full|less)ness ( btw. CORBA supports both types + asynchronous ).
    - Any attempts to unify client cause glory for one technology and oblivion for others;
    - XWindow server(on client) supports everything, but then others cry that it is too much;
    - Any attempt to use that already works and proven – and vendors cry- there is nothing to sell;

    Personally I am OK with some technologies fading away, but their replacements should be evolutionary and really address needs of IT consumers, rather than be duck-tape patches which benefit IT vendors only.
  19. Cross-browser XSLT[ Go to top ]

    In most scenario, we have to reload the whole page for a minor change in web page. AJAX is way better. However, what annoys me a lot is that I could not find a generic way to call XSLT transform from browser that limits the AJAX application.

    It is possible to do this. Check out Sarissa. The challenge is the difference in processing models; Mozilla is DOM to DOM and IE is string to string. Combined with cross-browser DOM parsing it works.

    It won't work on IE 5.0, though, whose support for XSLT is old and non-compliant. Personally, I do the transforms I need server-side, where I have better control.
  20. There is no need to use XSLT, just send the HTML from the server directly as text and do a

    element.innerHtml=htmlFromServer;
  21. Better web browsers?[ Go to top ]

    In most scenario, we have to reload the whole page for a minor change in web page. AJAX is way better.

    I hear this argument for AJAX techniques pretty often. It seems that typically, this argument is based on the fact that when we reload a page in a browser, the entire browser (or at least the active tab/window) becomes more-or-less unusable until the entire page is returned from the server, parsed, and rendered.

    Would maybe it be better to invest time in making Firefox with page reloads in a more intelligent way? For example, if the browser were to download the page on a reload in the background and then diff it against the currently-displayed page, it could do a significantly better job of just updating those parts of the page that need to be redrawn.

    I've no idea if Firefox's architecture is amenable to doing this, but given its growth of late, it seems like innovating in Firefox could be a good way to drive the other browsers to improve. And clearly, it's better to fix UI problems like ugly refreshes at the browser level, rather than by layering an entire non-portable infrastructure on top.

    (For all of you about to respond to my "non-portable" comment: read this thread, and notice that the linked-to site doesn't work on Safari, and one post takes into account DOM-to-DOM behavior vs. string-to-string behavior, and then goes on to highlight that IE5 (which is probably one of the more common browsers still) doesn't really support either behavior.)

    Standard TSS disclaimer: no, I'm not against all that is AJAX. I'm really just posting about the quoted-and-discussed issue.

    -Patrick

    --
    Patrick Linskey
    http://solarmetric.com
  22. Better web browsers?[ Go to top ]

    One problem is that the web browser has to deconstruct/free memory for everything in use, load the new page, parse the document tree, start the scripts, etc.. when it loads the new page. An AJAX model does keep that from happening.. for minor replacements, there is much less overhead with the AJAX route, even with a slower connection..

    Personally, for things like forms, monitoring applications, and things like that, I think AJAX is snazzy. I don't really have a big pro or con feeling towards JavaRSS doing it for their preview/overview things.. but its certainly good to have things tested in different ways.
  23. Better web browsers?[ Go to top ]

    One problem is that the web browser has to deconstruct/free memory for everything in use, load the new page, parse the document tree, start the scripts, etc.. when it loads the new page.

    The problem isn't that the browser *has* to do that; the problem is that the browser *does* do that. It is not hard to imagine a browser that reused vast portions of the previously-displayed page to avoid fully throwing things away every time, and did so sufficiently intelligently as to avoid mem leaks, page inconsistencies, etc.

    -Patrick

    --
    Patrick Linskey
    http://solarmetric.com
  24. Better web browsers?[ Go to top ]

    One problem is that the web browser has to deconstruct/free memory for everything in use, load the new page, parse the document tree, start the scripts, etc.. when it loads the new page.
    The problem isn't that the browser *has* to do that; the problem is that the browser *does* do that. It is not hard to imagine a browser that reused vast portions of the previously-displayed page to avoid fully throwing things away every time, and did so sufficiently intelligently as to avoid mem leaks, page inconsistencies

    Agreed that this would be a sensible improvement to current browsers. But on its own it doesn't really improve the user-experience much. You'll still have to sit there twiddling your thumbs while the page downloads in the background. We need XMLHttpRequest (or something like it).

    Chris
  25. Better web browsers?[ Go to top ]

    Agreed that this would be a sensible improvement to current browsers. But on its own it doesn't really improve the user-experience much. You'll still have to sit there twiddling your thumbs while the page downloads in the background. We need XMLHttpRequest (or something like it).

    (Again, recall the disclaimer in my original post: "no, I'm not against all that is AJAX. I'm really just posting about the quoted-and-discussed issue.")

    My original post was regarding the concept that AJAX techniques provide a solution for the annoyance of needing to fully redraw a page just to see relatively minor content changes. The AJAX solution to this problem is a hack that prevents the browser from throwing away the current page and starting from scratch. If the browser itself were smart enough to do this when it detected substantively similar blocks of HTML, this hack would not be necessary.

    Let's assume that for most websites, the latency of the network roundtrip and the latency of the drawing time are the big time sinks, and that the size of the raw (compressed) HTML for the page itself (i.e., not images etc.) has a minimal impact on the time of the request. Given these assumptions, an XMLHttpRequest-based approach will take roughly the same amount of elapsed time as a "normal" HttpRequest. So, the benefit that AJAX brings to the table for these types of situations is that it provides a workaround to the fact that browsers *do* redraw everything from scratch pretty much every time. The user still has to twiddle thumbs while waiting for the XMLHttpRequest downloads in the background, and the elapsed time for that download is on the same order of the elapsed time for the full-page (sans images etc.) download. The benefit is that browsers do a somewhat-decent job of letting bits of JavaScript do some page-redrawing.

    However, this UI benefit comes at a cost -- back buttons don't always work quite as expected, development / maintenance time is typically longer, compatibility is often sacrificed, etc.

    If someone were to spend the time to optimize a browser to deal well with loading pages that are vastly similar to their predecessors, all HTML-based websites would immediately benefit (when accessed with that browser). As I had attempted to make clear in my original post, there is clearly still a place for AJAX techniques, as the HTML primitives are, well, primitive, and do not really allow for much fancy stuff (except for everyone's favorite, of course, the venerable BLINK tag).

    -Patrick

    --
    Patrick Linskey
    http://solarmetric.com
  26. While on the surface it may seem like this would help, I wonder if this would really improve the performance of the app. Bandwidth shouldn't be your only concern. Each HTTP request comes with overhead, not to mention the overhead of each DB call and servlet invocation, etc.Is one 50k request really worse than one 40k request and 10-20 small description requests? It'd probably take some good load testing to find out.

    This is an excellent point, which suggests that browser-based caching is a major decision to be made in AJAX applications.

    It's not an all-or-none thing. With AJAX, you can continue to download in the background if you want. Doing so would let you optimise the user experience at the expense of bandwidth, but you'd only end up downloading as much as the old-style page was downloading anyway (though with more connection overhead).

    Also, you can try to anticipate what the user will do next, and pre-fetch likely data. In JavaRSS, it might make sense to fetch the data for all five items in a feed, whenever the first one is selected. If you look at Google Maps carefully, it seems to be doing just that. Notice you can move a little in any direction, and the map will immediately be refreshed. Move faster and further, and it will have to be downloaded.
  27. Interesting..[ Go to top ]

    I would think the overhead of processing the other requests, HTTP headers, and such would (in this case) not be any more efficient. The main page, with compression running, would be pretty light.. (since the text should compress pretty well..)

    That said.. I could certainly see the difference when I turned Ajax on. Previously things would simply show the title, but now I had nice JavaScript errors on some entries, but not others.. it was actually difficult to navigate out past the errors to change my settings. To be fair, I'm on Safari which may have not been tested, but perhaps the task of error handling (javascript exception catching) should be considered in these Ajax applications..
  28. Tested on[ Go to top ]

    It was tested on Firefox, Opera and IE. Sorry. I don't have access to Safari. If you can email the errors to webmaster {at} javarss {dot} com, I will try to fix them.
  29. 'AJAX in Action' on JavaRSS[ Go to top ]

    There is some trick to avoid XMLHttp:
    http://www.servletsuite.com/servlets/jscalltag.htm
  30. 'AJAX in Action' on JavaRSS[ Go to top ]

    Use Sarissa, it's dead easy to use.

    quick tutorial:
    http://www.xml.com/pub/a/2005/02/23/sarissa.html