JSON-RPC - JavaScript to Java Remote Scripting

Discussions

News: JSON-RPC - JavaScript to Java Remote Scripting

  1. JSON-RPC - JavaScript to Java Remote Scripting (66 messages)

    JSON-RPC-Java enables a new breed of fast and highly dynamic enterprise Java web applications (using similar techniques to Gmail and Google Suggests) with a minimum of coding required.

    Some terms explained:

        * JSON (JavaScript Object Notation) is a lightweight data-interchange format with language bindings for C, C++, C#, Java, JavaScript, Perl, TCL and others.
        * JSON-RPC is a simple remote procedure call protocol similar to XML-RPC although it uses the lightweight JSON format instead of XML.
        * XMLHttpRequest object (or MSXML ActiveX in the case of Internet Explorer) used in JavaScript to call remote methods on the server via HTTP without reloading the page.

    Overview

    JSON-RPC-Java is a dynamic JSON-RPC implementation in Java. It allows you to transparently call server-side Java code from JavaScript. It is designed to run in a Servlet container such as Tomcat and can be used with JBoss and other J2EE Application servers to allow calling of plain Java or EJB methods from within a JavaScript DHTML web application.

    Minimal or zero changes are necessary to existing server-side Java code to allow calling from JavaScript (such as the marshalling and unmarshalling of special types) as JSON-RPC-Java dynamically maps JavaScript objects to and from Java objects using Java reflection. JSON-RPC-Java allows simple exporting of Java objects by reflection on their method signatures (a single line of code is required to provide access to all public methods of a Java object).

    Features

    JSON-RPC-Java currently supports transparent marshalling and unmarshalling of primitive types, Arrays, Java Beans, List, Map and Set interfaces and their concrete implementations as well as any nested combination of these.

    JSON-RPC-Java has basic ORB (Object Request Broker) functionality. Support for opaque references, instead of pass by value. Dynamic instantiation of callable proxy objects for classes returned from factories (non value objects). Allows for very transparent Java and JavaScript interworking.

    Supports Internet Explorer 6, Mozilla 1.7, Firefox 1.0, Safari 1.2, Opera 8 Beta, Konqueror 3.3 + patch [1] (see website for details): http://oss.metaparadigm.com/jsonrpc/

    Threaded Messages (66)

  2. Interesting time relationship between this article posting and the one on /.

    ...

    Dah well, it is very interesting indeed!

    I would like to add some feedback as well as some questions.

    I have made several web applications on several platforms -with varying degrees of static and dynamic content. At this point i am an expert using javascript and I have done several applications that use web services called by javascript libs, and i have implemented single pages with well over 30,000 lines of OO js in a single page. Most of the time the time these projects have been dictated by a system architect, but now i am becoming the architect most of the time, and although i am proficient using DHTML to its fullest, most of the time i question whether it is reasonable to do certain things that only dynamic web pages can do.

    We need some rules of thumb as browsers begin to support more rich applications requirements!

    (plz use examples)

    1. When is it reasonable to add functionality while losing browser compatibility?
    2. Has anyone implemented any kind of security using js frameworks or just RPC using js?
    3. How much of your web experience using DHTML is for internal corp/college networks versus the WWW?
    4. Any horror stories - in my experience - js is a huge management issue - not proficiency issue.
    5. Any examples on how to manage xslt and js?

    Friend of mine works with a place where the java tech lead made a JavaBean that was used dynamically create javascript...

    ANY THOUGHTS?

    Mostly curious to hear examples of the best ways to manage large amounts of javascript on the web!

    Thanks,
    ~tim
  3. and i have implemented single pages with well over 30,000 lines of OO js in a single page.
    And you are still sane? I've done a few projects with lots of javascript. Not fun at all.

    If you need that much logic on the client, look at a real rich client - if you are able. Sometimes you have no choice but a browser client (no plugins).

    I can see no reason why Javascript cannot be "dynamically" generated. It seems to work well in Echo and Swing-on-web (to name a few).
  4. Not exactly...[ Go to top ]

    Hi Mark,

    I would definitely say i lost a fair amount of sanity from that project!! It was almost 3 years ago and since then i have always advised against using javascript as much as possible. The last project i did, i made each versions of the web revolve around the level of javascript. Level 1 had none, level 2 had full browser compatibility, and level 3 had browser specific implementations.

    As far as the dynamic javascript goes - it is certainly possible - but as far as being a conceptually good architecture, i think the idea of generating client code - dynamically on the server is crossing more layers of separated functionality than i would think to be a good idea. Essentially coupling a javabean to a client script...
  5. 1. When is it reasonable to add functionality while losing browser compatibility?

    Never, if you have issues like this coming up, then go back to the drawing board and figure out where the design flow went wrong.
    2. Has anyone implemented any kind of security using js frameworks or just RPC using js?

    JS can be disabled in a browser, and who knows what might come out in a new IE Service Pack that will block possibly unlawful use of JS RPC technologies.
    3. How much of your web experience using DHTML is for internal corp/college networks versus the WWW?

    They are the same medium---
    4. Any horror stories - in my experience - js is a huge management issue - not proficiency issue.

    Yeah, JS mixed with JSTL to set variables, etc. This is horrible to manage. Also, regression testing and automated web testing tools will barf with most JS use. Stay away from providing core functionality to your application with JS.

    5. Any examples on how to manage xslt and js?

    Stay away from XSLT, /. had an article a couple weeks ago from XML.com that stated developers should be using CSS instead of XSLT on the web.
  6. Automated web testing tools.[ Go to top ]

    Hallo Jacob,
    Nice to see you here. (I wrote you yesterday about the annotation based validation thing).
    Never, if you have issues like this coming up, then go back to the drawing board and figure out where the design flow went wrong.

    You shouldn't lump the usage of the XMLHttpRequest object with creating an unmaintainable mess of client and server code.

    This technology is usefull as addition to a traditional page based web application. Clients which support the necessary javascript can use an richer user interface which is as well quicker for the user as faster to put out for the server.

    This shouldn't be done just because you can, of course.
    Each interface should be as simple as possible for the needed functionality. But there are problems and web interfaces which can't be simplified anymore. If the complexity of the problem leads to a web application in which page reload times and the amount of user action required begins to interfere with the user experience of that application, these kind of technology can be a solution.

    Httpunit does automated javascript testing via Rhino, btw.
  7. Automated web testing tools.[ Go to top ]

    Httpunit does automated javascript testing via Rhino, btw.

    Regression testing tools do have support for some JS, but trust me, web designers can generate JS that isn't executable by HttpUnit (probably because of specific browser functionality). We will be rewriting the UI for a couple of our applications because we can't automate any regression testing because of the JS event/behavior used to dictate flow. I would be interested to hear if anyone gets JS RPC working with Httpunit.

    A similar discussion on user interfaces arose with Hans Bergsten's article on designing interfaces with JSF where he found it acceptable not to be able to bookmark any content and to use frames to produce simulated desktop applications.

    All the Best!
  8. Automated web testing tools.[ Go to top ]

    I know the pains of javascript developing web designers too well. Helpfull in this case can be to implement a Javascript library with a client side API to make it more web designer friendly.
    We will be rewriting the UI for a couple of our applications because we can't automate any regression testing because of the JS event/behavior used to dictate flow. I would be interested to hear if anyone gets JS RPC working with Httpunit.
    I was always successfull in limiting javascript to an assisting role in application flow on the projects I've been. You can't simply lock out a double digit percentage of users.

    As my project is lacking some aditional unit tests, I will take a look at the possibilities to use XMLHttpRequest under Rhino soon.
  9. 1. When is it reasonable to add functionality while losing browser compatibility?
    Never, if you have issues like this coming up, then go back to the drawing board and figure out where the design flow went wrong.

    I've found that if you are developing a web app for an intranet, where you can have a more controllable environment (one default browser and its enabled options), then it is possible to risk losing some compatibility. Usually, at least where I work, web apps are very complex and require lots of forms and constant refreshes, and such RPC mechanism indeed helps a lot.

    Regards,
    Henrique Stekelberg
  10. 1. When is it reasonable to add functionality while losing browser compatibility?
    Never, if you have issues like this coming up, then go back to the drawing board and figure out where the design flow went wrong.
    I've found that if you are developing a web app for an intranet, where you can have a more controllable environment (one default browser and its enabled options), then it is possible to risk losing some compatibility. Usually, at least where I work, web apps are very complex and require lots of forms and constant refreshes, and such RPC mechanism indeed helps a lot.Regards,Henrique Stekelberg

    Funny how we use the web/browser so we don't have to worry about the "client" and then we go and limit it to one "client".
  11. Funny how we use the web/browser so we don't have to worry about the "client" and then we go and limit it to one "client".
    Mark,

    Where I work we are more worried with easier client deployment/maintenance than with browser "independence". We have some C++ client-server apps still lingering around, with hundreds of users spread thoughout the whole country (Brazil), and they surely are a headache when it comes to client updating, whenever there's a new maintenance release. Some have automatic update features, but even that sometimes give us troubles, besides that's more code to worry about too. We simply don't have to worry about it with web apps.

    Regards,
    Henrique Steckelberg
  12. Henrique,

    Web apps might be revolutionary compared to direct client/server applications to be delivered around your country. However, you can (and should) use application servers with rich clients as well -> implement business logic in your server application and keep the rich clients also "thin". Nowadays Webstart works pretty well in Java client distribution

    When targeted to a large group of end-users, a web application certainly has its advantages (the right JRE doesn't have to be downloaded, proxy settings doesn't need to be set for the WebStart etc). But creating complex clients with JavaScripts still has several drawbacks: the application just doesn't work with every user's browser.

    Tomi
  13. Henrique,Web apps might be revolutionary compared to direct client/server applications to be delivered around your country. However, you can (and should) use application servers with rich clients as well -> implement business logic in your server application and keep the rich clients also "thin". Nowadays Webstart works pretty well in Java client distribution When targeted to a large group of end-users, a web application certainly has its advantages (the right JRE doesn't have to be downloaded, proxy settings doesn't need to be set for the WebStart etc). But creating complex clients with JavaScripts still has several drawbacks: the application just doesn't work with every user's browser.Tomi
    Tomi,

    We have problems trying to deploy JRE to the whole company, besides, there are really old pentiums still lying around, which would not run Java so well (or maybe at all!). I think in this case http://thinlet.sourceforge.net could help us.

    Regarding Javascript not working with every browser, that is what I was talking about using it when you have a "controlled environment", meaning that you know that all users will have the _same_browser all the time, and that is the case where I work: the company has standardized on a specific browser for their intranet, so we are confident that differences won't affect us. Of course, it wouldn't be like this on an internet environment.

    I have already made some tests with http://xmlrpc.kollhof.net/, it worked beautifully with different browsers. It was like going back to Client-Server old days, but with a thin client... full logic at the server, browser just creates/updates the views and sends form data back. An eerie feeling, after working for so long with web apps.

    Regards,
    Henrique Steckelberg
  14. Funny how we use the web/browser so we don't have to worry about the "client" and then we go and limit it to one "client".
    Mark,Where I work we are more worried with easier client deployment/maintenance than with browser "independence". We have some C++ client-server apps still lingering around, with hundreds of users spread thoughout the whole country (Brazil), and they surely are a headache when it comes to client updating, whenever there's a new maintenance release. Some have automatic update features, but even that sometimes give us troubles, besides that's more code to worry about too. We simply don't have to worry about it with web apps.Regards,Henrique Steckelberg
    Henrique,
      I understand your needs and reasoning. I know the joy of distributing client-server apps. I also know the pain of developing and maintaining webapps. And deploying is not always that easy either. Especially when your app becomes clustered and destributed. But with WebStart and Smartclient (.Net) the tables have turned.
      What I really meant is that the web/browser paradigm was supposed to make the client unimportant and (thanks Bill) it still is.

      I see in another post you mention older machines. It is probably a bad assumption, but I assume you are targeting IE? Current versions of IE will have a tough time running on older machines. I have run a pretty current JVM on an older Pentium Laptop running Windows 9x. It ran quiet well. Also, a comparible webapp (with all the Javascript and HTML on the client) uses pretty close to the same memory as the webstarted Java app. The Java app uses a bit more, but it does more.

      Another thing to consider is the future the IE browser. It is not in MS's best interest to continue allowing non-IE browsers accessing Windows Server generated content and allowing next-gen Windows browsers access non-Windows server generated apps. No one says they have to continue supporting the web in its current form. Looking at what they are doing with IE and Longhorn, I don't think they will. And I bet IE browser viruses will be part of the "reason". And I can't say that I blame them. (I know pass the crystal ball back to Rolf :) )

    Mark
  15. I know pass the crystal ball back[ Go to top ]

    OK, Mark, :)

    "Looking at what they are doing with IE and Longhorn"

    What they are doing with Avalon/Xaml and Longhorn is the only totally satisfactory solution of the problem. (a version for Windows XP will be available when Longhorn arrive) It is one of the biggest happenings in MS history. A 1,2,3 punch.

    1) Clickonce installations
    2) Declarative programming - XAML
    3) The full power of "SVG" (MS version of it!)

    You also have the option to let you application run inside the browser. The user doesn't even need to know that he is running a "real" application.

    But that time (Longhorn arriving) I am sure that Linux will have something similar. It is not so difficult. If a small company as Xamlon has copied 80% of XAML in one year, I am sure that Miguel de Icaza and his team can do it in half the time. The biggest challenge is actually the code security.

    Regards
    Rolf Tollerud
  16. I know pass the crystal ball back[ Go to top ]

    I am sure that Linux will have something similar. It is not so difficult.
    More than likely via Java. Most of that exists today in Java. Maybe not 100% either way but close enough.

    The point is that MS is moving in the direction that Rolf is pointing out. I just had a discussion with .Net architect about it (Before this post). Continuing to directly target Windows tools and platforms with non-Windows tools and platforms is not a good idea. And there is no need.

    For info on Java based XForms see - http://www.theserverside.com/news/thread.tss?thread_id=23491
    http://developer.novell.com/xforms/
  17. And how do you marry SVG and xForms?[ Go to top ]

    "Continuing to directly target Windows tools and platforms with non-Windows tools and platforms is not a good idea."

    On the contrary I think that is a very good idea, saving 99% of the cost.

    "And there is no need." (because of xForms)

    Tinic Uro (Principal Engineer on the Macromedia Flash):
    XAML is a SVG derivation in many points. But where SVG is incredibly complex, still lacking some basic functionality, slow and almost impossible to implement 100% correctly, XAML is simply to the point of what developers and designers need.

    Remember that Tinic is not working for MS but is in fact a competitor.

    IMO, W3C has lost touch with reality. No implementation should be accepted without there being a working implementation. That was how it worked for 2-3 years ago, before they got hubris.

    Regards
    Rolf Tollerud
  18. The future ...[ Go to top ]

    On the contrary I think that is a very good idea, saving 99% of the cost
    I am talking about the future. In the future, as I gaze into my crystal ball, I see Windows not supporting non-MS generated content. At least not to the same level (kinda like today). Again, I am not saying they are bad but that we need to wake up and smell the coffee.
    "And there is no need." (because of xForms)
    I didn't mean for the two to go together. It was more of a reflection of what I had said earlier. Java has everything we need. Some of it needs more work. But so does the .Net world.
  19. I know pass the crystal ball back[ Go to top ]

    More than likely via Java. Most of that exists today in Java. Maybe not 100% either way but close enough.

    I want to believe you that Java could be a next generation client platform. But everytime Mozilla is ported to Java (Javagator, Jazilla), it dies quickly. Batik is way behind schedule, stagnant, and almost useless for interactivity. Even WebStart (which serves Java up on a silver platter) has almost no penetration.

    As long as 99.99% of the desktops run JavaScript and 98% of the desktops run MS-Windows, the case for Java on the desktop is awkward. JavaScript already accounts for most of the world's logic sent to clients, and as SVG and other XULs rise, the importance of JavaScript surely grows. The next few years likely focus on object-oriented server-side frameworks for encapsulating client JavaScript. I'm predicting more JavaScript and less hand-coding of it. Moore's Law is on JavaScript's side.
  20. I know pass the crystal ball back[ Go to top ]

    As long as 99.99% of the desktops run JavaScript and 98% of the desktops run MS-Windows, the case for Java on the desktop is awkward.

    I don't really think that the problem with desktop Java lies in windows as an OS (or windows adoption). I have yet to see Java as a desktop platform widely used on Linux. On linux most the UI is done in C/C++ using developer unfriedly libraries as GTK (in comparison to which Swing is like VB against the windows platform SDK)
  21. I know pass the crystal ball back[ Go to top ]

    As long as 99.99% of the desktops run JavaScript and 98% of the desktops run MS-Windows, the case for Java on the desktop is awkward.
    I don't really think that the problem with desktop Java lies in windows as an OS (or windows adoption). I have yet to see Java as a desktop platform widely used on Linux. On linux most the UI is done in C/C++ using developer unfriedly libraries as GTK (in comparison to which Swing is like VB against the windows platform SDK)

    The major problem is the continuing love affair with web apps. This app proves it. So Sun and IBM and ... etc's focus has been on web apps by necessity. It is a vicious cycle.

    "We only do web apps cause they are easier to maintain/deploy". "Were are all the Java desktop apps?" "We aren't going to do Java desktop apps cause no one else is and it isn't well supported." "We aren't going to support the tools to do them cause no one is gonna do them and no one is asking for it."
  22. I now pass the crystal ball back[ Go to top ]

    "This app proves it" - should read - "This thread proves it"
  23. I know pass the crystal ball back[ Go to top ]

    Mark, let me take a look at Rolf's crystal ball for a minute, please? :)

    The day we see Windows/Linux/Mac/Unix come with JRE preinstalled, is the day people will start thinking about doing Java client apps. Until then, the web will be mainstream.

    The only RIA framework that can prove above statement is wrong, IMO, is Thinlet (http://thinlet.sourceforge.net), as it runs on almost any JRE available today (Applets inside browsers, MS', MIDP, JRE1.1, etc). I think this project is the only one capable of making people really step forward into real RIA in Java. The burden of having JRE broadly installed is too high, so a project that can use any existing JRE will be best positioned to be adopted.

    MS is just doing it with XAML and avalon, indigo, or whatever nickname it has. They are just copying existing Java technologies, packaging it and spoon-feeding us, as they always have done. I just am curious as to how they are going to reach the millions of Win98 users out there. I doubt they will upgrade just so they can use the latest and greatest, they would've done it by now if it was for this reason. Maybe an activex plugin that will run XAML apps, or something in this line.

    Ok Rolf, take that cristal ball back, I'm done with it! :)

    Regards,
    Henrique Steckelberg
  24. The only RIA framework that can prove above statement is wrong, IMO, is Thinlet, as it runs on almost any JRE available today (Applets inside browsers, MS', MIDP, JRE1.1, etc).

    Thinlet might be great for obsolete systems, but it's a horrible example of a modern XUL. It uses XML almost as an afterthought. There's no support for CSS, scripting, or DOM. A real XUL, such as SVG, exposes all of these.
    The burden of having JRE broadly installed is too high...

    I understand our curse, but isn't the Moore's-Law-like exponential adoption of broadband destined to make plugins mainstream? If not, then Mark's correct that MS can usurp the Web simply by monopolizing the terminals.
  25. Thinlet might be great for obsolete systems, but it's a horrible example of a modern XUL. It uses XML almost as an afterthought. There's no support for CSS, scripting, or DOM. A real XUL, such as SVG, exposes all of these.
    I wouldn't call MIDP obsolete devices, but I think this wasn't your point. My point is, if you want the broadest reach, you'll have to go with the least common denominator. Thinlet is, IMO, the closest you'll get to that today. (PS: there are skinnable and scriptable add ons for thinlet).
    I understand our curse, but isn't the Moore's-Law-like exponential adoption of broadband destined to make plugins mainstream? If not, then Mark's correct that MS can usurp the Web simply by monopolizing the terminals.
    I've read a interesting book titled "telecosm", where the author makes great use of Rolf's crystal ball, and predicts that if broadband really increases exponentially, we'll eventually reach a point where we won't even have installed software nor hard disks any more, since every and all information will be a click away though ubiquitous wireless multiterabit connections. Why store anything locally when you can get it instantaneously from anywhere around the globe? :)
    Let's not abuse Rolf's crystal ball, and get back to the present. I've heard before, when internet was just beginning to appear to the masses, of how MS would hijack the internet, back in the days when netscape x IE was the battle du jour. It didn't happen then, and I think it won't ever happen, given internet's democratic environment, and the fact that in order for it to exist, there must be standards, a thing MS is not too famous for supporting, maybe for imposing by using its desktop monopoly. But no monopoly will replace HTML, not for a long time, as I see it, it is too widespread now. It's the least common denominator law working again, and sometimes it's better to use it for our benefit, instead of try to fight against it.

    Sorry for babbling so much.

    Regards,
    Henrique Steckelberg
  26. ...predicts that if broadband really increases exponentially, we'll eventually reach a point where we won't even have installed software nor hard disks any more, since every and all information will be a click away though ubiquitous wireless multiterabit connections.

    I don't consider growth of shared infrastructure as something that discourages client processing.

    I think compound documents are the future of GUIs. Mark mentioned Eclipse pluggability as a very crude model, but one that proves the desirability of gathering multiple plugins to achieve capabilities beyond what any one frozen distribution can offer.

    For me the question is whether W3C and Mozilla can properly craft namespace pluggability for compound document presentation before Avalon becomes universally accepted as a leading desktop middleware solution. If Avalon gets entrenched first, it likely imposes a web presentation model so broad that bodies such as W3C and Mozilla might need to reconsider themselves.

    Rolf considers serverside Java interesting enough to hang out with us here. But I have never heard him praise Java on the client nor chat at clientside JavaLobby. And now Avalon desktop middleware seems to be captivating him.
    It didn't happen then, and I think it won't ever happen, given internet's democratic environment, and the fact that in order for it to exist, there must be standards, a thing MS is not too famous for supporting...

    HTML and HTTP are the result of European government funding. During a brief few years of the Web emerging while Microsoft picked its nose, democratic standards were economically possible. I doubt the same commercial mistake will be made twice.
  27. I don't consider growth of shared infrastructure as something that discourages client processing.
    I was not talking about processing, but storage.
    I think compound documents are the future of GUIs. Mark mentioned Eclipse pluggability as a very crude model, but one that proves the desirability of gathering multiple plugins to achieve capabilities beyond what any one frozen distribution can offer.For me the question is whether W3C and Mozilla can properly craft namespace pluggability for compound document presentation before Avalon becomes universally accepted as a leading desktop middleware solution. If Avalon gets entrenched first, it likely imposes a web presentation model so broad that bodies such as W3C and Mozilla might need to reconsider themselves.
    Well, regarding Mozilla / W3C versus MS' Avalon, I would put my bets on SVG instead of XAML. IMO, this would be the democratic standard that could really replace HTML in the future, not some company's proprietary technology. MS' release of Avalon will make other companies + internet community open their eyes to this, I suppose.
    HTML and HTTP are the result of European government funding. During a brief few years of the Web emerging while Microsoft picked its nose, democratic standards were economically possible. I doubt the same commercial mistake will be made twice.
    Yes, the situation now is very different from back then, but even so, I still think that there's limited space for private companies to try to impose new technologies on the internet, so much as to take control of it. The internet as it is now is too heterogeneous for something like that be possible, IMO. Even MS' with their monopoly - 90% of browser market - couldn't just take HTML and change it freely as they wanted. I doubt they will come up with something so wonderful that will simply replace overnight what's out there now, not even an open standard would. Such a change would take years, if ever.

    Regards,
    Henrique Steckelberg
  28. elusive animal[ Go to top ]

    Luke
    IMO sometimes you just have to tell the customer that "always being right" can have a hefty premium - and if they want a zero maintenance, zero install, web delivered, server driven, rich client, browser based application then they'd better have a big bank balance and a stomach for disappointment.
    Can not anyone show up a "zero maintenance, zero install, web delivered, server driven, rich client, browser based application"? Please, I want to see it!

    So come on, give it to me, where is the stunning web browser applications? Please! And I don't mean Flash-only apps. (But perhaps cases where Flash is used as a component technology rather than take over the whole screen).

    Henrique
    Why store anything locally when you can get it instantaneously from anywhere around the globe? :)

    If that happens is equivalent to handing everything to Microsoft.

    Regards
    Rolf Tollerud
  29. elusive animal[ Go to top ]

    Why store anything locally when you can get it instantaneously from anywhere around the globe? :)
    If that happens is equivalent to handing everything to Microsoft.RegardsRolf Tollerud
    Well, would you trust all your data and files to MS? ;) This is a matter of trust, as much as you trust Orkut to put your personal information, or some Blog site to put your diary, or some ISP to put your web pages. If there will be mechanisms providing real security and trust on the future internet, nothing will hold us from storing everything remotely instead of locally.

    Regards,
    Henrique Steckelberg
  30. it is still time to build up Java XAML with SWT

    Henrique,
    "I would put my bets on SVG instead of XAML"

    As Tinic Uro said, (http://www.kaourantin.net/)
    "But where SVG is incredibly complex, still lacking some basic functionality, slow and almost impossible to implement 100% correctly, XAML is simply to the point of what developers and designers need"

    "nothing will hold us from storing everything remotely instead of locally."

    Avalon/XAML/Clickonce allows for offline storage. So most probably you will have a local backup of your most important data. And at the server it is your company data store, MS is not involved. Hailstorm failed remember?

    Brian,
    "I have never heard him praise Java on the client nor chat at clientside JavaLobby"

    Actually I would if SWT was universally accepted by the Java community and something like the Excelcior SWT compiler was part of the official Java SDK distribution. I still think it is the best and most lightweight way to develop client GUIs today. But as it is now - never preoccupy yourself with far out - little know technologies.

    That Rod & Juergen choose Swing to build client side Spring I find astonishing.

    Regards
    Rolf Tollerud
  31. SWT, XML Put True Cross-platform GUIs Within Reach(2 years ago)
  32. I would put my bets on SVG instead of XAML" As Tinic Uro said, (http://www.kaourantin.net/)"But where SVG is incredibly complex, still lacking some basic functionality, slow and almost impossible to implement 100% correctly, XAML is simply to the point of what developers and designers need"
    HTML is somewhat complex (add HTML + XML + javascript + CSS + etc and see what you get), lacks basic functionality and almost impossible to get 100% compatibility between different browsers, but MS could not come up with something simply to the point of what developerts and designers need that could replace it. Why would MS beat SVG with XAML?
    nothing will hold us from storing everything remotely instead of locally." Avalon/XAML/Clickonce allows for offline storage. So most probably you will have a local backup of your most important data. And at the server it is your company data store, MS is not involved. Hailstorm failed remember?
    Rolf, in a (maybe not so) distant future, everyone will have secure IPv6 terabit wireless connections everywhere, everytime. Why store locally if it will be always available instantly remotely, anywhere you are?
    Maybe Hailstorm timing was wrong. Where and how exactly will data be stored I can'l tell, but IMO it won't be on a local storage on your computer, it will be cheaper to keep it on the net than to buy big HDs. Computers will just fetch data, generate a view of it, process it, and send it back. No need for offline mode, because we will _NEVER_ be offline.
    Being always online though huge connections may bring some niceties: you'll never have to worry about software upgrades, because your computer will always fetch the latest from the net automatically. You won't have to worry about back-ups, it will be storage service provider's responsability. Your RAM will become just a short time cache memory where data is worked on before being viewed/processed/sent back to remote storage, or to cache operational system and application binaries being run. It will be like going back to terminal + mainframe era again, but with the following differences:
    terminal = smart data viewer + transformer + local processing
    mainframe = distributed storage / distributed processing /load balancing / clustering, not really one big computer but huge CPU + disk farms.

    What would we do with almost infinite bandwidth, processing power and storage? I hope I'll be still alive to see that.

    Enough dreaming :)

    Regards,
    Henrique Steckelberg
  33. dreaming[ Go to top ]

    "almost infinite bandwidth, processing power and storage? I hope I'll be still alive to see that."

    Me too
  34. I know pass the crystal ball back[ Go to top ]

    Mark, let me take a look at Rolf's crystal ball for a minute, please? :)
    Sure, have at it.

    I sort of feel like one of the 3 witches passing the eyeball around. :)
  35. I know pass the crystal ball back[ Go to top ]

    I want to believe you that Java could be a next generation client platform. But everytime Mozilla is ported to Java (Javagator, Jazilla), it dies quickly.

    If MS has its way, the browser, as we currently know it, will go away. The web will be used like Eclipse uses it - for downloading apps, communication and delivering content. I am not talking about a Java browser that does everything that Firefox, for example, does today. Take a look at what is happening at Eclipse.org with the RCP.
  36. I know pass the crystal ball back[ Go to top ]

    JavaScript already accounts for most of the world's logic sent to clients, and as SVG and other XULs rise, the importance of JavaScript surely grows. The next few years likely focus on object-oriented server-side frameworks for encapsulating client JavaScript. I'm predicting more JavaScript and less hand-coding of it. Moore's Law is on JavaScript's side.

    Just for fun I am going to predict (don't stone me if I am wrong :) ) that MS will not include Javascript support in their future browser (probably in Longhorn + 1) due to all the viruses and only include support for basic html and their rich client technololgy. It will all be for our protection. :)
  37. JSON-RPC - JavaScript to Java Remote Scripting[ Go to top ]

    Meant now not know. Wish I could type. :)
  38. Veeeery Interesting.

    I can't help but be impressed with the gmail effort, and if this framework helps mere mortals achieve similar results then great. Personally I would rather see browsers used for viewing documents as originally intended and rich client platforms and/or appropriate browser plugins used for any significant client functionality. Sounds a bit reactionary now - but that sort of thinking was probably revolutionary in the past, and in the roundy-bouty way of these things probably will be again in the future.

    IMO sometimes you just have to tell the customer that "always being right" can have a hefty premium - and if they want a zero maintenance, zero install, web delivered, server driven, rich client, browser based application then they'd better have a big bank balance and a stomache for disappointment.

    One thing I noticed with the demo though - running "Basic Tests" maxes out my CPU for about 20 seconds while they were running (Firefox / XP / 2.4Ghz CPU / 512Mb). Interesting to know what sort of overall performance this mode of programming gives - weighing up page refreshes with RPC calls and CPU usage / client machine degradation etc. If a Swing (or native app) app munged your CPU for that length of time calling some tests on a remote server you'd probably want your money back.
  39. Maxed out CPU[ Go to top ]

    I believe the CPU usage is a bug with Firefox wheeling the CPU while it is waiting for a reply (due to the use of synchronous calls). This doesn't happen for instance with Konqueror (at least in my patched version to fix XMLHttpRequest).

    I'd like to track down this bug in Firefox as it makes synchrounous calls (which are much easier to program with) not quite as usable. Firefox seems to disable it's whole event loop and wheel the CPU when XMLHttpRequest.send() is running which which seems a bit severe.

    I'm planning on adding the support for asynchronous calls real soon (although these can be done now with the alternative jsolail JSON-RPC client I believe). There are no changes neccesary on the server-side.

    In the next week or so i'll post up a more interesting demo making use of async calls and actually doing something more functional and interesting.
  40. Yes I also have the same max CPU load when running the test suite. IMO, if you realy want a rich client you should go for a web start application. Besides the rich UI, a web start client gives you also posibillity to have local disk caches, work offline mode and other nice things for a rich client.
  41. I'm on Explorer 6.1 and the demo is failing with an "[object Error]".
    Works in MSIE 6.0 and in Firefox 1.0. Great stuff!
    IWhen is it reasonable to add functionality while losing browser compatibility?
    I think we should talk about gaining compatibility, not losing it. The zoo of obsolete and non-standard browsers must go. Now we have Mozilla and to some extend MSIE 6.0 in strict mode and maybe something else in the Mac land ;) These browsers are more or less compliant, with MSIE still being quirky ;) even in strict mode. These browsers support DOM, HTML4/XHTML, Javascript, CSS, XML/XSL. These browsers are the current standard, and applications should be build for them and for newer ones, not for some antiquated incompatible clients with proprietary tags.

    It is too much to require from all Windows users to install Firefox, but at least they can download MSIE 6.0, the browser was released in 2001!

    Old browsers must not be abandoned, but instead of trying to provide all new visual and content features that available, say with CSS or XSLT, to old browsers using proprietary hacks, the pages should just degrade. Users must either move to newer version, or have less visual appeal and features.
    Stay away from XSLT, /. had an article a couple weeks ago from XML.com that stated developers should be using CSS instead of XSLT on the web.
    Is that so? XSLT is great, one can use it with CSS, not instead CSS. And, XSLT can be used on the server if client does not support it.
    I can't help but be impressed with the gmail effort
    Yeah, but have you tried to open a message in gmail in a separate window? You cannot because the damn links are javacscripts, not a real links. I hate links like these, because I want to open any link in a new window if I want to. I would say that it is ok to use Javascript for some automation, but it should not interfere with good old clicking on link as on a document.
    IMO sometimes you just have to tell the customer that "always being right" can have a hefty premium - and if they want a zero maintenance, zero install, web delivered, server driven, rich client, browser based application then they'd better have a big bank balance and a stomache for disappointment.
    Plain and simple: if they want all this and want to stay within a browser window, they MUST use compliant browser!
    Nowadays Webstart works pretty well in Java client distribution
    Now who won after the Sun-MS suit, after which Windows and MSIE come without Java plugin?
    Flash is one option, look for instance, http://reservations.ihotelier.com/onescreen.cfm?hotelID=2054 One screen in place of normally 6-7!.
    The same ugly non-fluid non-scaling Flash that I fed up with. Can someone hand me an airsick bag?
    I see in another post you mention older machines. It is probably a bad assumption, but I assume you are targeting IE? Current versions of IE will have a tough time running on older machines. I have run a pretty current JVM on an older Pentium Laptop running Windows 9x. It ran quiet well.

    Agree, (X)HTML-based compliant web apps need modern browsers, so there is no difference between either to upgrade MSIE from 4.x to 6.x or to install JRE. But MSIE is also a browser ;-)
    IMO, W3C has lost touch with reality. No implementation should be accepted without there being a working implementation. That was how it worked for 2-3 years ago, before they got hubris.
    Is it the way how MSIE got its non-standard features like its box model?
    I want to believe you that Java could be a next generation client platform.
    I used to write Windows GUI apps in plain C and good old message loop. It was simpler than doing a Swing GUI! Check event listeners in Delphi, and then look at how they are implemented using interfaces in Java. Awkward and unusable. I do not like Swing and I better be doing XHTML/XSS than Swing whenever I have a choice.
    Just for fun I am going to predict (don't stone me if I am wrong :) ) that MS will not include Javascript support in their future browser (probably in Longhorn + 1) due to all the viruses and only include support for basic html and their rich client technololgy. It will all be for our protection. :)
    This will happen if all users will use Microsoft Online Banking, Microsoft Money, Microsoft Stores, Microsoft Ebay, Microsoft Yahoo... which will not happen _that_ soon ;-)
    The day we see Windows/Linux/Mac/Unix come with JRE preinstalled, is the day people will start thinking about doing Java client apps.
    And even then people may choose Flash, simply because Flash works almost anytime (God, what would I give for a great Flash-ad killer!), and applets choke on a simple personal ad-filter/firewall.
  42. JavaScript as an application platform[ Go to top ]

    You shouldn't overlook the problems with JavaScript when thinking about using it to build a large client-side application. I speak from experience, having developed a very large client-side JavaScript/DHTML/XML based app (www.youbet.com) starting back in 1999.

    Since we launched, the app has handled about $1 billion in transactions, so we've had financial success. The app accomplished it's technical goals as well: low bandwidth utilization - since most of the HTML rendering is done on the client, quick UI response - though still not as fast as a native app, and excellent scalability - since the clients to most of the work.

    HOWEVER, there were a LOT of hurdles to overcome:
    - Downloading of JS files to the client can be problematic. They don't always get all your code, it can get corrupted by bugs in the browser, and proxies on the Internet can cache out of date copies. We had a lot of problems with this.
    - JS on the client has limitations... for example the IE security model prohibits accessing an SSL URL from an non-SSL page. We had to resort to ActiveX (gasp) components to provide certain required capabilities.
    - JS isn't really OO, its "object based". What this means is no strong typing, no true encapsulation, etc... As your codebase grows this can lead to maintainability issues. JavaScript is a great combination for getting things done fast, but it is a "scripting" language. Trying to build a large application with it is possible, but not the best solution IMO.

    The browser is becoming an increasingly unstable platform for building robust applications. IE started out as a fairly decent platform, but the ongoing war between the spyware/virus/privacy-invader/popup-spammer side and the anti-spyware/anti-virus/anti-cookie/anti-popup side has left the average user's browser with very unpredicable behavior. Many security products, for instance, can cause serious problems with your JavaScript. Norton Internet Security 2000, for example, treats the following comment-header "//////" as an "attempted attack". This situation has only intensified in the last few months with XP SP2, and will likely continue to deteriorate.

    My advice: there are now better solutions out there for client development. Consider the alternatives to client-side JavaScript.

    Cheers,

    Henry Stapp
  43. Is the demo busted? http://oss.metaparadigm.com/jsonrpc/test.jsp

    I'm on Explorer 6.1 and the demo is failing with an "[object Error]".

    Dennis
  44. Works on Firefox. Just tested with IE and it don't work for me.
  45. 30.000 lines in a single page ???? are u insane or just fond of maintainless software :)
  46. But my 2 cents... the less logic in the client the better. So there must be a very good reason for this kinda complexily to exists on the client side.

    On the otherhand if it reduces network traffic big time.. it can be used.
  47. Usability[ Go to top ]

    The can take your webapplications to a totally different level of usability with this (look at GMail for a very similar approach).
  48. gsuggest is good
    gmail sign-page, very nice.
    gmail is over using this, back button become useless.

    Useful technique, just don't let it take over your normal page flow.
  49. JS != Maintainable[ Go to top ]

    Hi Other Mark,

    I could not agree with you more! The more js the less maintainable a web app - bottom line. Just like it is best to have systems rely on UNIX script as little as possible, html apps should not rely on js. Agreed?

    ~tim
  50. JS != Maintainable[ Go to top ]

    IMHO, this remoting technology should only be used in little niche situations and not as a solution for your whole web application. Developers should be more conscious of their page content and design as to provide simpler interfaces that are faster to download and easier to use (this *IS* the web after all). The page request/response lifecycle shouldn't be something you battle with.
  51. JS != Maintainable[ Go to top ]

    IMHO, this remoting technology should only be used in little niche situations and not as a solution for your whole web application. Developers should be more conscious of their page content and design as to provide simpler interfaces that are faster to download and easier to use (this *IS* the web after all). The page request/response lifecycle shouldn't be something you battle with.

    If you can afford to provide simpler interfaces by all means do it. Unfortunately, I have been in situations (the majority of the time in my line of work) where the customers want an application delivered through the web (for deployment sake) but very rich in funcionality (facilited data-entry etc.). They also want a zero-install deployment on the client (no plug-ins, no java, etc.)
    In these situations advanced DHTML and some form of remoting is a necessity.
  52. JS != Maintainable[ Go to top ]

    If you can afford to provide simpler interfaces by all means do it. Unfortunately, I have been in situations (the majority of the time in my line of work) where the customers want an application delivered through the web (for deployment sake) but very rich in funcionality (facilited data-entry etc.). They also want a zero-install deployment on the client (no plug-ins, no java, etc.)In these situations advanced DHTML and some form of remoting is a necessity.

    Because of the volitile nature of JS in relation to IE, I would say go for JS to support little UI features, but never to dictate application flow (basically, your application will still function without JS).
  53. JS != Maintainable[ Go to top ]

    Because of the volitile nature of JS in relation to IE, I would say go for JS to support little UI features, but never to dictate application flow (basically, your application will still function without JS).

    I see your point. I agree that for its nature JS is and must be limited, where necessary, to enhance the UI. I don't use it to provide business functionality. I use remoting only to fetch additional data without reconstructing the whole page.
  54. Fix brown paper bag IE bug[ Go to top ]

    There was bug which stopped the JSON-RPC JavaScript client working in Internet Explorer slipped through my QA for 0.6 (with all that testing of Konq, Opera, Moz and Safari I missed IE). Embarrassing.

    I have uploaded a 0.7 that fixes this is so now we have both Opera 8 and Internet Explorer working again.
  55. Waiting for Godot[ Go to top ]

    Eventually Avalon or some competitor to Avalon/XAML will be the dominating format for smart clients. But that lies a long time ahead, what to do in the meantime? Webstart or clickonce is not so attractive without some form of declarative markup programming style (XAML).

    Flash is one option, look for instance, http://reservations.ihotelier.com/onescreen.cfm?hotelID=2054 One screen in place of normally 6-7!. But no, a pure Flash solution is not the solution, too alien and primitive environment IMO.

    However, a combination with some part of the screen seamlessly done in Flash, and some part with Java/C#/JSON-RPC could be a viable option. In real life, compromise usually wins. Then the real work could be done by Java/C#/. http://news.com.com/Xamlon+looks+to+beat+Microsoft+to+the+punch/2100-1007_3-5395963.html
    ”> Xamlon </a>
     for instance, has just released a solution that compiles to the Flash format.

    But asynchronous calls are a absolutely must in those kind of scenarios, do you plan to add it in the future?

    Regards
    Rolf Tollerud
  56. Async calls[ Go to top ]

    Expect to see them in a about a week.

    I've been thinking about the API and how to keep it as transparent as it is now (with the proxy object interface matching the host objects).

    Due to JavaScript's flexibility I could add a method to the method (as a way to tell it to do the call async without disturbing it's method signature). So i'm thinking something like this.

    For a sync call:

      server.someObject.someMethod('foo', bar, [ baz ]);

    For an async call:

      server.someObject.someMethod.async(callbackFunc);
      server.someObject.someMethod('foo', bar, [ baz ]);

    This would of course have concurrency issues although I believe the JavaScript execution is purely single threaded, no? (only way to avoid that is to use to pass the callback directly into the method)

      server.someObject.someMethod(callbackFunc, 'foo', bar, [ baz ]);

    This could also be done easily as the delegating proxy function for each method can just see if the type of the first arg is a function and go to async mode (as a function will never be passed to a Java method).

    Which approach do people like?

    Michael.
  57. spoiled users[ Go to top ]

    "Which approach do people like?"

    Michael.

    I only had in mind one(1) pending operation. (KISS)

    An important principle with user interaction is that as far as possible; never allow the screen to freeze with the user waiting. My users are an impatient lot! (They wasn't brought up by Japanese parents..)

    For example (the underlying protocoll in IE),
    var oXMLHTTP =new ActiveXObject(...);
    oXMLHTTP.open("POST", sURL, true);
    oXMLHTTP.onreadystatechange = callbackfunction;
    try {
        oXMLHTTP.setRequestHeader ("Content-Type", "application/x-www-form-urlencoded");
    //send form data
        oXMLHTTP.send ("param1=bbb&param2=ddd");
        }
        catch (e) {
        alert(...);
        }
    }

    Regards
    Rolf Tollerud
  58. I Have A Dream...[ Go to top ]

    [WARNING: RANT COMING] (Personally, I love reading rants because even if the person is way off base, they usually have that sarcastic sense of humor that I love. So without further ado,...)

     Javascript is like crack cocaine to software developers - once you start, you can't stop. Or if you prefer a slightly less in-your-face description, javascript is like the vines that overtake your house or your fence or your beautiful trees. No big deal in the beginning, right? Just wait.

    We need to make it easier to get OFF javascript - not easier to do more stuff with it - IMHO. I am currently working on a project where javascript has run amok and the only thing I have to be thankful for is the first-hand realization that money alone truly does not buy happiness ;-)

    I know, I know,... "But we need client-side functionality," you say. Use Swing/JDNC - it's coming anyway.
    I know, I know,... "You need to do code reviews. Conscientious code reviews will keep that pesky javascript under control," you say. Code reviews? Hah! Code reviews are like the Yetti or BigFoot - sure some people may have caught a quick glimpse of one, but do you REALLY know what one looks like? So code reviews... out, sorry.

    And now, in honor of the late Martin Luther King Jr., I'd like to reprint his famous "I Have A Dream" speech - updated for 2005 and completely distorted to address only software developers.

    I have a dream today!

    I have a dream...
    That one day, when you say "javascript", every man, woman and child, all over the world will go "huh?"

    I have a dream today!

    I have a dream...
    That one day, I'll be able to figure out IN LESS THAN A DAY, where the hell a particular "[object Error]" error is coming from.

    I have a dream today!

    I have a dream...
    That one day my code will not just be judged by its functionality, but also by its maintainability.

    I have a dream today!

    I have a dream...
    That one day, developers who are "forced" to use javascript will rise up and find another job and that those who blindly recommend it because "you can do a lot of cool stuff with it" will be sent special re-education camps where they will be subjected to nothing but the Game Show Channel and the runny scrambled eggs from the cafeteria downstairs.

    I have a dream today!

    Thank you


    (The views expressed in this post do not reflect the views of this station, its management, its affiliates, or anyone without a sense of humor. The author of this post welcomes all replies, advice, flames, jokes, recipes, admonitions, job offers, astrological readings (I'm a Sag.), rides to and from work, suggestive looks from cute girls between the ages of 25 and 30, winning lottery numbers, and free beer.)
  59. JS cross browser compatibility.[ Go to top ]

    Any JavaScript (ECMA script) programmer worth his salt will be easily able to create cross browser functional code just by stick to the DOM scripting model – it’ s time to forget about IE’s DHTML.

    WWW client side scripting based applications are not feasible today because about 10% of the people out there have JavaScript disabled.
  60. I use a very similar system as part of my web application toolkit.

    I use javascript RPC call to invoke user-defined serverside actions. These actions can return plain text or a nested POJO graph consisting of the same component types as JSON does (Javabeans, primitives, strings, arrays, collections, maps).

    Online Demo of the javascript RPC

    Project documentation and download
  61. Although tempting, the JSON-RPC library should be probably used remembering there is no silver bullet in general, and may be even less so in the space of browser-server RPC.

    About JScript compatibility, my experience is that the problem does not really lay in 'Javascript' - as the ECMA standard - but how different browsers interpret & render the W3C DOM .

    Setting aside the RPC side of things, if you exchange data from your server to the client-browser side, the client has got to render it and to this end, must generate the page.
    From there, there are two routes; one is to use the W3C DOM API, the other to stringify HTML code (let's assume XSLT would be a variant of the latter case).
    In any case, IE which is the most deployed browser, is the one having the most odd behaviors and a lot of difficulties with the DOM (add CSS as a topping); the workaround is usually to go by the '.innerHTML' way which requires HTML code to be be generated... And besides, your page designers (fleshware & software) are not usually generating templates in JScript DOM code.
    At this point, you unfortunately already have two versions of your code executing; one for IE, another one for others (since in any case, XMLHttpRequest also has differences). And one of them already requires you generate to HTML (although at this point, you do it from the client side).

    Which leads to the second problem which is speed. Even with a Gecko based browser, heavy calculations through the DOM are slow (although faster than stringification). You transmit less data, less burden in general but the end-user response time is not good (latency - not bandwidth - becomes an issue). Building a rich 1000 'TD's TABLE (or refreshing its content) requires the client end side to be a 1800Mhz box. 'Rich' in this case would be with 'zebra' lines, selection capabilities, tree*table, etc: the core logic plus the event-hooks necessary become pretty heavy fast.
    On top, you are working in an almost hostile environment - altough using Mozilla/Venkman + DOM inspector is a convenient combination, IE does not have similar tools... And I forgot to mention memory requirements & gc issues with cycles between JScript & HTML objects...

    The crossroad: going the DOM-other-than-IE way is the less deployed but most elegant solution technically, going the generated-HTML-IE-friendly is the most-deployed-not-as elegant one. But none of them perform fast enough on common hardware. And it has been so painfull to develop in such a hostile world already that imagining the XSLT solution hoping it'd solve the performance issue becomes 'not an option'.

    Mitigation performance-wise is to have the server side generate the HTML (or whatever the client end side may want to 'render' -XUL- by 'skinning' if you want to get fancy & have time). So, extend what HTML forms do in a js library with the constraint that what is refreshed on the screen might not be known at page creation time. This is where XMLHttpRequest comes handy and your client-side framework may stay maintainable.

    Now, you still have work which is to have 'richer' widgets (both in terms of appearance & functionality) on the client end which is twisting HTML code beyond its intent, but far more less work than hand-craftying each page.

    My 2 cents:
    If your application is the client-rich type, does have a lot of pages & performance may be an issue with the volume of data that is to be presented, the investment might be worth it. (some of use are hard pressed that way; go look at a site like Asus...). But JSON-RPC is not suitable there since what you want to transmit is most probably XHTML (or any XML based format).

    If your application is in the area of 'simple is beautiful' , browser-server RPC in general and especially JSON which will hide most of the communication gorry details in a de-facto standard across browsers, is a good solution.

    To conclude, widening the subject back to the big & mean case, my fear about reach-client-apps is as follows: some will go the JSF / REST routes, others the XAML way, may be even some JSP+XUL at the frontier of XML-icism flanked by server-side/client side scripters and finally good-old 2-tiers riders. In other words, Java-server-siders, MSFT-way, rebels/Python/Perl/PHP & heavy-client-siders (Java/Eclipse included), all of them in different directions towards the same goal.
    It's really too bad no standard is really picking up the XUL (insert your choice of XML GUI framework here, may be WHAT-WG..) ideas; widgets skinning could be left to client side and pages agnostic of the renderer - and our work simpler. But looking at how the W3C standards are supported by healthy companies that contribute to the specs (read HTML/XHTML/CSS), it's unlikely we'll stop having to try to solve those questions anytime soon.
  62. A Good Idea Overall[ Go to top ]

    Let's assume that I built a web application that allows the user to define a list of market tickers adding the tickers to the list one by one. The server must store the ticker list for the session.

    I can implement the add operation:
    1. With a POST request;
    2. Or I can add the new ticker on the server side with an RPC call.

    The same result but the second solution will consume only a fraction of the server resources compared with the first (CPU, memory transfers, the amount of garbage collection and network bandwidth). This is important when you have to service hundreds of clients connected simultaneously.

    A couple of month ago I used the same RPC approach but using an applet, LiveConnect, JavaScript and a binary protocol.
    I even found a solution to load JS and html templates on the fly using RPC calls, paste and use them in the browser document. Very unstandard but very, very, very fast. The main problems were the need for a properly installed Java pluging and that LiveConnect works differently on Netscape and IE.
  63. Great Work[ Go to top ]

    We're looking forward to integrate it with wingS 2.
  64. Trivia[ Go to top ]

    The guys in Redmond had done this a while back actually using a Java applet -- figure that ;)

    http://msdn.microsoft.com/library/default.asp?url=/library/en-us/dnclinic/html/scripting041299.asp
  65. To be dependent of Java defeat the whole purpose. Then you can as well have a full blown Java client. The beauty of XMLHttp is that it needs not Java, neither .NET.
  66. nothing should be needed on the client[ Go to top ]

    To be dependent of Java defeat the whole purpose. Then you can as well have a full blown Java client. The beauty of XMLHttp is that it needs not Java, neither .NET.

    The down side is that you end up with duplicated code and typically in different programming languages. Just did it. VBA on the client, C# on the server. And did it 4 years ago. Java on the client, ASP/VB on the server. If you don't end up with duplicated code, you have other issues.
  67. What about DWR?[ Go to top ]

    How does it compare to DWR?
    I'm using DWR in a project. I use it in a page that has an applet that has to process a lot of information, so y get a chunk of information via DWR, process it with the applet, and continue to the next chunk, so avoiding downloading the whole data to be processed in one time (browser could crash)
    This way I can process at the client side (for an end to end secure solution) without any restriction of the page size.

    Regards.
    Guillermo Meyer