Our client has several separate systems with separate databases. Information must be exchanged between the databases. The client has contracted out to have the information exchanged via a common xml format and transferred via secure ftp. Daemons would read in and process the xml files when they arrive at the destination directories.
The client also has a directive (from higher ups within the client organization) to move to a net-centric environment over the next several years.
What would be the advantages/disadvantages for changing the ftp'ing of the files to sending them via web services? Initially, with web services, the files could be dropped in the dir where the daemon picks it up. But later the daemon could be refactored to recieve the file from the web servce (or even to become the web service themselves).
I am trying formulate the rational for the client switching from ftp'ing files to transfering of the files through web service . Ftp'ing would be faster, right?
BTW, the some instances of the files could be large (e.g., 30MB), some could be small (K's). So I still see a need for the ftp'ing of the larger files at least.
Thanks very much.
Some issues that comes to mind:
For some data you may be able to call live rather than sending the data around at night. Actually I think you should if the performance is enough. Often the caller need data one piece at a time.
With web services you can wrap systems built with different technology and they will all look the same from the outside.
It seems to me that you will have redundant data in the different systems. It should be avoided if possible. It is better to call "on demand" then relying on nightwise batch jobs. If the data changes a lot it will most certainly differ between systems in certain situations and may cause errors in the business. Data that users base important desicions on should never be possibly outdated.
If you stick with redundant data for performance reasons I dont think web services helps much. The important thing is to have a nice data format that is as easily read as it is written. Ftp will certainly be faster than web services for this kind of transaction.
Is web services secure nowadays? Some of the wizards around may have the answer.
A web services solution bids for a much more complex environment. Setting up a web-services-enabled and secure web server would probably cost you a couple of manweeks. Or are you building it in the existing systems? In one of my recent projects (a year ago) we built an XML solution in an IBM mainframe that served data on demand to weblogic web/ejb servers using Tuxedo as middleware. The only place where performance was an issue was in WebLogic. When tuned it worked fine too. We never thought about web services in that project.
The client is pretty adament about not having his users affected by outside systems pulling/updating data in real-time. And pretty adament that a secure ftp'ing a standardized xml files in a pub-sub messaging schema is the way to go. They do have the beginnings of a standardized xml schema for moving data back and forth.
The benefit I see to web services, as you mentioned, is the real time data interaction (or near real time) and the elimination of redundent data. But since the external systems are separate databases supporting separate applications, with separate concerns, schedules, and budgets, I think the FTP'ing of files (based on a standardized xml schema) will be what is implemented in the near term. Perhaps we can work in some web services to do some of the work in near real time (at least prototypically). We'll see.
Thanks for the response. Any others are much appreciated.