URL's in web applications are often pivotal not only for allowing users to save friendly URL's, but also for enhancing the possibility an application's content is efficiently crawled by major search engines. The following blog entries explore this process using Struts:
Now if you know Struts, 9 times out of 10 your url’s are ugly, because a bunch of programmers didn’t care at all when they developed your application about the impact the urls would have on natural search and the framework developers pretty much left you with a bunch of “do.do”. Very quickly the SEO firm was recommending 70+ rewrite rules on the Apache server to resolve to the urls in the application and then custom work for each individual url to rewrite it to the friendly url, so that when Googlebot crawls the site it would traverse these friendly urls. I cringed at the thought of this suggestion, not only is this not maintainable, but when I run a local server I can’t use the rewritten urls, as my development environment doesn’t have a full blown http server with rewrite capabilities. I knew there had to be a better solution, I just wasn’t sure what it was. I happened upon the UrlRewriteFilter, when researching regular expressions and rewrite rules, a rules engine that allows you to set up inbound and outbound rules to modify the urls using regular expressions. It handles every occurance of the url on both sides of the equation. The library comes in a jar with an xml configuration file that goes in your WEB-INF directory,
Or on the other hand, if you are using JSF this blog explores the process.
Believe it or not, but there are two generic kinds of URL's: dirty URL's and friendly URL's. Dirty URL's: Example: /index.php?a=123&b=456&c=3dGySMoi&x=FgSDRt4 Pros: easy to implement 1:1 on back-end, high portable Cons: difficult to mind, easy to hack, not indexed by most searchengines Friendly URL's: Example: /development/java/friendly-urls-in-jsf Pros: easy to mind, hard to hack, indexed by all searchengines Cons: difficult to implement 1:1 on back-end, less portable To combine the best of both worlds it's a good practice to strictly use dirty URL's for the back-end functionality and add an extra mapping layer to be able to use Friendly URL's. In a Java web application environment this is relatively easy to implement using a simple filter which redirects unfriendly URL's to friendly URL's and which dispatches friendly URL's into unfriendly URL's. The redirect has one concern in JSF: the faces messages will be lost in the new request. Fortunately there exists a phase listener which restores those faces messages for a PRG pattern: POST-Redirect-GET pattern. Lot of its code can be reused.
Read the complete post on Struts friendly URL's http://greatwebguy.com/search-engines/seo-friendly-urls-for-java-ee-frameworks/ Read the complete post on JSF friendly URL's http://balusc.blogspot.com/2007/11/friendly-urls-in-jsf.html