- Posted by: legolas wood
- Posted on: May 16 2005 16:59 EDT
Its two days which im in fight with a severe problem in XML/JAVA/JDBC arena
I have an XML file its a glossary which has some thousands of difinition.
each difinition is a word with two parts , one of this parts is hebrew and the other is english . (hebrew is explanation of english term).
The XML file is unicode , i tried to read it with XMLdocument and store it into database , but a big problem araise here
All hebrew character replaced with ? in database ? does any one know why ?
I tried almost all OS database like Mysql (i used useunicode parameter for it) postgreSQL , Firebird and cloudscape
But still what i get is not more that ????? .
Shoud i do some convertion before inserting them into database ?
In brief :
My problem is reading an XML file with multilanguage content and inserting its data into database , in a manner that my hebrew does not convert to ???????
- Problem in reading a multilengual XML file into database by Rakesh Malpani on May 17 2005 16:23 EDT
Can try encoding it to Hex encoding or base64 encoding.. but ofcourse the overhead is that you have to endoce and decode it every time you have changes.
Or alternatively instead of using char/varchar datatypes try storing it in binary format, that would work too. The question to me is the effect of the endian formats of the machines writting it and reading it if are different, might cause a problem.
Since its a message file, I would go with Hex or base64 encoding and cache the messages in the application to increase performance (which you will have to do it any ways).
Hope this helps.