I am trying to convert unicode to RUSSIAN CURRENCY (руб) symbol in teradata. Please find below query I have written but it is not converting unicode to symbol sel translate('U+0431' UNICODE_TO_LATIN). I am getting output as unicode which has written in sql query U+0431 but not desired symbol
Related
I am using DBI package in R to connect to teradata this way:
library(teradatasql)
query <- "
SELECT sku, description
FROM sku_table
WHERE sku = '12345'
"
dbconn <- DBI::dbConnect(
teradatasql::TeradataDriver(),
host = teradataHostName, database = teradataDBName,
user = teradataUserName, password = teradataPassword
)
dbFetch(dbSendQuery(dbconn, query), -1)
It returns a result as follows:
SKU DESCRIPTION
12345 18V MAXâ×¢ Collated Drywall Screwgun
Notice the bad characters â×¢ above. This is supposed to be superscript TM for trademarked.
When I use SQL assistant to run the query, and export the query results manually to a CSV file, it works fine as in the DESCRIPTION column has correct encoding.
Any idea what is going on and how I can fix this problem? Obviously, I don't want a manual step of exporting to CSV and re-reading results back into R data frame, and into memory.
The Teradata SQL Driver for R (teradatasql package) only supports the UTF8 session character set, and does not support using the ASCII session character set with a client-side character set for encoding and decoding.
If you have stored non-LATIN characters in a CHARACTER SET LATIN column in the database, and are using a client-side character set to encode and decode those characters for the "good" case, that will not work with the teradatasql package.
On the other hand, if you used the UTF8 or UTF16 session character set to store Unicode characters into a CHARACTER SET UNICODE column in the database, then you will be able to retrieve those characters successfully using the teradatasql package.
My Oracle DB is in ALT16UTF16 charcter set.I need to generetae unicode text file which should be imported in another DB in AL32UTF8 encoding. I use for this a PL/sql code which calls the DBMS_XSLPROCESSOR.CLOB2FILE procedure.
In my code process:
1st ,I use an NCLOB to store the unicode lines which contains a chinese characters.
then I call the procedure as: DBMS_XSLPROCESSOR.CLOB2FILE(v_file, DIRE, fileName,873)
where v_file is the NCLOB variable which contains the file and 873 is the Oracle characteset of AL32UTF8
However when I check in the text file I find ¿¿ instead of the chinese caracters, could you help to resolve this or if you can suggest another procedure other than DBMS_XSLPROCESSOR.CLOB2FILE which allow writing a large file with the chinese caracters which is extracted from non unicode DB ?
Many Thanks
I have an ETL that creates a plain text that later will be loaded into Teradata, in the source data there is a column UTF8 enconded that has all kind of characters including non printable, I properly write the file but i get Error Code 6706: The string contains an untranslatable character.
This is the destiny column
column VARCHAR(300) CHARACTER SET LATIN NOT CASESPECIFIC
I can load characters like this é but not characters like this ’
How can I properly write the file, and validate before sending the data, I don't have access to the data base just know that they get the error Code 6706.
Thanks in advance
Make column as unicode...
column VARCHAR(300) CHARACTER SET unicode NOT CASESPECIFIC.
I am trying to use the Unicode UTF-16 character set, but I am unsure how to do this, by default when I use the Unicode character set it uses UTF-8 which changes foreign Spanish, Arabic, etc. characters into ?. I am currently using Teradata 14.
https://twitter.com/intent/tweet?source=webclient&text=G%C5
produces the following error:
Invalid Unicode value in one or more parameters
btw, that is the Å character
twitter expects parameters to be encoded as utf-8.
So Å is Unicode U+00C5, and represented as utf-8 is C3 85
With url-escape this means that the query should be ...&text=G%C3%85
Since I don't know how you are building that query (programming language/environment), I can't really tell you how to do it right. Only that you should convert your string to utf-8 before escaping.