I'm working with spring mvc. I've set up a web form that has two simple text inputs. On controller, I use #ModelAttribute to let spring build the bean from the web form.
The problem comes when user puts on those text fields specials characters, like 酒店 and this kind of stuff, spring doesn't read it as utf-8, and they become the usual bad-encoded string.
I've checked web.xml and there's the utf-8 encoding filter, all pages are marqued as utf-8 and browser is sending right charset headers. Any idea on what's going on?
You may want to check this out:
http://forum.springsource.org/showthread.php?81858-ResponseBody-and-UTF-8
The short of it is that if you are using annotated methods then the messageconverter being used has a default character set. You can change this setting in your web.xml by setting the supported media types.
However, if your service doesn't like that media type, you may get a 406 error. You can create your own string message converter and set the default encoding, but there is no easy way with the built in HttpStringMessageConverter.
Alternately you can re-encode a string to a different character set:
String newresponse = new String(responseString.getBytes("ISO-8859-1"), "UTF-8");
You may also want to check out the related question here: How to get UTF-8 working in Java webapps?
the solution is simple by add produces = "text/plain;charset=UTF-8" to request mapping you can force spring mvc to encode the returned text.
Related
I am trying to use BizTak WCF-WebHttp adapter to send to Service Desk Plus CMDB API using Variable Mapping.
When trying using the browser, it works fine. Service Desk Plus CMDB API requires an URI like (strictly shortened for readability):
http://host.com/api/cmdb/ci?OPERATION_NAME=read&TECHNICIAN_KEY=Mykey&format=XML&INPUT_DATA=<?xml version='1.0'?>
<API>
<name>email#host.com</name>
</API>
I have used the URI http://host.com/api/cmdb/ci and URL Mapping.
<BtsHttpUrlMapping>
<Operation Url="?OPERATION_NAME=read&TECHNICIAN_KEY=MyKey&format=XML&INPUT_DATA=<?xml version='1.0'?>
<API>
<name>email#host.com</name>
</API>"/>
</BtsHttpUrlMapping>
This works fine, but I need a more dynamic approach. I tried using Variable Mapping, so I replaced the hard coded email address with a variable.
<BtsHttpUrlMapping>
<Operation Url="?OPERATION_NAME=read&TECHNICIAN_KEY=MyKey&format=XML&INPUT_DATA=<?xml version='1.0'?>
<API>
<name>{email}</name>
</API>"/>
</BtsHttpUrlMapping>
Trying to save the URL Mapping with the variable I get an error.
WCF-WebHttp Transport Properties
Error saving properties.
(System.InvalidOperationException) The UriTemplate
?OPERATION_NAME=read&TECHNICIAN_KEY=MyKey&format=XML&INPUT_DATA=<?xml version='1.0'?><API><name>{email}</name></API>
is not valid; each portion of the query string must be of the form 'name=value', when value cannot be a compound segment. See the documentation for UriTemplate for more details.
If I try a variable that is not within the escaped XML string, like with the key, then it works fine.
<BtsHttpUrlMapping>
<Operation Url="?OPERATION_NAME=read&TECHNICIAN_KEY={key}&format=XML&INPUT_DATA=<?xml version='1.0'?>
<API>
<value>email#host.com</value>
</API>"/>
</BtsHttpUrlMapping>
My intention is to be able to use a variable within the escaped XML string. If that is not possible; I will have to turn to a dynamic adapter and Create the URI and URL mapping in an orchestration.
Did u understand why it said each portion of the query string must be of the form 'name=value? There are just a few ways to make UriTemplates work.
See how a UriTemplate works here. Here is an example that is valid:
weather/{state}/{city}?forecast={day}
So in your case you should make everything after INPUT_DATA= a variable. Which means the whole escaped XML string you were talking about.
I have a web application that places the user's search term in the query string, in a similar way to Google. E.g. the address might be www.example.com/mysearchpage.aspx?q=searchTerm.
Usually this works fine, but if there is a special character in the search term such as â, the action attribute on the form is encoded to percent encoding and the character is replaced with %u00e2.
If I search for chât I will end up with the URL www.example.com/mysearchpage.aspx?q=châtin the browser's address bar but the action attribute on the form that comes back from the server would be www.example.com/mysearchpage.aspx?q=ch%u00e2t which means that a subsequent form submission fails because the URL is incorrectly formatted.
I have ensured that in IIS the encoding is set to be UTF-8 for Requests, Response Headers and Responses. I have also inspected the page being delivered from IIS in Fiddler and that already includes the incorrectly encoded action.
The encoded format appears to be in a non-standard format as explained in this wikipedia article.
Is there a way to prevent IIS from encoding the form's action in this way?
The solution was to add targetFramework=4.5.2 into the httpRuntime tag in the web.config file.
Previously this was not specified but was specified in the compilation tag, however specifying targetFramework=4.5.1 still caused the problem.
I'm using Spring MVC and I have a REST API.
I need some informations, for example, Date, Person ... but I have another information where I'll put to add information where will be text.
For example:
/addtimesheetjson/{idusuario}/{data}/{latitude}/{longitude}/{other}
{other} can be (for example): lorem/ipsum/dolar/ -- the user can put any text there.
When the user sends the information, my system will give error because there are a lot "/".
My question is, how can I pass text where my Spring MVC understand that "/" is information instead of a path of my REST?
Spring will not handle your case very well and escaping doesn't solve it unfortunately. You will need to find another way to pass the value of other. You can add a header to the request e.g. X-OTHER: /lorem/ipsum/dolar
Context: ASP.NET MVC running in IIS, with a a UTF-8 %-encoded URL.
Using the standard project template, and a test-action in HomeController like:
public ActionResult Test(string id)
{
return Content(id, "text/plain");
}
This works fine for most %-encoded UTF-8 routes, such as:
http://mydevserver/Home/Test/%e4%ba%ac%e9%83%bd%e5%bc%81
with the expected result 京都弁
However using the route:
http://mydevserver/Home/Test/%ee%93%bb
the url is not received correctly.
Aside: %ee%93%bb is %-encoded code-point 0xE4FB; basic-multilingual-plane, private-use area; but ultimately - a valid unicode code-point; you can verify this manually, or via:
string value = ((char) 0xE4FB).ToString();
string encoded = HttpUtility.UrlEncode(value); // %ee%93%bb
Now, what happens next depends on the web-server; on the Visual Studio Development Server (aka cassini), the correct id is received - a string of length one, containing code-point 0xE4FB.
If, however, I do this in IIS or IIS Express, I get a different id, specifically "î“»", code-points: 0xEE, 0x201C, 0xBB. You will immediately recognise the first and last as the start and end of our percent-encoded string... so what happened in the middle?
Well:
code-point 0x93 is “ (source)
code-point 0x201c is “ (source)
It looks to me very much like IIS has performed some kind of quote-translation when processing my url. Now maybe this might have uses in a few scenarios (I don't know), but it is certainly a bad thing when it happens in the middle of a %-encoded UTF-8 block.
Note that HttpContext.Current.Request.Raw also shows this translation has occurred, so this does not look like an MVC bug; note also Darin's comment, highlighting that it works differently in the path vs query portion of the url.
So (two-parter):
is my analysis missing some important subtlety of unicode / url processing?
how do I fix it? (i.e. make it so that I receive the expected character)
id = Encoding.UTF8.GetString(Encoding.Default.GetBytes(id));
This will give you your original id.
IIS uses Default (ANSI) encoding for path characters. Your url encoded string is decoded using that and that is why you're getting a weird thing back.
To get the original id you can convert it back to bytes and get the string using utf8 encoding.
See Unicode and ISAPI Filters
ISAPI Filter is an ANSI API - all values you can get/set using the API
must be ANSI. Yes, I know this is shocking; after all, it is 2006 and
everything nowadays are in Unicode... but remember that this API
originated more than a decade ago when barely anything was 32bit, much
less Unicode. Also, remember that the HTTP protocol which ISAPI
directly manipulates is in ANSI and not Unicode.
EDIT: Since you mentioned that it works with most other characters so I'm assuming that IIS has some sort of encoding detection mechanism which is failing in this case. As a workaround though you can prefix your id with this char and then you can easily detect if the problem occurred (if this char is missing). Not a very ideal solution but it will work. You can then write your custom model binder and a wrapper class in ASP.NET MVC to make your consumption code cleaner.
Once Upon A Time, URLs themselves were not in UTF-8. They were in the ANSI code page. This facilitates the fact that they often are used to select, well, pathnames in the server's file system. In ancient times, IE had an option to tell whether you wanted to send UTF-8 URLs or not.
Perhaps buried in the bowels of the IIS config there is a place to specify the URL encoding, and perhaps not.
Ultimately, to get around this, I had to use request.ServerVariables["HTTP_URL"] and some manual parsing, with a bunch of error-handling fallbacks (additionally compensating for some related glitches in Uri). Not great, but only affects a tiny minority of awkward requests.
To show this fundamental issue in .NET and the reason for this question, I have written a simple test web service with one method (EditString), and a consumer console app that calls it.
They are both standard web service/console applications created via File/New Project, etc., so I won't list the whole code - just the methods in question:
Web method:
[WebMethod]
public string EditString(string s, bool useSpecial)
{
return s + (useSpecial ? ((char)19).ToString() : "");
}
[You can see it simply returns the string s if useSpecial is false. If useSpecial is true, it returns s + char 19.]
Console app:
TestService.Service1 service = new SCTestConsumer.TestService.Service1();
string response1 = service.EditString("hello", false);
Console.WriteLine(response1);
string response2 = service.EditString("hello", true); // fails!
Console.WriteLine(response2);
[The second response fails, because the method returns hello + a special character (ascii code 19 for argument's sake).]
The error is:
There is an error in XML document (1, 287)
Inner exception: "'', hexadecimal value 0x13, is an invalid character. Line 1, position 287."
A few points worth mentioning:
The web method itself WORKS FINE when browsing directly to the ASMX file (e.g. http://localhost:2065/service1.asmx), and running the method through this (with the same parameters as in the console application) - i.e. displays XML with the string hello + char 19.
Checking the serialized XML in other ways shows the special character is being encoded properly (the SERVER SIDE seems to be ok which is GOOD)
So it seems the CLIENT SIDE has the issue - i.e. the .NET generated proxy class code doesn't handle special characters
This is part of a bigger project where objects are passed in and out of the web methods - that contain string attributes - these are what need to work properly. i.e. we're de/serializing classes.
Any suggestions for a workaround and how to implement it?
Or have I completely missed something really obvious!!?
PS. I've not had much luck with getting it to use CDATA tags (does .NET support these out of the box?).
You will need to use byte[] instead of strings.
I am thinking of some options that may help you. You can take the route using html entities instead of char(19). or as you said you may want to use CDATA.
To come up with a clean solution, you may not want to put the whole thing in CDATA. I am not sure why you think it may not be supported in .NET. Are you saying this in the context of serialization?