I have this code:
<cfdump eval=server>
And it outputs top-level keys for coldfusion, java, lucee, os, separator, servlet. Note that railo is not listed there.
However if I do this:
<cfdump eval=server.railo>
It then outputs the usual struct one might expect when running a Railo server (as opposed to a Lucee server).
What's up with that?
see: https://groups.google.com/d/msg/lucee/1asgCDwC_tE/-gtE06lkjuEJ
"server.railo" is supported as an alias for "server.lucee", we did this to make sure code like the following still work
if(server.railo.version>"4.0.0.000");
We saw this as an hidden feature for backward compatibility, because of that it is not shown with the dump or structKeyList, but structKeyExists should also return false, we will change this for the next patch release...
best use "server.coldfusion.productName" instead.
Related
I'm currently using the NLog.Web package for writing my .Net logs in my application.
After reading the NLog.Web I've noticed that unlike the ${windows-identity} layout renderer, the ${aspnet-user-identity} layout renderer got no domain parameter for it.
For example, if I want to log the current running windows identity, it logs out: domain\user, but when specifying domain=false, it logs only user.
How do I implement this kind of ability with the ${aspnet-user-identity}? Because when I configured ${aspnet-user-identity:domain=false} it didn't work.
The WindowsIdentity.Name, used in NLog, will always give the full name, including domain.
The logon name is in the form DOMAIN\USERNAME.
https://msdn.microsoft.com/en-us/library/system.security.principal.windowsidentity.name(v=vs.110).aspx
I think you need a custom layout renderer, and split it by-hand on the /.
Something like this: (maybe also add soms checks for outOfIndex)
using NLog.LayoutRenderers;
....
//register ${my-aspnet-user-identity}
LayoutRenderer.Register("my-aspnet-user-identity",
(logEvent) => HttpContext.Current?.User?.Identity?.Name?.Split('/')[1]);
Register it as soon as possible.
I found a different way to solve this issue #Julian
in the NLog.config file, i created a variable:
<variable name="aspnetIdentity" value="${replace:searchFor=^\\w+\\\\:replaceWith=:regex=true:inner=${aspnet-user-identity}}" />
As defined in the variable, the regex searches for at least one word (at the start) and finally searching for a backslash.
the other backslashes are written to escape special characters and also double backslashing. Finally, what was found (it's the domain name) will be replaced with an empty string and therefore I got only the username and not the Domain\Username
Thanks for the help #Julian
I can think of workarounds on how to get this working however I'm interested in finding out if there's a solution to this specific problem.
I've got a go program which requires a json string arguement:
go run main.go "{ \"field\" : \"value\" }"
No problems so far. However, am I able to run from the command line if one of the json values is another json string?
go run main.go "{ \"json-string\" : \"{\"nestedfield\" : \"nestedvalue\"}\" }"
It would seem that adding escape characters incorrectly matches up the opening and closing quotes. Am I minuderstanding how this is done or is it (and this is the side I'm coming down on) simply not possible?
To reiterate, this is a question that has piqued my curiosity - I'm aware of alternative approaches - I'm hoping for input related to this specific problem.
Why don't you just put your json config to the file and provide config file name to your application using flag package
Based on the feedback from wiredeye I went down the argument route instead. I've modified the program to run on:
go run main.go field:value field2:value json-string:"{\"nestedfield\":nestedvalue}"
I can then iterate over the os.Args and get the nested json within my program. I'm not using flags directly as I don't know the amount of inputs into the program which would have required me to use duplicate flags (not supported) or parse the flag to a collection (doesn't seem to be supported).
Thanks wiredeye
Context: ASP.NET MVC running in IIS, with a a UTF-8 %-encoded URL.
Using the standard project template, and a test-action in HomeController like:
public ActionResult Test(string id)
{
return Content(id, "text/plain");
}
This works fine for most %-encoded UTF-8 routes, such as:
http://mydevserver/Home/Test/%e4%ba%ac%e9%83%bd%e5%bc%81
with the expected result 京都弁
However using the route:
http://mydevserver/Home/Test/%ee%93%bb
the url is not received correctly.
Aside: %ee%93%bb is %-encoded code-point 0xE4FB; basic-multilingual-plane, private-use area; but ultimately - a valid unicode code-point; you can verify this manually, or via:
string value = ((char) 0xE4FB).ToString();
string encoded = HttpUtility.UrlEncode(value); // %ee%93%bb
Now, what happens next depends on the web-server; on the Visual Studio Development Server (aka cassini), the correct id is received - a string of length one, containing code-point 0xE4FB.
If, however, I do this in IIS or IIS Express, I get a different id, specifically "î“»", code-points: 0xEE, 0x201C, 0xBB. You will immediately recognise the first and last as the start and end of our percent-encoded string... so what happened in the middle?
Well:
code-point 0x93 is “ (source)
code-point 0x201c is “ (source)
It looks to me very much like IIS has performed some kind of quote-translation when processing my url. Now maybe this might have uses in a few scenarios (I don't know), but it is certainly a bad thing when it happens in the middle of a %-encoded UTF-8 block.
Note that HttpContext.Current.Request.Raw also shows this translation has occurred, so this does not look like an MVC bug; note also Darin's comment, highlighting that it works differently in the path vs query portion of the url.
So (two-parter):
is my analysis missing some important subtlety of unicode / url processing?
how do I fix it? (i.e. make it so that I receive the expected character)
id = Encoding.UTF8.GetString(Encoding.Default.GetBytes(id));
This will give you your original id.
IIS uses Default (ANSI) encoding for path characters. Your url encoded string is decoded using that and that is why you're getting a weird thing back.
To get the original id you can convert it back to bytes and get the string using utf8 encoding.
See Unicode and ISAPI Filters
ISAPI Filter is an ANSI API - all values you can get/set using the API
must be ANSI. Yes, I know this is shocking; after all, it is 2006 and
everything nowadays are in Unicode... but remember that this API
originated more than a decade ago when barely anything was 32bit, much
less Unicode. Also, remember that the HTTP protocol which ISAPI
directly manipulates is in ANSI and not Unicode.
EDIT: Since you mentioned that it works with most other characters so I'm assuming that IIS has some sort of encoding detection mechanism which is failing in this case. As a workaround though you can prefix your id with this char and then you can easily detect if the problem occurred (if this char is missing). Not a very ideal solution but it will work. You can then write your custom model binder and a wrapper class in ASP.NET MVC to make your consumption code cleaner.
Once Upon A Time, URLs themselves were not in UTF-8. They were in the ANSI code page. This facilitates the fact that they often are used to select, well, pathnames in the server's file system. In ancient times, IE had an option to tell whether you wanted to send UTF-8 URLs or not.
Perhaps buried in the bowels of the IIS config there is a place to specify the URL encoding, and perhaps not.
Ultimately, to get around this, I had to use request.ServerVariables["HTTP_URL"] and some manual parsing, with a bunch of error-handling fallbacks (additionally compensating for some related glitches in Uri). Not great, but only affects a tiny minority of awkward requests.
Warning: Non-static method Zend_Controller_Request_Http::getCookie() should not be called statically in..
Iam trying the following to get Cookie values:
$cookieData = Zend_Controller_Request_Http::getCookie($key, $default);
is there an better way to this?
getCookie() method is not static, it should be called on an object.
I believe this code is from your controller, so it should basically look like
$request = $this->getRequest();
$cookieData = $request->getCookie('someCookie', 'default');
This is a slight side note, yet it may just well help avoid long fruitless hours. From my experience, the problems that occur when one cannot retrieve value from $_COOKIE in zf1 and other frameworks occur mostly because setCookie is so easy to use one forgets to add the path and the domain like so:
setcookie('cookieName', 'cookieValue', $finalExpirationTime,'/','.yourdomain.com');
and instead do this:
setcookie('cookieName', 'cookieValue', $finalExpirationTime);
This gets real annoying especially so when working on Windows with ip's instead of actual domains. Another thing to look out for would be the dot (.) in front of the domain. As stated in the manual: Older browsers still implementing the deprecated » RFC 2109 may require a leading . to match all subdomains.
Hope this helps
I pass a vb.net query string to a page with, depending on the parameter, a spanish character on it, wich works perfectly on the development and testing servers, but not in production.
So, inside the query string I encode the name like this:
Server.UrlEncode(name)
And even before it gets to load, the server throws a "500 Internal Server Error"
with this url:
http://website/Dir/Page.aspx?num=CEJMEJINMFCEGICCEH&name=Jose+Bonse%c3%b1or+Del+Rosario
when I replace the %c3%b1 for any other normal letter, it works (%c3%b1 is an "ñ")
Again, its works just fine on the other servers, except on production, I dont even know where to start looking..
The HttpServerUtility class (which "Server" is an instance of in the Page base class) uses ASCII by default to decode and encode URL parameters. ASCII, of course, is not vast enough to handle other localizations. To force Unicode, try using UrlEncodeUnicode function of the HttpUtility class instead...
HttpUtility.UrlEncodeUnicode(name)
There are two things that pop to mind although I cannot say that I've actually tested either in this specific case (request string encoding).
First, how is your page encoded? That is, do you indicate the character sets used?
Second, how is the server computer configured re: localization? I'd compare the "Regional and Language Options" in the Control Panel on the three computers.
Finally (Ok, that's three) there are localization options in Web.Config. As is often true, CodeProject has a great article on the subject.
Good luck!