Helicon Ape X-Sendfile with Railo - railo

I’m trying to use Helicon Ape’s mod_xsendfile with Railo server (Windows 2012 R2). mod_xsendfile functions correctly and it works fine with PHP, it deliver the file and also it pass the content length value to the browser too. No file size limit found with PHP and no significant use of sever memory regardless the file size.
With Railo, obvious first attempt.
<cfcontent type="text/plain">
<cfheader name="content-disposition" value="attachment; filename=test.txt"/>
<cfheader name="X-Sendfile" value="D:\iis\hello.txt"/>
This does not work. It returns a blank file; no error log generated by Helicon Ape, so it is safe to assume Header X-Sendfile does not passed into IIS/ correctly.
Second Attempt
<cfheader name="content-disposition" value="attachment; filename=test.txt"/>
<cfset Response = GetPageContext().GetResponse() />
<cfset Response.setHeader('X-Sendfile','D:\iis\hello.txt')>
<cfset Response.setContentType('plain/text')>
<cfset Response.GetOutputStream().Flush() />
<cfset Response.Reset() />
<cfset Response.Finish() />
This works with
Limitation 1: When the file size is more than 2GB, browser returns error “ERR_INVALID_CHUNKED_ENCODING” It works fine with smaller file size, no memory issues. (again, PHP seems not to have this issue. IIS do not have a size limit either)
Limitation 2: This does not pass the content-length to the browser, hence browser don’t know the size of the file.
Third Attempt: Add content-length manually. (this is not necessary with PHP)
<cfset filePath = "D:\iis\246.zip">
<cfheader name="content-disposition" value="attachment; filename=246.zip"/>
<cfset Response = GetPageContext().GetResponse() />
<cfset Response.setContentLength( createObject("java","java.io.File").init( filePath ).length() )>
<cfset Response.setHeader('X-Sendfile', filePath )>
<cfset Response.setContentType('application/octet-stream')>
<cfset Response.GetOutputStream().Flush() />
<cfset Response.Reset() />
<cfset Response.Finish() />
Content-length passed into the browser, but unlike with PHP, IIS tries to allocate memory for the file and it soon end up with “Overflow or underflow in the arithmetic operation” error.
I’m sure I’m not handling GetPageContext().GetResponse() correctly. If anyone can help me out here, I would be highly appreciated.

If you use BonCode to connect to IIS it has facilities for spooling large files without overloading the server mem limit. Thus, allowing efficient streaming.
You will need to add the FlushThresholdBytes setting to your BonCode setting (check c:\windows), e.g.:
<FlushThresholdBytes>10240</FlushThresholdBytes>
However, from my limited understanding of Railo it seems to load the whole file into memory which would create a limit on the file size you can stream.
-John

Related

How to scrape ColdFusion protected website?

It is trivial to extract the PDF url from the following webpage.
https://www.osapublishing.org/boe/abstract.cfm?uri=boe-11-5-2745
But when I wget it, it will show something like in the output instead of downloading a PDF file.
<p>OSA has implemented a process that requires you to enter the letters and/or numbers below before you can download this article.</p>
As the website uses the cookie cfid, it should be protected by ColdFusion. Does anybody know how to scrape such a webpage? Thanks.
https://cookiepedia.co.uk/cookies/CFID
EDIT: The wget solution offered by Sev Roberts does not work. I checked the chrome devtools (in a new incognito window), many requests are sent after the first request of https://www.osapublishing.org/boe/abstract.cfm?uri=boe-11-5-2745 is sent. I guess it is because wget won't send those requests so that the subsequent wget (with cookies) of https://www.osapublishing.org/boe/viewmedia.cfm?uri=boe-11-5-2745&seq=0 won't work. Can anybody tell which of those extract requests are essential? Thanks.
There are several methods that sites use against this sort of scraping and direct linking or embedding. The basic old methods included:
Checking the user's cookies: to at least check the user already had a session from a previous page on this site; some sites might go further and look for the presence of specific cookie or session variables that verify a genuine path through the site.
Checking the cgi.http_referer variable to see whether the user arrived from the expected source.
Checking whether the cgi.http_user_agent looks like a known human browser - or checking that the user agent does not look like a known bot browser.
Other more intelligent methods of course exist, but in my experience if you're requiring more than the above then you're reaching the territory of requiring a captcha and/or requiring a user to register and log in.
Obviously (2) and (3) are easily spoofed by setting the headers manually. For (1) if you're using cfhttp or its equivalent in another language, then you need to ensure that cookies returned in the Set-Cookie header of the site's response, are returned in the headers of your subsequent request by using cfhttpparam. Various cfhttp wrappers and alternative libraries such as Java wrappers bypassing the cfhttp layer, are available to do this. But if you want to understand a simple example of how this works then Ben Nadel has an old but good one here: https://www.bennadel.com/blog/725-maintaining-sessions-across-multiple-coldfusion-cfhttp-requests.htm
With the pdf url from the link in your question, a couple of minutes tinkering in Chrome shows that if I lose the cookies from the previous page and keep the http_referer then I see the captcha challenge, but if I keep the cookies and lose the http_referer then I get directly through to the pdf. This confirms that they care about the cookies but not the referer.
Copy of Ben's example for SO completeness:
<cffunction
name="GetResponseCookies"
access="public"
returntype="struct"
output="false"
hint="This parses the response of a CFHttp call and puts the cookies into a struct.">
<!--- Define arguments. --->
<cfargument
name="Response"
type="struct"
required="true"
hint="The response of a CFHttp call."
/>
<!---
Create the default struct in which we will hold
the response cookies. This struct will contain structs
and will be keyed on the name of the cookie to be set.
--->
<cfset LOCAL.Cookies = StructNew() />
<!---
Get a reference to the cookies that werew returned
from the page request. This will give us an numericly
indexed struct of cookie strings (which we will have
to parse out for values). BUT, check to make sure
that cookies were even sent in the response. If they
were not, then there is not work to be done.
--->
<cfif NOT StructKeyExists(
ARGUMENTS.Response.ResponseHeader,
"Set-Cookie"
)>
<!---
No cookies were send back in the response. Just
return the empty cookies structure.
--->
<cfreturn LOCAL.Cookies />
</cfif>
<!---
ASSERT: We know that cookie were returned in the page
response and that they are available at the key,
"Set-Cookie" of the reponse header.
--->
<!---
Now that we know that the cookies were returned, get
a reference to the struct as described above.
--->
<!---
The cookies might be coming back as a struct or they
might be coming back as a string. If there is only
ONE cookie being retunred, then it comes back as a
string. If that is the case, then re-store it as a
struct.
---><!---<cfdump var="#arguments#" label="Line 305 - arguments for function GetResponseCookies" output="D:\web\safenet_GetResponseCookies.html" FORMAT="HTML">--->
<cfif IsSimpleValue(ARGUMENTS.Response.ResponseHeader[ "Set-Cookie" ])>
<cfset LOCAL.ReturnedCookies = {} />
<cfset LOCAL.ReturnedCookies[1] = ARGUMENTS.Response.ResponseHeader[ "Set-Cookie" ] />
<cfelse>
<cfset LOCAL.ReturnedCookies = ARGUMENTS.Response.ResponseHeader[ "Set-Cookie" ] />
</cfif>
<!--- Loop over the returned cookies struct. --->
<cfloop
item="LOCAL.CookieIndex"
collection="#LOCAL.ReturnedCookies#">
<!---
As we loop through the cookie struct, get
the cookie string we want to parse.
--->
<cfset LOCAL.CookieString = LOCAL.ReturnedCookies[ LOCAL.CookieIndex ] />
<!---
For each of these cookie strings, we are going to
need to parse out the values. We can treate the
cookie string as a semi-colon delimited list.
--->
<cfloop
index="LOCAL.Index"
from="1"
to="#ListLen( LOCAL.CookieString, ';' )#"
step="1">
<!--- Get the name-value pair. --->
<cfset LOCAL.Pair = ListGetAt(
LOCAL.CookieString,
LOCAL.Index,
";"
) />
<!---
Get the name as the first part of the pair
sepparated by the equals sign.
--->
<cfset LOCAL.Name = ListFirst( LOCAL.Pair, "=" ) />
<!---
Check to see if we have a value part. Not all
cookies are going to send values of length,
which can throw off ColdFusion.
--->
<cfif (ListLen( LOCAL.Pair, "=" ) GT 1)>
<!--- Grab the rest of the list. --->
<cfset LOCAL.Value = ListRest( LOCAL.Pair, "=" ) />
<cfelse>
<!---
Since ColdFusion did not find more than one
value in the list, just get the empty string
as the value.
--->
<cfset LOCAL.Value = "" />
</cfif>
<!---
Now that we have the name-value data values,
we have to store them in the struct. If we are
looking at the first part of the cookie string,
this is going to be the name of the cookie and
it's struct index.
--->
<cfif (LOCAL.Index EQ 1)>
<!---
Create a new struct with this cookie's name
as the key in the return cookie struct.
--->
<cfset LOCAL.Cookies[ LOCAL.Name ] = StructNew() />
<!---
Now that we have the struct in place, lets
get a reference to it so that we can refer
to it in subseqent loops.
--->
<cfset LOCAL.Cookie = LOCAL.Cookies[ LOCAL.Name ] />
<!--- Store the value of this cookie. --->
<cfset LOCAL.Cookie.Value = LOCAL.Value />
<!---
Now, this cookie might have more than just
the first name-value pair. Let's create an
additional attributes struct to hold those
values.
--->
<cfset LOCAL.Cookie.Attributes = StructNew() />
<cfelse>
<!---
For all subseqent calls, just store the
name-value pair into the established
cookie's attributes strcut.
--->
<cfset LOCAL.Cookie.Attributes[ LOCAL.Name ] = LOCAL.Value />
</cfif>
</cfloop>
</cfloop>
<!--- Return the cookies. --->
<cfreturn LOCAL.Cookies />
</cffunction>
Assuming you have a cfhttp response from the first page https://www.osapublishing.org/boe/abstract.cfm?uri=boe-11-5-2745 and pass that response into the above function and hold its result in a variable named cookieStruct, then you can use this inside subsequent cfhttp requests:
<cfloop item="strCookie" collection="#cookieStruct#">
<cfhttpparam type="COOKIE" name="#strCookie#" value="#cookieStruct[strCookie].Value#" />
</cfloop>
Edit: if using wget instead of cfhttp - you could try the approach from the answer to this question - but without posting a username and password since you don't actually need a login form
How to get past the login page with Wget?
eg
# Get a session.
wget --save-cookies cookies.txt \
--keep-session-cookies \
--delete-after \
https://www.osapublishing.org/boe/abstract.cfm?uri=boe-11-5-2745
# Now grab the page or pages we care about.
# You may also need to add valid http_referer or http_user_agent headers
wget --load-cookies cookies.txt \
https://www.osapublishing.org/boe/viewmedia.cfm?uri=boe-11-5-2745&seq=0
...although as others have pointed out, you may be violating the terms of service of the source, so I couldn't recommend actually doing this.

Generate Oracle results to csv

I'm using coldfusion now to query oracle then save the results to a csv file. It occurred to me that I may be able to print the results directly to csv, but I can't find anything recently written about it except a way to do so using sql developer. Is there a way to do this in the query ?
This is the way I usually do them:
<cfquery name="myQuery"
datasource="myDatasource">
<!--- Your query --->
</cfquery>
<cfheader name="Content-Disposition" value="attachment; filename=fileanem.csv">
<!--- Be very careful about line breaks or commas in your data because they'll break the CSV format --->
<cfsavecontent variable="myData">header1,header2,header3<cfoutput query="myQuery">#Chr(13)##Chr(10)##column1#,#column2#,#column3#</cfoutput></cfsavecontent>
<cfcontent type="text/csv" variable="#ToBinary(ToBase64(myData.Trim()))#" />

ASP.NET how to encode into Chinese?

I am currently helping a Chinese colleague migrating an ASP.NET app to a different server and we have now run into a character encoding problem. What is supposed to look like Chinese looks like gibberish.
The strings presented by the ASP.NET web pages have been coded in Chinese ...
Example:
<input id="BQuery" value=" 查询 " runat="server" class="BottonLoginRed" name="Button1" type="button" />
The Web.config file is configured like so ...
<globalization requestEncoding="gb2312" responseEncoding="gb2312" />
Since the source code contains Chinese encoded characters I figured I needed to set the correct culture for the response thread by adding a "culture" setting for "Simplified Chinese" in the Web.config file, like so ...
<globalization culture="zh-Hans" requestEncoding="gb2312" responseEncoding="gb2312" />
... but that produces this error message:
"Culture 'zh-Hans' is a neutral culture. It cannot be used in formatting and parsing and therefore cannot be set as the thread's current culture."
I have tried all variants for Chinese encoding, such as "zh-Hant", "zh-CHS" or just "zh" but they all yield the same problem. Apparently, there is no way to run Chinese as the response thread culture.
What would be the correct approach to resolve this issue?
[EDIT]
Apparently, my Chinese colleague has "solved" this problem before by simply setting Chinese as the language for the server itself. This is no longer an option (we'll have other apps, for different cultures running on the same server) but it might provide a hint.
[EDIT 2]
When I removed the encoding hints from the Web.config file it works. So, why is it we need to use these hints at all these days? Is it just me or is character encoding something that's being perceived as a very messy subject by everyone? :-)

Coldfusion Encryption/Decryption issue

I recently did a website for my company using ColdFusion 9. The issue I am having is with the ColdFusion encryption/decryption function. On certain strings that I decrypt I get these weird special characters that show up.
Example:
MK/_0 <---Encrypted String Outputted
�#5&z <---Decrypted String Outputted
I'm not sure why this is happening (and only on certain strings that get decrypted).
Here is the code:
<cfset ccNum = decrypt(getCCInfo.CUST_CARDNUMBER,myKey)>
Ok, well first, I have to point out that by not specifying an encryption algorithm you are using very POOR encryption. So you'll need to fix that. Second, you should probably be using some encoding to make your crypto storage more reliable.
So try this code.
<cfset key = generateSecretKey("AES") />
<!--- Set the ciphertext to a variable. This is the string you will store for later deciphering --->
<cfset cipherText = encrypt(plaintext, key, "AES/CBC/PKCS5Padding", "HEX") />
<cfoutput>#cipherText#</cfoutput>
<!--- Then when you decrypt --->
<cfset decipherText = decrypt(cipherText, key, "AES/CBC/PKCS5Padding", "HEX") />
<cfoutput>#decipherText#</cfoutput>
The above code will use a strong crypto algorithm and will put the ciphertext into a much easier to store format than the gibberish you showed as an example above. That way when you store it, it will be more reliable when you retrieve it again.
Here is an example of what the string will look like:
A51BBB284D6DCCDC17D26FB481584236087C3AB272918E17963BAF749438C06A484922820EDCCD25150732CC5CF8A096

URLCompression + Response Filter Conflict

I have IIS 7.5 with URL compression enabled for dynamic content. I wanted to add a response filter to remove modify the rendered html and for some reason I kept getting garbage data while filtering.
The code for the response filter's write method is below:
Encoding encoding = HttpContext.Current.Response.ContentEncoding;
string html = encoding.GetString(buffer);
html = regFindFollow.Replace(html, new MatchEvaluator(AddFollowNoFollowAttribute));
byte[] outdata = encoding.GetBytes(html);
This starts to work when I remove URL compression from web config. Am I missing something here? Is there an order for response filters that can be specified?
Config I am using is
<urlCompression doDynamicCompression="true" dynamicCompressionBeforeCache="true" />
Changing the config with
<urlCompression doDynamicCompression="true" dynamicCompressionBeforeCache="false" />
Fixed this. I suppose during the execution module received compressed html and couldn't parse it.

Resources