I created an image uploader for an app I am working on. I first used php for the server side script, and everything worked fine. I found out afterwards I had to use .net, so I created new serverside scripts. The problem I am having is that my event.COMPLETE listener is never firing. I can receive data back using a DATAEVENT listener, but then it stops at this error:
Error #2044: Unhandled IOErrorEvent:. text=Error #2036: Load Never Completed.
Here is how I am sending my file.
var fileRefReq:URLRequest = new URLRequest(FILE_UPLOAD_TEMP);
var fileReqVars:URLVariables = new URLVariables();
fileReqVars.subdir = "Temp";
fileRefReq.data = fileReqVars;
fileRefReq.method = URLRequestMethod.POST;
fileRef.upload(fileRefReq);
The file definitely gets uploaded to the first TEMP directory, but then it breaks with the above error.
Has anyone else had a similar problem or point me in the right direction for solving this?
This is an error produced by Flash. The most common causes are:
It could be a 404 Error you are getting somewhere in the Flash.
This error can occur if you close the browser while it is loading something.
By default, the calling SWF file and the URL you load must be in the same domain. For example, a SWF file at www.adobe.com can load data only from sources that are also at www.adobe.com. To load data from a different domain, place a URL policy file on the server hosting the data.
Number 3 is important because a common user problem with Flash is security issues - so it is just something to rule out. Most likely not the cause here.
I would test for these 3 causes and read over the URLRequest: http://livedocs.adobe.com/flex/3/langref/flash/net/URLRequest.html
After some additional thought I think it is timing out but that is just a theory. Add an event listener like so:
urlLoader.addEventListener("httpResponseStatus", function(event:HTTPStatusEvent):void
to see what is actually happening.
You have to handle the event such as:
// add the event listener
urlLoader.addEventListener( IOErrorEvent.IO_ERROR, onErrorHandler );
// handle the error event like this:
private function onErrorHandler( e: IOErrorEvent ): void {
trace( "An io error occurred." );
}
Hope that helps
Related
A user attempted to upload a file that was too large (70MB for a single PDF page) and the system errored out. This is correct and expected behavior, however in the response.responseText (in a jQuery AJAX call) instead of just being the message, it was raw text of an entire html page, cut off at a certain point, which I believe coincides with the default style of IIS error pages.
I do not want to increase the limit of the file size to allow the file to come through, but I do wish to make it to where response.responseText just returns the message (effectively, what's between the < title > < /title > tags).
I attempted to set breakpoints in the upload.ashx file to see if I could find where this was happening, but it never gets that far (if it is a normal file, these breakpoints hit). Which is fine, I'm okay with IIS gatekeeping (I imagine if I try to bypass IIS for handling it, the file is going to get uploaded to the server and then rejected. Plus, lose out on just letting IIS configuration handle this), but I don't want to return an entire page if possible.
To my mind, the resolution I see is to see if response.responseText contains DOCTYPE and if so, scrape what is inside the title tag, but I feel like there may be a more by the book way of doing this?
edit: I did see where someone recommended setting existingResponse="PassThrough" on the httpErrors section of web.config, but when I did this the responseText just became blank and it still didn't touch breakpoints so I don't think this is achieving what I'm after.
This probably isn't the best way to handle, but seems to work in this case so just running with it:
changed:
error: function (response) {
alert(response.responseText);
}
to:
error: function (response) {
var titleIndex = response.responseText.indexOf('<title>');
var titleEndIndex = response.responseText.indexOf('</title>');
var message = response.responseText.substr(titleIndex + 7, titleEndIndex - titleIndex - 7);
alert(message);
}
which returns "IIS 10.0 Detailed Error - 413.1 - Request Entity Too Large" in this particular instance.
We are publishing extra page using event system on publishing event of pages.
It is working well when we publish a page from the CME. When we create a page using Tridion UI, it is created and published as expected but the event system throws error when it tries to create a target type object.
try
{
foreach (var t in e.Targets)
{
var targets = new List<TargetType>();
some logic
targets.Add(new TargetType(t.Id, page.Session)); // error is in this line
PublishInstructionBase pib = e.PublishTransactions.First().Instruction;
PublishInstruction pi = new PublishInstruction(page.Session);
pi.StartAt = pib.StartAt;
pi.RollbackOnFailure = pib.RollbackOnFailure;
PublishEngine.Publish(
items,
pi,
targets,
PublishPriority.Low
);
}
}
Exception we are getting is InvalidURIException
Both t and Session are not definitely not null which we verified by writing log over there.
Please do suggest what we can to fix the issue?
We manage to fix that by little hack putting code in try catch and creating target type manually if there is exception as we do know Tridion UI publish to staging only initially.
Thanks,
Vikas Kumar
It's hard to tell without exception and stack trace, but I assume you need to read TargetType first and not try to create it, like this:
(TargetType) page.Session.GetObject(t.Id)
It might be that the URIs you use are not from the same Publication context and are therefore invalid.
I get
Unhandled exception at 0x004687b4 in D3DTest.exe: 0xC0000005: Access violation reading location 0x00000000.
the error is at:
m_d3dDevice->CreateIndexBuffer(sizeof(short)*CHUNK_PRIMITIVES*3,D3DUSAGE_WRITEONLY, D3DFMT_INDEX16, D3DPOOL_MANAGED, &m_ib, NULL);
Now I checked m_d3dDevice and it's all OK,everything works properly if I don't create the buffer.
m_ib is also propery created before being used in that function:
LPDIRECT3DVERTEXBUFFER9 m_vb;
m_vb = NULL;
I don't think anything else could be causing the problem.I'm confused.
Try enabling the debug runtime from the DirectX control panel that will show you warnings and errors if you have any and always check the return codes from DX functions.
Showing the code for the whole function that creates the index buffer would help too.
When loading a MP3 to a flash.media.Sound object the id3 property gives an error:
SecurityError: Error #2000: No active security context.
Offcourse, like many errors in Flex, the Flex documentation doesn't mention a thing about this, except that it exists...
The MP3 is valid (i've checked it with MediaPlayer and iTunes), the Sound object is in a good state (bytesTotal and bytesLoaded both reflect the correct amount of bytes).
Has anyone had this problem too? Any solutions or suggestions?
Your MP3 should be fine.
If you want to access more data about your mp3 file, rather than just play, you will need a policy file that allows it. Similar to loading an image, if you just add it to the display it and don't access the pixels, it's all good, but if you want to access the pixels you should have permission(a crossdomain xml).
For images, when you call the load image, you can pass a LoaderContext in which you explicitly say you want to check for a crossdomain.xml file and get access to the content.
Similarly you should create a SoundLoaderContext with the second parameter set to true(to check) and use that in the sound load call.
e.g.
var snd:Sound = new Sound();
var req:URLRequest = new URLRequest("yourSound.mp3");
var context:SoundLoaderContext = new SoundLoaderContext(0, true);
snd.load(req, context);
snd.play();
For ID3 data you should listen for the ID3 event:
sound.addEventListener(Event.ID3, onID3);
function onID3(event:Event) {
for(var i in sound.id3)
trace('prop: ' + i + ' value: ' + sound.id3[i]);
}
For more info, you might find the mp3infoutil library handy.
HTH,
George
In Adobe AIR 1.5, I'm using URLLoader to upload a video in 1 MB chunks. It uploads 1 MB, waits for the Event.COMPLETE event, and then uploads the next chunk. The server-side code knows how to construct the video from these chunks.
Usually, it works fine. However, sometimes it just stops without throwing any errors or dispatching any events. This is an example of what is shown in a log that I create:
Uploading chunk of size: 1000000
HTTP_RESPONSE_STATUS dispatched: 200
HTTP_STATUS dispatched: 200
Completed chunk 1 of 108
Uploading chunk of size: 1000000
HTTP_RESPONSE_STATUS ...
etc...
Most of the time, it completes all of the chunks fine. However, sometimes, it just fails in the middle:
Completed chunk 2 of 108
Uploading chunk of size: 1000000
... and nothing else, and no network activity.
Through debugging, I can tell that it does successfully call urlLoader.load(). When it fails, it just seems to stall, calling load(), and then calling the UIComponent's callLaterDispatcher() and then nothing.
Does anyone have any idea why this could be happening? I'm setting up my URLLoader like this:
urlLoader.dataFormat = URLLoaderDataFormat.BINARY;
urlLoader.addEventListener(Event.COMPLETE, chunkComplete);
urlLoader.addEventListener(IOErrorEvent.IO_ERROR, ioErrorHandler);
urlLoader.addEventListener(SecurityErrorEvent.SECURITY_ERROR, securityErrorHandler);
urlLoader.addEventListener(HTTPStatusEvent.HTTP_RESPONSE_STATUS, responseStatusHandler);
urlLoader.addEventListener(HTTPStatusEvent.HTTP_STATUS, statusHandler);
urlLoader.addEventListener(ProgressEvent.PROGRESS, progressHandler);
And I'm re-using it for each chunk. No events get called when it doesn't succeed, and urlLoader.load() doesn't throw any exceptions. When it succeeds, HTTP_RESPONSE_STATUS, HTTP_STATUS, and PROGRESS events are dispatched.
Thanks!
Edit: One thing that might be helpful is that, we have the same upload functionality implemented in .NET. In .NET, the request.GetResponse() method sometimes throws an exception, complaining that the connection was closed unexpectedly. We catch the exception if this happens, and try that chunk again, until it succeeds. I'm looking to implement something similar here, but there are no exceptions being thrown or error events being dispatched.
More detailed code example below. The URLLoader is setup as described above. The readAgain variable just makes it skip reading a new set of bytes in the file stream (ie: it tries to send the old one again) ... however, it never catches any exceptions, because none are ever thrown.
private function uploadSegment():void
{
.... prepare byte array, setup url ...
// Create a URL request
var urlRequest:URLRequest = new URLRequest();
urlRequest.url = _url + "?" + paramStr;
urlRequest.method = URLRequestMethod.POST;
urlRequest.data = byteArray;
urlRequest.useCache = false;
urlRequest.requestHeaders.push(new URLRequestHeader('Cache-Control', 'no-cache'));
try
{
urlLoader.load(urlRequest);
}
catch (e:Error)
{
Logger.error("Failed to upload chunk. Caught exception. Trying again.");
readAgain = true;
uploadSegment();
return;
}
readAgain = false;
}
Have you tried signing up for 'Event.OPEN' to see if the connection is opening correctly? If you're doing this per chunk - perhaps that event or lack thereof would help?
[Edit]
Can you also try setting useCache to false on your URLRequest?
[Edit]
I assume you're urlLoader is globally referenced... If not, while you're waiting for async behavior, something evil like GC might hurt you ... But - skipping that, if you call 'bytesTotal' while you're waiting for something to happen - does it always return zero?
[More]
Also - check the URL in the cases where NOTHING happens - because online I've found some mention that if the server is unreachable there are no events fired (though there is some argument around that)...
I encountered a similar problem in Flex, only with Safari.
The URLloader sometimes returned nothing, not even the OPEN event.
I made sure that this wasn't a cache problem.
After lots of trial
and error, the only remedy I found was to use https protocol in the url. I am not sure what this does to
Safari, but now the problem is gone.