Flex event.bytesLoaded returning wrong value - apache-flex

I have a function that's called when a file download has reported progress:
private function progressHandler(event:ProgressEvent):void
{
var percent:Number = Math.round((event.bytesLoaded / event.bytesTotal) * 100.0);
Alert.show(event.bytesLoaded.toString());
//pb.setProgress(percent, 100);
}
Now, this should work fine but unfortunately, event.bytesLoaded is returning much larger values than it should. For a test file (8555 bytes), bytesLoaded goes all the way up to 8973384.
Any ideas why this might be happening?

Amarghosh, in his comment, gave the hint that lead to a solution.
"Is the file 8555 kilobytes - because the number you gave is close to 8555 * 1024"

Related

So, a mutant escaped. Now what?

I've just managed to get mutation testing working for the first time. My usual testing framework is Codeception but as of writing, it is not compatible with mutation testing (although I believe work is being done on it and it's not far off). I'm using PHPUnit and Infection, neither of which seem easy to work out how to use.
My test suite generated ten mutants. Nine were killed and one escaped. However, I don't know what part of the code or the tests needs to be improved to kill the final mutant.
How do you get information about what code allowed the mutant to escape?
I found in this blog what I couldn't find in Infection's documentation: the results are saved in infection.log.
The log file looks like this:
Escaped mutants:
================
1) <full-path-to-source-file>.php:7 [M] ProtectedVisibility
--- Original
+++ New
## ##
use stdClass;
trait HiddenValue
{
- protected function hidden_value($name = null, $value = null)
+ private function hidden_value($name = null, $value = null)
{
static $data = [];
$keys = array_map(function ($item) {
Timed Out mutants:
==================
Not Covered mutants:
====================
It says that the mutation changed the protected visibility to private and that no tests failed as a result. If this is important, I can now either change the code or write another test to cover this case.
Now that I've found this, I've searched on the Infection website for infection.log and found --show-mutations or -s which will output escaped mutants to the console while running.

simple html dom find returns NULL

Having a strange behavior using simple html dom
$html = str_get_html($output, true, true, DEFAULT_TARGET_CHARSET, false);
Than
var_dump($html->find('title', 0));
returns an object. It's ok
But
var_dump($html->find('body', 0));
returns NULL.
I can't understand what's wrong.
mb_detect_encoding($output);
returns UTF-8 - all seems to be ok with the string.
I increased MAX_FILE_SIZE to 6000000 - it not helps (((
Just wrote mbstring.func_overload 0 in php.ini and all things begin to work perfectly.
May be this helps somebody else.

Memory error with type L"" in Win32

Here's the code for my paint method in my Win32 project:
case WM_PAINT:
_tcscat_s(greeting, sizeof(greeting), LoadedFile);
hdc = BeginPaint(hWnd, &ps);
TextOut(hdc,
5, 5,
greeting, _tcslen(greeting));
EndPaint(hWnd, &ps);
break;
I am consistently getting the error that either the stack around greeting or around ps is corrupted. To be clear, greeting is initialized like:
TCHAR greeting[100] = _T("Welcome! Your file is ");
And LoadedFile is initialized like this:
TCHAR LoadedFile[100];
LoadedFile[0] = 0;
LoadedFile is not yet changed by anything, so it shouldn't be adding anything to greeting. I've tried things like
sizeof(greeting) + 1
Which just shifts the error. Not sure what's wrong here.
Edit: Without the _tcscat_s(), call the window loads normally
Well, I found the problem, even though I don't really understand why the solution works. I just changed
_tcscat_s(greeting, sizeof(greeting), LoadedFile);
to
_tcscat_s(greeting, 100, LoadedFile);

How to check actual content length against Content-Length header?

A user can POST a document to our web service. We stream it elsewhere. But, at the end of the streaming, we need to be sure they didn't lie about their Content-Length.
I assume if headerContentLength > realContentLength, the request will just wait for them to send the rest, eventually timing out. So that's probably OK.
What about if headerContentLength < realContentLength? I.e. what if they keep sending data after they said they were done?
Is this taken care of by Node.js in any way? If not, what is a good way to check? I suppose I could just count up the bytes inside of some data event listeners---i.e., req.on("data", function (chunk) { totalBytes += chunk.length; }). That seems like a kludge though.
To check the actual length of the request, you have to add it up yourself. The data chunks are Buffers and they have a .length property that you can add up.
If you specify the encoding with request.setEncoding(), your data chunks will be Strings instead. In that case, call Buffer.byteLength(chunk) to get the length. (Buffer is a global object in node.)
Add up the total for each of your chunks and you'll know how much data was sent.
Here's a rough (untested) example:
https.createServer(function(req, res) {
var expected_length = req.headers['content-length']; // I think this is a string ;)
var actual_length = 0;
req.on('data', function (chunk) {
actual_length += chunk.length;
});
req.on('end', function() {
console.log('expected: ' + expected_length + ', actual: ' + actual_length);
});
});
Note: length refers to the maximum length of the Buffer's content, not the actual length. However, it works in this case because chunk buffers are always created at the exact correct length. Just be aware of that if you're working with buffers somewhere else.

Determining HTML5 database memory usage

I'm adding sqlite support to a my Google Chrome extension, to store historical data.
When creating the database, it is required to set the maximum size (I used 5MB, as suggested in many examples)
I'd like to know how much memory I'm really using (for example after adding 1000 records), to have an idea of when the 5MB limit will be reached, and act accordingly.
The Chrome console doesn't reveal such figures.
Thanks.
You can calculate those figures if you wanted to. Basically, the default limit for localStorage and webStorage is 5MB where the name and values are saved as UTF16 therefore it is really half of that which is 2.5 MB in terms of stored characters. In webStorage, you can increase that by adding "unlimited_storage" within the manifest.
Same thing would apply in WebStorage, but you have to go through all tables and figure out how many characters there is per row.
In localStorage You can test that by doing a population script:
var row = 0;
localStorage.clear();
var populator = function () {
localStorage[row] = '';
var x = '';
for (var i = 0; i < (1024 * 100); i++) {
x += 'A';
}
localStorage[row] = x;
row++;
console.log('Populating row: ' + row);
populator();
}
populator();
The above should crash in row 25 for not enough space making it around 2.5MB. You can do the inverse and count how many characters per row and that determines how much space you have.
Another way to do this, is always adding a "payload" and checking the exception if it exists, if it does, then you know your out of space.
try {
localStorage['foo'] = 'SOME_DATA';
} catch (e) {
console.log('LIMIT REACHED! Do something else');
}
Internet Explorer did something called "remainingSpace", but that doesn't work in Chrome/Safari:
http://msdn.microsoft.com/en-us/library/cc197016(v=VS.85).aspx
I'd like to add a suggestion.
If it is a Chrome extension, why not make use of Web SQL storage or Indexed DB?
http://html5doctor.com/introducing-web-sql-databases/
http://hacks.mozilla.org/2010/06/comparing-indexeddb-and-webdatabase/
Source: http://caniuse.com/

Resources