SevenZip, many trailing 0s - encryption

My array is 140bytes. outArray is 512bytes... Not what i wanted. Also i dont know if i am encrypting properly. Is the code below correct? how do i fix this so outArray is the real size and not fixed with many trailing zeros?
var compress = new SevenZipCompressor();
compress.CompressionLevel = CompressionLevel.Ultra;
compress.CompressionMethod = CompressionMethod.Lzma;
compress.ZipEncryptionMethod = ZipEncryptionMethod.Aes256;
var sIn = new MemoryStream(inArray);
var sOut = new MemoryStream();
compress.CompressStream(sIn, sOut, "a");
byte[] outArray = sOut.GetBuffer();

You are getting the whole MemoryStream buffer, you need to use ToArray(),
byte[] outArray = sOut.ToArray();
This will remove the trailing zeros but you may still get an array bigger than input. There is overhead with compression/encryption, which is probably bigger than 140 bytes.

Many compression algorithms (I'm unfamiliar with the specific details for 7-zip) generate output with a minimum output size. 7-zip performs best on large input data sets, and 140 bytes is not "large". You might do better with something like gzip or lzo. What other compression algorithms have you tried?

Related

Convert Image<Rgba32> to Byte[] using ImageSharp

How can I convert an image to array of bytes using ImageSharp library?
Can ImageSharp library also suggest/provide RotateMode and FlipMode based on EXIF Orientation?
If you are looking to convert the raw pixels into a byte[] you do the following.
var bytes = image.SavePixelData()
If you are looking to convert the encoded stream as a byte[] (which I suspect is what you are looking for). You do this.
using (var ms = new MemoryStream())
{
image.Save(ms, imageFormat);
return ms.ToArray();
}
For those who look after 2020:
SixLabors seems to like change in naming and adding abstraction layers, so...
Now to get a raw byte data you do the following steps.
Get MemoryGroup of an image using GetPixelMemoryGroup() method.
Converting it into array (because GetPixelMemoryGroup() returns a interface) and taking first element (if somebody tells me why they did that, i'll appreciate).
From System.Memory<TPixel> get a Span and then do stuff in old way.
(i prefer solution from #Majid comment)
So the code looks something line this:
var _IMemoryGroup = image.GetPixelMemoryGroup();
var _MemoryGroup = _IMemoryGroup.ToArray()[0];
var PixelData = MemoryMarshal.AsBytes(_MemoryGroup.Span).ToArray();
ofc you don't have to split this into variables and you can do this in one line of code. I did it just for clarification purposes. This solution only viable as for 06 Sep 2020

How to POST Chunked Encoding in R

I have a large oracle query result and want to upload it using http POST.
But with memory constrain, I cannot read all rows at once into memory.
So I read a few rows at a time, but can't find a way to start chunked uploading in R.
If it were in C# it would go something like this
var req = WebRequest.Create("http://myserver/upload");
req.SendChunked=true;
req.method = "POST"
using(var s = req.GetRequestStream()){
while(queryResult.hasRow()){
byte[] buffer = queryResult.readRow();
s.write(buffer,0,buffer.Length);
}
}
resonse = req.getResponse();
Is there anything equivalent in R?

looking for the jar file for HexUtils. bytesToHex

I have to implement the below code in my Java application, but I am unable to find the jar file for HexUtils.bytesToHex(). Where do I find that?
byte[] data = cleartext.getBytes(ENCODING);
md.update(data);
byte[] digestedByteArray = md.digest();
// Convert digested bytes to 2 chars Hex Encoding
md5String = HexUtils.bytesToHex(digestedByteArray) );
Not sure that I got what does this code , but seems the same can be done with help of org.apache.commons.codec.binary.Hex.encodeHexString(final byte[] data) (commons-codec-*.jar). This one very easy to find

asp.net Concatenate files

In my program, i split a file into multiple files and sent it to a WCF rest service which then joins it back to one file. After concatenate, the file size is more than the size of the file sent.
Following is the code to concatenate:
string[] files = Directory.GetFiles(path, string.Concat(guid, "*"),SearchOption.TopDirectoryOnly);
StreamReader fileReader;
StreamWriter fileWriter = new StreamWriter(path + newGuid);
for (Int64 count = 0; count < files.Length; count++)
{
fileReader = new StreamReader(string.Concat(path,guid, count));
fileWriter.Write(fileReader.ReadToEnd());
}
fileWriter.Close();
Are your dealing with only text files because both StreamWriter and StreamReader are meant to be used only for text files and not binary files.
Further, this line fileWriter.Write(fileReader.); appears to be wrong. It should be something like
fileWriter.Write(fileReader.ReadToEnd());
Of course, if your file size is too large, you should be doing reading/writing in chunks or line by line basis.

Determining HTML5 database memory usage

I'm adding sqlite support to a my Google Chrome extension, to store historical data.
When creating the database, it is required to set the maximum size (I used 5MB, as suggested in many examples)
I'd like to know how much memory I'm really using (for example after adding 1000 records), to have an idea of when the 5MB limit will be reached, and act accordingly.
The Chrome console doesn't reveal such figures.
Thanks.
You can calculate those figures if you wanted to. Basically, the default limit for localStorage and webStorage is 5MB where the name and values are saved as UTF16 therefore it is really half of that which is 2.5 MB in terms of stored characters. In webStorage, you can increase that by adding "unlimited_storage" within the manifest.
Same thing would apply in WebStorage, but you have to go through all tables and figure out how many characters there is per row.
In localStorage You can test that by doing a population script:
var row = 0;
localStorage.clear();
var populator = function () {
localStorage[row] = '';
var x = '';
for (var i = 0; i < (1024 * 100); i++) {
x += 'A';
}
localStorage[row] = x;
row++;
console.log('Populating row: ' + row);
populator();
}
populator();
The above should crash in row 25 for not enough space making it around 2.5MB. You can do the inverse and count how many characters per row and that determines how much space you have.
Another way to do this, is always adding a "payload" and checking the exception if it exists, if it does, then you know your out of space.
try {
localStorage['foo'] = 'SOME_DATA';
} catch (e) {
console.log('LIMIT REACHED! Do something else');
}
Internet Explorer did something called "remainingSpace", but that doesn't work in Chrome/Safari:
http://msdn.microsoft.com/en-us/library/cc197016(v=VS.85).aspx
I'd like to add a suggestion.
If it is a Chrome extension, why not make use of Web SQL storage or Indexed DB?
http://html5doctor.com/introducing-web-sql-databases/
http://hacks.mozilla.org/2010/06/comparing-indexeddb-and-webdatabase/
Source: http://caniuse.com/

Resources