I'm retrieving a file from a database server and allowing the user to download it. The problem is, I'm not getting the same byte stream out as I've read from the server.
I have confirmed (through lots of response.write) that I've received the right array of bytes, they're in the right order, etc...
Here's the code for the download st.FileContents is a byte[]:
Response.Clear();
Response.AddHeader("Content-Disposition",
"attachment; filename=" + st.FileName);
Response.AddHeader("Content-Length", st.FileSize.ToString());
Response.ContentType = "application/octet-stream";
Response.Write(new System.Text.ASCIIEncoding()
.GetString(st.FileContents)); // Problem line
Response.End();
I've tried a few ways of converting that byte [] to a string, and none give the results I need. Instead of the expected stream of bytes as:
FF D8 FF E0 1E D0 4A 46 58 58 00 10 FF D8 FF DB
(yes, that's the start of a jpeg image)
I wind up with something like:
C3 BF C3 98 C3 BF C3 A0 1E C3 90 4A 46 58 58 00
The first 6 bytes get mangled into 10 completely different bytes. What gives?
Edit
The answer is, of course, to use BinaryWrite instead of Write.
You shouldn't be treating binary data as strings. In this example, ASCII encoding only supports characters in range 0-127, so any bytes that do not fall into that range are treated as invalid and replaced with a question mark. Use HttpResponse.BinaryWrite instead:
Response.BinaryWrite(st.FileContents);
Related
i try to setup the LPSTK-CC1352R Launchpad with Node RED and a bluetooth connection.
the inbuild sensor is a hdc2080 sensor.
I'm not a electronic engineer so the datasheets are a bit confusing to me.
I made it to the point, where i have a connection to the MCU via bluetooth and get every second the temperature values. Unfortunately i get these values as a 4 dimensional hex array.
[04 4a d5 41]
[dc 44 d5 41]
[b4 3f d5 41]
[8c 3a d5 41]
...
here is a example of values.
I tried a lot to convert them into a simple temperature value but without success.
I even found a kind of tutorial, but without success.
Could anyone help me with the convertion?
Thank you :)
You have to reorder the hex values from right to left, because the last hex value is not changing, which means it has to be little endian.
https://en.wikipedia.org/wiki/Endianness#:~:text=Little%2Dendian%20representation%20of%20integers,visualized%2C%20an%20oddity%20for%20programmers.
4hex are 32-bit
converted to IEEE-754 Floating Point:
[41 d5 4a 04] = 26.6611404419
[41 d5 44 dc] = 26.6586227417
[41 d5 3f b4] = 26.6561050415
[41 d5 3a 8c] = 26.6535873413
https://www.h-schmidt.net/FloatConverter/IEEE754.html
I am trying to write some code to communicate with an old device over serial that uses the Siemens 3964r protocol. This includes a checksum, or more accurately a BCC (block check character) at the end of the transmission. A single char after the ETX. The docs define the BCC as this:
With the 3964R transfer protocol, data security is enhanced by sending an additional block
check character (BCC = Block Check Character).
The block check character is the even longitudinal parity (EXOR logic operation of all data
bytes) of a sent or received block. Its calculation begins with the first byte of user data (first
byte of the frame) after the connection establishment, and ends after the DLE ETX character
at connection termination.
Here is some sample data in hex.
53 54 41 54 55 53 10 03 07
07 is the BCC in this one.
4d 45 41 53 4d 50 54 45 53 54 41 4e 41 50 52 47 30 30 30 55 78 30 31 31 2e 30 30 5a 30 31 31 31 30 10 03 61
61 is the BCC in this one.
I know in general how to do XOR operations, but I haven't been able to figure out any combination of things that gives me a proper BCC. I think I am interpreting the definition wrong.
My preferred language for this is javascript as it's for a node.js electron app. I can read the buffer and get the hex values. And I can construct propoer messages back. But it won't work correctly until I can include a proper BCC. So just looking for someone smarter than me that knows exactly how to produce a valid BCC.
thanks!
The document that was posted as the first comment had the right structure to calculate the 3964r BCC. That document is here:
https://support.industry.siemens.com/cs/attachments/1117397/s7300_cp341_manual_en_en-US.pdf?download=true
Here is a simple javascript function. The hexarray would be passed in, not hardcoded like here, but this accurately calculates the BCC for this particular protocol. If anyone cares or needs it. I just wrote out the bcc as a hex string to the console in this, but you can make it a function and actually return something useable.
var hexarr = ['4d', 45, 41, 53, '4d', 50, 30, 30, 10, 03];
var bcc = 0;
var xor = 0;
for(var i= 0; i< hexarr.length; i++){
var hexint = parseInt(hexarr[i],16);
if(i==0){ xor = hexint; }
else {
bcc = xor ^ hexint;
xor = bcc;
}
}
console.log(bcc.toString(16));
I'm trying to understand the LDAP message structure, particularly the searchResEntry type in order to do some parsing. Using Wireshark as a guide, I have a general understanding but I can't find more specifics on the actual data structure. For example, it appears that each block starts with
0x30 0x84 0x0 0x0
Then from there, there is some variability on the remaining bytes before the actual data for the block. For example the first 17 bytes of a searchResEntry is
30 84 00 00 0b 8f 02 01 0c 64 84 00 00 0b 86 04 3b
30 84 00 00 - block header
0b 8f - size of entire searchResEntry remaining
02 - I believe represents a type code where the next byte (01) is a length and 0c is the messageId.
64 84 00 00 - No idea
0b 86 - size of entire searchResEntry remaining
04 - some type code
3b - length of block data
But then other blocks that begin with 30 84 00 00 are not 17 bytes long.
I've looked at rfc4511 but they just provide an unhelpful notation that doesn't actually describe the what the bytes mean.
searchResultEntry ::= [APPLICATION 4] SEQUENCE {
objectName LDAPDN,
attributes PartialAttributeList }
I've also looked at Wireshark's packet-ldap.c but it is very hard to follow. I wouldn't think it would be this hard to find a good description of the data structure layout and associated flags.
LDAP protocol is encoded according to the ASN.1 BER encoding rules which is a standard defined by ITU. Full specifications are here: https://www.itu.int/ITU-T/studygroups/com17/languages/X.690-0207.pdf
Today i got this reader from local shop. Earlier i worked with Wiegand type readers with no problem. So anyway, when i try to read EM type card with 0009177233 ID (written on card) i should get at least 9177233 with start and stop chars expected. But instead i get 50008C0891
ASCII 50008C0891
HEX 02 35 30 30 30 38 43 30 38 39 31 0D 0A 03
BIN 00000010 00110101 00110000 00110000 00110000 00111000 01000011 00110000
00111000 00111001 00110001 00001101 00001010 00000011
I use USB-RS232 converter and RealTerm software.
Does anyone has any ideas why?
Are there 2 ID's?
The decimal 9177233 equals HEX 8C0891, so the software gives you the serialnumber in hexadecimal notation. I think, the full number 50008C0891 is the 5 Bytes (40bit) from the UID of the EM-type chip.
Regards
I'm trying to interpret the communication between an ISO 7816 type card and the card reader. I've connected inline between the card and the reader when i dump the ouput to console i'm getting data that that im not expecting, see below:
Action: Card inserted to reader, expect an ATR only
Expected output:
3B 65 00 00 B0 40 40 22 00
Actual Output:
E0 3B 65 00 B0 40 40 22 00 90 00 80 9B 80 E0 E2
The 90 00 is the standard for OK that it reset, but why i am still logging additional data both before the ATR (E0) as well as data after
The communication line is documented in ISO 7816-3 (electrical interface and transmission protocols), look for the respective chapters of T=0 or T=1 protocol. T=1 is a block oriented protocol involving a prolog containing node addresses and an epilog containing a CRC/LRC.
For the ATR however, no protocol is running yet, since here the information is contained, which protocols the card supports, for the terminal to choose. Surely so early 90 00 is no SW1/SW2.