How do I write the MAC address ff:ff:ff:ff:ff:ff as a char[] in C?
Do I just do char macaddress[6] = "%0xFF%0xFF%0xFF%0xFF%0xFF%0xFF";
I'm not sure. Thanks!
char macaddress[6] = { 0xff, 0xff, 0xff, 0xff, 0xff, 0xff };
I rather do like this char macaddress[] = "\xff\xff\xff\xff\xff\xff";
There is some coding guide lines for char array initializations, because need to be null-terminated and the size is actually 7.
Do not initialize an array of
characters using a string literal with
more characters (including the '\0')
than the array. Therefore, it is
necessary to specify the correct size
of a string literal (char s[4] =
"abc";).
However, because the result
of the expectation always can be
obtained even if the size of the
string literal is changed, the method
of not describing the size (char s[] =
"abc";) is recommended.
ref:
http://www.caravan.net/ec2plus/guide.html
Related
I'm not able to understand the output of this simple c code.What happens when we typecast a int value to char pointer?
int main(void) {
int a =320;
char *ptr;
ptr=(char *)&a;
printf("%d",*ptr);
return 0;
}
the output is 64.But I'm unable to figure out the logic.Does the size of the signed char play a role here?
320 is 0x140 in hex. A char is one byte (Two hexadecimal digits), so casting and printing with %d will print the decimal value of 0x40, which happens to be 64.
I have the blow QByteArray.
QByteArray ba;
ba[0] = 0x01;
ba[1] = 0x10;
ba[2] = 0x00;
ba[3] = 0x07;
I have really no idea how to convert this QByteArray into resulted string which have "01100007", which i would use the QRegExp for pattern matching on this string?
First of all, the QByteArray does not contain "hex values", it contains bytes (as it's name implies). Number can be "hex" only when it is printed as text.
Your code should be:
QByteArray ba(4, 0); // array length 4, filled with 0
ba[0] = 0x01;
ba[1] = 0x10;
ba[2] = 0x00;
ba[3] = 0x07;
Anyway, to convert a QByteArray to a hex string, you got lucky: just use QByteArray::toHex() method!
QByteArray ba_as_hex_string = ba.toHex();
Note that it returns 8-bit text, but you can just assign it to a QString without worrying much about encodings, since it is pure ASCII. If you want upper case A-F in your hexadecimal numbers instead of the default a-f, you can use QByteArray::toUpper() to convert the case.
QString has following contructor:
constructor QString(const QByteArray &ba)
But note that an octal number is preceeded by 0 in c++, so some of your values are deciamal, some octal, none of them are hex.
I am working with byte arrays and strings. I have a byte array that I modify and then use to generate a string. I have looked at lots of posts on this website that recommend using BlockCopy or System.Text.Encoding.Default.GetString(); I have tried those but for some reason the string I am getting has all gibberish characters.
Here is the problem and what i expect. Lets say i have hex encoded string of bytes as follows:
string str = "f20bdba6ff29eed7b046d1df9fb70000";
Corresponding array is:
byte[] arrayStr = new byte[] { 0xf2, 0x0b, 0xdb, 0xa6, 0xff, 0x29, 0xee, 0xd7, 0xb0, 0x46, 0xd1, 0xdf, 0x9f, 0xb7, 0x00, 0x00 };
Please note that 2 characters in above string represent byte.
Now, lets say I manipulate arrayStr and change the byte at array index 4 (0xff) to (0xe1). I want that I should be able to get a string such that:
string str = "f20bdba6e129eed7b046d1df9fb70000";
Look at BitConverter:
string str = BitConverter.ToString(arrayStr).Replace("-", "");
I just have a quick question about what this code mean. Sorry, been reading other posts but I couldn't fully grasp the concept since none seems to resemble this piece of code that I'm currently working in my embedded system.
int8u buf[1024];
memset(buf, 0, sizeof(buf));
*((int16u*)&buf[2]) = 0xbb01;
can someone explain to me what these lines mean?
It basically interprets the array of bytes buf as 16-bit words and then changes the second word to 0xbb01. Alternative representation:
int8u buf[1024];
memset(buf, 0, sizeof(buf));
int16u *words = buf;
buf[1] = 0xbb01;
&buf[2] takes the address to the second byte in buf. Then the cast to (int16u *) informs the compiler that the result is to be treated as a 16-bit unsigned integer. Finally, the memory on that address is set to 0xbb01.
Depending on the endianness of your system, the contents of buf could then be 0x00, 0x00, 0xbb, 0x01 or 0x00, 0x00, 0x01, 0xbb (followed by more NULs due to the memset()).
Please see the comment of the code for explanation
int8u buf[1024]; // intializing int array of size 1024 in RAM.
memset(buf, 0, sizeof(buf)); // fill in buffer with zero.
(int16u*)&buf[2] is a type casting for pointer which points to int16. here casting is given to &buf[2] i.e. address of buf[2].
*((int16u*)&buf[2]) = 0xbb01; // updating content of int16 -two byte intger starting at buf2
Why this is done ?
This is done as buf array was created is of int8u. and now we need to update int16 value 0xbb01. To do this, in above code we have created int16 pointer.
Step by Step simplification of above pointer
((int16u)&buf[2]) = 0xbb01;
updating content of ((int16u*)&buf[2]) by 0xbb01
&buf[2] is pointer to int16u and update its value by 0xbb01
update value at buf[2],buf[3] by 0xbb01.[#]
[#]: exact content of buf[2], buf[3] will depend on type of core architecture: big endian or small endian.
I have a really weird problem. Basically just convert a char to nsstring and store them in an nsmutable array.
But the code runs ok on simulator, but crash on device.
Here is the crash code,
char t = 'A' + i;
NSString* alphabetString = [NSString stringWithUTF8String:&t]; //substringToIndex:1];
[tempArray addObject:alphabetString];
Basically the stringWithUTF8String will return NULL on device, but return valid value on simulator.
The device is an iPhone 4s.
I did not see any notification of changes on NSString stringwithutf8string on iOS5 release.
Thanks.
The address of a single char is not a C-style string. You need to ensure it's null terminated with something like:
char t = 'A' + i;
char s[2]; s[0] = t; s[1] = '\0';
NSString* alphabetString = [NSString stringWithUTF8String:s];
From the docpage:
Parameters
bytes : A NULL-terminated C array of bytes in UTF8 encoding.
You can't pass the address of a single char value to -stringWithUTF8String. That function is expecting a null-terminated string, and you're not passing it one. This results in undefined behavior: anything at all could happen. It might appear to succeed, it might fail benignly, or it might erase your file system. But more likely, it will just crash your program.
You should create a two-character array that's null-terminated instead:
char t[2] = {'A' + i, 0}; // Two-character null-terminated array
NSString* alphabetString = [NSString stringWithUTF8String:t];
Alternatively, you can also use -stringWithFormat: with the %c format specifier to get a string containing a single character:
NSString* alphabetString = [NSString stringWithFormat:#"%c", 'A' + i];