Problem with stringByTrimmingCharactersInSet: - nsstring

I'm doing a very simple trimming from a string. Basically it goes like this:
int main (int argc, const char * argv[]) {
NSAutoreleasePool *pool = [[NSAutoreleasePool alloc] init];
NSString *s = [[NSString alloc] initWithString:#"Some other string"];
s = [s stringByTrimmingCharactersInSet:[NSCharacterSet whitespaceAndNewlineCharacterSet]];
NSLog(#"%#",s);
[pool drain];
return 0;
}
No matter how many times I check, I can't find a mistake on the code. But the compiler keeps returning:
Running…
2009-12-30 08:49:22.086 Untitled[34904:a0f] Some other string
It doesn't trim the string at all. Is there a mistake in my code, because I can't believe something this simple doesn't work. Thanks!
Edit:
I think I've figured my mistake. stringByTrimmingCharactersInSet only trims the string from the leftmost and rightmost ends of a string. Can anybody confirm this?

You are correct - with your code stingByTrimmingCharactersInSet will trim the left and right whitespace but leave internal whitespace alone.
Note that there's also a memory leak in your code sample:
NSString *s = [[NSString alloc] initWithString:#"Some other string"];
This will leak when you re-assign to s in the next line.

Related

Parsing a hex nr byte by byte

I'm trying to parse a hex number byte by byte, and concatenate to a string the representation of each byte, in the order they're stored in memory. (for a little test on endianness, but that's not important I guess).
Here is the code (please ignore the glaring unit-test issues with it :D; also, some of the code might look weird since initially the display_bytes method took in a char* not an int8_t*, but I thought using an int8_t might make it more obvious to me, what the issue is)
TEST_CLASS(My001littlebigendian)
{
public:
TEST_METHOD(TestMethod1)
{
int i = 0x12345678;
display_bytes((int8_t*)&i, sizeof(i));
}
void display_bytes(int8_t* b, int length)
{
std::stringstream ss;
for (int i = 0; i < length; ++i)
{
int8_t signedCharRepresentation = *(b + i); //signed char has 1 byte
int8_t signed8ByteInt = (int8_t)signedCharRepresentation; //this is not ok
int32_t signed32ByteInt = (int32_t)signedCharRepresentation; //this is ok. why?
//ss << std::hex << signed8ByteInt; //this is not ok. why?
ss << std::hex << signed32ByteInt; //this is ok
}
std::string stringRepresentation = ss.str();
if (stringRepresentation.compare("78563412") == 0)
{
Assert::IsTrue(true, L"machine is little-endian");
}
else if(stringRepresentation.compare("01234567") == 0)
{
Assert::IsTrue(true, L"machine is big-endian");
}
else
{
Assert::IsTrue(true, L"machine is other-endian");
}
}
};
Now, what I don't understand (as hopefull the comments make clear) is why does this only work when I cast each byte to a 4 byte int, and not an 1 byte int. Since I am working with chunks of 1 byte. Intuitively it would make me think doing it like this should cause some sort of overflow? But it seems not.
I've not dug deeper into why this is the issue yet, since I was hoping to not need to. And maybe if someone with more knowledge in this area can give me nudge in the right direction, or maybe even an outright answer if I'm missing something very obvious. (which I do feel I might be, since I'm not used to working at this low level).

Don't laugh, but what on earth am i missing?

OK, I probably have no business trying to learn an OOP and I'm having trouble with the simplest little first program. I am getting a message that my implementation is incomplete (I commented the line that is giving 4 errors below). What is wrong? It wants a type specifier among other things, but don't I give it one with NSString? I do notice that NSString doesn't change color to a green type color in XCODE in the implementation like it does in the interface.
ALSO, why do we need to declare the method in the interface and type the exact same thing in the implementation? that is, why the need to type the startDrinking: (NSString*) newBeverage in both?
#import <Foundation/Foundation.h>
#interface Drinks : NSObject {
NSString *beverage;
}
- (void) startDrinking: (NSString*) newBeverage; // setter
- (void) printDrink;
#end
#implementation Drinks
{
//THIS NEXT LINE IS WHERE I GET 4 ERRORS
- (void) startDrinking: (NSString *) newBeverage {
beverage = [[NSString alloc]initwithString:newBeverage]
}
-(void) printDrink {
NSLog(#"How is your", beverage);
}
}
#end
int main (int argc, const char * argv[]) {
Drinks *beverage = [[Drinks alloc] init];
[beverage startDrinking:#"Lemonade"];
return 0;
}
Your question is too chatty.
You missed a semi-colon in line beverage = [[NSString alloc]initwithString:newBeverage]
The line should be :
beverage = newBeverage;
and the NSLog line should be:
NSLog(#"How is your %#", beverage);
and for the declaration of method signature in header, it is followed by C and C++ . You can think of, the Compiler needs to know which functions are available first.
Your mistake is the { right below #implementation Drinks.
That's why the alignment is messed up too.
In general, if you can't find an error on the line it is reported on, just check any extraneous or missing parenthesis, brackets or braces.
The weird alignment is another clue to this.
Hope this helps. Also, like some other said, it helps if your subject is more meaningful. Not just for yourself, but also for any others that might be having a similar problem

Setting a text = a string crashes my program

I am new at this so be gentle.
I have this function:
- (void) Morepoint {
gscore ++;
scoreString = [NSString stringWithFormat:#"%i", gscore];
lblscore.text = scoreString;
}
Where gscore is a global. scoreString is a NSString and lblscore is a label.
Every time I insert the function in my gameloop, the program stops to run.
Can anyone figure that out?
If I call the function from outside my gameloop, everything works fine, why?
You own the object returned from initWithFormat which you are responsible for releasing, but you don't own the object returned from stringWithFormat which returns an autoreleased string and so do not need to release it (if you do want to have ownership of it, you must retain it).
So for resolving your issue try to assign your value like this,
- (void) Morepoint
{
gscore ++;
scoreString = [[NSString alloc] initWithFormat:#"%i",gscore];
lblscore.text = scoreString;
}
Hope this helps you. Just Give it a try :)

NSString stringWithUTF8String return null on device, but ok on simulator

I have a really weird problem. Basically just convert a char to nsstring and store them in an nsmutable array.
But the code runs ok on simulator, but crash on device.
Here is the crash code,
char t = 'A' + i;
NSString* alphabetString = [NSString stringWithUTF8String:&t]; //substringToIndex:1];
[tempArray addObject:alphabetString];
Basically the stringWithUTF8String will return NULL on device, but return valid value on simulator.
The device is an iPhone 4s.
I did not see any notification of changes on NSString stringwithutf8string on iOS5 release.
Thanks.
The address of a single char is not a C-style string. You need to ensure it's null terminated with something like:
char t = 'A' + i;
char s[2]; s[0] = t; s[1] = '\0';
NSString* alphabetString = [NSString stringWithUTF8String:s];
From the docpage:
Parameters
bytes : A NULL-terminated C array of bytes in UTF8 encoding.
You can't pass the address of a single char value to -stringWithUTF8String. That function is expecting a null-terminated string, and you're not passing it one. This results in undefined behavior: anything at all could happen. It might appear to succeed, it might fail benignly, or it might erase your file system. But more likely, it will just crash your program.
You should create a two-character array that's null-terminated instead:
char t[2] = {'A' + i, 0}; // Two-character null-terminated array
NSString* alphabetString = [NSString stringWithUTF8String:t];
Alternatively, you can also use -stringWithFormat: with the %c format specifier to get a string containing a single character:
NSString* alphabetString = [NSString stringWithFormat:#"%c", 'A' + i];

How to convert a XZ compression output to a NSString?

I've successfully set up a small XZ compressor which returns a std::string that contains the compressed output. To process the result I need to "convert" the std::string to a NSString. Unfortunately there are (encoding?) problems:
Even though "abc" is not a good example it shows the difficulty pretty good:
NSString *text = [_text string]; // Length is 3
std::string content = xz_compress([text UTF8String]); // Length is 60
NSString *convertedContent = [NSString stringWithCString:content.c_str() encoding:NSUTF8StringEncoding];
// convertedContent is (null) and 0 characters long
Using NSUnicodeStringEncoding makes the NSString at least 2 characters long but they don't match the ones from std::string.
My questions:
Is this way possible / the preferable way (already unsure about that)?
If so: What encoding do I need to use?
Thanks in advance!
Paul
I assume that xz_compress output is binary data. So, why don't you try and use NSData dataWithBytes:length: method? Possibly you could also try with string::data() instead of string::c_str for the same reason:
NSData* convertedContent = [NSData dataWithBytes:content.data() length:content.length()];

Resources