Confusing output when using CLLocationCoordinate2D - nsstring

Consider this code:
NSString * coordinateString = [NSString stringWithFormat:#"(%#,%#}", [inPoint objectAtIndex:0],[inPoint objectAtIndex:1]];
CLLocationCoordinate2D * myPoint= (__bridge CLLocationCoordinate2D *)(coordinateString);
My question is:
In this example the value of
[inPoint objectAtIndex:0] and [inPoint objectAtIndex:0]
are respectively 16.4006 and 38.254
If I log the value of myPoint, I get "myPoint=(16.400583,38.254028}".
Just as a I expected. However, in the Debug Panel, myPoint is represented with a latitude of 7304.....-304 and a longitude of endless zeroes.
Is this an internal representation, or is something lost in translation from double to NSString to CLLocationCoordinate2D?

I hope this is relevant to somebody: I eventually fixed this with the following code:
float lat = [[inPoint objectAtIndex:0]floatValue];
float lng = [[inPoint objectAtIndex:1]floatValue];
CLLocationCoordinate2D myPoint = CLLocationCoordinate2DMake(lat, lng);
The input was a double, and apparently needed to be converted to float.

Related

Point a RigidBody to look in a direction (LookAt)

In Godot I have a RigidBody that I would like to turn in 3D space so that it points towards a certain direction. Pretty much a LookAt. I found this tutorial, and I've written the below based on it.
Something I find a little confusing is that I'm not sure why it's only worrying about the x axis in the rotationAngle - should I be somehow implementing the others too?
public override void _IntegrateForces(PhysicsDirectBodyState state)
{
// Push RigidBody in direction that it's facing
Vector3 forwardForce = GetForwardForce();
this.ApplyCentralImpulse(forwardForce);
// Turn RigidBody to point in direction of world origin
var targetPosition = Vector3.Zero;
var currentDirection = -this.Transform.basis.z;
var currentDistanceFromTarget = this.Translation.DistanceSquaredTo(targetPosition);
GD.Print(currentDistanceFromTarget + " " + currentDirection);
if(currentDistanceFromTarget > 50)
{
var targetDir = (targetPosition - this.Transform.origin).Normalized();
var rotationAngle = Mathf.Acos(currentDirection.x) - Mathf.Acos(targetDir.x); // Why only x?
var angularVelocity = Vector3.Up * (rotationAngle / state.GetStep());
state.SetAngularVelocity(angularVelocity);
}
}
This will work up to a point - if the new currentDirection happens to match the positive Z axis, then it just continues in that way.

How to draw a line with x length in the direction of the mouse position in unity3D?

My question seems rather simple but i cant figure it out myself.
I want to draw a line with a fixed length from my transform.position in the direction where the mouse cursor is.
The things i figured out:
var mousePos = Camera.main.ScreenToWorldPoint(Input.mousePosition);
lazer.setPosition(0, transform.position);
// here is where the failing starts. i need to calculate the end position.
lazer.setPosition(1, ?)
Thanks A.
I think what you are looking for is the variable normalized on either the Vector2 or Vector3 class. Something like this will give you a new vector with the same length (magnitude, actually) every time:
Vector3 mousePos = Camera.main.ScreenToWorldPoint(Input.mousePosition);
Vector3 offsetPos = mousePos - transform.position;
Vector3 newVec = offsetPos.normalized * scale; // this is the important line
newVec += transform.position;

Unity3D - How to get the angle between 2 points as a vector

I have run into problems while trying to get my unity code done (using Javascript/Unityscript), so the question is:
How can i get the angle between two points (as Vector3s) and the angle as another Vector3
my code is :
var Offset : Vector3 = (0,0,3);
var angle1 : Vector3;
angle1 = Vector3.Angle(Vector3(0,0,0), Offset);
The error that i got is :
BCE0022: Cannot convert 'float' to 'UnityEngine.Vector3'.
I looked around, but was only able to find what i already knew.
Thanks for any help in advance !
-Etaash
In addition to the syntax error letiagoalves mentioned, Vector3.Angle returns a float, not a Vector3. You don't need to pre-define the angle variable, you can do it one line. Your final code should look like this:
var Offset : Vector3 = Vector3(0,0,3);
var angle1 = Vector3.Angle(Vector3(0,0,0), Offset);

Convert from NSString to float?

How can I convert this number to a float? It is 35.77091272818393, type NSString. The problem is that, if I use
[str doubValue]
it returns 35.770912.
I want to convert this string
NSString *tt =[NSString stringWithFormat:#"%.14f", 35.77091272818393 ];
to a float using
var=[tt floatValue];
but it does not have enough precision - it only returns 35.7709127 and not all the digits.
Your problem is that the value 35.770912 you are seeing is the rounded double-precision floating point value. If you want more decimal points, than you can use something like the following:
[NSString stringWithFormat:#"%.14", #"35.77091272818393"];
This will print out the value with 14 decimal points, and you will see that the precision has not been lost in the floating point value.
You are having problems with the second part of your question because you're accessing the floatValue property of the string instead of the doubleValue property. The number you reference requires more precision than float provides. You must store the number in a double-precision floating-point type. Something like this should fix your problem:
NSString *tt = [NSString stringWithFormat:#"%.14f", 35.77091272818393];
double var = [tt doubleValue];

Problem with stringByTrimmingCharactersInSet:

I'm doing a very simple trimming from a string. Basically it goes like this:
int main (int argc, const char * argv[]) {
NSAutoreleasePool *pool = [[NSAutoreleasePool alloc] init];
NSString *s = [[NSString alloc] initWithString:#"Some other string"];
s = [s stringByTrimmingCharactersInSet:[NSCharacterSet whitespaceAndNewlineCharacterSet]];
NSLog(#"%#",s);
[pool drain];
return 0;
}
No matter how many times I check, I can't find a mistake on the code. But the compiler keeps returning:
Running…
2009-12-30 08:49:22.086 Untitled[34904:a0f] Some other string
It doesn't trim the string at all. Is there a mistake in my code, because I can't believe something this simple doesn't work. Thanks!
Edit:
I think I've figured my mistake. stringByTrimmingCharactersInSet only trims the string from the leftmost and rightmost ends of a string. Can anybody confirm this?
You are correct - with your code stingByTrimmingCharactersInSet will trim the left and right whitespace but leave internal whitespace alone.
Note that there's also a memory leak in your code sample:
NSString *s = [[NSString alloc] initWithString:#"Some other string"];
This will leak when you re-assign to s in the next line.

Resources