how to project curve to a planar in vtk - projection

I have used the vtkDijkstraGraphGeodesicPath class(example: http://www.cmake.org/Wiki/VTK/Examples/Cxx/PolyData/DijkstraGraphGeodesicPath) to find the shortest path between two points on a mesh, and my next step is to project the path(curve) to a planar. Is there a class or function in vtk to project curve to planar?
And the other way is to sample the path(curve) and then project the sampled points to the planar, so how to sample the curve and get sampled points? Thank you in advance!‍

I never found a method doing the projection of 3D mesh, but I had to use it, and I selected texturizations methods allowing to project a mesh (to texzturize) on a Plane/Cylinder/Sphere.
The main method used in this case is vtkTextureMapToPlane.
// your mesh
vtkSmartPointer<vtkPolyData> mainMesh = myFilter->GetOutput ();
// extract your path from the poly data
// retrieve selected ids from dijkstra algo
vtkSmartPointer<vtkIdList> idList = dijkstra->GetIdList ();
vtkSmartPointer<vtkIdTypeArray> ids = convertToIdTypeArr (idList); // custom method to convert id array
// Once the ID selections is done, the extraction is invoked
vtkSmartPointer<vtkSelectionNode> selectionNode = vtkSelectionNode::New ();
selectionNode->SetFieldType (vtkSelectionNode::POINT);
selectionNode->SetContentType (vtkSelectionNode::INDICES);
selectionNode->SetSelectionList (ids);
vtkSmartPointer<vtkSelection> selection = vtkSelection::New ();
selection->AddNode (selectionNode);
vtkSmartPointer<vtkExtractSelection> extract = vtkExtractSelection::New ();
extract->SetInputData (0, pl);
extract->SetInputData (1, selection);
// convert result to polydata
vtkSmartPointer<vtkGeometryFilter> geoFilter = vtkGeometryFilter::New ();
geoFilter->SetInputConnection (extract->GetOutputPort());
geoFilter->Update();
vtkSmartPointer<vtkPolyData> selected = geoFilter->GetOutput();
You have a vtkpolyData with vertices from the path. You need to create the plane and project
// plane is sized with 800x600, on y-z directions
double orig[3] = {0, 0, 0};
double pt1[3] = {0, 600, 0};
double pt2[3] = {0, 0, 800};
// create TextureMapToPlan instance
vtkSmartPointer<vtkTextureMapToPlane> planeMapper = vtkTextureMapToPlane::New ();
planeMapper->SetOrigin(orig);
planeMapper->SetPoint1(pt1);
planeMapper->SetPoint2(pt2);
planeMapper->SetInputData (selected);
planeMapper->Update (); // project
vtkSmartPointer<vtkPolyData> d = planeMapper->GetPolyDataOutput(); // retrieve result
As this algorithm is used for texturization, you need to retrieve Texture coords, and convert them into the plane coordinates. (Text coords are defined in [0, 1] ratio of the height and width)
vtkSmartPointer<vtkDataArray> textCoord = d->GetPointData()->GetTCoords ();
vtkSmartPointer <vtkPoints> textPoints = vtkPoints::New ();
for (int i = 0; i < textCoord->GetNumberOfTuples (); ++i)
{
textPoints->InsertNextPoint (textCoord->GetTuple2(i)[0] * 800,
textCoord->GetTuple2(i)[1] * 600, 0);
}
textPoints got here all coordinates in 2 dimension of the projection of your path on the plane. /!\ This coords depends on your plane coordinates.

Related

Draw an ellipse arc between two points in Three.js

I've been trying to draw an ellipse arc between two arbitrary points but my implementation is not working in some situations.
Because a part of this is involves mathematics, I started by asking this question.
Basically, given two points and the center, you can always get an ellipse if you allow rotation, except for cases where the points are collinear.
The solution proposed to that question is to:
Translate the center to the origin, translating both points by the same vector.
Rotate both points by the angle -alpha which is the simetric of the angle of the largest vector with the positive x-semiaxis.
Solve the ellipse equation to find its radiuses (system of two equations with two unknowns).
Define the ellipse
Rotate back the ellipse with the angle alpha and translate back to its center.
However, I'm having trouble implementing this in Three.js.
The documentation for the EllipseCurve lists the expected parameters. I assume the starting angle to always be zero and then set the end angle to either the angle between the two vectors or its simetric. I also want the arc to always be the smallest (i.e., if the angle is bigger than 180º, I'd use the complementary arc). I assume the center of the ellipse to be the middle point between the centers of the shape's bounding boxes.
This is my example code:
https://jsfiddle.net/at5dc7yk/1/
This example tries to create an arc from a vertex in the original shape and the same vertex in the modified shape.
Code regarding the ellipse arc is under the class EllipseArc and you can mess with the transformation applied to the object in line 190.
It works for some cases:
But not all:
Just an idea from scratch, not the ultimate solution.
When you clone and translate object, to build an arc between two respective points you'll need their coordinates in world coordinate system, and a coordinate of the middle point between centroids of objects.
Find the mid point between points in world space (between start and end vectors).
Find its projection on the vector of translation (this is the center of an arc).
Find the angle between vectors that you get by subtraction the result center vector from each of them.
Divide an angle by amount of divisions - you'll get the step value.
Get the end vector as the base and rotate it around an axis (which is the normal of a triangle, built with start, center, end vectors) in a loop, multiplying that step angle value with the number of the current iteration.
Code example:
var scene = new THREE.Scene();
var camera = new THREE.PerspectiveCamera(60, innerWidth / innerHeight, 1, 10000);
camera.position.set(0, 0, 150);
var renderer = new THREE.WebGLRenderer();
renderer.setSize(innerWidth, innerHeight);
document.body.appendChild(renderer.domElement);
var controls = new THREE.OrbitControls(camera, renderer.domElement);
var shapeGeom = new THREE.ShapeBufferGeometry(new THREE.Shape(californiaPts));
shapeGeom.center();
shapeGeom.scale(0.1, 0.1, 0.1);
var shapeMat = new THREE.MeshBasicMaterial({
color: "orange"
});
var shape = new THREE.Mesh(shapeGeom, shapeMat);
shape.updateMatrixWorld();
scene.add(shape);
var shapeClone = shape.clone();
shapeClone.position.set(25, 25, 0);
shapeClone.updateMatrixWorld();
scene.add(shapeClone);
var center = new THREE.Vector3().lerpVectors(shapeClone.position, shape.position, 0.5);
var vecStart = new THREE.Vector3();
var vecEnd = new THREE.Vector3();
var pos = shapeGeom.getAttribute("position");
for (let i = 0; i < pos.count; i++) {
vecStart.fromBufferAttribute(pos, i);
shape.localToWorld(vecStart);
vecEnd.fromBufferAttribute(pos, i);
shapeClone.localToWorld(vecEnd);
makeArc(center, vecStart, vecEnd);
}
function makeArc(center, start, end) {
console.log(center, start, end);
let vM = new THREE.Vector3().addVectors(start, end).multiplyScalar(0.5);
let dir = new THREE.Vector3().subVectors(end, start).normalize();
let c = new THREE.Vector3().subVectors(vM, center);
let d = c.dot(dir);
c.copy(dir).multiplyScalar(d).add(center); // get a center of an arc
let vS = new THREE.Vector3().subVectors(start, c);
let vE = new THREE.Vector3().subVectors(end, c);
let a = vS.angleTo(vE); // andgle between start and end, relatively to the new center
let divisions = 100;
let aStep = a / divisions;
let pts = [];
let vecTemp = new THREE.Vector3();
let tri = new THREE.Triangle(start, c, end);
let axis = new THREE.Vector3();
tri.getNormal(axis); // get the axis to rotate around
for (let i = 0; i <= divisions; i++) {
vecTemp.copy(vE);
vecTemp.applyAxisAngle(axis, aStep * i);
pts.push(vecTemp.clone());
}
let g = new THREE.BufferGeometry().setFromPoints(pts);
let m = new THREE.LineDashedMaterial({
color: 0xff0000,
dashSize: 1,
gapSize: 1
});
let l = new THREE.Line(g, m);
l.computeLineDistances();
l.position.copy(c);
scene.add(l);
}
renderer.setAnimationLoop(() => {
renderer.render(scene, camera);
});
body {
overflow: hidden;
margin: 0;
}
<script src="https://threejs.org/build/three.min.js"></script>
<script src="https://threejs.org/examples/js/controls/OrbitControls.js"></script>
<script>
var californiaPts = [
new THREE.Vector2(610, 320),
new THREE.Vector2(450, 300),
new THREE.Vector2(392, 392),
new THREE.Vector2(266, 438),
new THREE.Vector2(190, 570),
new THREE.Vector2(190, 600),
new THREE.Vector2(160, 620),
new THREE.Vector2(160, 650),
new THREE.Vector2(180, 640),
new THREE.Vector2(165, 680),
new THREE.Vector2(150, 670),
new THREE.Vector2(90, 737),
new THREE.Vector2(80, 795),
new THREE.Vector2(50, 835),
new THREE.Vector2(64, 870),
new THREE.Vector2(60, 945),
new THREE.Vector2(300, 945),
new THREE.Vector2(300, 743),
new THREE.Vector2(600, 473),
new THREE.Vector2(626, 425),
new THREE.Vector2(600, 370),
new THREE.Vector2(610, 320)
];
</script>
If you don't translate, and just rotate an object, in this case you don't need to compute a new center for each arc, just omit that step, as all the centers are equal to the centroid of the object.
I hope I explained it in more or less understandable way ^^

Generate socket junctions for 3D model / Draw cylinders along edges

I have this Processing sketch in which I'm trying to load a model (.stl or .obj) and build parametric socket junctions for every edge intersection. These will be 3D printed and rods of the appropriate gauge will slot in their holes.
I've been able to draw Spheres at the vertices (in red,on the model), but I can't generate the green sockets or even the rods in pink:
I can get the coordinates of the model's edges:
id: 0 0 {-25.805634,-23.170607,13.6315975} -> 1 {-16.868328,-10.148323,6.785455} f: 2
id: 1 1 {-16.868328,-10.148323,6.785455} -> 2 {-52.833824,-10.148322,16.799314} f: 2
....
and with these I could draw a Line3D(Vec3D a, Vec3D b) (toxi.geom.Line3D)
But I don't want a Line3D, I need to draw a cylinder between them, thereby getting the rods. To sort of extrude or inflate the line to a 3D volume, if you will...
WETriangleMesh whale, redmesh;
[bla bla bla]
Then inside void setup():
for(WingedEdge e : whale.edges.values()) {
edges.add(e);
drawSocket(e);
}
void drawSocket(WingedEdge e) {
// draw size 2 balls at model vertices
Sphere ball = new Sphere(e.a, 2);
// convert to mesh at resolution 6 and add to redmesh
ball.toMesh(redmesh, 6);
}
Then inside void draw():
// draw the mesh with the whale
gfx.mesh(whale);
// color the next mesh red
fill(255,0,0);
// draw the mesh with the spheres
gfx.mesh(redmesh);
I have found no 3D shape class that can be constructed in place from the Vec3D a, Vec3D b arguments. Perhaps creating a class to make use of those two coordinates?
If building a shape in place is too difficult, maybe a transform then? pushMatrix() translate() and popMatrix()?
EDIT: Solved, see below. Thanks!
The trick is to translate the sockets to one vertex in the edge and have them face the opposite vertex. Here's a Processing function that'll take a WETriangleMesh like your whale and return back meshes for the points, sockets and rods. I've written a more complete example here.
float socketLength = 30;
float socketRadius = 6;
float rodRadius = 5;
float pointRadius = 10.0;
int resolution = 10;
WETriangleMesh[] convertMeshToRodSockets(WETriangleMesh inMesh) {
WETriangleMesh[] meshes = new WETriangleMesh[3];
meshes[0] = new WETriangleMesh();
meshes[1] = new WETriangleMesh();
meshes[2] = new WETriangleMesh();
for (Vertex vert : inMesh.getVertices ()) {
//Sphere points
Sphere sphere = new Sphere(vert, pointRadius);
WETriangleMesh sphereMesh = new WETriangleMesh();
sphere.toMesh(sphereMesh, resolution);
meshes[2].addMesh((WETriangleMesh)sphereMesh);
}
for (WingedEdge edge : inMesh.edges.values ()) {
Vec3D pointA = edge.a;
Vec3D pointB = edge.b;
//Meshes to store socket shapes
ZAxisCylinder socket;
ZAxisCylinder rod;
WETriangleMesh socketAMesh = new WETriangleMesh();
WETriangleMesh socketBMesh = new WETriangleMesh();
WETriangleMesh rodMesh = new WETriangleMesh();
float distanceBetweenPoints = pointA.distanceTo(pointB);
//Create sockets and point towards target
socket = new ZAxisCylinder(new Vec3D(0.0, 0.0, 0.0), socketRadius, socketLength);
socket.toMesh(socketAMesh, resolution, 0.0);
socketAMesh.pointTowards(pointB.sub(pointA));
//Translate socket to start from the center of the point
socketAMesh.translate( offsetTranslation(pointA, pointB, socketLength ));
//Create second socket and look in the opposite direction to face the start point
socket.toMesh(socketBMesh, resolution, 0.0);
socketBMesh.pointTowards(pointA.sub(pointB));
socketBMesh.translate( offsetTranslation(pointB, pointA, socketLength ));
//Create rod with a length matching the distance between the two points
rod = new ZAxisCylinder(new Vec3D(0.0, 0.0, 0.0), rodRadius, distanceBetweenPoints);
rod.toMesh(rodMesh, resolution, 0.0);
rodMesh.pointTowards(pointB.sub(pointA));
//Translate the rod to the midpoint of both points
rodMesh.translate(pointA.add(pointB).scale(0.5));
//Combine meshes together
meshes[0].addMesh((WETriangleMesh)rodMesh);
meshes[1].addMesh((WETriangleMesh)socketAMesh);
meshes[1].addMesh((WETriangleMesh)socketBMesh);
}
return meshes;
}
//Function to return a vector that is slightly offset by a length.
Vec3D offsetTranslation(Vec3D a, Vec3D b, float l) {
return a.interpolateTo(b, (l*0.5 / a.distanceTo(b)) );
}

Three.js - Rotation of a cylinder that represents a vector

I am using three.js to create a simple 3d vector environment. I am using lines to represent all 3 vector compontens x, y, z and a line for the final vector representation. Problem is that setting the width of a line is not working in Windows. The workaround that I try to implement is placing a cylinder onto the line (see red object in image below).
That is my current result:
As you see I am not able to rotate the cylinder to the correct position.
I faced the problem that the rotation center of the cylinder is in the middle of the object, so I moved the rotation point to the beginning of the cylinder. But still, rotation is not working correctly. I guess, the rotations around the axis influence each other.
Here is the code:
// VEKTOR
var vektor = {};
vektor._x = 2;
vektor._y = 1.5;
vektor._z = 1;
vektor._length = Math.sqrt(vektor._x*vektor._x + vektor._y*vektor._y + vektor._z*vektor._z);
// CYLINDER
var cyl_material = new THREE.MeshBasicMaterial( { color: 0xff0000 } );
// cylinder which is our line that represents the vector
var cyl_width = 0.025; // default line width
var cyl_height = vektor._length;
// THREE.CylinderGeometry(radiusTop, radiusBottom, height, radiusSegments, heightSegments, openEnded)
var cylGeometry = new THREE.CylinderGeometry(cyl_width, cyl_width, cyl_height, 20, 1, false);
// translate the cylinder geometry so that the desired point within the geometry is now at the origin
// https://stackoverflow.com/questions/12746011/three-js-how-do-i-rotate-a-cylinder-around-a-specific-point
cylGeometry.applyMatrix( new THREE.Matrix4().makeTranslation( 0, cyl_height/2, 0 ) );
var cylinder = new THREE.Mesh(cylGeometry, cyl_material);
updateCylinder();
scene.add( cylinder );
And the function updateCylinder trys to do the rotation.
function updateCylinder() {
// ... stuff, then:
cylinder.rotation.x = Math.atan2(vektor._z,vektor._y);
cylinder.rotation.y = 0.5*Math.PI+Math.atan2(vektor._x,vektor._z);
cylinder.rotation.z = Math.atan2(vektor._x,vektor._y);
}
Here is the current demo: http://www.matheretter.de/3d/vektoren/komponenten/
What am i doing wrong with the rotation? How to implement it so that the cylinder is following the vector line?
Thanks for your help.
If you want to transform a cylinder so that one end is at the origin and the other end points toward a specific point, here is the pattern you can follow:
First, transform your geometry so one end of the cylinder is at the origin, and the other end (the top) is on the positive z-axis.
var geometry = new THREE.CylinderGeometry( 0, 1, length, 8, 1, true );
geometry.applyMatrix( new THREE.Matrix4().makeTranslation( 0, length / 2, 0 ) );
geometry.applyMatrix( new THREE.Matrix4().makeRotationX( Math.PI / 2 ) );
Then create your mesh, and call the lookAt() method:
var mesh = new THREE.Mesh( geometry, material );
mesh.lookAt( point );
three.js r.67

Entity look at for cameras and objects

I'm currently extending my 3D-engine for a better entity system which includes cameras. This allows me to put cameras into parent entities which may also be in another entity ( and so on... ).
--Entity
----Entity
------Camera
Now i want to set the cameras look direction, i'm doing this with the following method which is also used to set the look at of a entity:
public void LookAt(Vector3 target, Vector3 up)
{
Matrix4x4 oldValue = _Matrix;
_Matrix = Matrix4x4.LookAt(_Position, target, up) * Matrix4x4.Scale(_Scale);
Vector3 p, s;
Quaternion r;
Quaternion oldRotation = _Rotation;
_Matrix.Decompose(out p, out s, out r);
_Rotation = r;
// Update dependency properties
ForceUpdate(RotationDeclaration, _Rotation, oldRotation);
ForceUpdate(MatrixDeclaration, _Matrix, oldValue);
}
But the code is only working for cameras and not for other entities, when using this method for other entities the object is rotating at it's position ( The entity is at a root node, so it has no parent ). The matrix's look at method looks like this:
public static Matrix4x4 LookAt(Vector3 position, Vector3 target, Vector3 up)
{
// Calculate and normalize forward vector
Vector3 forward = position - target;
forward.Normalize();
// Calculate and normalie side vector ( side = forward x up )
Vector3 side = Vector3.Cross(up, forward);
side.Normalize();
// Recompute up as: up = side x forward
up = Vector3.Cross(forward, side);
up.Normalize();
//------------------
Matrix4x4 result = new Matrix4x4(false)
{
M11 = side.X,
M21 = side.Y,
M31 = side.Z,
M41 = 0,
M12 = up.X,
M22 = up.Y,
M32 = up.Z,
M42 = 0,
M13 = forward.X,
M23 = forward.Y,
M33 = forward.Z,
M43 = 0,
M14 = 0,
M24 = 0,
M34 = 0,
M44 = 1
};
result.Multiply(Matrix4x4.Translation(-position.X, -position.Y, -position.Z));
return result;
}
The decompose method also returns the wrong value for the position variable p. So why is the camera working and the entity not?
I had a similar problem a while ago.The problem is that camera transformations are different from others as those are negated.What I did was first to transform camera eye, center and up into world space (transforming those vectors by its parent model matrix )and then calculating the lookAt() .That is how it worked for me.

Bounding Spheres move farther than sphere object

I have a program that I'm making with others and I ran into a problem. I'm working on adding in polygon models into our scene in an XNA window. I have that part complete. I also have bounding spheres(I know I tagged as bounding-box but there is no bounding sphere tag) drawing around each polygon. My problem is when I move the polygons around the 3D space the bounding spheres move twice as much as the polygons. I imagine its something within my polygon matrices that I use to create the bounding sphere that makes it move twice as much but that is only speculation.
So just to clarify I'll give you an example of my problem. If I hold down D to move a polygon along the X axis. (model.position.X--;) The polygon moves as expected to but the bounding sphere around the polygon moves twice as much. Thanks for the help guys!
Here is how I draw the models and the bounding spheres:
public void Draw(Matrix view, Matrix projection, bool drawBoundingSphere)
{
Matrix translateMatrix = Matrix.CreateTranslation(position);
Matrix worldMatrix = translateMatrix * Matrix.CreateScale(scaleRatio);
foreach (ModelMesh mesh in model.Meshes)
{
foreach (BasicEffect effect in mesh.Effects)
{
effect.World = worldMatrix * modelAbsoluteBoneTransforms[mesh.ParentBone.Index];
effect.View = view;
effect.Projection = projection;
effect.EnableDefaultLighting();
effect.PreferPerPixelLighting = true;
}
mesh.Draw();
if (drawBoundingSphere)
{
// the mesh's BoundingSphere is stored relative to the mesh itself.
// (Mesh space). We want to get this BoundingSphere in terms of world
// coordinates. To do this, we calculate a matrix that will transform
// from coordinates from mesh space into world space....
Matrix world = modelAbsoluteBoneTransforms[mesh.ParentBone.Index] * worldMatrix;
// ... and then transform the BoundingSphere using that matrix.
BoundingSphere sphere = BoundingSphereRenderer.TransformBoundingSphere(mesh.BoundingSphere, world);
// now draw the sphere with our renderer
BoundingSphereRenderer.Draw(sphere, view, projection);
}
}
And here is the BoundingSphereRenderer Code:
private static VertexBuffer vertexBuffer;
private static BasicEffect effect;
private static int lineCount;
public static void Initialize(GraphicsDevice graphicsDevice, int sphereResolution)
{
// create our effect
effect = new BasicEffect(graphicsDevice);
effect.LightingEnabled = false;
effect.VertexColorEnabled = true;
// calculate the number of lines to draw for all circles
lineCount = (sphereResolution + 1) * 3;
// we need two vertices per line, so we can allocate our vertices
VertexPositionColor[] vertices = new VertexPositionColor[lineCount * 2];
// compute our step around each circle
float step = MathHelper.TwoPi / sphereResolution;
// used to track the index into our vertex array
int index = 0;
//create the loop on the XY plane first
for (float angle = 0f; angle < MathHelper.TwoPi; angle += step)
{
vertices[index++] = new VertexPositionColor(new Vector3((float)Math.Cos(angle), (float)Math.Sin(angle), 0f), Color.Blue);
vertices[index++] = new VertexPositionColor(new Vector3((float)Math.Cos(angle + step), (float)Math.Sin(angle + step), 0f), Color.Blue);
}
//next on the XZ plane
for (float angle = 0f; angle < MathHelper.TwoPi; angle += step)
{
vertices[index++] = new VertexPositionColor(new Vector3((float)Math.Cos(angle), 0f, (float)Math.Sin(angle)), Color.Red);
vertices[index++] = new VertexPositionColor(new Vector3((float)Math.Cos(angle + step), 0f, (float)Math.Sin(angle + step)), Color.Red);
}
//finally on the YZ plane
for (float angle = 0f; angle < MathHelper.TwoPi; angle += step)
{
vertices[index++] = new VertexPositionColor(new Vector3(0f, (float)Math.Cos(angle), (float)Math.Sin(angle)), Color.Green);
vertices[index++] = new VertexPositionColor(new Vector3(0f, (float)Math.Cos(angle + step), (float)Math.Sin(angle + step)), Color.Green);
}
// now we create the vertex buffer and put the vertices in it
vertexBuffer = new VertexBuffer(graphicsDevice, typeof(VertexPositionColor), vertices.Length, BufferUsage.WriteOnly);
vertexBuffer.SetData(vertices);
}
public static void Draw(this BoundingSphere sphere, Matrix view, Matrix projection)
{
if (effect == null)
throw new InvalidOperationException("You must call Initialize before you can render any spheres.");
// set the vertex buffer
effect.GraphicsDevice.SetVertexBuffer(vertexBuffer);
// update our effect matrices
effect.World = Matrix.CreateScale(sphere.Radius) * Matrix.CreateTranslation(sphere.Center);
effect.View = view;
effect.Projection = projection;
// draw the primitives with our effect
effect.CurrentTechnique.Passes[0].Apply();
effect.GraphicsDevice.DrawPrimitives(PrimitiveType.LineList, 0, lineCount);
}
public static BoundingSphere TransformBoundingSphere(BoundingSphere sphere, Matrix transform)
{
BoundingSphere transformedSphere;
// the transform can contain different scales on the x, y, and z components.
// this has the effect of stretching and squishing our bounding sphere along
// different axes. Obviously, this is no good: a bounding sphere has to be a
// SPHERE. so, the transformed sphere's radius must be the maximum of the
// scaled x, y, and z radii.
// to calculate how the transform matrix will affect the x, y, and z
// components of the sphere, we'll create a vector3 with x y and z equal
// to the sphere's radius...
Vector3 scale3 = new Vector3(sphere.Radius, sphere.Radius, sphere.Radius);
// then transform that vector using the transform matrix. we use
// TransformNormal because we don't want to take translation into account.
scale3 = Vector3.TransformNormal(scale3, transform);
// scale3 contains the x, y, and z radii of a squished and stretched sphere.
// we'll set the finished sphere's radius to the maximum of the x y and z
// radii, creating a sphere that is large enough to contain the original
// squished sphere.
transformedSphere.Radius = Math.Max(scale3.X, Math.Max(scale3.Y, scale3.Z));
// transforming the center of the sphere is much easier. we can just use
// Vector3.Transform to transform the center vector. notice that we're using
// Transform instead of TransformNormal because in this case we DO want to
// take translation into account.
transformedSphere.Center = Vector3.Transform(sphere.Center, transform);
return transformedSphere;
}

Resources