How to repeat or tile a texture in A-Frame? - aframe

I have a material
<a-entity material="src:#imgGround; repeat: 400"></a-entity>
that I want to repeat or tile for the ground, but it is not working. How do I repeat or tile a texture in A-Frame?
I also want to do it for displacement and normal maps:
<a-entity id="ground_model" geometry="primitive:circle;radius:1" position="0 0 0" rotation="-90 0 0" scale="30 30 30" color="green" material="src:#imgGround;flatShading:false;transparent:false;repeat:44;normalMap:#imgGroundN;displacementMap:#imgGroundZ">

https://aframe.io/docs/master/components/material.html#built_in_materials_repeat
A repeat takes two values (for U and V mapping, or X and Y), not one.
<a-entity material="src:#imgGround; repeat: 20 20"></a-entity>
https://aframe.io/docs/master/components/material.html#built_in_materials_normaltexturerepeat
Similarly for displacement and normal maps, there are specific repeat properties:
<a-entity material="displacementMap:#imgGroundZ; displacementTextureRepeat: 4 4">

Related

A-frame x and z object rotations are actually moving object position instead

I am trying to simply rotate an object in A-frame.
When I rotate it on the y-axis rotation = "0 90 0" it works fine.
But when I try and rotate it along the X or Z axes the object moves position along those axes instead.
For instance, with rotation = "10 0 0" below it moves the head downwards without rotating it at all.
Here is the code in question:
<a-scene embedded arjs = 'trackingMethod: best; sourceType: webcam; debugUIEnabled: false;'>
<a-anchor hit-testing-enabled='true'>
<a-entity
scale="5 5 5"
position="0 -8 -0"
rotation = "10 0 0"
obj-model="obj: url(models/free_head.obj);">
</a-entity>
</a-anchor>
<a-camera-static/>
</a-scene>
Thanks for any insights you can give me

How to map UV coordinates of a texture to the "inner" UV coordinates of a sub rectangle?

In a fragment shader I get the UV coordinates of the current pixel in the range of [0.0,1.0]. However, I don't render the whole texture (thus in UV from 0.0 to 1.0), but only a sub rectangle of it. So the UV coordinates I get in this shader are only a sub range of [0.0,1.0], as the other parts are outside the rectangle that I render.
In my shader I do want to work with UV coordinates in the range of [0.0,1.0], but mapped to the sub rectangle of the texture that I render.
So my question is: given the total size of the texture and the sub rectangle of that texture that I render, how can I convert the given UV coordinates for the overall texture to the "inner" UV coordinates of my sub rectangle, but again also in the range [0.0,1.0]?
In the following I will use GLSL and the GLSL type vec2 which corresponds to the HLSL type float2.
Suppose you have a texture with a size (vec2 tSize) and a rectangular area in the texture from vec2 tMin; to vec2 tMax. You want to map the texture coordinate (vec2 uv) in the range [0.0, 1.0] to the rectangular area in the texture. Either do the mapping with an expression using the formula val * (max - min) + min:
vec2 uvMapped = (uv * (tMax - tMin) + tMin) / tSize;
Or use the GLSL function mix (GLSL mix, corresponds to HLSL lerp):
vec2 uvMapped = mix(tMin, tMax, uv) / tSize;

Iterating toward a specific angle on a circle

I'm building a robot vehicle that has an on-board compass. Starting from the current orientation of the vehicle in degrees, I would like to rotate it 90 degrees in either direction, using the compass.
I'm supposing that the best way to do this is to rotate the vehicle in increments within a "while" loop and test after each rotational increment if it has moved 90 degrees.
However, while dealing with transitions between two positive points is simple, it becomes challenging with transitions that involve the transition from 0 to 360.
In other words, this code for left rotation fails for obvious reasons:
let startingPoint = 30 // in degrees
let endPoint = startingPoint - 90
while currentPoint > endPoint {
rotateLeft()
}
Is there an equation that will enable this comparison when crossing the 360/0 boundary?
You can check difference, not absolute value.
To find whether difference is outside 90 range, you can use formula that accounts for arbitrary angles including transitions through 0 (don't forget angles should be in radians)
if Cos(startingangle - currentangle) <=0 then
absolute difference is equal or more than Pi/2 (90 degrees)
Here thick arrow shows (ignore axis labels or divide them by 4) Cos of difference of zero start angle with +-30 degrees angles (works for any start angle)
Python demo:
import math
def AngleInRange(value, central, arange):
value = math.radians(value)
central = math.radians(central)
arange = math.radians(arange)
return (math.cos(value - central) >= math.cos(arange))
for a in range (100, 220, 15): #through 180
print(a, AngleInRange(a, 150, 45))
for a in range (-40, 40, 10): #through 0
print(a, AngleInRange(a, -10, 20))
100 False
115 True
130 True
145 True
160 True
175 True
190 True
205 False
-40 False
-30 True
-20 True
-10 True
0 True
10 True
20 False
30 False

Measure rotation around one axis with Gyroscope

The gyroscope of a device returns values in the following range:
Alpha: 0 - 360 around the Z-axis
Beta: -180 - 180 around the X-axis
Gamma: -90 - 90 around the Y-axis
When I rotate the device around the Y-axis, the value 'flips' at a certain point from -90 to 90.
I know that this is due to the fact that the gyroscope functions like a gimbal, where Alpha is the outer ring, Beta is the middle ring and Gamma is the inner ring.
My question is: how do I measure the rotation JUST around the Y-axis, without the 'flipping' effect?
Is there a mathematical way to process the alpha/beta/gamma values and get more 'useful' values for this case, so that is easier to get the amount of rotation around the X, Y or Z axis?

How to plot the camera and image positions from camera calibration data?

I have the intrisic and extrinsic parameters of the camera.
The extrinsic is a 4 x 4 matrix with rotation and translation.
I have sample data as under, I have this one per camera image taken.
2.11e-001 -3.06e-001 -9.28e-001 7.89e-001
6.62e-001 7.42e-001 -9.47e-002 1.47e-001
7.18e-001 -5.95e-001 3.60e-001 3.26e+000
0.00e+000 0.00e+000 0.00e+000 1.00e+000
I would like to plot the image as given on the Matlab calibration toolkit page or
However I'm unable to figure out the Math of how to plot these 2 images.
The only lead I have is from this page http://en.wikipedia.org/wiki/Camera_resectioning. Which tells me that the camera position can be found by C = − R` . T
Any idea how to achieve this task?
Assume the corners of the plane that you want to draw are 3x1 column vectors, a = [0 0 0]', b = [w 0 0]', c = [w h 0]' and d = [0 h 0]'.
Assume that the calibration matrix that you provide is A and consists of a rotation matrix R = A(1:3, 1:3) and a translation T = A(1:3, 4).
To draw the first view
For every pose A_i with rotation R_i and translation T_i, transform each corner x_w (that is a, b, c or d) of the plane to its coordinates x_c in the camera by
x_c = R_i*x_w + T_i
Then draw the plane with transformed corners.
To draw the camera, its centre of projection in camera coordinates is [0 0 0]' and the camera x axis is [1 0 0]', y axis is [0 1 0]' and z axis is [0 0 1]'.
Note that in the drawing, the camera y-axis is pointing down, so you might want to apply an additional rotation on all the computed coordinates by multiplication with B = [1 0 0; 0 0 1; 0 -1 0].
Draw the second view
Drawing the plane is trivial since we are in world coordinates. Just draw the plane using a, b, c and d.
To draw the cameras, each camera centre is c = -R'*T. The camera axes are the rows of the rotation matrix R, so for instance, in the matrix you provided, the x-axis is
[2.11e-001 -3.06e-001 -9.28e-001]'. You can also draw a camera by transforming each point x_c given in camera coordinates to world coordinates x_w by x_w = R'*(x_c - T) and draw it.
There is now an example in opencv for visualizing the extrinsics generated from their camera calibration example
It outputs something similar to the original questions ask:

Resources