I am looking for a way to convert a image, which is represented (because I used ArrayFire function loadimage()) as af::array to QVector. Any advice on how can I do this?
Thank you!
Let a be you af::array. Provided you can retrieve the pointer to the actual data via a.host() and the number of array elements with a.bytes(), the question reduces to "How can I convert a C array to a QVector.
That is already answered here: Initialize QVector from array
Related
Got an ArrayD which always has 2 dimensions, but is an ArrayD due to a calculation, need to change this to Array2 for storage.
Been looking through documentation, can't seem to find a way.
Is there a function to do it?
If you have a variable of type ArrayD<f32>, you can do:
// convert n-dimensional array into 2d array
let arr2d: Array2<f32> = arr2d.into_dimensionality::<Ix2>()?;
to convert it into a variable of type Array2<f32>.
If the dimensions don't match, it will throw an ndarray::ShapeError error.
also see: ndarray documentation: into_dimensionality
I'm trying to create a 2D array from a Vec of 1D arrays using the ndarray crate. In the current implementation, I have Vec<Array1<u32>> as the collection of 1D arrays, and I'm having a hard time figuring out how to convert it to Array2<u32>. I've tried from_vec() on Vec<Array1<u32>> but it yielded Array1<Array1<u32>>. I thought of using the stack! macro, but I'm not sure how to call it on the above Vec. I'm using ndarray 0.12.1 and Rust 1.31.0.
I'm not hugely familiar with ndarray, but it looks like you have to flatten the data as an intermediate step and then rebuild from that. An iterator would probably have been more efficient but I don't see a method to build from an iterator that also lets you specify a shape.
It likely isn't the most performant way to to this, but it does at least work:
fn to_array2<T: Copy>(source: &[Array1<T>]) -> Result<Array2<T>, impl std::error::Error> {
let width = source.len();
let flattened: Array1<T> = source.into_iter().flat_map(|row| row.to_vec()).collect();
let height = flattened.len() / width;
flattened.into_shape((width, height))
}
Note that it can fail if the source arrays has different lengths. This solution is not 100% robust because it won't fail if one array is smaller but compensated by another array being longer. It is probably worth adding a check in there to prevent that, but I'll leave that to you.
I am wondering how can I find the range between two elements in QVector using c++.
When using C# it's easier, and looks like following:
QVector aaa;
aaa.getRange(item1, item2);
Your question is not very clear. By googling what .NET's getRange actually does, it seems to return given count of elements from given starting position. QVector<T> QVector::mid(int pos, int length = -1); does the same with QVector.
Is it possible to convert a pointer to certain value to a slice?
For example, I want to read single byte from io.Reader into uint8 variable. io.Reader.Read accepts a slice as its argument, so I cannot simply provide it a pointer to my variable as I'd do in C.
I think that creating a slice of length 1, capacity 1 from a pointer is safe operation. Obviously, it should be the same as creating a slice from an array of length 1, which is allowed operation. Is there an easy way to do this with plain variable? Or maybe I do not understand something and there are reasons why this is prohibited?
A slice is not only a pointer, like an array in C. It also contains the length and capacity of the data, like this:
struct {
ptr *uint8
len int
cap int
}
So, yes, you will need to create a slice. Simplest way to create a slice of the var a uint8 would be []uint8{a}
a := uint8(42)
fmt.Printf("%#v\n", []uint8{a})
(But after rereading your question, this is not a solution as all)
But if you wish to create the slice from the variable, pointing to the same space of memory, you could use the unsafe package. This is most likely to be discouraged.
fmt.Printf("%#v\n", (*[1]uint8)(unsafe.Pointer(&a))[:] )
Instead of (over)complicating this trivial task, why not to use the simple solution? I.e. pass .Read a length-1 slice and then assign its zeroth element to your variable.
I found a way to overcome my case when I want to supply a variable to io.Reader. Go standard library is wonderful!
import (
"io"
"encoding/binary"
)
...
var x uint8
binary.Read(reader, LittleEndian, &x)
As a side effect this works for any basic type and even for some non-basic.
Edit: Jeremy Wall helped me realize I had asked a question more specific than I intended; here's a better version.
Say I want to represent a table associating of values of some type B to sequences of values of some type A for which equality is defined. What is the best way to do that in Go?
Obviously for the table I'd want to use a Go map, but what can I use for the sequences of values of type A? Slices cannot be used as keys for maps in Go; arrays can, but the length of an array is a part of it's type and I'm interested in being able to use sequences of length determined at runtime. I could (1) use arrays of A declaring a maximum length for them or (2) use slices of A, serialize them to strings for use as keys (this technique is familiar to Awk and Lua programmers...). Is there a better work around for this "feature" of Go than the ones I've described?
As pointed out by Jeremy Wall in answer to my original version of the question, where I had A = int, option (2) is pretty good for integers, since you can use slices of runes for which conversion to string is just a cast.
Will a sequence of rune instead of integers work for you? runes are uint32 and the conversion to a string is just a cast:
package main
import "fmt"
type myKey struct {
seq []int
}
func main() {
m := make(map[string]string)
key := []rune{1, 2}
m[string(key)] = "foo"
fmt.Print("lookup: ", m[string(key)])
}
You can play with this code here: http://play.golang.org/p/Kct1dum8A0