How to set the plot object name? - bokeh

In here, it says we can select the plot object by name.
# These two are equivalent
p.select({"type": HoverTool})
p.select(HoverTool)
# These two are also equivalent
p.select({"name": "mycircle"})
p.select("mycircle")
# Keyword arguments can be supplied in place of selector dict
p.select({"name": "foo", "type": HoverTool})
p.select(name="foo", type=HoverTool)
E.g. fig.circle(x, y) Then we can select the circle marker from fig.select(Circle).
However, I am a bit interested in the name parameter. How do we set the name of each model so it is easy to query?

The PlotObject object that all of the models/glyphs are subclassed from have a "name" attr, so you can add name='foo' as an argument when creating a plot object.
(source: https://github.com/bokeh/bokeh/blob/master/bokeh/plot_object.py)
This is an example of naming some models and passing their names to the HoverTool to select them:
http://nbviewer.ipython.org/urls/gist.githubusercontent.com/canavandl/7bae1a47e40e4d44b5da/raw/named_objects_example.ipynb

Related

How do I get a list of all elements and their attributes via XQuery

I am quite new to XQuery and I am trying to get a list of all elements and all attributes.
It should look like this:
element1 #attributex, #attribue y, ...
element 2 #attribute x, #attribute y, ...
element 3 #attribute x, #attribute y, ...
I am trying this so far, but the error "Item expected, sequence found":
for $x in collection("XYZ")
let $att := local-name(//#*)
let $ele := local-name(//*)
let $eleatt := string-join($ele, $att)
return $eleatt
I feel like I am turning an easy step into a complicated one. Please help.
Thanks in advance, Eleonore
//#* gives you a sequence of attribute nodes, //* a sequence of element nodes. In general to apply a function like local-name() to each item in a sequence, for nodes you have three options:
Use a final step /local-name() e.g. //#*/local-name() or //*/local-name()
In XQuery 3.1 use the map operator ! e.g. //#*!local-name()
Use a for .. return expression e.g. for $att in //#* return local-name($att)
The local-name() function takes a single node as its argument, not a sequence of nodes. To apply the same function to every node in a sequence, using the "!" operator: //*!local-name().
The string-join() function takes two arguments, a list of strings, and a separator. You're trying to pass two lists of strings. You want
string-join((//*!local-name(), //#*!local-name()), ',')
Of course you might also want to de-duplicate the list using distinct-values(), and to distinguish element from attribute names, or to associate attribute names with the element they appear on. That's all eminently possible. But for that, you'll have to ask a more precise question.

R only specify optional parameters if specified

I have an R function with optional parameters like so:
myFunc <- function(
requiredParam,
optionalParam1 = optionalValue1,
optionalParam2 = optionalValue2,
...
optionalParamN = optionalValueN) {
# implementation
}
I have another function which calls this function and has the necessary parameters stored in a dataframe:
optionalParam1 optionalParam3 optionalParam10
1 "val1" "val2" "val3"
I only want to pass the optional parameters specified in the dataframe. For the others, I want it to use the default values. How can I accomplish this without typing up all permutations of optionalParameters existing/not existing?
Call the function using do.call (not knowing what your data.frame is called I will just assume you have a list or something of the parameters called myParams):
do.call(myFunc, as.list(myParams))
You can also build your function call as a string by parsing your dataframe column names and using paste.
Then, use eval(parse(text="your string"))

Assign a variable by name in scilab

Given that I have a matrix of variable name strings and the respective values in another matrix (both come from a csv file), how can I create variables in the workspace that have the names from the name matrix and the values from the value matrix?
I have found global to define a variable's scope so that I can write to it in a function, but I haven't found a way to handle runtime variable names.
You should use execstr function (see: https://help.scilab.org/docs/5.5.2/en_US/execstr.html)
For example, with a matrix names stored in the variable MatrixNames and the matrix content stored in the variable MatrixContent, you will simply have:
execstr(MatrixName(i)+'= MatrixContent');
With i the cell number for the corresponding matrix name you want to treat.
As #david-dorchies suggested, you should use execstr. To make sure they are globally accesible use globals if you want to do it in a function.
See below for an example implementation.
funcprot(0);
clear;
function assign_to_globals(names, values)
for i=1:length(values)
execstr(sprintf('clearglobal %s; global %s;', names(i), names(i)))
execstr(sprintf('%s = %s;', names(i), string(values(i))))
end;
endfunction
function disp_all_globals(names)
for i=1:(size(names,1)*size(names,2))
disp(names(i))
execstr(sprintf('global %s; disp(%s)', names(i), names(i)))
end;
endfunction
values = list(23,5.6,6/10,"[1,2,3]");
names = ['a','my_long_var_name','c1','my_sub_mat'];
assign_to_globals(names, values)
disp_all_globals(names)
clearglobal()

How to extract keys in a nested json array object in Presto?

I'm using the latest(0.117) Presto and trying to execute CROSS JOIN UNNEST with complex JSON array like this.
[{"id": 1, "value":"xxx"}, {"id":2, "value":"yy"}, ...]
To do that, first I tried to make an ARRAY with the values of id by
SELECT CAST(JSON_EXTRACT('[{"id": 1, "value":"xxx"}, {"id":2, "value":"yy"}]', '$..id') AS ARRAY<BIGINT>)
but it doesn't work.
What is the best JSON Path to extract the values of id?
This will solve your problem. It is more generic cast to an ARRAY of json (less prone to errors given an arbitrary map structure):
select
TRANSFORM(CAST(JSON_PARSE(arr1) AS ARRAY<JSON>),
x -> JSON_EXTRACT_SCALAR(x, '$.id'))
from
(values ('[{"id": 1, "value":"xxx"}, {"id":2, "value":"yy"}]')) t(arr1)
Output in presto:
[1,2]
... I ran into a situation where a list of jsons was nested within a json. My list of jsons had an ambiguous nested map structure. The following code returns an array of values given a specific key in a list of jsons.
Extract the list using JSON EXTRACT
Cast the list as an array of jsons
Loop through the json elements in the array using the TRANSFORM function and extract the value of the key that you are interested in.
>
TRANSFORM(CAST(JSON_EXTRACT(json, '$.path.toListOfJSONs') AS ARRAY<JSON>),
x -> JSON_EXTRACT_SCALAR(x, '$.id')) as id
You can cast the JSON into an ARRAY of MAP, and use transform lambda function to extract the "id" key:
select
TRANSFORM(CAST(JSON_PARSE(arr1) AS ARRAY<MAP<VARCHAR, VARCHAR>>), entry->entry['id'])
from
(values ('[{"id": 1, "value":"xxx"}, {"id":2, "value":"yy"}]')) t(arr1)
output:
[1, 2]
Now, you can use presto-third-functions , It provide json_array_extract function, you can extract json array info like this:
select
json_array_extract_scalar(arr1, '$.book.id')
from
(values ('[{"book":{"id":"12"}}, {"book":{"id":"14"}}]')) t(arr1)
output is:
[12, 14]
I finally gave up finding a simple JSON Path to extract them.
Instead, I wrote a redundant dirty query like the following to make the task done.
SELECT
...
FROM
(
SELECT
SLICE(ARRAY[
JSON_EXTRACT(json_column, '$[0].id'),
JSON_EXTRACT(json_column, '$[1].id'),
JSON_EXTRACT(json_column, '$[2].id'),
...
], JSON_ARRAY_LENGTH(json_column)) ids
FROM
the.table
) t1
CROSS JOIN UNNEST(ids) AS t2(id)
WHERE
...
I still want to know the best practice if you know another good way to CROSS JOIN them!

Parametric Type Creation

I'm struggling to understand parametric type creation in julia. I know that I can create a type with the following:
type EconData
values
dates::Array{Date}
colnames::Array{ASCIIString}
function EconData(values, dates, colnames)
if size(values, 1) != size(dates, 1)
error("Date/data dimension mismatch.")
end
if size(values, 2) != size(colnames, 2)
error("Name/data dimension mismatch.")
end
new(values, dates, colnames)
end
end
ed1 = EconData([1;2;3], [Date(2014,1), Date(2014,2), Date(2014,3)], ["series"])
However, I can't figure out how to specify how values will be typed. It seems reasonable to me to do something like
type EconData{T}
values::Array{T}
...
function EconData(values::Array{T}, dates, colnames)
...
However, this (and similar attempts) simply produce and error:
ERROR: `EconData{T}` has no method matching EconData{T}(::Array{Int64,1}, ::Array{Date,1}, ::Array{ASCIIString,2})
How can I specify the type of values?
The answer is that things get funky with parametric types and inner constructors - in fact, I think its probably the most confusing thing in Julia. The immediate solution is to provide a suitable outer constructor:
using Dates
type EconData{T}
values::Vector{T}
dates::Array{Date}
colnames::Array{ASCIIString}
function EconData(values, dates, colnames)
if size(values, 1) != size(dates, 1)
error("Date/data dimension mismatch.")
end
if size(values, 2) != size(colnames, 2)
error("Name/data dimension mismatch.")
end
new(values, dates, colnames)
end
end
EconData{T}(v::Vector{T},d,n) = EconData{T}(v,d,n)
ed1 = EconData([1,2,3], [Date(2014,1), Date(2014,2), Date(2014,3)], ["series"])
What also would have worked is to have done
ed1 = EconData{Int}([1,2,3], [Date(2014,1), Date(2014,2), Date(2014,3)], ["series"])
My explanation might be wrong, but I think the probably is that there is no parametric type constructor method made by default, so you have to call the constructor for a specific instantiation of the type (my second version) or add the outer constructor yourself (first version).
Some other comments: you should be explicit about dimensions. i.e. if all your fields are vectors (1D), use Vector{T} or Array{T,1}, and if their are matrices (2D) use Matrix{T} or Array{T,2}. Make it parametric on the dimension if you need to. If you don't, slow code could be generated because functions using this type aren't really sure about the actual data structure until runtime, so will have lots of checks.

Resources