I need a 2D bone engine for JavaScript - 2d

I'm looking for a way to define a Bone animation system, I need a basic one, since my objective is to apply it for inverse kinematics, much like Flash supports.
The desirable feature is that: I can set bones (as position in 2D, defined by 2 dots) each having an ID. So I can make an animation based on frames, ie:
['l_leg', [10, 0],[ 13,30 ] ] ['r_leg', [30, 0 ], [13, 30] ] //Frame 1 (standing)
['l_leg', [10, 0],[ 13,30 ] ] ['r_leg', [35, 30], [13, 30] ] //Frame 2 (lifting right leg)
...
I'm confident that defining Joints ain't necessary.
The lib may be lib in Ruby, since I can port it to JS, but if in JS already is better :)

UPDATE: deprecated for a long time now.
I am developing my own: http://github.com/flockonus/javascriptinmotion

See Wikipedia: Express Animator.
1st result for Google: javascript skeletal.

Related

group_by behavior when using --stream

Having (simplified for learning) input file:
{"type":"a","id":"1"}
{"type":"a","id":"2"}
{"type":"b","id":"1"}
{"type":"c","id":"3"}
I'd like to turn it into:
{
"a": [1,2],
"b": [1],
"c": [3]
}
via using --stream option, not needed here, just for learning. Or at least it does not seem that viable to use group_by or reduce without it on bigger files (even few G seems to be rather slow)
I understand that I can write smth like:
jq --stream -cn 'reduce (inputs|select(length==2)) as $i([]; . + ..... )' test3
but that would just process the data per line(processed item in stream), ie I can either see type or id, and this does not have place where to create pairing. I can cram it to one big array, but that opposite of what I have to do.
How to create such pairings? I don't even know how to create(using --stream):
{"a":1}
{"a":2}
...
I know both (first target transformation, and the one above this paragraph) are probably some trivial usage of for each, I have some working example of one here, but all it's .accumulator and .complete keywords(IIUC) are now just magic. I understood it once, but ... Sorry for trivial questions.
UPDATE regarding performace:
#pmf provided in his answer 2 solutions: streaming and non streaming. Thanks for that, I was able to write non-streaming version, but not the streaming one. But when testing it, the streaming variant was (I'm not 100% sure now, but ...) 2-4 times slower. Makes sense if data does not fit into memory, but luckily in my case, they do. So I ran the non streaming version for ~1G file on laptop, but not actually that slow i7-9850H CPU # 2.60GHz. For my surprise it wasn't done withing 16hours so I killed it as not viable solution for my usecase of potentially a lot bigger input files. Considering simplicity of input, I decided to write pipeline just via using bash, grep,sed,paste and tr, and eventhough it was using some regexes, and was overally inefficient as hell, and without any parallelism, the whole file was correctly crunched in 55 seconds. I understand that character manipulation is faster than parsing json, but that much difference? Isn't there some better approach while still parsing json? I don't mind spending more cpu power, but if I'm using jq, I'd like to use it's functions and process json as json, not just chars just as I did it with bash.
In the "unstreamed" case I`d use
jq -n 'reduce inputs as $i ({}; .[$i.type] += [$i.id | tonumber])'
Demo
With the --stream option set, just re-create the streamed items using fromstream:
jq --stream -n 'reduce fromstream(inputs) as $i ({}; .[$i.type] += [$i.id | tonumber])'
{
"a": [1,2],
"b": [1],
"c": [3]
}

Counting interactions with linked nodes [netlogo]

My model has links with defined duration, and I am trying to register the new links and the old ones in two different vectors.
The Problem: When I run the simulation the new links are stored correctly, but the old ones appear duplicated in the csv file. I am making a mistake at some point and I really need some help. If there is a more elegant way of doing it, I appreciate the tips! Thanks all for the collaboration!
ifelse not link-neighbor? myself
[
create-new-links-with partner in-radius 0.1
ask new-links
[
set registernew []
set link-duration max (list duration maxduration)
set link-creation time:copy dt
set link-end time:plus link-creation link-duration "year"
set link-installment invtransf
set meets 1
set registernew ([(list link-creation link-duration link-end link-installment meets end1 end2) ] of new-links)
add-records
set breed old-links
]
]
[
ask old-links
[
set registerold [ ]
set meets time:copy dt
set registerold ([(list link-creation link-duration link-end link-installment meets end1 end2) ] of old-links)
]
I haven't checked this, but I suspect that the problem is that you have bidirectional links WITH, not unidirectional links TO or FROM, so the old-links get reported by each end separately which duplicates their presence in the list.
Again I haven't tried this but you might set a flag-variable in each link at the start of each large pass, like, reported? to false, and then deep in the loop when you are about to report it see if it's already been reported and if not, go ahead ( report and set the flag ) and if it has, don't report it again but interrupt the run with a user-message and examine what just happened.
Wade

How to import multiple json objects from a json file into R as a dataframe in such a way that all values are consecutive rows and names are columns

I need it as a dataframe in order to create predictive and classification models on it in R. The json file looks as follows:
{"reviewerID": "A2IBPI20UZIR0U", "asin": "1384719342", "reviewerName": "cassandra tu \"Yeah, well, that's just like, u...", "helpful": [0, 0], "reviewText": "Not much to write about here, but it does exactly what it's supposed to. filters out the pop sounds. now my recordings are much more crisp. it is one of the lowest prices pop filters on amazon so might as well buy it, they honestly work the same despite their pricing,", "overall": 5.0, "summary": "good", "unixReviewTime": 1393545600, "reviewTime": "02 28, 2014"}
{"reviewerID": "A14VAT5EAX3D9S", "asin": "1384719342", "reviewerName": "Jake", "helpful": [13, 14], "reviewText": "The product does exactly as it should and is quite affordable.I did not realized it was double screened until it arrived, so it was even better than I had expected.As an added bonus, one of the screens carries a small hint of the smell of an old grape candy I used to buy, so for reminiscent's sake, I cannot stop putting the pop filter next to my nose and smelling it after recording. :DIf you needed a pop filter, this will work just as well as the expensive ones, and it may even come with a pleasing aroma like mine did!Buy this product! :]", "overall": 5.0, "summary": "Jake", "unixReviewTime": 1363392000, "reviewTime": "03 16, 2013"}
{"reviewerID": "A195EZSQDW3E21", "asin": "1384719342", "reviewerName": "Rick Bennette \"Rick Bennette\"", "helpful": [1, 1], "reviewText": "The primary job of this device is to block the breath that would otherwise produce a popping sound, while allowing your voice to pass through with no noticeable reduction of volume or high frequencies. The double cloth filter blocks the pops and lets the voice through with no coloration. The metal clamp mount attaches to the mike stand secure enough to keep it attached. The goose neck needs a little coaxing to stay where you put it.", "overall": 5.0, "summary": "It Does The Job Well", "unixReviewTime": 1377648000, "reviewTime": "08 28, 2013"}
{"reviewerID": "A2C00NNG1ZQQG2", "asin": "1384719342", "reviewerName": "RustyBill \"Sunday Rocker\"", "helpful": [0, 0], "reviewText": "Nice windscreen protects my MXL mic and prevents pops. Only thing is that the gooseneck is only marginally able to hold the screen in position and requires careful positioning of the clamp to avoid sagging.", "overall": 5.0, "summary": "GOOD WINDSCREEN FOR THE MONEY", "unixReviewTime": 1392336000, "reviewTime": "02 14, 2014"}
... and a 100 more.
I tried the rjson package but that just imports the first json object from the file and not the others.
library("rjson")
json_file <- "reviews_Musical_Instruments_5.json"
json_data <- fromJSON(paste(readLines(json_file), collapse=""))
Expected result should be a dataframe with "reviewerID", "asin", "reviewerName", etc. as columns and their values as consecutive rows.
Here I fix the json to be regular json and use jsonlite::fromJSON() to get the data.frame:
library(jsonlite)
json_file <- "reviews_Musical_Instruments_5.json"
json_file_contents <- readLines(json_file)
json_file_contents <- paste(json_file_contents, collapse = ",")
json_file_contents <- paste(c("[", json_file_contents, "]"), collapse = "")
fromJSON(json_file_contents)
(Based on the comments.)
The data is not regular JSON, it's "ndjson", which is Newline-Delimited JSON. The (only?) difference is that each line is self-sufficient JSON.
If this were regular JSON, one would need to encapsulate all of these within a list (or similar) by prepending an open-bracket [, putting commas between each element (dict here), and append a close-bracket ]. JSON-structured streaming data is nice, but if a client connects after the leading [ then everything else is invalid. Ergo, NDJSON.
For you, just use jsonlite::stream_in(file(json_file)) and all should work.

Difference between 2 DateTime in hours in Elixir

My goal is to find a difference in hours between now and some provided date_time. I'm trying to do this this way:
pry(1)> dt1
#DateTime<2017-10-24 05:12:46.000000Z>
pry(2)> Ecto.DateTime.to_erl(dt1)
** (FunctionClauseError) no function clause matching in Ecto.DateTime.to_erl/1
The following arguments were given to Ecto.DateTime.to_erl/1:
# 1
#DateTime<2017-10-24 05:12:46.000000Z>
Attempted function clauses (showing 1 out of 1):
def to_erl(%Ecto.DateTime{year: year, month: month, day: day, hour: hour, min: min, sec: sec})
(ecto) lib/ecto/date_time.ex:608: Ecto.DateTime.to_erl/1
# ............
How to fix that? Or is there a better way to achieve my goal?
Note that I don't use timex and won't use it, neither any third-party library. Only ones built-in in Elixir/Erlang/Phoenix.
You can use DateTime.diff/3:
iex> dt1 = %DateTime{year: 2000, month: 2, day: 29, zone_abbr: "AMT",
...> hour: 23, minute: 0, second: 7, microsecond: {0, 0},
...> utc_offset: -14400, std_offset: 0, time_zone: "America/Manaus"}
iex> dt2 = %DateTime{year: 2000, month: 2, day: 29, zone_abbr: "CET",
...> hour: 23, minute: 0, second: 7, microsecond: {0, 0},
...> utc_offset: 3600, std_offset: 0, time_zone: "Europe/Warsaw"}
iex> DateTime.diff(dt1, dt2)
18000
iex> DateTime.diff(dt2, dt1)
-18000
Since DateTime.diff/3 returns seconds you have to calculate the hours out of the result like this:
result_in_hours = result_in_seconds/(60*60)
This answer has been added before topic author edited his question and excluded external libs from the scope. However, I'm not deleting it as I find Timex extremely useful and maybe someone will be interested in using it as well (I have nothing to do with Timex creators)
I strongly recommend using Timex library. It's perfect for date/time calculations with different formats and time zones.
So in your case to easily calculate hour difference you just need to:
Timex.diff(dt1, DateTime.utc_now, :hours)
You can find diff/3 docs here.
Ignoring timezones entirely here (to do that it is easiest to convert everything to UTC, then cast to whatever is local), the pure Erlang way is to do something along these lines.
1> Event = {{2017,05,23},{13,11,23}}.
{{2017,5,23},{13,11,23}}
2> Now = calendar:local_time().
{{2017,10,24},{14,44,1}}
3> EventGSec = calendar:datetime_to_gregorian_seconds(Event).
63662764283
4> NowGSec = calendar:datetime_to_gregorian_seconds(Now).
63676075441
5> Elapsed = NowGSec - EventGSec.
13311158
6> calendar:seconds_to_daystime(Elapsed).
{154,{1,32,38}}
The exact solution would depend, of course, on the resolution that you require. Extracting that into a function that returns a tuple of the form {Days, {Hours, Minutes, Seconds}} gives us:
-spec elapsed_time(calendar:datetime()) -> {Days, Time}
when Days :: non_neg_integer(),
Time :: calendar:time().
elapsed_time(Event) ->
Now = calendar:local_time(),
EventGSec = calendar:datetime_to_gregorian_seconds(Event),
NowGSec = calendar:datetime_to_gregorian_seconds(Now),
Elapsed = NowGSec - EventGSec,
calendar:seconds_to_daystime(Elapsed).
(Of course, the above could be composed on a single line -- but why would you ever do that?)
I'm sure there are some wazoo Elixir libraries that do things in a totally different way. Have fun with those. This answer only addresses the underlying native libraries. Most of the time functions in the Erlang standard library are now based on integer-based time units (seconds, milliseconds, or nanoseconds) instead of the older erlang:now() timestamp form (which is now deprecated). The exact way you should write your version of the function above depends on what kind of resolution you require and formatting of the original input (Unix-era nanoseconds are quite common for the type of timestamps I deal with myself, for example -- but are inapplicable to a datetime data type).
Remember: time calculations are tricky, subtle and hard to get right in edge cases. The standard libraries for most languages actually get quite a lot of TZ and time diff issues wrong -- and if that's OK for your case then just don't worry about it. In any case, I suggest at least skimming the Time and Time Correction in Erlang page of the standard docs -- even if it does not apply to your current situation, eventually you'll probably be in a situation where subtle timing issues will matter.

Is there any charting library in elixir?

I want to plot a bar graph in elixir and save it as an image.
Is there any good charting library for doing this ?
I tried searching on elixir.libhunt.com and github.com/h4cc/awesome-elixir, but didn't find a single package for my need.
Thanks in advance.
Yes - you can interface to gnuplot with gnuplot-elixir
As as example, to generate the bar graph in this answer - the code would be:
import Gnuplot
chart = [
[:set, :term, :png, :size, '512,512'],
[:set, :output, Path.join("/tmp", "barchart.PNG")],
[:set, :boxwidth, 0.5],
~w(set style fill solid)a,
[:plot, "-", :using, '1:3:xtic(2)', :with, :boxes]
]
dataset = [[0, "label", 100], [1, "label2", 450], [2, "bar label", 75]]
plot(chart, [dataset])
I don't think there's any such control for Elixir--nothing native anyway. Graphics is not exactly in Elixir's wheelhouse. However, I think you could probably build something yourself with wxErlang. You can see what sorts of things you can do with wxErlang in Elixir by typing :wx.demo() from within iex. I don't know of a graph primitive in wxErlang but it may be that I simply haven't found it yet.
As an update to this question, you can now use the vega_lite package and LiveBook to easily plot with Elixir.
https://livebook.dev/
https://github.com/livebook-dev/vega_lite

Resources