I have the following Test collection where each document looks like:
firstName: "Jeff",
lastname: "Harper",
scores:[ {'period':'week one', 'score':90},
{'period':'week two', 'score':85},
{'period':'week three','score':92},
{'period':'week four', 'score':87}
I would like to iterate through the scores array and console.log the score. As a trial, I have tried:
Test.find()forEach(function(doc){ console.log( doc.firstName ) } );
This works fine to print out the first name. If I would want to print the first score in the array object, i.e., I try the statement:
Test.find()forEach(function(doc){ console.log( doc.scores[0].score ) } );
which doesn't work. How do I gain access to the elements in the array of objects?
Thanks everyone for your input. Christian Fritz identified my problem. Now, I limit my search to only documents that have the object array. Both the forEach method and the fetch() method work now. However, Ethaan, I had to include an inner for-loop inside the primary for-loop to gain access to each internal score. Thanks for your help and your editing and the picture of the beautiful asian princess.
Related
Getting Taxonomy term name from Taxonomy target ID:
I have a taxonomy term that accepts multiple values. It's rendered as a multiselectfield. I am trying to read the target ID of the field and figure the term name out of it using the below code in a preprocess function:
$granttype = $user_entity->field_user_grant_type->getValue();
foreach($granttype as $gt)
{
$granttype_name = \Drupal\taxonomy\Entity\Term::load($gt)->label();
}
dd($granttype_name);
$variables['grant_type'] = $granttype_name;
dd($granttype) shows the below output:
However, the foreach loop to figure out the term name is not working correctly.
dd($granttype_name) results as:
The website encountered an unexpected error. Please try again later.
TypeError: Illegal offset type in Drupal\Core\Entity\EntityStorageBase->load() (line 297 of core/lib/Drupal/Core/Entity/EntityStorageBase.php).
I am looping through the target ID and trying to get the term name. But it's not working. Any help pls?
UPDATE: I tried the below line of code:
$term = term::load($gt);
$name = $term->getName();
still no luck :( same error
Here is an example how to do this:
$grant_type = $user_entity->field_user_grant_type->entity;
if ($grant_type instanceof \Drupal\taxonomy\TermInterface) {
var_dump($grant_type->label());
}
If you have multiple referenced terms, use:
$grant_types = $user_entity->field_user_grant_types->referencedEntities();
foreach ($grant_types as $grant_type) {
var_dump($grant_type->label());
}
Explanation:
The generic way to get entity title is the Entity::label method
$term->label();
There is a helpful method Entity::referencedEntities to get relations.
First, you need to include
use Drupal\taxonomy\Entity\Term;
use Drupal\taxonomy\TermInterface;
Second, retrieve the value(s) stored in your field (field_user_grant_type) and store them in an array:
$myArray = array();
$granttype = $user_entity->get('field_user_grant_type')->getValue();
$granttype will now contain an array of arrays. Next, you need to gather the actual term IDs
foreach($granttype as $type){
$myArray[] = $type['target_id'];
}
Finally, loop through $myArray and fetch the term IDs stored there, and then use each ID to get its corresponding Term Name. Here, I store them in a new array called grantTypeNames
grantTypeNames = array();
foreach($myArray as $term_id){
$grantTypeNames[] = Term::load($term_id)->get('name')->value;
}
The array $grantTypeNames will now contain the term names you want. I hope that helps.
I'm writing out some functions for Inventory management. I've recently wanted to add a "photo url column" to my spreadsheet by using an API I've used successfully while initially building my inventory. My Spreadsheet header looks like the following:
SKU | NAME | OTHER STUFF
I have a getProductInfo function that returns a list of product info from an API I'm calling.
getProductInfo<- function(barcode) {
#Input UPC
#Output List of product info
CallAPI(barcode)
Process API return, remove garbage
return(info)
}
I made a new function that takes my inventory csv as input, and attempts to add a new column with product photo url.
get_photo_url_from_product_info_output <- function(in_list){
#Input GetProductInfo Output. Returns Photo URL, or nothing if
#it doesn't exist
if(in_list$DisplayStockPhotos == TRUE){
return(in_list$StockPhotoURL)
} else {
return("")
}
}
add_Photo_URL <- function(in_csv){
#Input CSV data frame, appends photourl column
#Requires SKU (UPC) assumes no photourl column
out_csv <- mutate(in_csv, photo =
get_photo_url_from_product_info_output(
getProductInfo(SKU)
)
)
}
return (out_csv)
}
#Call it
new <- add_Photo_URL(old)
My thinking was that R would simply input the SKU of the from the row, and put it through the double function call "as is", and the vectorized DPLYR function mutate would just vectorize it. Unfortunately I was running into all sorts of problems I couldn't understand. Eventually I figured out that API call was crashing because the SKU field was all messed up as it was being passed in. I put in a breakpoint and found out that it wasn't just passing in the SKU, but instead an entire list (I think?) of SKUs. Every Row all at once. Something like this:
#Variable 'barcode' inside getProductInfo function contains:
[1] 7.869368e+11 1.438175e+10 1.256983e+10 2.454357e+10 3.139814e+10 1.256983e+10 1.313260e+10 4.339643e+10 2.454328e+10
[10] 1.313243e+10 6.839046e+11 2.454367e+10 2.454363e+10 2.454367e+10 2.454348e+10 8.418870e+11 2.519211e+10 2.454375e+10
[19] 2.454381e+10 2.454381e+10 2.454383e+10 2.454384e+10 7.869368e+11 2.454370e+10 2.454390e+10 1.913290e+11 2.454397e+10
[28] 2.454399e+10 2.519202e+10 2.519205e+10 7.742121e+11 8.839291e+11 8.539116e+10 2.519211e+10 2.519211e+10 2.519211e+10
Obviously my initial getProductInfo function can't handle that, so it'll crash.
How should I modify my code, whether it be in the input or API call to avoid this vectorized operation issue?
Well, it's not totally elegant but it works.
I figured out I need to use lapply, which is usually not my strong suit. Initally I tried to nest them like so:
lapply(SKU, get_photo_url_from_product_info_output(getProductInfo())
But that didn't work. So I just came up with bright idea of making another function
get_photo_url_from_sku <- function(barcode){
return(get_photo_url_from_product_info_output(getProductInfo(barcode)))
}
Call that in the lapply:
out_csv<- mutate(in_csv, photocolumn = lapply(SKU, get_photo_url_from_sku))
And it works great. My speed is only limited by my API calls.
In eXist-db I have hundreds of documents in /db/apps/foo/resources/documents like so:
...
BNF9992-J305-1.xml
BNF9992-J305-5.xml
BNF9992-J308-9.xml
BNF9992-J310-8.xml
BNF9992-J311-1.xml
BNF9992-J312-6.xml
BNF9992-J312-7.xml
BNF9992-J315-9.xml
BNF9992-J316-2.xml
BNF9992-J317-2.xml
BNF9992-J319-3.xml
...
Imagine I want to present to the user a list of 3 documents appearing before and after a specific document (based on alpha-numeric sort). So, my 'current document' is BNF9992-J312-7.xml, and I want to show the user something like:
BNF9992-J310-8.xml
BNF9992-J311-1.xml
BNF9992-J312-6.xml
BNF9992-J312-7.xml (current document)
BNF9992-J315-9.xml
BNF9992-J316-2.xml
BNF9992-J317-2.xml
Is there a function/method in Xquery 3.1 for iterating up/down a list of documents once they've been retrieved. The most I've been able to do is a simple retrieval of document names from a collection:
for $resource in collection("/db/apps/foo/resources/documents")
let $uri := base-uri($resource)
return util:unescape-uri(replace($uri, ".+/(.+)$","$1"), "UTF-8")
But I don't know how to iterate up and down the list from a given document.
Perhaps writing the list into nodes and applying a formula to node ordinals?
Many thanks.
If this were a list of strings $list, and the "current string" is $s, then I would do
let $i := index-of($list, $s)
return subsequence($list, $i - 3, 7)
I'm not sure whether the fact that you have a list of documents (rather than strings) changes this.
I am using knex and bookshelf, and my table consists of author, title, content, count, and each data looks like this:
author: 'John Doe',
title: 'aaaaa',
content: 'aaaaaaaa'
count: 54,
I want to retrieve data based on the value of count, and I want to get 4 data that has the highest count value.
If I want to retrieve all data, I am doing like this:
router.get('/', (req, res) => {
Article.forge().fetchAll().then(article => {
res.json(article);
})
})
Is there any way that I can do like forge({ count: 3 data that has the highest count value }) or
What should I add the code so that I can achieve this?
Combine orderBy with fetchPage
Article
.orderBy('-count')
.fetchPage({
pageSize: 3
})
.forge()
This highlights a reason why my team is removing bookshelf and just using basic knex. Unless you are wanting to fetch related models it's simpler to deal without the ORM layer. The knex equivalent knex code is:
knex('articles')
.orderBy('count', 'desc')
.limit(3)
Which is slightly simpler and the resulting rows' properties can be accessed directly, ie rows[0].id rather than rows[0].get('id')
EDIT: I accidentally misrepresented the problem when trying to pare-down the example code. A key part of my code is that I am attempting to sort the array after adding elements to it. The hang appears on sort, not insert. The following abstracted code will consistently hang:
<?=
local('a' = array)
#a->insert('test1' = map('a'='1'))
#a->insert('test2' = map('b'='2')) // comment-out to make work
#a->sort
#a
?>
I have a result set for which I want to insert a pair of values into an array for each unique key, as follows:
resultset(2) => {
records => {
if(!$logTypeClasses->contains(field('logTypeClass'))) => {
local(i) = pair(field('logTypeClass'), map('title' = field('logType'), 'class' = field('logTypeClass')))
log_critical(#i)
$logTypeClasses->insert(#i) // Lasso hangs on this line, will return if commented-out
}
}
}
Strangely, I cannot insert the #i local variable into thread variable without Lasso hanging. I never receive an error, and the page never returns. It just hangs indefinitely.
I do see the pairs logged correctly, which leads me to believe that the pair-generating syntax is correct.
I can make the code work as long as the value side of the pair is not a map with values. In other words, it works when the value side of the pair is a string, or even an empty map. As soon as I add key=value parameters to the map, it fails.
I must be missing something obvious. Any pointers? Thanks in advance for your time and consideration.
I can verify the bug with the basic code you sent with sorting. The question does arise how exactly one sorts pairs. I'm betting you want them sorted by the first element in the pair, but I could also see the claim that they should be sorted by last element in the pair (by values instead of by keys)
One thing that might work better is to keep it as a map of maps. If you need the sorted data for some reason, you could do map->keys->asArray->sort
Ex:
local(data) = map('test1' = map('a'=2,'b'=3))
#data->insert('test2' = map('c'=33, 'd'=42))
local(keys) = #data->keys->asArray
#keys->sort
#keys
Even better, if you're going to just iterate through a sorted set, you can just use a query expression:
local(data) = map('test1' = map('a'=2,'b'=3))
#data->insert('test2' = map('c'=33, 'd'=42))
with elm in #data->eachPair
let key = #elm->first
let value = #elm->second
order by #key
do { ... }
I doubt you problem is the pair with map construct per se.
This test code works as expected:
var(testcontainer = array)
inline(-database = 'mysql', -table = 'help_topic', -findall) => {
resultset(1) => {
records => {
if(!$testcontainer->contains(field('name'))) => {
local(i) = pair(field('name'), map('description' = field('description'), 'name' = field('name')))
$testcontainer->insert(#i)
}
}
}
}
$testcontainer
When Lasso hangs like that with no feedback and no immediate crash it is usually trapped in some kind of infinite loop. I'm speculating that it might have to do with Lasso using references whenever possible. Maybe some part of your code is using a reference that references itself. Or something.