Limit, offset and starting ID in Symfony 6 entity repositories - symfony

I have an entity and repository to fetch details:
$subscriptions = $userSubscriptionRepo->findBy(
['isDeleted' => false , 'disabled' => false],
null,
$this->paginationLimit, //5
0
);
On initial load, this will retrieve the first 5 items within the DB. But in our product, new subscriptions are happening all the time and my hope is to create lazy loading of subscriptions and that will be handled by another call.
The problem is that if more items are added, the limit will retrieve the next five, but shifted by the amount added (could be 2 - 5). So now I have an issue where the next five could be the same ones or well into the list +5 which wont be accurate representation. Is there a condition within findBy that allows me to do a call that has something similar to "start from ID" that then allows the limit + offset to start from there?
Example:
$subscriptions = $userSubscriptionRepo->findBy(
['isDeleted' => false , 'disabled' => false],
null,
$this->paginationLimit, //5
0,
['startFromID' = '5'] // so that it ensures no matter how many is added, it will start from this id
);

Related

Woocommerce REST API Retrieve Order By Transaction ID?

I got some problem with Woocommerce REST API.
My goal is simple, to check whether the order is exist or not by transaction ID. So far, parameter that works is order status.
this is my script:
$param = array('status' => 'on-hold', 'transaction_id' => 'XXXXXXXXXXXXXX');
//OR $param = array('search' => 'XXXXXXXX');
$cek = $woocommerce->get('orders', $param);
print_r($cek);
but when I add more parameter like 'transaction_id', the results is weird, resulting all orders.
You can get order data using following way
$order_details = $woocommerce->get('orders/1'); // 1 = Transaction id
I hope it will work for you.

How to create Field Collection Item by code in Drupal 8

How to create a Field Collection item for a node by program in Drupal 8. I have tried with below code, but it doesn't work. 'field_abc_inside' is the field of the Field Collection 'field_abc'.
$field_collection_item = entity_create('field_collection_item', array(
'field_name' => 'field_abc',
'field_abc_inside' => array('value'=> 'Test data'),
));
$field_collection_item->setHostEntity($node);
$field_collection_item->save();
$user_id = \Drupal::currentUser()->getAccount()->id();
$user = User::load($user_id);
$fc = FieldCollectionItem::create(array(
"field_name" => "field_hobbies",
));
$fc->set('field_hobby_name', 'Watch TV');
$fc->setHostEntity($user);
That code helped me but I have one observation.
\Drupal::currentUser() is the user object, which is clear since it is using it to retrieve the id().
Therefore no need to User::load() it again in line 2, redundant processing that will cause the function to take longer to run.
// Get the current user object
$user = \Drupal::currentUser();
// Prepare the new Field Collection Item object
$fc = FieldCollectionItem::create(array(
"field_name" => "field_hobbies",
));
// Sets more field values
$fc->set('field_hobby_name', 'Watch TV');
// Sets the host record; where the field collection will be attached into
$fc->setHostEntity($user);
// Saves it into an actual field collection record
$fc->save();

Gravity Forms API exits the page when fetching more than 2000 entries

I have this weird behaviour when trying to fetch more than 2000 records via the Gravity Forms api, the methods used for fetching the records is as follows:
$search_criteria["field_filters"]["mode"] = "all";
$all_entries_submitted = GFAPI::get_entries(2,$search_criteria,null,array('offset' => 0, 'page_size' => 3000 ));
Any ideas why could this be?
Thanks!
Found the answer to my problem:
When I increased the allowed memory size in php to a huge number (1gb) the script eventually returned the records (after a few minutes)
ini_set('memory_limit','1000M');
$search_criteria["field_filters"]["mode"] = "all";
$all_entries_submitted = GFAPI::get_entries(2,$search_criteria,null,array('offset' => 0, 'page_size' => 3000 ));

How to scan data from AWS dynamoDb incrementally

I've created a table in AWS dynamoDb with only one hash key. Currently it holds over 20 million pieces of data, and every day a few thousands of data are inserted.
Recently, I want to fetch these data from dynamoDb into local hard disk every day. I wrote a small program to use scan operations to save them. The total size of data is not much larger, about 10G, but the time cost in the scanning process is nearly 5 hours each day. Of course, considering the expenses, I didn't set much larger read throughputs.
My question is: is there a way to scan these data incrementally, which means I only need to copy the newly inserted data, but not the entire database. I once tried to use withExclusiveStartKey, but it couldn't find newly inserted data, it might because the lastKeyEvaluated only describes the last key of the specific segment.
You can Create the LSI on the Table and then query the table with
by default it is true it will give you the result in acceding order if you want it in descending order you can user "ScanIndexForward" => false,
E.g
$response = $this->dbClient->query(array(
"TableName" => $this->tableName,
"IndexName" => "TableNameIndex",
"KeyConditions" => array(
"Id" => array(
"ComparisonOperator" => ComparisonOperator::EQ,
"AttributeValueList" => array(
array(Type::NUMBER => $this->getId())
)
)
),
"ScanIndexForward" => false,
));
You will get the result in decremental model.
if you want the top 50 records then you can also set limit like
'limit' => Number;
Hope it will help you.

Creating a Pods relationship programmatically produces error in Advanced Custom Fields

I'm trying to create a pods relationship using the following code:
$data = array(
"pod_id" => esc_attr(strip_tags($_POST['customMetaAutorID'])),
"field_id" => 1073,
"item_id" => $post_id,
"related_item_id" => $_POST["customMetaAutorID"],
"related_pod_id" => 0,
"related_field_id" => 0,
"weight" => 0
);
$wpdb->insert("wp_podsrel", $data);
The row gets added to the table, how ever, after a few page refreshes I start getting the error:
Strict Standards: Declaration of acf_taxonomy_field_walker::start_el() should be compatible with Walker::start_el(&$output, $object, $depth = 0, $args = Array, $current_object_id = 0)
This means that all I have is the white screen of death and the only thing I can do is to restore the database.
What's the way to add a pods relationship field value and not breaking everything else?
Found out the answer myself.
Turns out each pod item has an add_to function wich adds values to related fields given the field name (much more convenient than harcoding the field ID)
The code I ended up using is this:
$postPod->add_to("field_name", $related_element_id);

Resources