Maximum execution time For DomPDF in laravel Project - report

I have to use the DomPDF package in the laravel(5.8) project. But we face a Maximum execution time Error in the report creation process.
I get a query result in 6sec.
My query count is 1971.
My package details:
barryvdh/laravel-dompdf: "0.8.7"
laravel/framework": "5.8"
My sample code for report creation:
$pdf = PDF::loadView('reports.demodetails.viewreport',compact('alldetails'));
return $pdf->download('demoreport.pdf');
But the report creation process takes more time to load.
I really appreciate it if anyone can suggest it to me.

I use the "barryvdh/laravel-snappy" package instead of "barryvdh/laravel-dompdf". I got the result in 16 secs.(My total records are 1971)
Note: Dompdf same records I got 1/2 hrs to generate PDF.
Anyone have this problem PL find the below steps to install the "barryvdh/laravel-snappy" package.
Step1:
composer require barryvdh/laravel-snappy
a) Add provider
Barryvdh\Snappy\ServiceProvider::class,
b)Add Aliases
'PDF' => Barryvdh\Snappy\Facades\SnappyPdf::class,
'SnappyImage' => Barryvdh\Snappy\Facades\SnappyImage::class,
c)Publish
php artisan vendor:publish --provider="Barryvdh\Snappy\ServiceProvider"
Step2:
composer require wemersonjanuario/wkhtmltopdf-windows
a)Set binary path in snappy file.
In live server(Linux) follow the below link and setup the wkhtmltopdf and wkhtmltoimage, After set the binary path values.
https://askubuntu.com/questions/959152/how-can-i-install-the-latest-wkhtmltopdf-on-ubuntu-16-04

Related

How to convert Tensorflow Object Detection API model to TFLite?

I am trying to convert a Tensorflow Object Detection model(ssd-mobilenet-v2-fpnlite, from TensorFlow 2 Detection Model Zoo) to TFLite. First of all, I train the model using the model_main_tf2.py and then I use the export_tflite_graph_tf2.py to export a saved model(.pb). However, when it comes to convert the .pb file to .tflite it throws this error:
OSError: SavedModel file does not exist at: /content/gdrive/My Drive/models/research/object_detection/fine_tuned_model/saved_model/saved_model.pb/{saved_model.pbtxt|saved_model.pb}
To convert the .pb file I used:
import tensorflow as tf
SAVED_MODEL_PATH = os.path.join(os.getcwd(),'object_detection', 'fine_tuned_model', 'saved_model', 'saved_model.pb')
# SAVED_MODEL_PATH: '/content/gdrive/My Drive/models/research/object_detection/exported_model/saved_model/saved_model.pb'
converter = tf.lite.TFLiteConverter.from_saved_model(SAVED_MODEL_PATH)
converter.optimizations = [tf.lite.Optimize.DEFAULT]
converter.experimental_new_converter = True
converter.target_spec.supported_ops = [tf.lite.OpsSet.TFLITE_BUILTINS, tf.lite.OpsSet.SELECT_TF_OPS]
tflite_model = converter.convert()
open("detect.tflite", "wb").write(tflite_model)
or "tflite_convert" from command line, but with the same error. I also tried to run it with the latest tf-nightly version as it suggests here, but the outcome is the same. I tried to pass the path with various ways, it seems like the .pd is not well written (not the right file). Is there a way to manage to convert the model to tflite so as to implement it to android? Thank you!
Your saved_model path should be "/content/gdrive/My Drive/models/research/object_detection/fine_tuned_model/saved_model/". It is the folder instead of files in that folder
For quick test, try to type in terminal
tflite_convert \
--saved_model_dir="path to saved_folder" \
--output_file="path to tflite file u want to save"
I don't have enough reputation to just comment but the problem here seems to be your SAVED_MODEL_PATH.
You could try to hardcode the path and remove the .pb file. I don't remember exactly what's the trick here but it's definitively due to the path

How to use rdflib to query WikiData?

I mean that I want to use rdflib to query WIkidata in my local computer, but rdflib.Graph() need to parse the namespace firstly.THerefore, How can I get the Wikidata NameSpace to use the rdflib local code?
I think the goal was:
from rdflib import Graph
g = Graph()
g.parse('wikidata-link')
or
g.load('wikidata-link')
I haven't spent much time on it, but here are my tryouts, just to kinda complete the question and maybe find an answer.
Some of the following possible versions have resulted in some kind of error ranging from, 'timeout', 'not well formed (invalid token)', Typeerrors, '.. not a valid NCName ...' up to missing plugin errors when getting 'text/html' or '.../json' back. I marked what worked and what didn't.
CODE SAMPLES I'VE TRIED
g.parse('https://www.wikidata.org/wiki/Special:EntityData/Q42.n3') # WORKS
g.parse('https://www.wikidata.org/wiki/Special:EntityData/Q42.json') # FAILS
g.parse('https://www.wikidata.org/wiki/Special:EntityData/Q42.ttl') # WORKS
g.parse('https://www.wikidata.org/wiki/Special:EntityData/Q42.rdf') # FAILS
g.parse('https://www.wikidata.org/wiki/Special:EntityData/Q64') # FAILS
g.parse('https://www.wikidata.org/wiki/Q42') # FAILS
g.load('https://www.wikidata.org/wiki/Special:EntityData/Q42.n3') # FAILS
g.load('https://www.wikidata.org/wiki/Special:EntityData/Q42.json') # FAILS
g.load('https://www.wikidata.org/wiki/Special:EntityData/Q42.ttl') # FAILS
g.load('https://www.wikidata.org/wiki/Special:EntityData/Q42.rdf') # FAILS
g.load('https://www.wikidata.org/wiki/Special:EntityData/Q42') # FAILS
g.load('https://www.wikidata.org/wiki/Q42') # FAILS
I tried these out based on Wikidata Access
VERSIONS USED
RDFLib 6.1.1
Python 3.10.1
Last additional thoughts
You could query wikidata via the endpoint and build your rdflib graph from there.

Creating a parameter in neo4j through R driver

I am trying to generate a graph using the neo4r R driver. I have no problems preforming standard queries such as
"MATCH (n:Node {nodeName: ‘A Name’}) RETURN COUNT(n)” %>% call_neo4j(con)
However when I try to create a parameter with the following query
":params {Testnode: {testNodeName: 'Node Name'}}" %>% call_neo4j(con)
I get the following syntax error
$error_code
[1] "Neo.ClientError.Statement.SyntaxError"
$error_message
[1] "Invalid input ':': expected <init> (line 1, column 1 (offset: 0))\n\":params {Testnode: {testNodeName: 'Node Name'}}\"\n ^"
The parameter query works fine when I run it directly in the neo4j browser so I do not understand how there is a syntax error?
Any ideas on how to fix this greatly accepted!
:params only works in the Neo4j Browser, it's not really Cypher.
Worse, the R Neo4j driver doesn't seem to support passing parameters - there's an open Github issue that points to a fork that contains relevant changes, but that fork also has other changes that make it deviate from the main driver.
I'd try either using the fork to see if it gets you anywhere, and if it does either create the relevant PR to the project or maintain a local fork that track the main driver but just contains that parameter change.

Use Azure custom-vision trained model with tensorflow.js

I've trained a model with Azure Custom Vision and downloaded the TensorFlow files for Android
(see: https://learn.microsoft.com/en-au/azure/cognitive-services/custom-vision-service/export-your-model). How can I use this with tensorflow.js?
I need a model (pb file) and weights (json file). However Azure gives me a .pb and a textfile with tags.
From my research I also understand that there are also different pb files, but I can't find which type Azure Custom Vision exports.
I found the tfjs converter. This is to convert a TensorFlow SavedModel (is the *.pb file from Azure a SavedModel?) or Keras model to a web-friendly format. However I need to fill in "output_node_names" (how do I get these?). I'm also not 100% sure if my pb file for Android is equal to a "tf_saved_model".
I hope someone has a tip or a starting point.
Just parroting what I said here to save you a click. I do hope that the option to export directly to tfjs is available soon.
These are the steps I did to get an exported TensorFlow model working for me:
Replace PadV2 operations with Pad. This python function should do it. input_filepath is the path to the .pb model file and output_filepath is the full path of the updated .pb file that will be created.
import tensorflow as tf
def ReplacePadV2(input_filepath, output_filepath):
graph_def = tf.GraphDef()
with open(input_filepath, 'rb') as f:
graph_def.ParseFromString(f.read())
for node in graph_def.node:
if node.op == 'PadV2':
node.op = 'Pad'
del node.input[-1]
print("Replaced PadV2 node: {}".format(node.name))
with open(output_filepath, 'wb') as f:
f.write(graph_def.SerializeToString())
Install tensorflowjs 0.8.6 or earlier. Converting frozen models is deprecated in later versions.
When calling the convertor, set --input_format as tf_frozen_model and set output_node_names as model_outputs. This is the command I used.
tensorflowjs_converter --input_format=tf_frozen_model --output_json=true --output_node_names='model_outputs' --saved_model_tags=serve path\to\modified\model.pb folder\to\save\converted\output
Ideally, tf.loadGraphModel('path/to/converted/model.json') should now work (tested for tfjs 1.0.0 and above).
Partial answer:
Trying to achieve the same thing - here is the start of an answer - to make use of the output_node_names:
tensorflowjs_converter --input_format=tf_frozen_model --output_node_names='model_outputs' model.pb web_model
I am not yet sure how to incorporate this into same code - do you have anything #Kasper Kamperman?

Schedule R Script Every 15 min (cronR)

I need to schedule/run an R script every 15 min. Running on an AWS RStudio instance.
I have played a bit with 'cronR', including loading the add-in. I can figure out how to get it to run "minutely", "hourly", etc...but not every 15-min.
What's the best way to get this done...either in RStudio via cronR or alternative, or via some other method?
So I followed #r2evans advice and opened an issue on git. It was addressed almost immediately with a fix to the code and an update to the readme. Figured I would answer this for completeness in case someone else ever finds their way here...
You can now schedule a job every 15 min with 'cronR' either in the RStudio add-in, or with the following code:
cron_add(script, frequency = '*/15 * * * *', id = 'Job ID', description = 'Every 15 min')
One note is that you might need to reinstall the package using devtools to push through the most recent changes:
devtools::install_github("bnosac/cronR")

Resources