I'm tryng to import a csv with a scheduled task with cronR. Unfortunatly even if I don't receive any error from log file I don't see any dataframe in global environment.
Below the super easy script
#Test_cronR
require("readr")
setwd("~/projects/my_proj")
df <- read.csv("~/projects/my_proj/df.csv")
I tried to insert also the absolute path but it's still not working.
I'm working on Rstudio Server in a VM on Google Cloud Platform,
Can somebody help me ?
Consulting the web actully I was not able to find anything, in particular on importing such a kind of file.Only some tips regarding the full path vs relative one.
My expecting is having simply a csv in my global environment.
Related
I have the following problem. I have a data pipeline at work that transforms raw data and loads it to a cloud database, for various projects. There are Python scripts for the project-based transformations, but everything must be done manually (defining the transformer's project-based inputs, run the transformer, load the data).
I want to automate this process with Airflow. I created the above steps as tasks in Python. The Airflow instance is running on some computer, which must reach a network drive, where the raw data and the transformer scripts are located. The required connection type is Samba.
I managed to connect to the drive and create a SambaHook object:
samba_file_share: Final[object] = SambaHook(connection_id, file_share_name)
In one task, I need to call and run the transformer script. With a former solution (without Samba) I used Popen, which worked fine. However, I must use Samba now, and I face the following problem.
I have the path of the transformer script by reading out the root folder of the file share from the Samba object, and join the path of the transformer to it:
samba_file_share._join_path(transformer_path)
If I print this out, the path is correct, and the network is available. If I fed it as a string to Popen (or byte string or path-like object) I got the error "No such file or directory".
Can anyone help with it? How can I fed it to Popen to run the script; or should I use something else, not Popen, to run it? The Samba documentation is totally incomplete, I could not found anything there so far.
Thanks,
Marci
This automated Airflow solution works perfectly if I connect from a machine that easily access the network drive.
However, that is only for development, and in production it must run in some other machine which has no direct access to the drive. I must use Samba to connect to it, and it breaks everything.
I'm attempting to make a public shinyapps.io website, and I'm trying to use image_write to create a file into a local directory.
The following code works on my local R studio code:
image_write(im.resized, path = paste0(output_file_directory, file_name), format = "jpg")
When I run the code on the shinyapps.io website, the code runs without error, but I'm not sure where it downloads the file to. I know that the output_file_directory part isn't the issue, so I'm a little lost. Any help would be much appreciated!
On shinyapps.io it is not possibly to store permanently data, due to:
"Shinyapps.io is a popular server for hosting Shiny apps. It is designed to distribute your Shiny app across different servers, which means that if a file is saved during one session on some server, then loading the app again later will probably direct you to a different server where the previously saved file doesn’t exist."
See here:
https://shiny.rstudio.com/articles/persistent-data-storage.html
Earlier when using AzureML from the Notebooks blade of Azure ML UI, we could access the local files in AzureML using simple relative paths:
For example, in the above image to access the CSV from the test.ipynb we could just mention the relative path:
df = pandas.read_csv('WHO-COVID-19-global-data.csv')
However, we are not able to do that anymore.
Also when we run
import os
os.getcwd()
We see the output as
'/mnt/batch/tasks/shared/LS_root/mounts/clusters/<cluster-name>'.
Hence, we are unable to access the files in the FileStore which was not the case earlier.
When we run the same from the JuyterLab environment of the compute environment we get:
'/mnt/batch/tasks/shared/LS_root/mounts/clusters/<cluster-name>/code/Users/<current-user-name>/temp'.
We can easily solve it by adding the path '/code/Users/<current-user-name>/temp' at the base and use that instead. But this is not recommended as with a change in the environment we are using the code needs to change every time. How do we resolve this issue without going through this path appending method.
I work on the Notebooks team in AzureML, I just tried this. Did this just start happening today?
It seems like things are working as expected:
I have noticed a strange behaviour in FireStore Cloud Functions that if try to break my code up into separate files, I start to get this error:
info: Worker for app closed due to file changes.
I just created a simple express server and hosted it in a cloud function and was emulating it locally as described here.
https://www.youtube.com/watch?v=LOeioOKUKI8&t=244s
I even wrote tests for the same. Everything was working fine until I split the source code of my express app into individual routes (contained in separate .js files).
The only thing that message means is that the emulator noticed when a code file changed, and performed a hot reload of that code. Note that it's just an "info" level message, not an error and not even a warning.
If your project is not working the way you expect, then edit your question with the symptoms you're observing, along with the code.
I'm trying to set up one of my AX environments to have an XPO imported whenever the server process is started up. Because the environment is updated from production on a regular basis, and the code needs to be unique to this (non-production) environment, it seems the best option would be to use the AOTImport command on startup. However, I'm having trouble identifying the exact syntax/setup to get this to work.
Looking through the system code, it seems the syntax should be aotimport_[path to file]. Is this correct? The server does not seem to complain about this command, but the file in question does not get imported. I have also tried variations on this command, but have not seen it work.
I supose you're trying to execute a command on SysStartupCmd classes. If so, this methods are fired when AX client starts, not the AOS. It's docummented on this page:
http://msdn.microsoft.com/en-us/library/aa569641(v=ax.50).aspx
If you want to automate this import it can be done scheduling an execution of the AX client (ax32.exe) on your build workflow that run the import (it's recommended to run a full compilation after importing). This is discussed on other questions here on SO.