Is it possible to communicate to a client/server application by calling the command System in R?
I use BaseX for storing xml databases and I want to call a Basex client from R by using the "system"command after have launching the basexserver manually
setwd("C:/Program Files/BaseX/bin/")
system("basexclient -U admin -P admin",wait = TRUE)
BaseX 8.1 [Client]
Try help to get more information.
The problem is that R can't communicate with the BaseX Client, and as consequence i get this error :
Child process not responding.R will terminate it.
I tried to change wait parameter to wait=FALSE and then execute a command BaseX but it seems that it can't communicate to the client also.
system("OPEN mydatabse",wait = FALSE)
object "mydatabse" not found
Any suggestions you can provide will be appreciated.
N.B : The same problem occurs with Java
Related
I have just started to explore DVC. I am trying with s3 as my DVC remote. I am getting
But when I run the dvc push command, I get the generic error saying
An error occurred (AccessDenied) when calling the PutObject operation: Access Denied
which I know for a fact that I get that error when I don't specify the encryption.
It is similar to running aws s3 cp with --sse flag or specifying ServerSideEncryption when using boto3 library. How can I specify the encryption type when using DVC. Coz underneath DVC uses boto3 so there must be an easy way to do this.
Got the answer for this immediately in the DVC discord channel!! By default, no encryption is used. We should specify what server-side encryption algorithm should be used.
Running dvc remote modify worked for me!
dvc remote modify my-s3-remote sse AES256
There are a bunch of things that we can configure here. All this does is that it adds an entry of sse = AES256 under the ['remote "my-s3-remote"'] inside the .dvc/config file.
More on this here
https://dvc.org/doc/command-reference/remote/modify
I've logged into a Microsoft R Server using mrsdeploy::remoteLogin()
Next I start a remote session with mrsdeploy::remoteCommandLine()
If I try to use system("pwd") I get no response.
I'm guessing access to the shell is blocked - does anyone know where this is controlled?
We found the answer to this.
The remote session does have access to the shell. You need to use intern = TRUE to see the result.
For example system("pwd", intern = TRUE)
My original problem was that I want to increase my DynamoDB write throughput before I run the pipeline, and then decrease it when I'm done uploading (doing it max once a day, so I'm fine with the decreasing limitations).
They only way I found to do it is through a shell script that will issue the API commands to alter the throughput. How does it work with my AMI access_key and secret_key when it's a resource that pipeline creates for me? (I can't log in to set the ~/.aws/config file and don't really want to create an AMI just for this).
Should I write the script in bash? can I use ruby/python AWS SDK packages for example? (I prefer the latter..)
How do I pass my credentials to the script? do I have runtime variables (like #startedDate) that I can pass as arguments to the activity with my key and secret? Do I have any other way to authenticate with either the commandline tools or the SDK package?
If there is another way to solve my original problem - please let me know. I've only got to the ShellActivity solution because I couldn't find anything else in documentations/forums.
Thanks!
OK. found it - http://docs.aws.amazon.com/datapipeline/latest/DeveloperGuide/dp-concepts-roles.html
The resourceRole in the default object in your pipeline will be the one assigned to resources (Ec2Resource) that are created as a part of the pipeline activation.
The default one in configured to have all your permissions and AWS commandline and SDK packages are automatically looking for those credentials so no need to update ~/.aws/config of pass credentials manually.
I am using sqlite3 Database for the database management for my AM1808 ARM9 based microprocessor.
I am using EMbedded Linux (V10.10 Lucid) and Gcc compiler for ARM.
My scenario is as following,
I have GSM moduled interfaced on UART. I am continuously synchronize my data with the server in background.
I am also accessing SQLIte database for other processes like read,write,view etc..
I have a single database connection.
I have simultaneous access to the Sqlite3.For the Multihandling sqlite3 with a single connection i have used Mutexs for the database lock. I have also used SQLITE3_BUSY flag for checking and all that.
Still i am missing my inserted record in the database, means database is not giving any error for the inserting record to the database.
So i can find that problem.I stuck here and i can not further proceed.
Please guide me. If you need than please tell me i will provide my code snippet.
I am very new to AS400, and I am stuck. I have read documenation but cannot find what I need.
I have an odbc connection to an AS400 server. When I run this command I get an Outfile with everything I need:
CALL QSYS.QCMDEXC('DSPUSRPRF USRPRF(*ALL) OUTPUT(*OUTFILE) OUTFILE(CHHFLE/TEST3)', 0000000061.00000)
Instead of the results going to an outfile I need to receive the results of this command to my script that is connecting through odbc. If I change 'OUTPUT(*OUTFILE)' to 'OUTPUT(*)' I get no results when I try to 'fetchall()'.
Is there any way to get this information through the odbc connection to my script?
EDIT: I am on a linux server, in a python script using pyodbc to connect. I can run sql queries successfully using this connection, but I can't figure out how to get the results of a command to come through as some sort of record set.
I hope I'm interpreting what you're asking correctly. it looks like you're accessing user profile data and dumping it to a file. It looks like you then want to use the contents of that file in a script or something that's running on Windows. If that's the case:
In general, when accessing data in a file from the Windows world, whether through ODBC and VBScript, or .NET, the AS/400 is treated like a database. All files in libraries are exposed via the built-in DB2 database. It's all automatic, and part of the Universal DB2 database.
So, after creating this file, you should have a file named TEST3 in library CHHFLE
You'd create a connection and execute the following SQL statement to read the contents:
Select * From CHHFLE.TEST3
This, of course, assumes that you have proper permissions to access this. You should be able to test this using the iSeries Navigator tool, which includes the ability to run SQL Scripts against the database before doing it in your script.
Added after reading comments above
There's info at this question on connecting to the DB2 from Python. I hope it's helpful.
OUTPUT(*) is not stdout, unfortunately. That means you won't be able to redirect OUTPUT(*) to an ODBC connection. Dumping to a DB2 table via OUTPUT(*OUTFILE) is a good plan. Once that's done, use a standard cursor / fetch loop as though you were working with any other DB2 table.