Connecting Excel with 'ABC' throught Excel works, connecting R with Excel by DDE works also, but how to connect R with 'ABC' application ?
I have application providing DDE interface, from Excel I could retrieve value from it with this DDE reference :
='ABC'|DDE!_nazwa_value
from R I've tried to use tcltk2 library, as follows :
tcltk2::tk2dde.request(service="ABC", topic="DDE", item="_nazwa_value")
but error occurs :
Error in structure(.External(.C_dotTcl, ...), class = "tclObj") :
[tcl] remote server cannot handle this command.
[1] "Error in structure(.External(.C_dotTcl, ...), class = \"tclObj\") : \n [tcl] remote server cannot handle this command.\n\n"
attr(,"class")
[1] "try-error"
attr(,"condition")
<simpleError in structure(.External(.C_dotTcl, ...), class = "tclObj"): [tcl] remote server cannot handle this command.
I'm only trying to use tcltk dde functions to retrieve data from application, I think that item part of my tk2dde.request is wrong, but I've tried various modification (without _ for example), do you know any clues or resources for solving this problem ?
EDIT
something is wrong I don't see ABC server nor DDE topic on server-topic list :
tk2dde.services() but Excel can still connect and retrieve value using ='ABC'|DDE!_nazwa_value, DDE Query also do not see it
You can read in the article about DDE in tcl/tk wiki:
(Talking about using Internet Explorer) ... All of the above experiments should "work" reliably, in that, from the user perspective, IE indeed acts as described. However, back in the Tcl process, [dde] typically throws a "remote server cannot handle this command" exception. That's because, in KBK's analysis, DDE gives no "way to distinguish 'result expected, but the server failed to provide it' from 'no result is expected'." The only way not to receive a DMLERR_NOTPROCESSED is to invoke "dde exec -async ..."
Note: i tried tcl examples of DDE using Excel + "request" and i got the same error as you: "remote server cannot handle this command".
Saludos!,
Related
I cannot use save_kable() to save tables created with knitr::kable() and knitr::kableExtra as images. It looks like this is due to PhantomJS being blocked by admins. Is there another function that could help me save them? This is the error i get:
Error in process_initialize(self, private, command, args, stdin, stdout, …: ! Native call to processx_exec failed Caused by error in
chain_call(c_processx_exec, command, c(command, args), pty, pty_options, …: ! create process
'C:\Users\user1\AppData\Roaming/PhantomJS/phantomjs.exe' (system error
1260, This program is blocked due to group policy. Contact the systems responsible person for more information. )
#win/processx.c:1040 (processx_exec)
PD: I translated to English the message that comes after "system error 1260...".
I have a simple function that should fetch a table using a DBI::dbConnect() connection. I am having trouble with the call to tbl() that works fine in an interactive session.
My function:
a2_db_read <- function(con, tbl_name, schema = "dbo"){
if(schema == "dbo"){
dplyr::tbl(con, tbl_name)
}
else{
dplyr::tbl(con, dbplyr::in_schema(schema, tbl_name))
}
}
If I make the call dplyr::tbl() I get:
Error in UseMethod("tbl") :
no applicable method for 'tbl' applied to an object of class "Microsoft SQL Server"
If I make the call dbplyr::tbl() I get:
a2_db_read(a2_con_uat, "AVL Data")
Error: 'tbl' is not an exported object from 'namespace:dbplyr'
How can I get that call to succeed in a function? My package Imports is:
Imports:
DBI,
dbplyr,
dplyr,
ggplot2,
odbc
I got it working with dplyr::tbl(), the correct usage.
The problem was that my connection was stored as an object in a package when in fact the connection needs to be made every time I restart R.
On a fresh R environment, the stored stale connection caused the error:
Error in UseMethod("tbl") :
no applicable method for 'tbl' applied to an object of class "Microsoft SQL Server"
When I regenerated the connection, it worked.
So I have a collection on json files located on my local machine that I am reading in currently using the command
file <- tbl_df(ndjson::stream_in("path/to/file.json")
I have copied these files to a linux server (using WinSCP) and I want to stream them in to my R session just like I did in the above code with ndjson. When searching for ways to do this I came across one method using RCurl that looked like this
file <- scp(host = "hostname", "path/to/file.json", "pass", "user")
but that returned an error
Error in function (type, msg, asError = TRUE) : Authentication failure
but either way I want to avoid copying my passphrase into my Rscript as other will see this script. I also came across a method suggesting this
d <- read.table(pipe('ssh -l user host "cat path/to/file.json"'))
however this command returned the error
no lines available in input
and I believe read.table would cause me issues anyways. Does anyone know I way I could read new line delimited json files from a remote server into an R session? Thank you in advance! Let me know if I can make my question more clear.
I am currently moving a (localhost) shiny App from a windows 32-bit to windows 64-bit . Google didn't managed to answer my problem :( so I'm asking the community !
This App worked fine on 32 bits, I had to re-install R, all Packages, Java on the 64-bit machine (each in the 64-bit mode). My app has the following file architecture :
a gloabl.R file where I load libraries
a server.R
List item
a ui.R
another file which format data to be sent d3/nvd3
a JS file to display a linechart.
the error I have is the following :
Warning in file(con, "rb") : file("") only supports open = "w+" and
open = "w+b": using the former Warning: Error in readChar: invalid
'nchars' argument Stack trace (innermost first):
1: runApp Error : invalid 'nchars' argument
When I'm running code without shiny, all works fine, all is precessed and results are good.
Does anyone has ever been facing to this ?
If you need anything just ask I will be more specific. I am not giving you the code, he is a little tricky and is in multiple files... And I think it is specific to my new environement.
Reinstalling solely the shiny-package fixed the problem for me. I am using R-studio, and did the operation through the Packages window (deleted and reinstalled).
I have to following code
drv <- RPostgreSQL::PostgreSQL()
con <- DBI::dbConnect(drv, dbname = 'dbname', user = 'user',
host = 'host.name', port = 5432, password = 'password')
When I run it on server (Ubuntu server 16.04 with latest updates) running the database I get the following error:
Error in .valueClassTest(ans, "data.frame", "dbGetQuery") :
invalid value from generic function ‘dbGetQuery’, class “NULL”, expected “data.frame”
But when I run R from commandline with sudo, it works, when I run it from different my laptop connecting to the DB on the server it also works. So it shouldn't be connection problem. I am thinking about problem with access rights to some libraries/executables/configs on the system? Any help will be appreciated.
When I run the dbConnect multiple times and it ends with the error, when I run drv_info <- RPostgreSQL::dbGetInfo(drv), I still get multiple connectionIds in the drv_info:
drv_info <- RPostgreSQL::dbGetInfo(drv)
> drv_info
$drvName
[1] "PostgreSQL"
$connectionIds
$connectionIds[[1]]
<PostgreSQLConnection>
$connectionIds[[2]]
<PostgreSQLConnection>
$fetch_default_rec
[1] 500
$managerId
<PostgreSQLDriver>
$length
[1] 16
$num_con
[1] 2
$counter
[1] 2
Found a source of confusion, but not necessarily the root problem. (I was assuming RPostgres, while you are using RPostgreSQL (github mirror).)
If you check the source, you'll find that the method is calling postgresqlNewConnection, which includes a call to dbGetQuery. The problem you're seeing is that your call to dbConnect is failing (my guess is at line 100) and returns something unexpected, but postgresqlNewConnection continues.
I see three options for you:
try calling dbConnect(..., forceISOdate=FALSE) to bypass that one call to dbGetQuery (note that this doesn't fix the connection problem, but it at least will not give you the unexpected query error on connection attempt);
raise an issue for the package maintainers; or
switch to using RPostgres, still DBI-based and actively developed (it looks like RPostgreSQL has not had significant commits in the last few years, not sure if that's a sign of good code stability or development stagnation)
One lesson you may take from this is that you should check the value returned from dbConnect; if is.null(con), something is wrong. (Again, this does not solve the connection problem.)