Publish r shiny application in intranet [closed] - r

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 2 years ago.
Improve this question
I am trying to build a Rstudio/Shiny App and post it in our intranet so that everyone else in our office could see it. I am a windows person, and the instructions online about how to setup a shiny server within Linux environment is a bit difficult for me. Is there an easy way that I can could accomplish this goal without messing up with Linux. Even if I have to do so, is there an easy way to just have my webpage available to people within our company, not everyone on the internet. Thanks!

You can use shinyapps.io and add a password authentication by using, for example, shinymanager Link here.

If you have a computer that's always on, you can run R on it and run the command shiny::runApp(host="0.0.0.0", port="xxxx")". You can experiment a bit with the port in xxxx.
You will have to know the computer's ip-address. You can then direct your colleagues to http://<ip-address>:<xxxx>, where you replace <ip-address> and <xxxx> with the respective ip-address and port. You might have to unblock the port in Windows Firewall.
If you do not have a computer available that is always turned on, you can actually install R onto a network drive, install required packages on the network drive, and then have users run R from the network drive and run the app themeselves.

Related

rstudio server and access of remote databases [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 1 year ago.
Improve this question
I understand that several users can use an rstudio server. usually when we work locally we are already connected the to our databases (sql server) via active directory. How would this work in the server scenario. Would users have to enter credentials? I think one can also deploy plumber apis. How can they issue credentials to the remote database please? Thanks.
It probably depends if your solution is a free or a pro server. If pro, then better ask RStudio for sure.
For free server, some hints to answer something in case it helps you.
Docs are available here about that:
ODBC pro drivers: https://docs.rstudio.com/pro-drivers/installation/ there is a point for MS server
You can specify databases in odbc config file or just use odbc::odbc function inside R
Then credentials can be passed in odbc package. You can prompt for pwd interactively, see here for instance (in this case on an oracle db).
For credentials related to AD auto connect, I don't know, for me, users needs to identify to the database individually. It depends, first question is maybe: is your server configured to use the AD?
For plumber apis, unsecure ones are easily done but securing it yourself (jwt token or other IDs) without Rstudio pro seems not so easy.

Is R shiny secure by doing so? [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 5 years ago.
Improve this question
I want to create a Shiny app that could share within my company. Though my company laptop don't allow the Shinyapps.io, I used my personal laptop to successfully run a shiny test.
My question is, is there any potential security issue if I use my personal laptop to run the company data using shiny, and share the output to my coworkers.
If the security is violated by doing so, what are other options to make it secure only to the company?
Anything hosted on shiny server will be available to anyone who is able to connect to your laptop. If you want to work around this, you can use NGINX and require people to authenticate on another page before they can gain access to shiny, which you host locally and connect through a websocket. However, you're likely to mess up on some security thing (because it's incredibly easy to get security wrong), and people will gain access to either the raw data (which is extremely terrifying) or whatever visualizations you create in shiny (which is still terrifying). If you just use runApp("my-app"), and then screenshot visualizations from your computer to send around, you're fine, but if the data goes anywhere on the internet, I wouldn't consider it safe.
To piggyback off of #ConCave on the issues of data privacy - the main issue of sharing your Shiny apps over shinyapps.io is that the data will need to be hosted on those external servers. If your company happens to have an IT support system that can recreate/host the entire Shiny server on their own servers, you could port your apps onto their server.

How do I access data from my personal computer on my AMI instance running RStudio server [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 9 years ago.
Improve this question
I have recently set up RStudio on an AMI ec2 instance using the process generously laid out by Louis Aslet from his website. But in an embarrassing turn of events I can't access the data I need because it resides on my personal computer. I am new to cloud computing and have 0 functional knowledge of Linux, but I do know SQL, and R well. Any help or suggestions would be greatly appreciated.
Have you tried the "Upload" button in the "Files" window of Rstudio?
use scp in terminal.
To put files from your remote server
Example: if the files are located locally in ~/mylocalfolder and you want to put them in /home/rstudio/mydata you would execute in terminal:
scp ~/mylocalfolder/*.csv ubuntu#<your address>:/home/rstudio/myData/
Note that if you want to access them under a different user, eg, rstudio, you need to change owners on the files. Use chown
To grab data from your remote server
Example: if the files are located on /home/rstudio/mydata and you want to put them locally in ~/mylocalfolder you would use
scp ubuntu#<your address>:/home/rstudio/myData*.Rda ~/mylocalfolder
I use the RStudio AMI all the time and what works for me is to use Dropbox. I can't remember exactly how I did it but I think I may have started the shell from within RStudio and installed Dropbox from the command line.
This link has a little more info:
http://www.louisaslett.com/RStudio_AMI/#comment-1041983219

Can I install New Relic on a Bluehost Shared Hosting Plan? [closed]

Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 10 years ago.
Improve this question
I want some more insights into my wordpress site so I signed up for a New Relic account. But they're asking me to install some agent by typing in commands and I have no idea where to do that or how to access a command line.
I use shared hosting with Bluehost so I have access to cPanel. I've never typed anything into a console to manage the server - the icons have covered everything I need. So far.
Is it even possible to install this on my shared hosting plan? If so, where do I get the command prompt, do I need FTP? I've tried to follow the instructions, but there're some strange words like "yum", "rpm" and what not. What are these, and how do I run them?
Is there a wordpress plugin that I can just install and have everything done automatically for me?
If anyone could point me to some clear step by step instructions as to how to go about this, I'd be very grateful...
Thanks!
What you need is shell access. According to their features page at http://www.bluehost.com/cgi/info/hosting_features, they support shell access via SecureShell (SSH). This is fairly common.
Download an SSH client like PutTTY and connect to yourdomain.com, or if you are on a Mac, open the Terminal app and type "ssh yourdomain.com" (with your web site's domain name). You can then run commands.
However, this probably will not get you what you need. You mentioned yum and rpm commands, which are system-level software installation tools. You'll need root access to do that, which you certainly cannot do on a shared hosting account.
These types of tools are really intended for application service providers. I'm fairly familiar with New Relic's products. I'm guessing you are trying to install the server monitor tool. You probably don't need to worry about that one since Bluehost should be monitoring their servers for you.

using Amazon-EC2 or Linode for hosting a debian server with R regularly extracting information from the web [closed]

Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 11 years ago.
Improve this question
I have hacked my NAS-01g into a debian server and use it to regularly download stock quote and earthquake information from the web. I was on a trip last week and I turned off my sever at home, but when I come back, I can no long gain access to the server. Considering re-configuring the server as very time-consuming, I am thinking of migrating my existing server to cloud.
I have a few requirements here:
server on 24/7
use cron to hourly call R to extract data from somewhere, say yahoo finance
(optional) backup and encrypt my gmail account
(optional) host django server and I am learning to use it now
I am thinking of using amazon-EC2 or Linode. I have tried amazon-EC2 a bit, but the pricing scheme seems very complicated for me, and I want the server to be as cheap as possible as it is not really that mission critical work. I wonder Linode is simpler for a non-system admin like me.
Hope my question won't be considered as off-topic here.
Thanks in advance.
Belongs on ServerFault, but I think Linode is a good candidate. They have monthly plans that don't cost a fortune and you can get a debian install on it. (we just installed one for work and selected Debian, so I'm 100% sure).
Pricing scheme for Amazon is more complicated, but more transparent.
They separate in Machine Hours (EC2 Hours), Data Transfer In/Out and storage.
You can have too Elastic IPs (static IPs) and Load Balancers (LBs).
And you can choose if you want reserved instances (1yr/3yr term) or on-demand instances(more expensive but able to terminate instances whenever you want).
They have their own monthly calculator, you can check it here: http://calculator.s3.amazonaws.com/calc5.html

Resources