Hel lo,
I have some files that are presrnt into a ssh server that I need to open from my R local software .
Does someone know a way to load these files from the server to the local computer instead of doing a (copy) scp of this file to my local computer and then load them in R.
Thank you for your help
Related
I am running CDH 5.10.0 VM
When I create .sql files using gedit in the terminal, into /home/cloudera, I can see the sql file is being created in Desktop-> Cloudera's Home. But the same is not appearing when I use hadoop fs -ls /home/cloudera
Similarly, when I execute INSERT OVERWRITE INTO DIRECTORY /home/cloudera/somefolder, it is not showing physically in Desktop -> Cloudera's Home. But it is being displayed when I use- hadoop fs -ls /home/cloudera
Is it a permission issue? or my VM is corrupted?
Hadoop file system is different from your OS file system(local filesystem) so the path Desktop-> Cloudera's Home is completely different with /home/cloudera in your HDFS.
Hive in Cloudera is configured to use HDFS by default so the query you issued :
INSERT OVERWRITE INTO DIRECTORY /home/cloudera/somefolder
ran using HDFS not your local file system.
I had created a Google compute engine (virtual machine) instance with RStudio Server being unaware that RStudio Server is a licensed software. Now, my trial license for RStudio has expired, and I cannot login to my R sessions anymore.
However, I had written some code which I need to recover. How do I download the files?
I have SSH-ed into my virtual machine but cannot find the relevant files or a way to download them.
I had a similar issue and I was able to recover the files by performing the following steps:
SSH to the virtual machine
Once you are in the virtual machine run the following command: cd ../rstudio-user/
Now ls there you will see the file structure you used to see in the RStudio Server interface}
Navigate using cd and ls between the folders to get to the desired file
Once you are in the desired location (where with an ls you can see the files you want to recover) run the following command: pwd
Click on the Engine and go to download file
Enter the full path of the file you want to download, it will be something like: /home/rstudio-user/FILENAME.R
Click on Download
You can do this for each of the files you want to recover.
In case you want to recover a full folder its easier to compress to a zip file and then to download it.
What is the right syntax to copy from Windows to a remote HDFS?
I'm trying to copy a file from my local machine to a remote hadoop cluster using RStudio
rxHadoopCopyFromLocal("C:/path/to/file.csv", "/target/on/hdfs/")
This throws
copyFromLocal '/path/to/file.csv': no such file or directory`
Notice the C:/ disappeared.
This syntax also fails
rxHadoopCopyFromLocal("C:\\path\\to\\file.csv", "/target/on/hdfs/")
with error
-copyFromLocal: Can not create a Path from a null string
This is a common mistake.
Turns out the rxHadoopCopyFromLocal command is a wrapper of the hdfs fs -copyFromLocal. All it does is copy from a local filesystem to an hdfs target.
In this case the rxSetComputeContext(remotehost) was set to a remote cluster. On the remote machine, there is not a C:\path\to\file.csv
Here are a couple of ways to get the files there.
Configure local hdfs-site.xml for remote Hdfs Cluster
Ensure you have hadoop tools installed on your local machine
Edit your local hdfs-site.xml to point to the remote cluster
Ensure rxSetComputeContext("local")
Run rxHadoopCopyFromLocal("C:\local\path\to\file.csv", "/target/on/hdfs/")
SCP and Remote Compute Context
Copy your file to the remote machine with scp C:\local\path\to\file.csv user#remotehost:/tmp
Ensure rxSetComputeContext(remotehost)
Run rxHadoopCopyFromLocal("/tmp/file.csv", "/target/on/hdfs/")
The dev version of dplyrXdf now supports files in HDFS. You can upload a file from the native filesystem as follows; this works both from the edge node, and from a remote client.
hdfs_upload("c\\path\\to\\file.csv", "/target/on/hdfs")
If you have a dataset (an R object) that you want to upload, you can also use the standard dplyr copy_to verb. This will import the data to an Xdf file and upload it, returning an RxXdfData data source pointing to the uploaded file.
txt <- RxTextData("file.csv")
hd <- RxHdfsFileSystem()
hdfs_xdf <- copy_to(hd, txt, name="uploaded_xdf")
I need to copy a file from a windows share directory to AIX box.
I am not able to get into windows share directory when i have connected to AIX box.
Can someone please help as which unix commands to use to use to login into windows share directory so that i can copy the file and paste to AIX box
You can copy by using nfsshare. Open nfsshare in windows and mount it in aix.
smbclient (if installed):
$ smbclient //host/share
Enter password
smb: \> help
Just wondering if there is any method to copy a file to a live VM created over KVM using libvirt tools. My objective is to assign a static IP address to VM without modifying the img file or without using dhcp. What I understand is we need to have a file in /etc/sysconfig/network-scripts/ corresponding to the interface in VM where ip address has to be assigned. Wondering if I can copy this file after VM is created and booted up.
Update : I am using CentOS 7 for guest and host .
Thanks
I'd suggest using a kickstart file for installing the machine. That way the installer automatically sets the IPÂ address wherever it is needed (even though you know where it needs to be set in the current version). Copying the file onto the disk with the VM being running must be done in a way that the VM knows about that, but that means you need to have access to the machine, which, I guess, you don't; mainly since that's probably what you're trying to do.
If the machine is installed and you want to configure that without access to it and without reinstalling, I'd suggest to cleanly shutdown the VM, then use libguestfs (mainly guestfish command) that lets you access the disk of the machine.
This works really well: http://www.linux-kvm.org/page/9p_virtio
Basically mkdir /tmp/share && echo '/hostshare /tmp/share 9p trans=virtio,version=9p2000.L 0 2' >> /etc/fstab. On Host ``mkdir /tmp/share`.
Then in Virt-manager, Add Hardware > Filesystem, change Driver to Path, add Source /tmp/share and Target /hostshare. And mount -a.
Or add by command mount -t 9p -o trans=virtio,version=9p2000.L hostshare /tmp/share.