Data Error Cyclic Redundancy Check - 7 zip - unzip

I have download a .zip file from the internet and now when I try to extract it using 7 Zip, it return back saying that a certain file has Data Error Cyclic Redundancy.
I also tried unzipping with the inbuilt Widows Unzipper, which is also showing an error extracting that file. Please help me.

Related

Error reading .xls file with format_from_signature NA in R

I have a .xls file which is sent to me daily and I'd like to automate the handling I do with it through R.
When I try to read the file using read_xls, it throws the following error libxls error: Unable to open file. For anyone with knowledge of Excel, this is one of those files that, when you try to open it there, Excel warns you about it not matching the file format and the extension. This, along with the fact that using format_from_signature on the file returns NA, tells me that the file isn't a real .xls file but rather a forced .xls document.
I can fix the issue for a single file by just opening it on Excel and saving it as a normal .xlsx, the problem is doing it frequently. I would like to replicate that process but all in R to automate it.
Edit: I've added an Excel file that meets the criteria I'm describing.

R: uploading (.csv) file to ftp server

does anyone of you have an idea of how to upload an R file (saved in the R environment) onto an FTP server?
It would be perfect to export the file directly as .csv or .json file, so I already tried some solutions like:
write.csv2(my.data, file = url("ftp://user:password#ftpserver.net/mydata.csv"))
or
write_json(my.data, path = url("ftp://user:password#ftpserver.net/folder/mydata.csv"))
but, unfortunately, in both cases R gives me the error:
"can only open URLs for reading"
which is kind of sad, but makes sense anyway.
Ok, after that I tried to work with the RCurl library - like this:
library("RCurl")
userpwd <- "user:password"
ftpUpload(I(my.data), "ftp://ftpserver.net/folder/mydata.csv", userpwd=userpwd, asText = T)
which doesn't work properly as well. Somehow this command brings R to a "fatal error" so that my session gets terminated - and oddly enough an empty mydata.csv file gets uploaded onto the FTP server.
To be honest, I have no clue of how to deal with that.
Does anyone of you have a smart idea?
Would a temp file bring a solution? My ideal is, to avoid storing data locally on my laptop but instead saving (and exporting it) directly onto the ftp server.
Many, many thanks in advance!!

Error in reading URL in R

I would like to read a URL (csv file) online in R but my code does not work
test = read.csv('https://s3.amazonaws.com/folder/abcd/file.csv')
Here is the error: Error in file(file, "rt") : for https:// URLs use setInternet2(TRUE)
I have no idea how to fix this.Really appreciate any help
The error message suggests a network constraint (e.g., you are attempting to access the AWS S3 bucket from your work location...but the SysAdmins have locked that protocol/port/etc. down).
If you are attempting to download a specific file, then you could download the file outside of R (say, using cURL), save it on your hard drive, then modify read.csv to access the file from your workstation.
If you need to [programmatically] access multiple files for download, then it's to your advantage to research 'setInternet2' (an MS-Windows DLL) AND existence of any network access limitations imposed by your organization's SysAdmins.

Accessing Excel file from Sharepoint with R

am trying to write an R script that will access an Excel file that is stored on my company's Sharepoint page so that I can make a few calculations and plot the results. I've tried various ways to do this (download.file, RCurl getURL(), gdata), but I can't seem to figure out how to do this. The url is HTTPS and there should be a username and password required. I've gotten the closest with this code:
require(RCurl)
URL<-"https://companyname.sharepoint.com/sites/folder/_layouts/15/WopiFrame.aspx?sourcedoc={2DCC2ED7-1C13-4910-AFAD-4A9ACFF1C797}&file=myfile.xlsx&action=default'
f<-getURL(URL,verbose=T,ssl.verifyhost=F,ssl.verifypeer=F,userpwd="mylogin:mypw")
This seems to connect (although the username and password don't seem to matter) and returns
> f
[1] "<html><head><title>Object moved</title></head><body>\r\n<h2>Object moved to here.</h2>\r\n</body></html>\r\n"`
However, I'm not sure what to do at this point, or even if I'm on the right track. Any help will be greatly appreciated.
I use
library(readxl)
read_excel('//companySharepointSite/project/.../ExcelFilename.xlsx', 'Sheet1', skip=1)
Note, no https:, and sometimes I have to open the file first (i.e., cut and paste //companySharepointSite/project/.../ExcelFilename.xlsx into my browser's address bar)
I found that other answers did not work for me, perhaps because I am on a Mac, which obviously does not play as well with Microsoft products such as Sharepoint.
Ended up having to split it into two pieces: first download the Excel file to disk and then separately read that Excel file.
library(httr)
library(readxl)
# the URL of your sharepoint file
file_url <- "https://yoursharepointsite/Documents/yourfile.xlsx"
# save the excel file to disk
GET(file_url,
authenticate(active_directory_username, active_directory_password, "ntlm"),
write_disk("tempfile.xlsx", overwrite = TRUE))
# save to dataframe
df <- read_excel("tempfile.xlsx")
df
# remove excel file from disk
file.remove("tempfile.xlsx")
This gets the job done, though would be interested if anyone knows how to avoid the interim step of writing to disk.
N.B. Depending on your specific machine/network/Sharepoint configuration, you may also be able to just use authenticate(":",":","ntlm") per this answer.
I was unable to accomplish this using hints from answers above in R (I tried many approaches found on this site). However, just to highlight the response by #RyanBradley above and especially the response by #ZS27:
I instead had to use the OneDrive Desktop client (Windows) to allow me to sync the folder to my computer. Newer versions of SharePoint (like that found in MS Teams) have a sync button or feature in the document libraries / folders that interfaces with OneDrive.
This is the functional equivalent of mounting the folder as a network drive, so R interfaces with the file as if it was a part of your file system. Works for me.
You may need to map a network drive to the SharePoint library so that you can connect to it directly. Or if you don't want to map a network drive you could also place a shortcut to the folder in your startup folder.
Example file path:
\company_sharepoint_site\ssp\site_name\sub_site_name\library_name
Example start up folder location (Windows 10):
C:\Users\USER_NAME\AppData\Roaming\Microsoft\Windows\Start Menu\Programs\Startup
Note direction of the slashes ("\" rather than "/") is important so that your file path is interpreted as a file location, not an internet browser location. By placing such a path in a network drive or as a shortcut in your startup folder your PC should connect to it when it boots.
# Load or install readxl
if(require(readxl) == FALSE){
install.packages("readxl")
if(require(readxl)== FALSE){stop("Unable to install and load readxl")}
}
# Define path to data
data_path <- "\\\\company_sharepoint_site\\ssp\\site_name\\sub_site_name\\library_name\\Example.xlsx"
# Pull data
df_employees <- read_xlsx(data_path)
I had a situation exactly like you. I want to access an excel file, available on an sharepoint site using R programming language.
I have also surfed many stuff in Internet and I didn't find anything relevant to my requirement.
Then, I have attempted the following thing:
I have made the sharepoint folder as a network drive folder, in my local system.
Then, I have accessed that excel file(in sharepoint site) from my machine without accessing web browser.
Hence, I have copied the network path, present in my system (it will be same as your sharepoint site, however it will not have https/http.
The site will start with "\" like the following: "\sharepoint.test.com\folder\path").
Launch RStudio and select Import Dataset option, under Environment section.
Choose 'From Excel'. 'Import Excel Data' form will be opened.
Under File/URL field: Paste the network path of sharepoint (copied from your machine).
Click Import, the excel file in Sharepoint will be imported in R successfully.
Ensure that the file should not have html language as input (lie %20 and all) and Backslash should be used as separator in the URL.
While importing the file, provide the input of the folder name exactly, as you see.
For example:
Sharepoint.microsoft.com - Sharepoint's Domain
Department name - name of the Folder
Project name - name of the folder
Sample.xlsx - name of the file
So, your URL to import dataset should be:
"\Sharepoint.microsoft.com\Department name\Project name\Sample.xlsx".
Thank you!
Try using the link in this format:
http://site/_layouts/download.aspx?SourceUrl=url-of-document-in-library
If above doesn't work try this syntax [note slash directions]:
"\\gov.sharepoint.com#SSL/DavWWWRoot/sites/SomePath/SomePath/SomePath/SomeFile"
See this for more info about syntax and what's going on:
Connect to a site via SSL/DavWWWRoot not usual URL? Why does this make a difference?

csv files in opencpu

If I put a very small csv file in my GitHub directory so that it gets copied to /ocpu/github/username/projectname/www/ , will I be able to access the contents of the csv for use in a R function? I tried to ajax the file, but I get a 404 error even though I can see the csv file sitting in the www directory of my local server. I need to have the csv on the server as a static file rather than being uploaded by a function. Thanks
You should be able to access them like any other file. Can you post an example that shows what you are doing and what error you are getting?
That said, if you just want to use this data in your R functions, it is better to include it in the R package as an actual data file. Also see section 1.1.6 of Writing R Extensions. An example is the mapapp package, which includes a dataset called countryExData. Also see the live app.

Resources