I'm using PHPExcel to read a remote file which is hosted on a public dropbox folder.
How can I get the file's last modified date and time?
Related
I would like to read a CSV file from a website into my R Shiny app.
This CSV file contains data that will regularly be updated on the website and so downloading the file once and accessing it on my local computer will not suffice.
Is a a way to automatically download and read the CSV file into the app every time is it launched?
If the website links and updates a csv file with a same filename, try this:
importedData <- read.csv(url("http://mywebsite.com/dataName.csv"))
Whenever you execute the above code, it will newly download the updated csv file and save it within the importData variable.
Refer to this link for more details.
https://stackoverflow.com/a/6299291/5365437
Hope this helps.
I'm trying to get data for RAIS (a Brazilian employee registry dataset) that is shared using a Google Drive public folder. This is the address:
https://drive.google.com/folderview?id=0ByKsqUnItyBhZmNwaXpnNXBHMzQ&usp=sharing&tid=0ByKsqUnItyBhU2RmdUloTnJGRGM#list
Data is divided into one folder per year and within each folder there is one file per state to download. I would like to automate the downloading process in R, for all years, and if not at least within each year folder. Downloaded file names should follow the file names that occur when downloading manually.
A know a little R, but no web programming or web scraping. This is what I got so faar:
By manually downloading the first of the 2012 file, I could see the URL my browser used to download:
https://drive.google.com/uc?id=0ByKsqUnItyBhS2RQdFJ2Q0RrN0k&export=download
Thus, I suppose the file id is: 0ByKsqUnItyBhS2RQdFJ2Q0RrN0k
Searching the html code of the 2012 page I was able to find that ID and the file name associated with it: AC2012.7z.
All the other ids' and file names are in that section of the html code. So, assuming I can download the file correctly, I suppose I could at least generalize tho the other files.
In R, I tried the flowing code to download the file:
url <- "https://drive.google.com/uc?id=0ByKsqUnItyBhS2RQdFJ2Q0RrN0k&export=download"
download.file(url,"AC2012.7z")
unzip("AC2012.7z")
It does download but I get and error when trying to uncompress the file (both within R and manually with 7.zip) There must be something wrong with file downloaded in R, as the the file size (3.412Kb) does not match what I get from manualy downloading the file (3.399Kb)
For anyone trying to solve this problem today, you can use the googledrive package.
library(googledrive)
ls_tibble <- googledrive::drive_ls(GOOGLE_DRIVE_URL_FOR_THE_TARGET_FOLDER)
for (file_id in ls_tibble$id) {
googledrive::drive_download(as_id(file_id))
}
This will (1) trigger an authentication page to open in your browser to authorise the Tidyverse libraries using gargle to access Google Drive on behalf of your account and (2) download all the files in the folder at that URL to your current working directory for the current R session.
I have a website that creates a txt file and saves it with a timestamp to the webserver directory.
I need a client based app to list all files in the directory to download to the client server for processing but cannot find a way to have them listed in a listbox without specifying the full file name
i.e. TB2014-09-08_11h48m25_765.txt is a full name. TB stays constant and the files are always .txt)
you want the GetFiles method in the Directory class (System.IO namespace). Something along these lines:
Dim files As String() = Directory.GetFiles("c:\YourFolder", "TB*.txt")
For Each filename In files
Console.WriteLine(filename)
Next
Very new to this so please help. Im trying to mass update files in a static folder location, many files in one folder.
What i want to do is run VBA macro in Excel 2010 to goto a network location folder, open the fist file in the folder. Unprotect the workbook and worksheets call another marco to run changes then protect the worksheet close the file and then move onto the next file in the folder until all files have been corrected.
I have created the marco to make the changes, this is called "Edit"
File types are xlsm and the workbook and worksheet are password protected
How can i automatically run the macro to goto the network location and in series open each file, unprotect, call the macro, then re protect the document close file and move onto the next file until they are all updated.
Have you tried running the MacroRecorder while performing the tasks you've listed. You can then pinpoint the exact lines of code you need to add in.
I have problem with uploading files. I followed these instructions: http://symfony.com/doc/current/cookbook/doctrine/file_uploads.html and everything is working fine, but I want to save uploaded files in a folder that is not public and with random name and then send it to the browser in original name.
I know how to put file in non public folder with random name and how to save original name to database, but what should I do next for getting this file content and send it to browser with original name? How to achieve that?
Store the hidden path in the database, then just read it in your php script that is public available and send your file to a user, if you don't how to send files in php, you can find it out here on stackoverflow.