Time series (xts) strptime; ONLY month and day - r

I've been trying to do a time series on my dataframe, and I need to strip times from my csv. This is what I've got:
campbell <-read.csv("campbell.csv")
campbell$date = strptime(campbell$date, "%m/%d")
campbell.ts <- xts(campbell[,-1],order.by=campbell[,1])
First, what I'm trying to do is just get xts to strip the dates as "xx/xx" meaning just the month and day. I have no year for my data. When I try that second line of code and call upon the date column, it converts it to "2013-xx-xx." These months and days have no year associated with them, and I can't figure out how to get rid of the 2013. (The csv file I'm calling on has the dates in the format "9/30,10/1...etc.)
Secondly, once I try and make a time series (the third line), I am unsure what the "order.by" command is calling on. What am I indexing?
Any help??
Thanks!

For strptime, you need to provide the full date, i.e. day, month and year. In case, any of these is not provided, current ones are assumed from the system's time and appended to the incomplete date. So, if you want to retain your date format as you have read it, first make a copy of that and store in a temporary variable and then use strptime over campbell$date to convert into R readable date format. Since, year is not a concern to you, you need not bother about it even though it is automatically appended by strptime.
campbell <-read.csv("campbell.csv")
date <- campbell$date
campbell$date <- strptime(campbell$date, "%m/%d")
Secondly, what you are doing by 'the third line' (xts(campbell[,-1],order.by=campbell[,1])) command is that, your are telling to order all the data of campbell except the first column (campbell[,-1]) according to the index provided by the time data in the first column of campbell (campbell[,1]). So, it would only work given the date is in the first column.
After ordering the data according to time-series, you can replace back the campbell$date column with date to get back the date format you wanted (although here, first you have to order date also like shown below)
date <- xts(date, order.by=campbell[,1]) # assuming campbell$date is campbell[,1]
campbell.ts <- xts(campbell[,-1], order.by=campbell[,1])
campbell.ts <- cbind(date, campbell.ts)

format(as.Date(campbell$dat, "%m/%d/%Y"), "%m/%d")

Related

My data does not convert to time series in R

My data contains several measurements in one day. It is stored in CSV-file and looks like this:
enter image description here
The V1 column is factor type, so I'm adding a extra column which is date-time -type: vd$Vdate <- as_datetime(vd$V1) :
enter image description here
Then I'm trying to convert the vd-data into time series: vd.ts<- ts(vd, frequency = 365)
But then the dates are gone:
enter image description here
I just cannot get it what I am doing wrong! Could someone help me, please.
Your dates are gone because you need to build the ts dataframe from your variables (V1, ... V7) disregarding the date field and your ts command will order R to structure the dates.
Also, I noticed that you have what is seems like hourly data, so you need to provide the frequency that is appropriate to your time not 365. Considering what you posted your frequency seems to be a bit odd. I recommend finding a way to establish the frequency correctly. For example, if I have hourly data for 365 days of the year then I have a frequency of 365.25*24 (0.25 for the leap years).
So the following is just as an example, it still won't work properly with what I see (it is limited view of your dataset so I am not sure 100%)
# Build ts data (univariate)
vs.ts <- ts(vd$V1, frequency = 365, start = c(2019, 4)
# check to see if it is structured correctly
print(vd.ts, calendar = T)
Finally my time series is working properly. I used
ts <- zoo(measurements, date_times)
and I found out that the date_times was supposed to be converted with as_datetime() as otherwise they were character type. The measurements are converted into data.frame type.

How do I stop implicit date conversion when using ifelse with date time data? [duplicate]

This question already has answers here:
How to prevent ifelse() from turning Date objects into numeric objects
(7 answers)
Closed 4 years ago.
I have a data frame that contains one column that is a series of dates, collected via a Google form. The date and time were collected separately. The data was entered by selecting a day from a calendar, and the date was entered manually - should have been a 24-hour clock, but the field appears to have just checked that the hour and minute were in the correct range.
I've read the file in from .csv . I converted the date time character field (as read in from the .csv) to a date time format in a new variable by using as.POSIXct(foo$When, tz="NZ", format="%Y-%m-%d %H:%M"). The dates and times were correctly constructed.
Except: I have some incorrect date/time entries in the original data. These have all been set to NA in the new field, as you expect. For those that do include a time, I have been trying to fix them while still retaining a POSIXct format.
I have been unsuccessful.
Here is an example of the data I have, and what I have tried to do:
TestDataForHelp <- data.frame(OldDateTime =
c("2013-12-04 21:10", "2013-12-15 09:07", "2014-01-01 06:27",
"2014-11-02 21:15", "2014-11-07 23:00", "2015-01-04 21:42",
"201508-11-02 20:15", "201508-11-02 20:15", "2017-11-02"))
TestDataForHelp$ActualDateTime <-
as.POSIXct(TestDataForHelp$OldDateTime, tz="NZ", format="%Y-%m-%d %H:%M")
TestDataForHelp$FixedDateTime <-
ifelse(TestDataForHelp$OldDateTime=="201508-11-02 20:15",
as.POSIXct("2015-11-02 20:15", tz="NZ", format="%Y-%m-%d %H:%M"),
TestDataForHelp$ActualDateTime)
The new variable, FixedDateTime, does not have a POSIXct type. It has been implicitly converted to a numeric type. How can I retain the POSIXct format from ActualDateTime and not have the implicit type conversion?
I would like to not have FixedDateTime but, rather, put the corrected data into ActualDateTime. The ifelse() seems to be the part of the code causing the format to shift from POSIXct to numeric. If I do:
TestDataForHelp$CopiedDateTime <- TestDataForHelp$ActualDateTime
The new variable, that is simply a copy of the original, retains the POSIXct type.
The previous question linked in the comments relates to date values only, not date time values. The data manipulation becomes more complicated with dealing with date time values, given that mine also do not include seconds. The other difference is that the original variable contains a mix of date, date-time, and incorrect date-time values, whereas that previous question had values that were all the same. It was unclear whether the non-uniform content of the variable was causing the problem.
Edit: I fixed the problem by fixing the strings before I converted them to dates. This removed the need to try to loop through the dates.
I can replicate the numeric answer, but not explain it. It is however calculating the results correctly for you. I'm not sure why it's returning as a numeric. However, the conversion from numeric to date is easy enough if you know the origin, which should be 1970-01-01. So I believe the following does the trick:
(Note, the first block is just what you already have)
TestDataForHelp$FixedDateTime <- ifelse(TestDataForHelp$OldDateTime=="201508-11-02 20:15",
as.POSIXct("2015-11-02 20:15", tz="NZ", format="%Y-%m-%d %H:%M"),
TestDataForHelp$ActualDateTime)
TestDataForHelp$FixedDateTime <- as.POSIXct(TestDataForHelp$FixedDateTime,
origin = as.POSIXct("1970-01-01", tz="NZ"))

function in R that creates dummies for given time period

There is a data frame like this:
The first two columns in the df describe the start date (month and year) and the end date (month and year). Column names describe every single month and year of a certain time period.
I need a function/loop that insterts "1" or "0" in each cell - "1" when the date from given column name is within the period described by the two first columns, and "0" if not.
I would appreciate any help.
You want to do two different things. (a) create a dummy variable and (b) see if a particular date is in an interval.
Making a dummy variable is the easiest one, in base R you can use ifelse. For example in the iris data frame:
iris$dummy <- ifelse(iris$Sepal.Width > 2.5, 1, 0)
Now working with dates is more complicated. In this answer we will use the library lubridate. First you need to convert all those dates to a format 'Month Year' to something that R can understand. For example for February you could do:
new_format_february_2016 <- interval(ymd('2016-02-01'), ymd('2016-03-01') - dseconds(1))
#[1] 2016-02-01 UTC--2016-02-29 23:59:59 UTC
This is February, the interval of time from the 1 of February to one second before the 1 of March. You can do the same with your start date column and you end date column.
To compare two intevals of time (so, to see if a particular month fall into your other intervals) you can do:
int_overlaps(new_format_february_2016, other_interval)
If this returns true, the two intervals (one particular month and another one) overlaps. This is not the same as one being inside another, but in your case it will work. Using this you can iterate over different columns and rows and build your dummy variable.
But before doing so, I would recommend to clean your data, as your current format is complicate to work with. To get all the power that vector types in R provides ideally you would want to have one row per observation and one variable per column. This does not seem to be the case with your data frame. Take a look to the chapter 'Tidy data' of 'R for Data Science' specially the spreading and gathering subsection:
Tidy data

Convert from dd/mm/yyyy to dd/mm in r

I have data spread over a period of two months. When I graph data points for each day, dates (dd/mm/yyyy) are overlapping and it is not possible to make sense of which date a certain point refers to. I tried to remove years from the date as they are not useful for the info I have and the dd/mm should leave enough space.
df$date<-as.Date(df$date, format="%d/%m")
However, it transforms the 01/09/2014 to 2015-09-01. I read that when the year is missing as.Date assumes current year and inputs it. Can I avoid this automatic insertion somehow?
something like this?
date <- as.Date("01/09/2014", format = %d/%m/%Y)
format(date, "%d/%m")
"01/09"

Creating a single timestamp from separate DAY OF YEAR, Year and Time columns in R

I have a time series dataset for several meteorological variables. The time data is logged in three separate columns:
Year (e.g. 2012)
Day of year (e.g. 261 representing 17-September in a Leap Year)
Hrs:Mins (e.g. 1610)
Is there a way I can merge the three columns to create a single timestamp in R? I'm not very familiar with how R deals with the Day of Year variable.
Thanks for any help with this!
It looks like the timeDate package can handle gregorian time frames. I haven't used it personally but it looks straightforward. There is a shift argument in some methods that allow you to set the offset from your data.
http://cran.r-project.org/web/packages/timeDate/timeDate.pdf
Because you mentioned it, I thought I'd show the actual code to merge together separate columns. When you have the values you need in separate columns you can use paste to bring them together and lubridate::mdy to parse them.
library(lubridate)
col.month <- "Jan"
col.year <- "2012"
col.day <- "23"
date <- mdy(paste(col.month, col.day, col.year, sep = "-"))
Lubridate is a great package, here's the official page: https://github.com/hadley/lubridate
And here is a nice set of examples: http://www.r-statistics.com/2012/03/do-more-with-dates-and-times-in-r-with-lubridate-1-1-0/
You should get quite far using ISOdatetime. This function takes vectors of year, day, hour, and minute as input and outputs an POSIXct object which represents time. You just have to split the third column into two separate hour minute columns and you can use the function.

Resources