I'm working on a client using the Google Calendar API, and I need to retrieve the full series of a recurring event.
https://developers.google.com/calendar/v3/reference/events/instances does that for pristine, unchanged series. Sending the eventId and some optional params like timeMin, timeMax and maxResults I get an iterable containing each instance in the series.
The problem is when the user edits one or more events in the series. When saving the changes a prompt asks for either apply the changes to the whole series, to only that instance or to that one and all following instances.
When you select one of the last two options, the series is literally converted into two distinct series: the original one is edited (UNTIL is shortened) and a new series is created with the instance or instances removed from the original series.
My question is how to retrieve the full original series starting from the eventId of one instance after the series has been modified/split.
Related
We only want to show our live chat tag Monday to Friday, from 09.00 until 17.00. Is this possible with Google Tag Manager please?
Thanks!
Preferred solution
The best solution is probably to get the current date on the server side and push to the datalayer a variable that determines whether the chat tag should be triggered or not depending on this date.
Alternative
The alternative is to retrieve the date on the client side which won't be 100% reliable. In order to use this solution, you need to create a Custom Javascript variable with the following code (replace "America/New_York" by your business' timezone) :
function (){
const currentDate = new Date(new Date().toLocaleString("en-US", {timeZone: "America/New_York"}));
const day = currentDate.getDay()
const hours = currentDate.getHours()
return day > 0 && day < 6 && hours >= 9 && hours < 17
}
Then, trigger your chat tag only when this variable's value is "true".
If you want to get a consistent, reliable date, ListenLayer.com returns a timestamp with every data dataLayer push. The time can appear in a single account timezone based on your account settings, so if you align your ListenLayer timezone to your chat time zone it will work.
You can then write rules in the platform to determine if the time returned should allow for chat. You will output a yes/no into a new data layer variable based on the rule. Then you will register this variable in GTM and use it to determine if the chat tag should fire.
Here is the process I would use at a high level
Create an account and set the account time to match your chat timezone
Turn on the User Source Listener and all it's features,
enable it to push on every page load This will push the user's traffic source into the data layer, along with a bunch of other structured data on each page load. You could use another listener, but this one will allow us to ensure it pushes on every page load (container load) so that GTM will know on every page load whether to trigger the chat script.
Create a custom data layer variable called chatTime. We will write a rule in ListenLayer to push a yes or a no into this variable based on the time that is returned in the userSource data layer event. This will happen in real time, in the same event in the data layer.
Publish everything in listenlayer and then go to your website with the console open (the data layer is written there for ease). You'll see the userSource dataLayer event in the console on every page load. It will contain an eventTimestamp. You want the data that is inside the listenlayerAccountTime node because it will match your ListenLayer timezone, which should match the timezone you operate your chat hours from (or if all of this is based on the user's time zone, use that one to be localized to them). You will need to focus on these two variables that are automatically provided because we will write a rule based on them in step 5:
sourceAutomaticValues.eventTimestamp.listenlayerAccountTime.time.timeOfDay
sourceAutomaticValues.eventTimestamp.listenlayerAccountTime.time.dayOfWeek
Here is a visual showing those variables I have given a path to above. One of the day of the week and one is the time of day.
Inside ListenLayer, under the User Source listener you will create two rules, they will be lookup table rules. The first rule will run first, then the second. If a match is found, they will stop. This means we can write the first rule to basically just look at the day of the week and set chatTime to no if the value is Sat or Sun. The second rule will then focus on the time of the day since this rule will only be reached if it is not Saturday or Sunday. In this rule we will look at the first part of the timeOfDay variable using simple ReGex. Our rule has 24 rows for each hour of the day and will return a yes only for the hours you are open for chat. Certainly this rule could be made simpler with some additional regex, but this gives a better illustration. There is an import export so you could spin up the 24 row rule in a csv file and import it.
After you save and publish this simple logic, your website will have a value in the data layer telling google tag manager whether chat should be on. It will appear in the data layer every time the userSource event is pushed, which is every page load.
I assume things are easy from here. You would create a data layer variable inside GTM referencing sourceCustomValues.chatTime You'll also create a custom event trigger referencing userSource and only firing if chatTime = yes. Then you will set your chat tag to fire on this trigger.
I added 'ga:eventLabel' function to my script and sum of sessions decreased from 2238 to 994. Why?
I expect the same result from both script
dim=['ga:eventLabel', 'ga:source','ga:sourceMedium']
met=['ga:sessions', 'ga:users']
start_date='2019-07-01'
end_date='2019-07-03'
transaction_type='Goal'
goal_number=''
refresh_token=token
condition=''
data_2=google_analytics_reporting_api_data_extraction(viewID,dim,met,start_date,end_date,refresh_token,transaction_type,goal_number,condition)
viewID='*********'
dim=['ga:source','ga:sourceMedium']
met=['ga:sessions', 'ga:users']
start_date='2019-07-01'
end_date='2019-07-03'
transaction_type='Goal'
goal_number=''
refresh_token=token
condition=''
data=google_analytics_reporting_api_data_extraction(viewID,dim,met,start_date,end_date, refresh_token,transaction_type, goal_number,condition)```
Here are the results:
--
The two queries have two different meanings, and won't give you the same result, unless you have a data set, where all sessions have at least one event type hit associated with them.
The first query says: count all my users and sessions for the given date range, breaking it down by event label, source, source/medium and date. So in this case, you implicitly filter for any known event labels, where (not set) is an empty, but existing label for a recorded event. Sessions without any events are excluded.
The second query says: count all my users and sessions for the given date range, breaking it down by source, source/medium and date (regardless, if they had any events).
You can verify this behavior, if you create these custom reports in Google Analytics web UI. It is similar to querying custom dimensions: if no value was set for a given custom dimension, those records are excluded.
My situation is this. I have a data-link that continually appends new snapshot data to a table, and upon the arrival of a new snapshot, it runs an R data function (script) which does some calculations with results that append to an output table. The R calculations are quite expensive and the input data is large, and more importantly, the snapshots are independent of each other, so there is no need to re-process previously received snapshots every time a new snapshot arrives.
I can't make a data function that takes it's own results as input (i.e. to filter by previously processed dates), and my other idea also throws up cyclic dependencies (creating a second data function to generate a second table with previously processed dates).
Has anyone experienced this issue, and could you please give me some ideas on safe ways to address it? I'm new to Spotfire (and dash-boarding generally).
I had the same situation - I got around this with a few steps:
Added a new filtering scheme just for data processing
Adding a calculated column to my output table which was just "X",
that I could filter out ie. filter out the entire output table
Added a relationship between my input table and output on a key column which excluded any filtered out columns
In my data function input parameters, selected to filter the input data by the data processing filtering scheme.
This meant every time my function runs it only runs off the unprocessed data.
I am current appending the data so that it adds on update not
replaces. This works fine when I open the file everyday and save it
however is not working with the scheduled updates in the web player-
it doesn't add a data function add rows action each time - still
looking for a way around this.... possibly an iron python script will
do the trick..
I try to add a new calculated column to sharepoint list that will show elapsed day. I enter name and write a formula like;
=ABS(ROUND(Today-Created;0))
The data type returned from this formula is: Single line of text
When I want to save I get an error like
Calculated columns cannot contain volatile functions like Today and
Me.
Calculated Column Values Only Recalculate As Needed
The values in SharePoint columns--even in calculated columns--are stored in SharePoint's underlying SQL Server database.
The calculations in calculated columns are not performed upon page load; rather, they are recalculated only whenever an item is changed (in which case the formula is recalculated just for that specific item), or whenever the column formula is changed (in which case the formula is recalculated for all items).
(As a side note, this is the reason why in SharePoint 2010 you cannot create or change a calculated column on a list that has more than the list view threshold of 5000 items; it would require a mass update of values in all those items, which could impact database performance.)
Thus, in order for calculated columns to accurately store "volatile" values like "Me" and "Today", SharePoint would need to somehow constantly recalculate those column values and continuously update the column values in the database. This simply isn't possible.
Alternatives to Calculated Columns
I suggest taking a different approach entirely instead of using a calculated column for this purpose.
Conditional Formatting: You can apply conditional formatting to highlight records that meet certain criteria. This can be done using SharePoint Designer or HTML/JavaScript.
Filtered List views: Since views of lists are queried and generated in real time, you can use volatile values in list view filters. You can set up a list view web part that only shows items where Created is equal to [Today]. Since you can place multiple list view web parts on one page, you could have one section for today's items, and another web part for all the other items, giving you a visual separation.
A workflow, timer job, or scheduled task: You can use a repeating process to set the value of a normal (non-calculated) column on a daily basis. You need to be careful with this approach to ensure good performance; you wouldn't want it to query for and update every item in the list if the list has surpassed the list view threshold, for example.
I found some conversations about this issue. Many people suggest to creating a new Date Time column, visible is false, default value is Today's Date and it will be named as Today. Then we can use this column in our formulas.
I tried this suggestion and yes error is gone and formula is accepted but calculated columns' values are wrong. I setted column Today is visible and checked, it was empty. Default value Today's Date was not working. When I looking for a solution for this issue I deleted column Today carelessly. Then I realized calculated columns' values are right.
Finally; I don't know what is trick but before using Today keyword in your formulas if you create a column named as Today and after your formula saving if you delete Today column, it is working.
UPDATE
After #Thriggle's answer I realized this approach doesn't work like a charm. Yes, formula doesn't cause an error when calculated column saving but it works correctly only first time, in the next day the calculated column shows old values, because its values are static as Thriggle explained.
I am new to this forum so I hope I asking my question in the right place.
I have a problem inserting a datetime into a Google Spreadsheet from a form created in Appinventor2;
In app inventor2 I created a form that fills in a google spreadsheet. Basically I merged the Pizza Party example (http://appinventor.mit.edu/explore/ai2/pizzaparty.html) with this example http://puravidaapps.com/spreadsheet.php to use google spreadsheet instead of fusion table.
the user selects in how many minutes he wants his order and then sees all the orders in a table sorted by delivery time.
Problem A)
Firstly, i want to save the current datetime + the desired delay into the google spreadsheet and sort the table by this new datetime.
1) when i use the block "call clock format time" + "call clock addminutes" the spreadsheet is populated with a text, but then i can't sort the table by delivery datetime. in fact i believe the sorting is done on the number regardless of the am/pm or day of the month. so for example instead of having 4am, 6am, 2pm, 3pm i get : 2pm, 3pm, 4am, 6am.
2) I then tried to remove the block "call clock format time" and in the google form i kept the field format = text
but the google spreadsheet is populated with the following:
java.util.GregorianCalendar[time=1395531335908,areFieldsSet=true,lenient=true,zone=Europe/Dublin,firstDayOfWeek=2,minimalDaysInFirstWeek=4,ERA=1,YEAR=2014,MONTH=2,WEEK_OF_YEAR=12,WEEK_OF_MONTH=4,DAY_OF_MONTH=22,DAY_OF_YEAR=81,DAY_OF_WEEK=7,DAY_OF_WEEK_IN_MONTH=4,AM_PM=1,HOUR=11,HOUR_OF_DAY=23,MINUTE=35,SECOND=35,MILLISECOND=908,ZONE_OFFSET=0,DST_OFFSET=0]
3) I then tried to remove the block "call clock format time" and in the google form I changed the field format = time
but then the google spreadsheet isn't populated with anything.
4)I tried using the segment block, but after a while I realised the block "format time" actually returns this format: "hh:mm:ss AM/PM"
so selecting the 5 characters is not good enough because it does not take into account of the am/pm element as well as the day of the month.
5) I found a temp solution by defining the desired delivery time as a new global variable, and extracting a string in the format hh:mm by joining the blocks ".hour instant" and ".minute instant".
However this is not a final solution because what i extracted is of course a string of text and when sorting, 01:10 will be always considered smaller than 23:50 for example, regardless of the date.
So is there a way of actually saving in the google spreadsheet not a string of text, but actually the date and time?
Problem B)
Secondly, I would like to filter/show only the rows of the google spreadsheet have a delivery time expired by no more than 1 hour (as well as orders with delivery time in the future e.g. in 2 hours from now()).
I tried using some Google Visualization API Query Language commands, altering the url of the google spreadsheet (like WHERE "now() - Delivery Time < 60 mins)" (cannot remember the exact code I wrote) but unsucessfully.
Would anyone know how to filter my results?
thanks in advance
alterettore
So there's a few things to note.
If you're using Taifun's example as you mention, you'll notice that when you submit data to Google Spreadsheets using a form, the first column is always a timestamp, even if you're not submitting a date or time. Trying to send the current date/time is redundant - go ahead and make use of what Google provided.
Google Spreadsheets (and Excel) store Date/Time as a number. If you want to store a date in GS, the best way to do so is not formatted text, but by sending a number. Use AppInventor to calculate the number you need. For example, today (April 27) in GS is 41756. Noon today would be 41756.5
To generate this number, start with AI's Millisecond function. NOTE: Both GS and AI use milliseconds, but they have different 0 points, so you have to manipulate the result a bit. The formula I've used in AI in the past is this:
GS Date/Time = (Clock1.GetMillis(Clock1.Now) / 86400000) + 25569
Hope this helps!