Retrieve a list of deleted Clockify time entry ids - clockify

We're syncing time entries through to an ERP system via the API, but if a time entry has been deleted in Clockify, we need to remove it from the ERP as well. How can we find deleted time entry ids from Clockify API?

Without knowing more about the ERP you are using (and without having ever used ERP myself), it should be trivial to check individual entry IDs in your ERP against Clockify using the "get single time entry" API call. Furthermore, if you enable the Clockify feature that freezes entries after a certain amount of time, you can ensure that you only need to check entries from, say, the past two weeks.

Related

What Firebase mechanism to use for system logging, Fabric, Firebase Analytics or RealTime DB

I have a problem to understand my data in the Firebase Realtime Database,
it is not always clear and easy to navigate and query the data.
The database entries are not easily understood and interpreted. Adding more data to make the entries more clear will add to the data storage costs.
Adding extra data for a certain duration, cannot be cleared out easily. It is also not visible when database read operations are performed. Dates are stored as an integers and cannot be interpreted easily. I have to use a bash command "date -r ", etc.
Clearly the database alone is not enough to debug flow of events,
and as the database grows, analysis of data will be harder.
I have no record of the sequence of the events that describes the database entries.
Possible solutions:
(1) Use Realtime Database
I create another Realtime Firebase "node" and log all the events in a human friendly way to this node". This way I have control over this data and can clear this at any time I wish, minimising the data "costs". The only problem I see with this is that I will have to periodically have to remember to clear this data. (Perhaps Firebase has some periodic scheduler to call some process). Or use some mobile clients to trigger the events...
(2) Fabric
My other option is to use Fabric's Answers, but after looking at the reporting of this data, it does not really suite my needs, no filtering and the details of the messages are not really as verbose as I have expected.
(3) Firebase Analytics
I am not sure about Firebase Analytics, I see no mechanism to clear events,
will this add to my costs? Will it be easy enough to filter / query logs
to analyse a certain sequence of events.
Typically I would like to see data something like this:
data_time_user_friendly,user_id,user_friendly_id,event_action,payload
What is the best practice to have a remote syslog for analysing my data and flow of events.
Edited ....
After some searching, I found there are numerous products that seem more suitable and are specifically developed for logging.
I did a quick comparison of some "free" products for system logging:
Papertrail 100 MB/month 7 days retention search 48 hours
Loggly 200 MB/day 7 days retention
Bugfender 100K lines/day 1 day retention
Logz.io 1GB/day 3 days retention
This is just a quick comparison and have not evaluated any of the options.
Does Firebase have a solution for this or is it best to use one of the above mentioned products?

Project deletion date in Google API Console

As specified in the Google Platform Console Help, there is a period of 7 days to remove a project once it has been selected for deletion. And it also says this period could vary depending on the billing setup.
This period has expired and I want to know what was the deletion date and what is the expected date for this operation to be completed. Is this possible from the Google API Console?. This is needed in order to create extra Firebase projects, I have reached the limit and I can't create new projects until the older ones are removed.
It could take up to 30-day period now to be completely removed. From Google documentation "Note that after the 7-day waiting period ends, the time it takes to completely delete a project may vary.".

Firebase: Data structuring query

I am a newbie to firebase and need some suggestions on structuring data.
I am designing a database for an application where multiple people may share a bank account and can update the status of this account. The group of people sharing this account may also keep changing. So, multiple people may perform actions which will influence the balance available in an account. I decided to list the linked accounts under each user so that one pull is enough to get a list of all user accounts once user logs in. If user is interested in details of a specific account (like balance) then I will go and fetch that child from accounts sub-tree. It all seems fine until I think about notifying users in a smartphone app if the balance associated with any of their linked account changes. Since the balance attribute is not under any specific "user" sub-tree, how to monitor for this change at the application level.
I don't want to bring balance attribute under "user" sub-tree or else I will have to find the duplicate copies and update all of them whenever balance of an account changes. Moreover this approach will not scale well.
Any suggestion?

Sending notifications according to database value changes

I am working on a vendor portal. An owner of a shop will login and in the navigation bar (similar to facebook) I would like the number of items sold to appear INSTANTLY, WITHOUT ANY REFRESH. In facebook, new notifications pop up immediately. I am using sql azure as my database. Is it possible to note a change in the database and INSTANTLY INFORM the user?
Part 2 of my project will consist of a mobile phone app for the vendor. In this app I, too , would like to have the same notification mechanism. In this case, would I be correct if I search on push notifications and apply them?
At the moment my main aim is to solve the problem in paragraph 1. I am able to retrieve the number of notifications, but how on earth is it possible to show the changes INSTANTLY? thank you very much
First you need to define what INSTANT means to you. For some, it means within a second 90% of the time. For others, they would be happy to have a 10-20 second gap on average. And more importantly, you need to understand the implications of your requirements; in other words, is it worth it to have near zero wait time for your business? The more relaxed your requirements, the cheaper it will be to build and the easier it will be to maintain.
You should know that having near-time notification can be very expensive in terms of computing and locking resources. The more you refresh, the more web roundtrips are needed (even if they are minimal in this case). Having data fresh to the second can also be costly to the database because you are potentially creating a high volume of requests, which in turn could affect otherwise good performing requests. For example, if your website runs with 1000 users logged on, you may need 1000 database requests per second (assuming that's your definition of INSTANT), which could in turn create a throttling condition in SQL Azure if not designed properly.
An approach I used in the past, for a similar requirement (although the precision wasn't to the second; more like to the minute) was to load all records from a table in memory in the local website cache. A background thread was locking and refreshing the in memory data for all records in one shot. This allowed us to reduce the database traffic by a factor of a thousand since the data presented on the screen was coming from the local cache and a single database connection was needed to refresh the cache (per web server). Because we had multiple web servers, and we needed the data to be exactly the same on all web servers within a second of each other, we synchronized the requests of all the web servers to refresh the cache every minute. Putting this together took many hours, but it allowed us to build a system that was highly scalable.
The above technique may not work for your requirements, but my point is that the higher the need for fresh data, the more design/engineering work you will need to make sure your system isn't too impacted by the freshness requirement.
Hope this helps.

Fastest way to get basic information from google analytics api

My GA Account has a number(50) of profiles associated with it and I am trying to build an api which shows me the basic information like visits, bounce rates etc. for each profile.
This query gets me what I want from GA, but for each profile:
URL ="https://www.google.com/analytics/feeds/data?ids=ga:11111&start-date=2011-07-01&end-date=2011-07-02&metrics=ga:visitors&prettyprint=true&alt=json"
The id is table id and the metrics gives me the information I want.
Now the problem is, I want to show all the information together. So, everytime I will have to send 50 requests to the API, which just doesn't work out. Is there a way I can get the information for all the profiles associated with me in a single request?
You unfortunately will be required to perform 50 requests if you want metrics for 50 different profiles. You can easily automate this, however, by using a combination of the Management API and the Data Export API.
The Management API allows you to pull information about the account. For example, you can very easily pull all profile IDs and names associated with an Analytics account through this API for use in an automated query.
The Data Export API, which I am sure you already are familiar with, is the only way to pull collected data/statistics for individual profiles.
If you are concerned about speed, you might want to build an automated process that uses both the Management API and the Data Export API. Pull all of the profiles associated with your account with the Management API, then loop through each and pull the basic data you'd like through the Data Export API. Have this run at regular intervals based on your needs and cache it between runs. This way it won't execute every time the page is hit (though you honestly might be fine, depending on your traffic - I've found it to be extremely quick).

Resources