In the chat history (summary) page of my app I'm using the function getUnreadCount() on MesiboProfile to get the number of messages currently unread so that I can show an indicator near the message.
The problem is that count is only correct the first time I read the summary from the read session. If it arrives a new message when I already read the summary, that count is not updated.
I saw that the counter gets fixed if I read the summary again but is this the recommended way to update that counter?
I'm using the iOS SDK v1.9.55
In 1.x, the unread count can be updated manually. Set the unread count to zero once you read it or increment it every time you receive a new message. This avoids database access. Here is the 1.x code which does the same.
Update: you can also use getUnreadMessageCount() in the user or group readsession (not the summary session) to get it from the database.
https://github.com/mesibo/ui-modules-ios/blob/master/Messaging/Messaging/UserListViewController.m#L474
In 2.x, we have moved this to API with additional logic.
Related
I have created a watch Channel on my calender and I am successfully receiving all updates from Google PUSH Notifcation.
But I am not able to use that response to get craeted/updated events.
I read few docs and SO questions that I need to use X-Goog-Resource-ID from the response and hit events list API.
But value of this X-Goog-Resource-ID is neither a calender id and neither it is a event id so how can I use this in events list API ?
I am using Python and Service Account for the integration.
Documentaion :
https://googleapis.github.io/google-api-python-client/docs/dyn/calendar_v3.events.html#list
https://developers.google.com/calendar/api/guides/push#making-watch-requests
Response from PUSH :
"X-Goog-Channel-Expiration": "",
"X-Goog-Channel-ID": "",
"X-Goog-Channel-Token": "",
"X-Goog-Message-Number": "",
"X-Goog-Resource-ID": <resource id>,
"X-Goog-Resource-State": "exists",
"X-Goog-Resource-URI": <calender UI>
Google Functions I tried using :
service = build('calendar', 'v3', credentials=credentials)
service.calendars().get(calendarId=X-Goog-Resource-ID).execute()
service.events().list(calendarId=calenderId', eventId=X-Goog-Resource-ID).execute()
Is their any ref Python Example of using digesting Calender PUSH Notification or which API/Function I need to call with what oaarms to get the created/updated events ?
The X-Goog-Resource-ID header holds a value that identifies that particular resource across the APIs. The whole push notifications basically informs you that something has changed on that calendar.
Now if you want to know exactly what changed, I strongly advise you to perform a synchronisation. One way to do this is to perform a full synchronisation and store the nextSyncToken. Then, when you receive a push notification telling you about a change in the calendar, you only have to use the syncToken to know what has changed since your last synchronisation. You can see a working full example on the linked docs.
UPDATE
If you are watching multiple calendars through push notifications, you will need a system in place to track which calendar is being modified at a time. The X-Goog-Resource-ID header maps with the Calendar ID, and it can be used along syncToken to run a events.list() request to receive the updated events.
I want to create a telegram bot to send updates to the groups/channel in which it is added. I used BotFather to create a bot. However, in https://api.telegram.org/bot<BOTAPI>/getUpdates, I'm getting all the messages sent in a channel like this: "channel_post":{"message_id":59,"chat":{"id":-1001192794322,"title":"Nseindia","username":"nseindia_updates","type":"channel"},"date":1588581996,"text":"AMBUJACEM : Bear\nAPOLLOHOSP : Bullish Reversal\nKOTAKBANK : Bullish\nMOTHERSUMI : Bear"}}
This is not a problem now, but as time goes, the json file could get very large and could pose a problem.
Is there any way such that I don't get all the messages in the json present in https://api.telegram.org/bot<BOTAPI>/getUpdates
you should specify the update_id of the latest update you've processed as an offset parameter to getUpdates to make them(updates with less update_id) marked processed and that way they wont come up the next time you call getUpdates.
In telegram's Bot API Docs it says:
By default, updates starting with the earliest unconfirmed update are
returned. An update is considered confirmed as soon as getUpdates is
called with an offset higher than its update_id.
I am using Change Feed processor library to read the Change Feed on a partitioned collection and below is the code for how I have configure it. I ma using most of the default options.
ChangeFeedProcessorOptions feedProcessorOptions = new
{
LeaseRenewInterval = TimeSpan.FromSeconds(15),
};
var docObserverFactory = DocumentFeedObserverFactory.Create(this.destinationCollectionInfo, this.dbRepository);
this.builder
.WithHostName(hostName)
.WithFeedCollection(this.monitoredCollectionInfo)
.WithLeaseCollection(this.leaseCollectionInfo)
.WithProcessorOptions(feedProcessorOptions)
.WithObserverFactory(docObserverFactory);
This runs fine as long as the Change Feed application is running and documents are being inserted/updated in the collection and the Change Feed app picks them up as expected.
The problem happens when I stop the Change Feed app for sometime and insert/update few documents in the Collection. Then when I start the Change Feed app, it doesn't pick changes from where it last left. Those changes that were inserted when the Change Feed app was stopped are lost. But when I set the flag StartFromBeginning to true, it picks everything from the start including changes that were inserted when the Change Feed app was stopped in between for sometime.
My understanding of read from current (StartFromBeginning to false) is that the Change Feed reads documents since it last left. But that doesn't seem to happen. Please help.
There are two ways to continue from exactly where you left it.
The first, and more accurate one, is to store the Continuation token of the last thing you read. That way you can specify it when you start again and it will win over both the StartTime and the StartFromBeginning flags.
The second one is to provide the StartTime property which will try and find the continuation token of a given time automatically. It has an approximate 5 second precision so there is a chance that you might miss some documents though.
Given the starting time/date and duration, how can I make a server side calculation that determines if an object is "finished", "in progress", or "upcoming"
--Show
--duration: "144"
--startDate: "2015-11-10"
--startTime: "14:00"
--status: "?"
Client-side javascript to determine if the show has started yet:
// if negative, then show hasn't started yet
var time = (-(startdate.getTime() - currentdate.getTime()) / 1000 / 60);
Client-side javascript to determine if the show has finished running yet:
// if negative, then show has finished
var timeLeft = channelDuration - timerStartTime;
There is no way to run your own server-side code on Firebase. See:
Common Firebase application architectures
Firebase Hosting with own server node.js
How would I run server-side code in Firebase?
How to write custom code (logic) when using firebase
But you can store a server-side timestamp, which seems what you're trying to do:
ref.child('Show/startTimestamp').set(Firebase.ServerValue.TIMESTAMP);
You can then get the shows that haven't started yet with:
var shows = ref.child('Shows');
ref.orderByChild('startTimeStamp').startAt(Date.now()).on(...
For someone passing by, I think now Firebase allow you to do this by Cloud Function. For this case, you can create the function that determine the status of the status by other parameter when the data is added to you database.
Please checkout
https://firebase.google.com/docs/functions/
I have not seen any discussion or awareness so far that Firebase does in fact make available a unique identifier--in fact the full URL--to each specific data record via their "snapshot" which they return, i.e. the wrapper around the data record (accessed via snapshot.val()). By doing a basic property examination of the snapshot I discovered that the unique URL is available (see examples below). However, it seems that, for some reason, Firebase keeps changing the name of the key every few days, causing my application to break. I have to go in and re-discover the new URL property key and change it so that it will work again.
Here are three examples of how I have seen the key change so far. Each value is the same, but the key keeps changing over time (i.e.: "Wb", "Xb", "bc").:
getMemberBySnapshot - snapshot has prop Wb with value https://prototype1.firebaseio.com/users/-IwohKfw1l5F3gFqyJJ5
getMemberBySnapshot - snapshot has prop Xb with value https://prototype1.firebaseio.com/users/-IwohKfw1l5F3gFqyJJ5
getMemberBySnapshot - snapshot has prop bc with value https://prototype1.firebaseio.com/users/-IwohKfw1l5F3gFqyJJ5
I have read Firebase's suggestions that developers should use an email address if they want a unique key (what if my model does not use an email field? What if a user wants to change their email?), or Firebase suggests altenatively to retrieve all existing records and then search through them on the client. Neither of these solutions are satisfying. But I'm seeing that they do provide the unique URL to each data record in the 'snapshot'. Why do they not provide a stabilized key so that a developer can call it consistently???
Firebase.js is a compiled script. The names of internal variables will change every time we compile it and release a new version, so you should definitely not be relying on any properties that are not documented on our website.
For your specific case, you should be using:
snapshot.ref().toString()
in order to get the URL.