Telegram, tracking message edit/delete and editing my own messages (Client, not Bot API) - telegram

So I'm trying to implement the logging of telegram chats into my ELK storage in a proper way, and the existing solution with tgcli is too old (I also have a PoC which logs message edits from Android client via Xposed, but its implemented on top of UI level and is ineffective)
I need to receive edits/deletion of messages, and do it with client Telegram API.
Spent a day on researching it:
support for editing messages appeared in May 15, 2016 (telegram blog)
telegram-cli's tgl library is 2 years old and most likely has no support for that layer
I looked into telegramdesktop source as it was very promising, unfortunately their git history has no scheme changes poiting to edit support.
And the official layer version list is truncated. Security via obscurity eh.
from some tests done with golang library used in shelomentsevd/telegramgo, edits in supergroup are handled by TL_updateChannelTooLong message
Now I don't want to lose more time picking the libraries/sources. So, I'm asking about the experience with either of the following libraries, I'm looking for exactly one library which will allow to implement the required features fast - for someone who doesn't want to dive deep into MTProto's specifics.
sochix/TLSharp is missing explicit examples about getting edits. Probably would be hard
danog/MadelineProto seems like a good place to start
there are also tdlib, libqtelegram, TelegramAPI

It's much easier to do it in telethon.
Here is a sample code I've put together gathering snippets directly from the docs.
from telethon import TelegramClient, events
API_ID = ...
API_HASH = " ... "
client = TelegramClient('session', api_id=API_ID, api_hash=API_HASH)
#client.on(events.MessageDeleted)
async def handler(event):
# Log all deleted message IDs
for msg_id in event.deleted_ids:
print('Message', msg_id, 'was deleted in', event.chat_id)
#client.on(events.MessageEdited)
async def handler(event):
# Log the date of new edits
print('Message', event.id, 'changed at', event.date)
with client:
client.run_until_disconnected()
Docs for: MessageEdited, MessageDeleted)

Related

Missing videoTrack in a multitrack stream in Ant media server 2.4.1

We have a Multitrack web conference implementation using AMS 2.4.1 version. Its working great for our use case, except in one scenario. When there are N (< 3) number of users and they on there camera simultaneously, then few remote users are not rendered as we don't receive the video tracks for those users in newStreamAvailable. We only receive the audio track for those users. We are able to reproduce this quite frequently.
As a backup, I am trying to poll AMS using getTrackList with the main track Id to get all available streams, but I am not getting any message trackList
var jsCmd =
{
command : "getTrackList",
streamId : streamId, // this is roomId or main track id
token : token
}
Any insight would be helpful.
Thanks,
We were able to resolve the issue, posting here to help anyone who might be facing a similar issue.
With push notifications from the server, we might encounter issues when for some reason push operation doesn't succeed. In that case, it's better to have a backup plan to pull from the server and sync.
The Ant Media Server suggests pulling the server periodically for the room info. The server will respond with active streams and the application should synchronize.
For reference, please refer to following link https://resources.antmedia.io/docs/webrtc-websocket-messaging-reference

telegram use schedule message

I want to schedule a telegram bot message to be sent at a specific unixtime.
As from telegrams official api (https://core.telegram.org/api/scheduled-messages) that should be possible by setting the schedule_date flag.
To schedule a message, simply provide a future unixtime in the schedule_date flag of messages.sendMessage or messages.sendMedia.
However I was not able to set that flag. To be more precisely, I do not even know how to set a flag, or if I am using the correct api.
What I have tried is to use the api directly via the browser (could use curl as well) like so: https://api.telegram.org/botBOT:TOKEN/sendMessage?chat_id=ID&text=Test&schedule_date=1653503351
I also did not find any way to access this flag via https://pypi.org/project/pyTelegramBotAPI/#description https://telepot.readthedocs.io/en/latest/#send-a-message, nor https://github.com/nickoala/telepot.
I want to implement this feature in a python environment, but any working suggestion would be much appreciated.
EDIT:
I decided to save the intention to send a telegram bot message at a certain unixtime in a database. I then create an infinite loop that checks if there are any unsent messages before the current timestamp. If the loop detects such a message it sends the message and sets a flag, that that message has been sent.
And as promised, here is a fully dockerized example of that behaviour in action: https://github.com/Sokrates1989/nameTheCountDown-lightweight
It creates a bot that you can pass a name and the duration. Once the duration has passed it sends a message with the passed name. Basically a simple countdown that you can give several names, that run simltaniously. As it is a telegram chat, you can modify the way you are informed about the end of a countdown by modifying the notificaiton of that chat.
And here is the Bot in action: http://t.me/NameTheCountdownBot
We can't do this by bot API itself, and there's no schedule_date parameter in sendMessage method:
https://core.telegram.org/bots/api#sendmessage
And what you've read is for Telegram clients, not bot API consumers.
If you don't really need unixtime, you can simply create a table for scheduled messages with a text, chat_id and a publish_time column (like 22:15), and run a command every minute to look if there's a message for current time to send. Then send the message and delete the record.
Note that the python-telegram-bot library has a built-in solution for scheduling tasks: The JobQueue. This feature is based on the APScheduler library, which you can ofc also use without python-telegram-bot.
Disclaimer: I'm currently the maintainer of python-telegram-bot.
https://core.telegram.org/method/messages.sendScheduledMessages
Now you can send scheduled messages right away

How do you "restart" receiving Firebase Alerts on short code 44398

I often use Firebase phone auth in my applications. I have one user who religiously sends "STOP" to all text messages. Big oops. Now she can not login to any application using the system...
Short code phone # = 44398
If the user types "STOP" to that short code, the system responds with:
Firebase: You are opted out and will receive no further messages. For
HELP, reply HELP. Msg & Data rates may apply
Type "HELP", the response is:
Firebase: For more info: https://firebase.google.com/support/ -
Msg&Data rates may apply.
My question. How do you "RESTART" the service? The Firebase support page offers no help here.
I've tried "GO", "RESTART", "UNSTOP". All of those fail.
Here's a posting on how Twilio addresses the topic. Twilio uses START, YES and UNSTOP to restart a service (on long code source). Each of those fail here.
Twilio also provides a link to standards for short code expressions., but I'm not seeing anything on restarting a service.
Here is a screen shot (of my phone):

Can Cloud Functions for Firebase be used across projects?

I was hoping to trigger a Pub/Sub function (using functions.pubsub / onPublish) whenever a new Pub/Sub message is sent to a topic/subscription in a third-party project i.e. cross projects.
After some research and experimentation I found that TopicBuilder throws an error if the topic name contains a / and it defaults to "projects/" + process.env.GCLOUD_PROJECT + "/topics/" + topic (https://github.com/firebase/firebase-functions/blob/master/src/providers/pubsub.ts).
I also found a post in Stack Overflow that says that "Firebase provides a (relatively thin) wrapper around Google Cloud Functions"
(What is the difference between Cloud Function and Firebase Functions?)
This led me to look into Google Cloud Functions. Whilst I was able to create a subscription in a project I own to a topic in a third-party project - after changing permissions in IAM - I could not find a way associate a function with the topic. Nor was I successful in associating a function with a topic and subscription in a third-party project. In the console I only see the topics in my project and I had no success using gcloud.
Has anyone had any success in using a function across projects and, if so, how did you achieve this and is there a documentation URL you could provide? If a function can't be triggered by a message to a topic and subscription in a third-party project can you think of a way that I could ingest third-party Pub/Sub data?
As Pub/Sub fees are billed to the project that contains the subscription I would prefer that the subscription resides in the third-party project with the topic.
Thank you
Google Cloud Functions currently does not not allow a function to listen to a resource in another project. For Cloud Pub/Sub triggers specifically you could get around this by deploying an HTTP-function and add a Pub/Sub push subscription to the topic that you want to fire that cross-project function.
A Google Cloud Function can't be triggered by a subsription to a topic of another project (since you can't subscribe to another project's topic).
But a Google Cloud Function can publish to a topic of another project (and then subscribers of this topic will be triggered).
I solved it by establishing a Google Cloud Function in the original project which listens to the original topic and reacts with publishing to a new topic in the other project. Therefore, the service account (...#appspot.gserviceaccount.com) of this function "in the middle" needs to be authorized by the new topic (console.cloud.google.com/cloudpubsub/topic/detail/...?project=...), i.e. add principal with role: "Pub/Sub Publisher"
import base64
import json
import os
from google.cloud import pubsub_v1
#https://cloud.google.com/functions/docs/calling/pubsub?hl=en#publishing_a_message_from_within_a_function
# Instantiates a Pub/Sub client
publisher = pubsub_v1.PublisherClient()
def notify(event, context):
project_id = os.environ.get('project_id')
topic_name = os.environ.get('topic_name')
# References an existing topic
topic_path = publisher.topic_path(project_id, topic_name)
message_json = json.dumps({
'data': {'message': 'here would be the message'}, # or you can pass the message of event/context
})
message_bytes = message_json.encode('utf-8')
# Publishes a message
try:
publish_future = publisher.publish(topic_path, data=message_bytes)
publish_future.result() # Verify the publish succeeded
return 'Message published.'
except Exception as e:
print(e)
return (e, 500)
google endpoints can be a easier solution to add auth to the function http.

Python receive Google Drive push notification

Since the Drive SDK v3 we are able to receive push notifications from Google Drive whenever a file has changed. At the moment I'm working on a Drive application in Python and I would like to receive such notifications. Do I really need a web server for this or can I implement this maybe with a socket or something like this?
I know that I can get changes by polling the changes.list method but I want to avoid this because of so many API calls. Is there maybe a better way to get informed if a file has changed?
EDIT: I captured my web traffic and saw, that the original Google Drive Client for Windows uses push notifications. So in some way it must be possible to get push notifications in a desktop application but is this maybe some sort of Google magic which we can't use with the current API
For Google Drive apps that need to keep track of changes to files, the Changes collection provides an efficient way to detect changes to all files, including those that have been shared with a user. The collection works by providing the current state of each file, if and only if the file has changed since a given point in time.
Retrieving changes requires a pageToken to indicate a point in time to fetch changes from.
# Begin with our last saved start token for this user or the
# current token from getStartPageToken()
page_token = saved_start_page_token;
while page_token is not None:
response = drive_service.changes().list(pageToken=page_token,
fields='*',
spaces='drive').execute()
for change in response.get('changes'):
# Process change
print 'Change found for file: %s' % change.get('fileId')
if 'newStartPageToken' in response:
# Last page, save this token for the next polling interval
saved_start_page_token = response.get('newStartPageToken')
page_token = response.get('nextPageToken')

Resources