Why is my export sink from Stackdriver only loading the latest audit logs into BigQuery and no historical? - stackdriver

I created an export sink in Stackdriver to load audit logs into BigQuery. I want to be able to see audit logs from the past 3 months. However, when I queried the tables in BigQuery, I am only seeing logs from today and no earlier.
I applied the following filters to my export sink. I also tried removing the timestamp filter but still only seeing logs from today and no prior.
resource.type="bigquery_dataset"
timestamp > "2019-05-01T23:59:09.739Z"

Exports only work for new entries.
Per the documentation --
"exporting happens for new log entries only, you cannot export log entries that Logging received before your sink was created."
https://cloud.google.com/logging/docs/export/#how_sinks_work

Related

Google Cloud Function logs showing error icon for Info log types

I am new to google-cloud and to stack driver logging. deployed a python script as cloud function also enabled in-built logging for python. I had set multiples logs based on the execution.
logging.info("Cloud fucntion was Triggerred on time {}".format(datetime.utcnow()))
While viewing in cloud function logs, i see many, like even the statement above, for which I have set INFO as the log level is displayed with !! icon which according to the legends should be displayed for only error types.
As I am new to this stack-driver logging, I am not sure of the reason, can anyone please explain? Thanks.

How many clients are connected to my firestore?

I am working on a flutter app that fetches 341 documents from the firestore, after 2 days of analysis I found out that my read requests are increasing too much. So I made a chart on the stackdriver metrics explorer from which I get to know that my app is just reading 341 docs a single time, it's the firebase console which is increasing my reads.
Now, comes to what are the questions that are bothering me,
1)How reads are considered when we see data on the console and how can I reduce my read requests? Basically there are 341 docs but it is showing more than 600 reads whenever I refresh my console.
2)As you can see in the picture there are two types of document reads 'LOOKUP' and 'QUERY', what's the exact difference between them?
3)I am getting data from the firestore with a single instance and when I open my app the chart shows 1 active client which is cool but in the next 5 minutes, the number of active clients starts to increase.
Can anybody please explain to me why this is happening?
For the last question, I tried to disable all the service accounts and then again opened my app but got the same thing again.
Firestore.instance.collection("Lectures").snapshots(includeMetadataChanges: true).listen((d){
print(d.metadata.isFromCache);//prints false everytime
print(d.documents.length);// 341
print(d.documentChanges.length);//341
});
This is the snippet I am using. When the app starts it runs only once.
I will try to answer your questions:
How reads are considered when we see data on the console and how can I
reduce my read requests? Basically there are 341 docs but it is
showing more than 600 reads whenever I refresh my console.
Reads are considered depending on your how you query your Firestore database in addition to your access to this database from the console so using of the Firebase console will incur reads and even if you leave the console open to do other stuff, when new changes to database occured these changes will incur reads also, automatically.and any document read from the server is going to be billed. It doesn't matter where the read came from. The console should be included in that.
Check this official documentation under the "Manage data" title you can see there is a note : "Note: Read, write, and delete operations performed in the console count towards your Cloud Firestore usage."
Saying that if you think there is an issue with this, you can contact Firebase support directly to have more detailed answers.
However, If you check the free plan of Firebase you can see that you have 50K free reads per day.
A workaround that I found for this (thanks to Dependar Sethi)
Bookmarking the Usage tab of the Firestore page. (So you basically
'Skip' the Data Tab)
Adding a dummy collection in a certain way that ensures it is the
first collection(alphabetically) which gets loaded by default on
the Firestore page.
you can find his full solution here.
Also, you can optimise your queries however you want to retreive only the data that you want using where() method and pagination with Firebase
As you can see in the picture there are two types of document reads
'LOOKUP' and 'QUERY', what's the exact difference between them?
I guess there are no important difference between them but "QUERY" is getting the actual data(when you call data() method) while "LOOKUP" is getting a reference of these data(without calling data() method).
I am getting data from the firestore with a single instance and when I
open my app the chart shows 1 active client which is cool but in the
next 5 minutes, the number of active clients starts to increase.
For this question, considering the metrics that you are choosing in Stackdriver I can see 3 connected clients. and as per the decription of "connected client" metric:
The number of active connections. Each mobile client will have one connection. Each listener in admin SDK will be one connection. Sampled every 60 seconds. After sampling, data is not visible for up to 240 seconds.
So please check: how many mobiles are connected to this instance and how many listeners do you have in your app. The sum of all of them is the actual number of connected clients that you are seeing in Stackdriver.

Flutter Firebase Analytics Events missing

I am attempting to publish events via FlutterAnalytics but I am experiencing very sporadic behaviour.
Using latest firebase_core and firebase_analytics packages
Using Firebase project on PAYG Blaze plan
Add pushing of events to BigQuery
Using vanilla flutter create project for testing
Downloaded and added google-services.json to android/app and android/app/debug folder
Added firebaseAnalytics.logEvent(name: 'testevent'); in onPressed where counter is incremented
Click button until counter reaches 100
Expect to see 100 events in Firebase Analytics but I see none.
Look in StreamView, after 5 minutes a part of them show up, alongside the automatically collected screen_view etc.
Look in DebugView (after activating adb) they show up instantly.
Look in Events tab, nothing
Look in BigQuery, nothing, not even tables created
They say events don't show up instantly, wait up to 24h, okay:
Wait 24h, no event in Events tab beyond the automatically collected ones
No BigQuery table generated
Wait 48h, no event showing up.
I then proceeded to create several other test firebase projects, with varying degrees of events showing up:
One project has 12 events out of 100 in BigQuery and 100 in Events tab
Another project has no events
Another project has 27 events in Events tab and 12 in BigQuery
Is anybody getting better mileage out of Firebase Analytics ? It cannot be a misconfiguration on my part on a vanilla project as then no events would show up, not this sporadic behaviour across all the projects.
since you are seeing varying levels of events logged, we need to determine if each of the event is sent to the server for processing. this can be checked by enabling the verbose mode.
adb shell setprop log.tag.FA VERBOSE
adb shell setprop log.tag.FA-SVC VERBOSE
adb logcat -v time -s FA FA-SVC
This will help you to verify if the event is logged or not immediately instead of waiting for 24-48 hours for the UI to show that.
If the events are not logged in the verbose mode, then you might want to rephrase your code to send 100 sequential events, if that is required. Another thought is like lots of same events from same user, are so quick and so they are packaged together for processing resulting in various count. ALways do verbose mode to ensure your events are created and sent as you wanted to analyze further.

Google Analytics export to BigQuery

I have a doubt related to the data export from Google Analytics into BigQuery.
Basically, I have configured the streaming export on the Google Analytics side to, in real time, export the data into the BigQuery (table ga_realtime_sessions_YYYMMDD). This streaming is working fine.
At some point at the end of the day, the data from this real timetable is exported into the ga_sessions_YYYYMMDD.
What I need to be explained is how this export (from the real timetable into the ga_sessions one) works.
I have several automatic processes that run around 8 AM (Portugal timezone) and, in the last days, these processes are failing due to the fact that the ga_sessions for the previous day are not created yet.
I checked the time that the ga_sessions are created for every day and this time is very volatile, and for some cases is around 2 AM, 3 AM but in another case is around 7 AM, 8 AM. This time difference could be due to the data size that needs to be exported from the real timetable into the ga_sessions one?
The exports of daily sessions in BigQuery are indeed not completed at the same time everyday. This is due to a fully managed backend, which depends on workloads worldwide.
I suggest that you create an event listener on file creation for ga_sessions_YYYYMMDD, so that only once it is created you can then safely run dependent processes.
E.g. you can export the file in a Cloud Storage bucket, then use a trigger with a Cloud Function.

How to Export Firebase Cloud Functions Logs to a file?

Is there a way to download cloud functions logs as a text file?
Using Stackdriver logging UI, you can view your logs and click on the "Download logs" button to download you latest logs in JSON or CSV format (up to 300 logs at the moment. If you need more, consider exporting your logs).
Using Firebase CLI, you should be able to pipe the output of the functions:log command to a text file:firebase functions:log > logs.txt
You can also use:
gcloud logging read "resource.type=cloud_function" --limit 1000 --format json > logs.json
To get the last 1000 log entries. You can also use --freshness=1d to get entries not older than one day. More info with and examples with gcloud logging read --help

Resources