Retrieving a previous version of a google cloud function - firebase

Is it possible to retrieve a previous version of a Google Cloud Function?
I am aware that this question has been asked before, but the answers are not working.
What I have tried is searching for a bucket containing my older version, but I can't seem to find it.

A workaround to retrieve an older version of Cloud Function would be to download it from the GCS bucket named as gcf-sources-${projectId}-${regionId}. Clicking on the provided url (see below) would let you download copies of the files used for each Cloud Function deployment.

Related

Resume download if _.gstmp files after downloading Sentinel-2 SAFE products using sen2r R package

I have downloaded a large number of Sentinel-2 SAFE files using the R package 'sen2r', which has implemented a Google Cloud download method to retrieve products stored in Long Term Archive. This has worked for me, but after checking the files I have found a decent number of empty files appended with _.gstmp, which according to this represent partially downloaded temporary files that are supposed to be resumed by gsutil. I have re-run the sen2r() command (with server = "gcloud" setting) but it does not resume and correct the downloads as the folders are already there. I would like to resume downloading just the _.gstmp files as it took over a week to download all of the SAFE products and I don't want to start all over again. I'm guessing I can fix this by using 'gsutil' directly but I'm a bit out of my element as this is my first experience using Google Cloud and the sen2r author as they no longer have time to respond to issues on github. If you have any tips for resuming these downloads manually using gsutil command line it would be much appreciated.
I have searched stack exchange and also the sen2r manual and github issues and have found any other reports of the problem.

Is it possible to see a Firebase cloud function source code in a browser for a single function?

My team is using a bunch of cloud functions and when I try to inspect the source on Firebase Cloud Functions, I'm told that "Preview unavailable for archives larger than 512 KB". When I download the archive, it downloads all of the functions instead of the singular function I want to inspect. Is this expected? Is something wrong with my setup?

Does admin.storage.object contain/process multiple files if multiple files are uploaded to Firebase Storage?

I am trying to move files into separate folders in Firebase Storage once they have been uploaded. As it turns out, you can not achieve this with the the JavaScript Web Client SDK for Storage. However, it appears that you could do so with the Admin SDK for Storage using Firebase Functions. So that is what I am trying to do. I understand that you need to first download a file into your Firebase Functions and then re-upload into a new folder in Storage.
To download a file, I need to pass its reference from the client and here is where it gets confusing to me. I am currently getting all the uploaded files in the client via the listAll() function which returns items and prefixes. I am wondering whether or not I can use either the items or the prefixes to then download the files into Firebase Functions using them (items or prefixes). Alternatively, I can pass the URLs. However, the question is, which method do I use to get and download them in Functions afterwards?
I know of admin.storage.object as explained in https://firebase.google.com/docs/storage/extend-with-functions#trigger_a_function_on_changes. However, does it handle multiples files? In other words, the object, as I understand, is one file that is uploaded to Storage and you can use its attributes such as object.bucket or object.name to access more information. However, what if there are multiple files uploaded at the same time, does it handle them one by one? Also, if I am passing the references or URLs of the files that need to be downloaded from the client, is admin.storage.object the right choice? Because it seems to simply process all the files uploaded to Storage, instead of getting any references from the client.
Further, there is a description of how to download a file (https://firebase.google.com/docs/storage/extend-with-functions#example_image_transformation) which is this code: await bucket.file(filePath).download({destination: tempFilePath});
I understand that the filepath is basically the name of the file that is already in Storage (ex. /someimage). But what if there are other files with the same name? Might the wrong file be downloaded? And how do I make sure that the filepath is the file that I passed from the client?
Let me know what your thoughts are and whether or not I am heading in the right direction. If you include a code in your answer, please write it in JavaScript for the Web. Thank you.
Thank you!
Here are some points that could help:
In GCP Storage technically there are no folders, GCS emulates the directory structure by using / in the names of objects.
When setting a cloud function triggered by a GCS object change, each object change is an event, each event triggers an invocation of the function (you might have an bucket for unprocessed files which triggers the function and have them move to a different bucket when proccesed)
You might consider using the REST API to move/copy/rename the objects without having to download them
As a side note the question is a little too broad, possibly these points could help clarify things for you.

Clarification for locating .RDS OAuth tokens generated by `gargle` package

Not strictly an RStudio question but the relevant packages are maintained by this community I think.
Apologies for not providing a working example of the problem. The question pertains to use within a container, and I am not sure if an example will provide clarity. :|
I am reading this about non-interactive authentication to google, for googlesheets/googledrive (either will do).
https://cran.r-project.org/web/packages/gargle/vignettes/non-interactive-auth.html
I am confused by the following:
If you know the filepath to the token you want to use, you could use readRDS() to read it and provide as the token argument to the wrapper’s auth function.
How would you know this filepath? That requires some attention to the location of gargle’s OAuth token cache folder, which is described in the next section.
I can't find a clear description in this or the linked documents of where to find the .RDS file.
My use-case is to read or download a googlesheet (via googlesheets or googledrive packages) within a docker package.
I was previously doing this just fine with the old token setup, before google made the changes their side.
Also I have no problem running non-interactively within a scrip on my machine, using httr::oauth_app and drive_auth_configure, then drive_auth as described in the above link.
I am just not clear as to how I can locate the .RDS file from the above documentation.
If you could give a brief description, or if you were so kind as to provide a link to other documents, or sections of the above that I have overlooked or misunderstood I would be very grateful.
Thank you!
Simon

DocDb Emulator missing functions

I'm using DocumentDb emulator in my dev environment. I simply downloaded it from the provided link and the local version is 1.11.72.11.
In the Explorer view, I don't see any of the functions to manipulate my database or documents e.g. save, create document, etc. that appear in the intro video.
This is what's in the video:
And this is what I see on my local version:
Any idea why those functions are missing and how I can fix this?
Just got an email response from the DocDb team that they were able to reproduce this issue and they're working to correct it.

Resources