Can files stored on a Chromebook? - chromebook

Can files stored on a Chromebook be accessed if the user profile they were created and saved under has been deleted? The Chromebook owner stated that several user accounts were deleted to better organize management. He then provided me with a new account (new gmail account) and stated the files are on the Chromebook. I attempted to recover the deleted accounts but was unsuccessful.

Related

R/googlesheets4 non-interactive session

When I use googlesheets4 in R, I use sheets_auth() in the console and it works fine. But when I try to run it in R markdown, and when I try to knit, I cannot seem to get the credentials. Can someone walk me through the process? I've gone to the vignettes for googlesheets4 but cannot seem to understand it.
This is working for me
gs4_auth(path = "xxxxxxxxxxxxxxxx.json")
It doesn't return anything, but after that I'm able to write data in my sheet with sheet_write()
To get the credentials in a json file you have to follow these steps:
From the Developers Console, in the target GCP Project, go to IAM & Admin > Service accounts.
Give it a decent name and description.
For example, the service account used to create the googledrive docs has name “googledrive-docs” and description “Used when generating
googledrive documentation”.
Service account permissions. Whether you need to do anything here depends on the API(s) you are targetting. You can also modify roles
later and iteratively sort this out.
For example, the service account used to create the googledrive docs does not have any explicit roles.
The service account used to test bigrquery has roles BigQuery Admin and Storage Admin.
Grant users access to this service account? So far, I have not done this, so feel free to do nothing here. Or if you know this is useful
to you, then by all means do so.
Do Create key and download as JSON. This file is what we mean when we talk about a “service account token” in the documentation of gargle
and packages that use gargle. gargle::credentials_service_account()
expects the path to this file.
Appreciate that this JSON file holds sensitive information. Treat it like a username & password combo! This file holds credentials that
potentially have a lot of power and that don’t expire.
Consider storing this file in such a way that it will be automatically discovered by the Application Default Credentials search
strategy. See credentials_app_default() for details.
You will notice the downloaded JSON file has an awful name, so sometimes I create a symlink that uses the service account’s name, to
make it easier to tell what this file is.
Remember to grant this service account the necessary permissions on any resources you plan to access, e.g., read or write permission on a
specific Google Sheet. The service account has no formal relationship
to you as a Google user and won’t automatically inherit permissions.
(copied from here https://gargle.r-lib.org/articles/get-api-credentials.html#service-account-token)

Linking Google Analytics 360 to Big Query, permissions issue

I have linked GA360 to Big Query. I do have a service account added to GCP as per documentation. The account I used has Project Owner permissions as required to link to said project.
Can I remove the Project Owner permissions from the GCP account once the link has been established in GA360? I do not want that account to have such a high access level to the project.
I did run a test on a small scale and it worked but I am not willing to risk a transfer failure on all of the data in production.
Yes, you can remove the permissions from the account you used to link GA360 to BQ.
The permission is only required for the time of setting this up.
It is not being checked whether the account which set up a connection is still active or has the same rights.
We have had multiple views linked by different accounts, of which most are not in the team anymore and therefore do not have "owner" rights anymore. The exports still work though (which makes sense, given that a company might keep using GA and the exports but part ways with the internal/external employee who sat it up).

Firebase Storage: Is it possible to auto - delete old files?

I use firebase cloud storage to upload images.
The app I am working on allows users to send images to one another (a chat thing), so that one user uploads the photo and another one will download it and once it is downloaded it should be deleted from the storage.
Example of what I am talking about
User A sends a photo to User B by uploading it to firebase storage, then User B notices that User A send him an image and decides to download it, after User B downloaded the image it should be deleted from storage.
My question
What if User A sends too many images and User B never downloads any of these images. Then this means that I will end with useless images on storage taking space.
So in this case is there a way in firebase to auto delete a file after it has been uploaded after (n) amount of time (not client side)?
I am still in the middle of doing this research, but it seems like you can use life cycle rules to delete files based on the age of the file.
Here are a few examples listed in the intro of the doc.
Downgrade the storage class of objects older than 365 days to
Coldline Storage.
Delete objects created before January 1, 2013.
Keep only the 3 most recent versions of each object in a bucket with
versioning enabled

OpenEdge SQL DBA Account Setup

I'm setting up SQL access in a newly created OpenEdge 11.5 database.
In checking the contents of the sysdbauth table using "select * from sysprogress.sysdbauth", I see that there are two users setup by default: sysprogress and a user with the name of the Linux user account that was used to create the database.
I'm looking for recommendations as to how to handle these two accounts. Obviously I want to have an account to use for DBA tasks. Should I use one of these accounts for the purpose? If so, what should I do with the other account?
Is it possible (and safe) to be deleting either of these predefined accounts?
On page 175 of the Database Administration guide you can read about default users and why they are created:
Tables used from SQL only
An SQL database administrator (DBA) is a person assigned a sysdbauth record in the database.
SQL DBAs have access to all meta data and data in the database. To support internal schema
caching, every OpenEdge database begins with a DBA defined as "sysprogress." However,
OpenEdge restricts the use of "sysprogress."
When you create an OpenEdge database using the PROCOPY or PRODB commands, and the
database does not have any _User records defined (from the source database, for example), then
a DBA is automatically designated with the login ID of the person who creates the database. This
person can log into the database and use the GRANT statement to designate additional SQL DBAs,
and use the CREATE USER and DROP USER statements to add and delete user IDs.When creating
users, this DBA can also specify users as SQL-only users, who can only access the database
through SQL.
There are several knowledge base entries around the task of deleting or disabling the default users.
http://knowledgebase.progress.com/articles/Article/P5094
http://knowledgebase.progress.com/articles/Article/P161411
This suggests that it's really safe to delete or disable these accounts but you should:
1) Create replacing accounts first.
2) As always: test in a separate environment first and not in production!
Yes, in fact Progress kind of expects you to do so. Create a root account and get rid of both. It's fine.

Track deleted records in LDAP

We are using OpenLDAP 2.3 to store contacts.
We have built a java project using Spring LDAP to do weekly export of the contacts. In the export file we flag if a contact is newly added or if an existing contact is modified. This works fine. The issue is when a contact is deleted in LDAP. When a contact is deleted, the whole record is removed in LDAP.
Is there a way in LDAP to keep a track of deleted records?
Openldap has an audit module you can add in. You would have to query that to see what was deleted in the last period. Another option outside of ldap is to keep a list of contacts of exported from the previous run and compare the two at the end.

Resources