How can I use cctrlapp without constantly entering credentials? - cloudcontrol

I've started playing experimenting with cloudcontrol.com. They provide a cli application called cctrlapp for managing projects.
However, many useful operations require a login. It is cumbersome and frustrating to have to put in my email address and password every time I push the current state of my app.
Can ccrtlapp be configured to use stored credentials?

Recommended: We now support authentication via public-keys between CLI and API which is more secure and more convenient. To set it up, simply run:
$ cctrluser setup
Read more about this here: http://www.paasfinder.com/introducing-public-key-authentication-for-cli-and-api/
Alternatively: You can set your credentials via the 'CCTRL_EMAIL' and 'CCTRL_PASSWORD' environment variables. If set, they're automatically used by cctrlapp.

Related

why does `airflow connections list` shows unencrypted results?

Airflow version: 2.1.0
I set FERNET_KEY and checked login/password fields are encrypted when I add connections via Web UI.
However, when I add a connection via CLI:
airflow connections add 'site_account' \
--conn-type 'login' \
--conn-login 'my_user_id' \
--conn-password 'wowpassword'
And run airflow connections list, it shows everything in raw value(not encrypted at all).
I think this could be dangerous enough if I manage all connections using CLI commands (I want to make my airflow infra restorable. That's why I tried to use CLI command to manage connections)
How to solve it?
Airflow decrypts the connections passwords during the processing of your cli commands.
You can use airflow connections list --o yaml to see whether your record was actually encrypted in the database or not.
Furthermore, if you are able to access the cli, you are also able to access the config, meaning you can always extract the database connection and fernet_key and get the full password on your own.
Jorrick answer is correct however I want to elaborate on the background as I feel it will bridge between the question and the answer.
It's very understandable that Airflow needs to be able to decrypt the connection when DAG/user asks to. This is needed for normal operation of the app so Airflow must assume that if a user can author DAGs he is permitted to utilize the system resources (Connections, Variables).
The security measurements are on a different level. If utilizing them (using Fernet) then Airflow will encrypt the sensitive information (like connection passwords) this means that in the database itself the value is encrypted. The security concern here is where the ferent_key is stored? is it rotating? etc...
There are many other security layers that handle different aspects like: access control, hiding sensitive information in the Ui but that's a different topic.
I think the important thing to understand that security handles two types of users:
A user that is permitted but you just want to limit what actions he can do or what he can see. (This is more what Airflow itself handles see security docs)
A user that is malicious and wants to do damage. While Airflow does provide some features in that area this is more of an issue of where you setup Airflow and how well you protect it (IP allow-list etc...)
keep in mind that if a malicious user gained access to Airflow server then there is little you can do about it. This user can simply use his admin privileges to do anything. This is no different than a user that hacked into any other server that you own.

R/googlesheets4 non-interactive session

When I use googlesheets4 in R, I use sheets_auth() in the console and it works fine. But when I try to run it in R markdown, and when I try to knit, I cannot seem to get the credentials. Can someone walk me through the process? I've gone to the vignettes for googlesheets4 but cannot seem to understand it.
This is working for me
gs4_auth(path = "xxxxxxxxxxxxxxxx.json")
It doesn't return anything, but after that I'm able to write data in my sheet with sheet_write()
To get the credentials in a json file you have to follow these steps:
From the Developers Console, in the target GCP Project, go to IAM & Admin > Service accounts.
Give it a decent name and description.
For example, the service account used to create the googledrive docs has name “googledrive-docs” and description “Used when generating
googledrive documentation”.
Service account permissions. Whether you need to do anything here depends on the API(s) you are targetting. You can also modify roles
later and iteratively sort this out.
For example, the service account used to create the googledrive docs does not have any explicit roles.
The service account used to test bigrquery has roles BigQuery Admin and Storage Admin.
Grant users access to this service account? So far, I have not done this, so feel free to do nothing here. Or if you know this is useful
to you, then by all means do so.
Do Create key and download as JSON. This file is what we mean when we talk about a “service account token” in the documentation of gargle
and packages that use gargle. gargle::credentials_service_account()
expects the path to this file.
Appreciate that this JSON file holds sensitive information. Treat it like a username & password combo! This file holds credentials that
potentially have a lot of power and that don’t expire.
Consider storing this file in such a way that it will be automatically discovered by the Application Default Credentials search
strategy. See credentials_app_default() for details.
You will notice the downloaded JSON file has an awful name, so sometimes I create a symlink that uses the service account’s name, to
make it easier to tell what this file is.
Remember to grant this service account the necessary permissions on any resources you plan to access, e.g., read or write permission on a
specific Google Sheet. The service account has no formal relationship
to you as a Google user and won’t automatically inherit permissions.
(copied from here https://gargle.r-lib.org/articles/get-api-credentials.html#service-account-token)

Is it possible to retrieve Firebase Cloud Function source code?

I'm writing some Firebase Cloud Functions but I have need to hide a private key, including from Firebase project admins.
If I embedded this key into my source code and uploaded the code myself, would it be possible for anyone to retrieve the source code and thus the key? Either via Firebase or Google?
Many thanks
The code for your Cloud Functions is never accessible to users of your app.
It is however accessible for the collaborators on your Firebase project. See Get code from firebase console which I deployed earlier
I don't think there's any way to hide such configuration values from collaborators. Since they can see/deploy code, and the code needs access to this private key, they by definition have access to the key too.
Answering precisely to your question: Yes, they can.
The step by step to achieve that is relatively simple
Go into the GCP Functions page
Select the function you want to inspect
Click on source (From there you should be able to see all the files and the code used by that function), or;
Click on variables (From there you should see all environment variables used by your function)
If people being able to see env variables is problematic to you, here's a way to make things more secure:
You can build on what you already and start encrypting those keys before adding them to the codebase or the environment variables. After that, you can use an encryption service such as KMS to decrypt those keys at runtime. In KMS itself you can have a stricter policy in there, only allowing yourself and the function to access that service.
Another great service from GCP is Google's Secret Manager
Maybe setting an environmental variable:
Oficial Doc

How to achieve that user is also the author of a task in Phabricator's Maniphest via Conduit API?

The Conduit API in Phabricator does not support setting of authorPHID parameter while calling maniphest.createtask. I can imagine this is because of security or some logical reason.
But I am developping my own frontend for Maniphest where the users (logged through Phabricator, so they are phab users and have phid) will add and edit tasks. What I need is that if a user creates task, he is also the author of the task.
But the problem is, that I can't connect to Conduit as any other user than "apibot" because I don't have others certificates in my front-end to do it. But if I log in as "apibot", then "apibot" is set as an author of the task.
Three possible solutions came to my mind:
1. retrieve certificate directly from phab's database
2. keep a list of certificates in some file in my front-end and update it manually everytime somebody will register
I guess none of them are really smart...
The third solution would be nice, but I didn't find a way, how to do it:
3. log in as "apibot", get certificate of userXY and then log in as the userXY
What would you suggest?

Meteor, get LoginWithService data without creating account

In Meteor, how can I proceed with LoginWithService (or LinkWithService) and get the service data without actually creating an account?
In my app, I use the service API keys to do certain tasks, to do that I use LinkWithService()
But I also allow users to login/create accounts with LoginWithService() function. But these two functions conflict with each other because if an account already exists with password + service, it will force a log out followed by another log-in.
I'm not sure if that made sense, anyhow I would like to just get the login service data without actually creating an account. How can I do this?

Resources