Manage permission on branch level in bitbucket - .net-core

I am new to bit bucket, using this for managing code versions,
When i add new member to user group, at the time of permission assignment either read or either write
or only Admin is available.
I want to have user both read and write permission so they can pull and push code into repo, but not the admin right.
Can someone suggest me how to configure this and also let me know if there is any knowledge mismatch in this statement.

If you choose "Write," it will give read and write permissions, so that's probably the option you want.

Related

Keep business variables which should define by admin

I have multi-vendor project which some variables should set by admin, For instance when User wants to pay his/her cart, fee should be specify and it defined by admin of system. (And it could be change passing of time.)
So what's the best approach for keeping this variables?
Note:
I'm running server with Nodejs and I use MongoDB as database.
I have following ideas which has pros and cons in my opinion:
Save these variables in document (in database), which I guess it's not good since I have to for each payment (or other actions which need these variables) send request to database. These variables seems to be fixed and can change after a while. I mean it's not like a user profile information which could change
frequently and when user wants to see his/her profile request should send to database. (further more it's just seems not good create new collection for storing just a document)
Save this in .env file (as environment variables) and I think we keep configuration variables in this file (application layer, not keeping the variables for business) and also updating this file is not good as database.
Please aware me if I make a mistake or there is common way which I don't know. (Also I searched for that and I couldn't find any proper keyword : ( )
My approach has been the following:
If the values can be updated by business administrator in normal course of operation - then they should have Admin UI and be stored in the database. Fees are a good example.
If the values hardly ever change; or changed by IT staff - put them in the configuration file. Endpoint of Vendor API, or mail server configuration would go there.

R/googlesheets4 non-interactive session

When I use googlesheets4 in R, I use sheets_auth() in the console and it works fine. But when I try to run it in R markdown, and when I try to knit, I cannot seem to get the credentials. Can someone walk me through the process? I've gone to the vignettes for googlesheets4 but cannot seem to understand it.
This is working for me
gs4_auth(path = "xxxxxxxxxxxxxxxx.json")
It doesn't return anything, but after that I'm able to write data in my sheet with sheet_write()
To get the credentials in a json file you have to follow these steps:
From the Developers Console, in the target GCP Project, go to IAM & Admin > Service accounts.
Give it a decent name and description.
For example, the service account used to create the googledrive docs has name “googledrive-docs” and description “Used when generating
googledrive documentation”.
Service account permissions. Whether you need to do anything here depends on the API(s) you are targetting. You can also modify roles
later and iteratively sort this out.
For example, the service account used to create the googledrive docs does not have any explicit roles.
The service account used to test bigrquery has roles BigQuery Admin and Storage Admin.
Grant users access to this service account? So far, I have not done this, so feel free to do nothing here. Or if you know this is useful
to you, then by all means do so.
Do Create key and download as JSON. This file is what we mean when we talk about a “service account token” in the documentation of gargle
and packages that use gargle. gargle::credentials_service_account()
expects the path to this file.
Appreciate that this JSON file holds sensitive information. Treat it like a username & password combo! This file holds credentials that
potentially have a lot of power and that don’t expire.
Consider storing this file in such a way that it will be automatically discovered by the Application Default Credentials search
strategy. See credentials_app_default() for details.
You will notice the downloaded JSON file has an awful name, so sometimes I create a symlink that uses the service account’s name, to
make it easier to tell what this file is.
Remember to grant this service account the necessary permissions on any resources you plan to access, e.g., read or write permission on a
specific Google Sheet. The service account has no formal relationship
to you as a Google user and won’t automatically inherit permissions.
(copied from here https://gargle.r-lib.org/articles/get-api-credentials.html#service-account-token)

Too easy to delete whole database

Is there a way to protect the database from deletion? I mean it's very easy to click on the "x" next to the root node. This would destroy the whole app and cause an enourmous mess to deal with.
How to deal with this fragility?
EDIT:
Let's assume I have two firebase accounts: one for testing and one for the launched app. I regularly log in and out to use the other one. On the test account I delete whole nodes on a regular basis. An activated password protection would avoid a very expensive confusion of the two accounts.
If you give a user edit access to the Firebase Console of your project, the user is assumed to be an administrator of the database. This means they can perform any write operation to the database they want and are not tied to your security rules.
As a developer you probably often use this fact to make changes to your data structure while developing the app. For application administrators, you should probably create a custom administrative dashboard, where they can only perform the actions that your code allows.
There is no way to remove specific permissions, such as limiting the amount of data they can remove. It could be a useful feature request, so I suggest posting it here. But at the moment: if you don't trust users to be careful enough with your data, you should not give them access to the console.
As Travis said: setting up backups may be a good way to counter some of this anxiety.

Drupal: user account is limited yet receiving workflow updates?

Is there a way in the Drupal interface to exclude a specific user from workflow status without having to eliminate his account and make a new one?
Looking over his account, he does not have any of the roles to receive status but he does.
Alternatively, I'd rather be able to somehow search for his actual email in the entire system and make sure he is not listed anywhere. Is that even possible in Drupal?
Thanks
I had to just remove his extra account. He had an older account as a admin buried deep in the users.

To use or not to use the user module

We are currently transitioning our website to Drupal 6.x from a non Drupal source. One of the first issues we need to deal with is that of authentication. We have a central database where we keep member information. We will create a module to authenticate against this database however a question of whether or not to create users in the drupal is needed.
I'm worried that if we do not add user to the user tables and have our module keep sync that with the other database, then we will not be able to take advantage of other modules that may use the user module
My colleague on the other hand believes that this is not an issue we can add all necessary attributes to the global $user at authentication with our module.
Is there a standard way of dealing with this problem?
Thanks!
David
Look at the LDAP_integration module, they do something similar. When logging in and a local user cannot be loaded, a user is searched for in another application and when user&pass are equal, the user is copied in the Drupal usertable.
If you want any Drupal functionality (read: core and modules) to be associated with that user account, then you will need to use that user table.
This is especially true for anything node-related, so if you want people to be able to create nodes with referenced data you will need it. uids are stored in the nodes table in order to show who authored the node. Storing a uid in the nodes table with a something that doesn't exist as a relational key to somewhere else will only return an empty object. For instance, if a person wants to see the author of X node they will get an empty user object. Keep it. There's no sense in working harder just to remove it. Besides, you can store as little or as much as you want in the user object for each account.
I'd also suggest looking at the LDAP module. I was able to use it as a jumping in point to interface with a custom WSAPI authentication method for an external database that we have at my company.
Do you need to have both sites running in parallel? If not, then you don't need to sync the user tables. A conversion will be enough then.

Resources