I'm getting "Parameters: [unauthRoleName] must have values" when adding cognito authorization ans api key authorization to my data model in amplify - aws-amplify

I have a graphql api with the following model:
type NewFeatureNotification #model #auth(rules: [{allow: public, operations: [read]}, {allow: groups, groups: ["AdminPortal"], operations: [read, create, update, delete]}]) {
id: ID!
title: String
description: String
date: AWSDate
featureType: String
productIdentifier: Int
}
I added an authentication option with an existing user pool and that worked fine, I also created a user group to restrict permissions. The issue appeared when I tried to add authorization rules to my model, the rules should allow access to read action with the api key and read, create, update and delete can be access with an authenticated user that belongs to the AdminPortal Group
rules: [{allow: public, operations: [read]}, {allow: groups, groups: ["AdminPortal"], operations: [read, create, update, delete]}]
but when I try to deploy this change with amplify push I get the following error:
Parameters: [unauthRoleName] must have values .
I also try to do the same thing from the amplify studio but I get the same error.

Related

Bicep roleAssignments/write permission error when assigning a role to Keyvault

I am using GitHub Actions to deploy via Bicep:
- name: Login
uses: azure/login#v1
with:
creds: ${{ secrets.AZURE_CREDENTIALS }}
- name: Deploy Bicep file
uses: azure/arm-deploy#v1
with:
scope: subscription
subscriptionId: ${{ secrets.AZURE_CREDENTIALS_subscriptionId }}
region: ${{ env.DEPLOY_REGION }}
template: ${{ env.BICEP_ENTRY_FILE }}
parameters: parameters.${{ inputs.selectedEnvironment }}.json
I have used a contributor access for my AZURE_CREDENTIALS based on the output of the next command:
az ad sp create-for-rbac --n infra-bicep --role contributor --scopes /subscriptions/my-subscription-guid --sdk-auth
I am using Azure Keyvault with RBAC. This Bicep has worked fine until I tried to give an Azure Web App a keyvault read access to the Keyvault as such:
var kvSecretsUser = '4633458b-17de-408a-b874-0445c86b69e6'
var kvSecretsUserRole = subscriptionResourceId('Microsoft.Authorization/roleDefinitions', kvSecretsUser)
resource kx_webapp_roleAssignments 'Microsoft.Authorization/roleAssignments#2022-04-01' = {
name: 'kv-webapp-roleAssignments'
scope: kv
properties: {
principalId: webappPrincipleId
principalType: 'ServicePrincipal'
roleDefinitionId: kvSecretsUserRole
}
}
Then I was hit with the following error:
'Authorization failed for template resource 'kv-webapp-roleAssignments'
of type 'Microsoft.Authorization/roleAssignments'.
The client 'guid-value' with object id 'guid-value' does not have permission to perform action
'Microsoft.Authorization/roleAssignments/write' at scope
'/subscriptions/***/resourceGroups/rg-x/providers/Microsoft.KeyVault/vaults/
kv-x/providers/Microsoft.Authorization/roleAssignments/kv-webapp-roleAssignments'.'
What are the total minimal needed permissions and what should my az ad sp create-for-rbac statement(s) be and are there any other steps I need to do to assign role permissions?
To assign RBAC roles, you need to have either User Access
Administrator or Owner role that includes below permission:
Microsoft.Authorization/roleAssignments/write
With Contributor role, you cannot assign RBAC roles to Azure resources. To confirm that, you can check this MS Doc.
I tried to reproduce the same in my environment and got below results:
I used the same command and created one service principal with Contributor role as below:
az ad sp create-for-rbac --n infra-bicep --role contributor --scopes /subscriptions/my-subscription-guid --sdk-auth
Response:
I generated one access token via Postman with below parameters:
POST https://login.microsoftonline.com/<tenantID>/oauth2/v2.0/token
grant_type:client_credentials
client_id: <clientID from above response>
client_secret: <clientSecret from above response>
scope: https://management.azure.com/.default
Response:
When I used this token to assign Key Vault Secrets User role with below API call, I got same error as you like below:
PUT https://management.azure.com/subscriptions/d689e7fb-47d7-4fc3-b0db-xxxxxxxxxxx/providers/Microsoft.Authorization/roleAssignments/xxxxxxxxxxx?api-version=2022-04-01
{
"properties": {
"roleDefinitionId": "/subscriptions/d689e7fb-47d7-4fc3-b0db-xxxxxxxxxx/providers/Microsoft.Authorization/roleDefinitions/4633458b-17de-408a-b874-0445c86b69e6",
"principalId": "456c2d5f-12e7-4448-88ba-xxxxxxxxx",
"principalType": "ServicePrincipal"
}
}
Response:
To resolve the error, create a service principal with either User Access Administrator or Owner role.
In my case, I created a service principal with Owner role like below:
az ad sp create-for-rbac --n infra-bicep-owner --role owner --scopes /subscriptions/my-subscription-guid --sdk-auth
Response:
Now, I generated access token again via Postman by replacing clientId and clientSecret values like below:
POST https://login.microsoftonline.com/<tenantID>/oauth2/v2.0/token
grant_type:client_credentials
client_id: <clientID from above response>
client_secret: <clientSecret from above response>
scope: https://management.azure.com/.default
Response:
When I used this token to assign Key Vault Secrets User role with below API call, I got response successfully like below:
PUT https://management.azure.com/subscriptions/d689e7fb-47d7-4fc3-b0db-xxxxxxxxxxx/providers/Microsoft.Authorization/roleAssignments/xxxxxxxxxxx?api-version=2022-04-01
{
"properties": {
"roleDefinitionId": "/subscriptions/d689e7fb-47d7-4fc3-b0db-xxxxxxxxxx/providers/Microsoft.Authorization/roleDefinitions/4633458b-17de-408a-b874-0445c86b69e6",
"principalId": "456c2d5f-12e7-4448-88ba-xxxxxxxxx",
"principalType": "ServicePrincipal"
}
}
Response:
UPDATE:
Considering least privileges principle, you need to create custom RBAC role instead of assigning Owner role.
To create custom RBAC role, follow below steps:
Go to Azure Portal -> Subscriptions -> Your Subscription -> Access control (IAM) -> Add -> Add custom role
Fill the details with name and description, make sure to select Contributor role after choosing Clone a role like below:
Now, remove below permission from NotAction Microsoft.Authorization/roleAssignments/write
Now, add Microsoft.Authorization/roleAssignments/write permission in Action:
Now, click on Create like below:
You can create service principal with above custom role using this command:
az ad sp create-for-rbac --n infra_bicep_custom_role --role 'Custom Contributor' --scopes /subscriptions/my-subscription-guid --sdk-auth
Response:

Is there a way to override AppSync auto generated input types when using Amplify?

Let's say we have the following graphql schema
type Customer
#model
#auth(
rules: [
{ allow: owner }
{ allow: groups, groups: ["Admin"] }
{ allow: public, provider: iam }
]
) {
id: ID! #primaryKey
owner: ID!
customer_last_name: String
}
When pushing above schema to AppSync via AWS Amplify, the following is created in the autogenerated graphql schema.
type Query {
getCustomer(id: ID!): Customer
#aws_iam
#aws_cognito_user_pools
listCustomers(
id: ID,
filter: ModelCustomerFilterInput,
limit: Int,
nextToken: String,
sortDirection: ModelSortDirection
): ModelCustomerConnection
#aws_iam
#aws_cognito_user_pools
}
Is it possible to pass and enforce a custom argument for the query input, such as
getCustomer(id: ID!, owner:ID!): Customer instead of the autogenerated getCustomer(id: ID!): Customer ?
This can be done by editing the autogenerated schema directly in the Appsync console, but in the case of a new Amplify push, the changes will be lost.
There can be only one Hash Key(aka Partition key or Primary Key) in dynamoDB. If you want multiple keys to be your Hash Key, you need to concatenate those keys into one Hash Key. The key pattern you want is called composite primary key.
For more information,
https://docs.aws.amazon.com/amazondynamodb/latest/developerguide/HowItWorks.CoreComponents.html#HowItWorks.CoreComponents.PrimaryKey
For creating a composite key with amplify graphql schema, please check details on this link. (Assuming you are using amplify graphql transformer v1)
https://docs.amplify.aws/cli-legacy/graphql-transformer/key/

firebase auth token from Graphcool

can I generate a custom auth token, for use with a third party, with a resolver in graph.cool? something like this??
type FirebaseTokenPayload {
token: String!
}
extend type Query {
FirebaseToken(userIdentifier: String!): FirebaseTokenPayload
}
const fb = require('myNodeFirebaseAuthLib')
module.exports = event => fb.generateTokenWithPayload({ id: event.data.userId })
Authentication required - restrict who can read data in fields: Permission query:
query ($user_id: ID!, $node_firebaseIdentifier: String) {
SomeUserExists(filter: {
id: $user_id ,
firebaseIdentifier: $node_firebaseIdentifier
})
}
--
I think this question boils down two parts
"is it possible to install node modules in the graph.cool instance -- or for that sort of thing do we need to use a webhook" -- if it must be a webhook, what is the flow of identity verification and how do I pass the payload parameters ?
"can we add permissions queries and authentication to resolvers?"
notes, addendums:
according to this alligator.io blog post, it seems that using the Graphcool framework, you can install node modules! So, I wouldn't need to use a webhook. -- However, that is with an ejected app. I lose auth0 authentication that way -- the template does not produce a createUser and singinUser that works with the same auth0 data that the integration offers.
I forgot to post the answer to this - I had to eject graphcool, I could not use any node_modules I thought to try in my custom functions.

Firebase - Saving data

I have a question regarding the firebase.
I'm developing an app where there are three screens: the 'registration screen' where the user will create an account with email and password. The 'building profile screen', where the user will answer some questions to be implemented in his profile, (such as "What's your name?"). And finally the 'profile screen', where the user information will be displayed, such as the user name.
On the 'registration screen' I'm having no problem, the user fills in the email input and password input, and by clicking "create account", calling .createUserWithEmailAndPassword, the user account is created and it is taken to the 'building profile screen'. The question I'm having is in the 'building profile screen'. My question is: How can I save the user name and other data?
I read some articles on the subject but I had difficulty understanding. Can any of you guys help me with this?
You're going to want to create a node or multiple nodes in firebase for each user to hold their user-specific information. The database structure could be uniform like so:
users: {
uid_a: {
username: 'uid_as_username',
email: 'uid_as_email',
name: 'uid_as_name',
other_attribute: 'uid_as_other_attribute_value'
[,...]
},
uid_b: {
username: 'uid_bs_username',
email: 'uid_bs_email',
name: 'uid_bs_name',
other_attribute: 'uid_bs_other_attribute_value'
[,...]
}
[,...]
}
or split up like so:
usernames: {
uid_a: 'uid_as_username',
uid_b: 'uid_bs_username'
[,...]
},
emails: {
uid_a: 'uid_as_email',
uid_b: 'uid_bs_email'
[,...]
},
names: {
uid_a: 'uid_as_name',
uid_b: 'uid_bs_name'
[,...]
},
other_attribute: {
uid_a: 'uid_as_other_attribute_value',
uid_b: 'uid_bs_other_attribute_value'
[,...]
}
Which you choose is a design choice, but thats the idea.
Just complementing #Vincent answer, by default you can store the user name, email and photoUrl within firebase auth (read get user profile: https://firebase.google.com/docs/auth/web/manage-users).
If you need to store more info, like postal address, phonenumbers, and so on, you can create a node in your database like users and store all the data you need. You can even use the same UID created for auth as the ID of your database. This way it would be easier for you to get user infos in the future.
When you just create the user with email and password, you can return the user and add it to your database with a script like this
firebase.database.ref(`Users/${user.uid}`).set({
name: this.state.name,
email: this.state.email,
});
Consider the code above just as an example.
Prefer to use .set() instead of .push(). If you use .push() firebase will create a random id which you will not be able to change. Using .set() you can determine the value of your node.
Hope it helps.
This is taken from the official documentation that might give you clue how to update and fetch data from database.
Set up Firebase Realtime Database for Android
Connect your app to Firebase
Install the Firebase SDK. In the Firebase console, add your app to
your Firebase project. Add the Realtime Database to your app
Add the dependency for Firebase Realtime Database to your app-level
build.gradle file:
compile 'com.google.firebase:firebase-database:11.2.2'
Configure Firebase Database Rules
The Realtime Database provides a declarative rules language that
allows you to define how your data should be structured, how it should
be indexed, and when your data can be read from and written to. By
default, read and write access to your database is restricted so only
authenticated users can read or write data. To get started without
setting up Authentication, you can configure your rules for public
access. This does make your database open to anyone, even people not
using your app, so be sure to restrict your database again when you
set up authentication.
Write to your database
Retrieve an instance of your database using getInstance() and
reference the location you want to write to.
// Write a message to the database
FirebaseDatabase database = FirebaseDatabase.getInstance();
DatabaseReference myRef = database.getReference("message");
myRef.setValue("Hello, World!");
You can save a range of data types to the database this way, including
Java objects. When you save an object the responses from any getters
will be saved as children of this location.
Read from your database
To make your app data update in realtime, you should add a
ValueEventListener to the reference you just created.
The onDataChange() method in this class is triggered once when the
listener is attached and again every time the data changes, including
the children.
// Read from the database
myRef.addValueEventListener(new ValueEventListener() {
#Override
public void onDataChange(DataSnapshot dataSnapshot) {
// This method is called once with the initial value and again
// whenever data at this location is updated.
String value = dataSnapshot.getValue(String.class);
Log.d(TAG, "Value is: " + value);
}
#Override
public void onCancelled(DatabaseError error) {
// Failed to read value
Log.w(TAG, "Failed to read value.", error.toException());
}
});

Can't create cloudsql role for Service Account via api

I have been trying to use the api to create service accounts in GCP.
To create a service account I send the following post request:
base_url = f"https://iam.googleapis.com/v1/projects/{project}/serviceAccounts"
auth = f"?access_token={access_token}"
data = {"accountId": name}
# Create a service Account
r = requests.post(base_url + auth, json=data)
this returns a 200 and creates a service account:
Then, this is the code that I use to create the specific roles:
sa = f"{name}#dotmudus-service.iam.gserviceaccount.com"
sa_url = base_url + f'/{sa}:setIamPolicy' + auth
data = {"policy":
{"bindings": [
{
"role": roles,
"members":
[
f"serviceAccount:{sa}"
]
}
]}
}
If roles is set to one of roles/viewer, roles/editor or roles/owner this approach does work.
However, if I want to use, specifically roles/cloudsql.viewer The api tells me that this option is not supported.
Here are the roles.
https://cloud.google.com/iam/docs/understanding-roles
I don't want to give this service account full viewer rights to my project, it's against the principle of least privilege.
How can I set specific roles from the api?
EDIT:
here is the response using the resource manager api: with roles/cloudsql.admin as the role
POST https://cloudresourcemanager.googleapis.com/v1/projects/{project}:setIamPolicy?key={YOUR_API_KEY}
{
"policy": {
"bindings": [
{
"members": [
"serviceAccount:sa#{project}.iam.gserviceaccount.com"
],
"role": "roles/cloudsql.viewer"
}
]
}
}
{
"error": {
"code": 400,
"message": "Request contains an invalid argument.",
"status": "INVALID_ARGUMENT",
"details": [
{
"#type": "type.googleapis.com/google.cloudresourcemanager.projects.v1beta1.ProjectIamPolicyError",
"type": "SOLO_REQUIRE_TOS_ACCEPTOR",
"role": "roles/owner"
}
]
}
}
With the code provided it appears that you are appending to the first base_url which is not the correct context to modify project roles.
This will try to place the appended path to: https://iam.googleapis.com/v1/projects/{project}/serviceAccount
The POST path for adding roles needs to be: https://cloudresourcemanager.googleapis.com/v1/projects/{project]:setIamPolicy
If you remove /serviceAccounts from the base_url and it should work.
Edited response to add more information due to your edit
OK, I see the issue here, sorry but I had to set up a new project to test this.
cloudresourcemanager.projects.setIamPolicy needs to replace the entire policy. It appears that you can add constraints to what you change but that you have to submit a complete policy in json for the project.
Note that gcloud has a --log-http option that will help you dig through some of these issues. If you run
gcloud projects add-iam-policy-binding $PROJECT --member serviceAccount:$NAME --role roles/cloudsql.viewer --log-http
It will show you how it pulls the existing existing policy, appends the new role and adds it.
I would recommend using the example code provided here to make these changes if you don't want to use gcloud or the console to add the role to the user as this could impact the entire project.
Hopefully they improve the API for this need.

Resources