RIAK-CS Unable to create bucket using s3cmd - AccessDenied - riak

I have the following setup: riak 1.4.12, riakcs 1.5.3, stanchion 1.5.0
I am able to list bucket contents, and the authentication works (I get a response when listing or trying to remove a bucket, PUT a file) but get an AccessDenied error when trying to create a bucket.
I found this thread http://riak-users.197444.n3.nabble.com/RIAK-CS-Unable-to-create-bucket-using-s3cmd-AccessDenied-td4032375.html and tried adding signature_v2 = True to .s3cfg with no success, and I've also tried three versions of s3cmd (1.5.0, 1.5.0alpha, 1.0.1) I also tried creating a bucket using the python library boto, which also gives an access denied error.
I'm stumped :( any suggestions on where I should look next would be greatly appreciated! Not sure where there are logs for individual operations against Riak-cs - I've set lager log level to debug and wasn't able to see anything in the logs.
Thanks!
Ambert

I posted the same question to riak-users mailing list, and got an answer!
In my case, I had to set the admin.key and admin.secret in /etc/stanchion/stanchion.conf.
After setting them, s3cmd mb succeeded.

Related

R keyring package - my keyring has disappeared (maybe), what happened to it and what keyring do I have now?

I've been using keyring reliably for months. All of a sudden, a bunch of scripts failed because my code to pull keys is returning nothing. I had a named keyring with a passcode. It appears to be gone.
When I run keyring::keyring_list I do have something though:
keyring num_secrets locked
1 9 FALSE
What is "1"? I think 9 is the number of secrets I had. But I can't figure out how to access this. I've tried keyring::key_list(keyring = NAME) where I used 1, "1" as names.
And what happened to my original keyring? How can I troubleshoot?
Edit: When the script runs via a batch file, I get this error:
Error in b_wincred_i_get(target) :
Windows credential store error in 'get': Element not found.
Calls: source ... b_wincred_parse_keyring_credential -> rawToChar -> b_wincred_i_get
Execution halted
I found one SO post (Error when using R to get credentials from Windows Cred Vault) that pointed me to make sure the credentials exist in Windows Credentials, I think they do ('credentials' is the name of the keyring):
I had the exact same problem and this seems a bug of the keyring package. I filed an issue here and I will update this answer if I hear back from the developers.
For now:
Don't bother about the 1, that's just the row number of the data.frame that's being returned by keyring_list(). The keyring name is empty. Existing keys do seem to work anymore, though setting keys via the package does work.
As you can also probably tell, the number 9 is the number of credentials you have under 'Generic Credentials'. So it is kind of working, but the retrieval fails.
I ended up having to re-create the ring, and create a process for backing up the keyring so that I can restore it if this happens again.

R- Following Error: API returned: Request had insufficient authentication scopes

I've verified my API in RStudio after hours of trying and now I've reached another error while trying to translate a sentence. Would be grateful for any help!
I'm just trying to translate "hello" to french using googleLanguageR package -
> gl_translate("Hello", "fr")
The result I get is this -
2021-01-21 17:15:36 -- Translating text: 5 characters -
i 2021-01-21 17:15:36 > Request Status Code: 403
Error: API returned: Request had insufficient authentication scopes.
I'm a literal beginner in the field of computing and do not understand what scopes mean here.
Thanks for the help!
Scopes are permissions that you give to apps you use to access an API. For example, one App might have permission to read the private messages of a users, whereas another doesn't. It's similar to when an app on your phone asks for permission to use the camera, or access your contacts.
Your app is trying to do something that it doesn't have permission to do. You'll need to add the relevant scopes in whatever setting that it is where you're generating keys etc. Presumably Google Data Studio?
Okay, I found an answer.
I needed to download a json version of my key and authorize it using the code -
gl_auth("filename.json")
After doing this, I needed to make sure my API is enabled. Now, it is working perfectly!

Error:1411809D:SSL routines - When trying to make https call from inside R module in AzureML

I have an experiment in AzureML which has a R module at its core. Additionally, I have some .RData files stored in Azure blob storage. The blob container is set as private (no anonymous access).
Now, I am trying to make a https call from inside the R script to the azure blob storage container in order to download some files. I am using the httr package's GET() function and properly set up the url, authentication etc...The code works in R on my local machine but the same code gives me the following error when called from inside the R module in the experiment
error:1411809D:SSL routines:SSL_CHECK_SERVERHELLO_TLSEXT:tls invalid ecpointformat list
Apparently this is an error from the underlying OpenSSL library (which got fixed a while ago). Some suggested workarounds I found here were to set sslversion = 3 and ssl_verifypeer = 1, or turn off verification ssl_verifypeer = 0. Both of these approaches returned the same error.
I am guessing that this has something to do with the internal Azure certificate / validation...? Or maybe I am missing or overseeing something?
Any help or ideas would be greatly appreciated. Thanks in advance.
Regards
After a while, an answer came back from the support team, so I am going to post the relevant part as an answer here for anyone who lands here with the same problem.
"This is a known issue. The container (a sandbox technology known as "drawbridge" running on top of Azure PaaS VM) executing the Execute R module doesn't support outbound HTTPS traffic. Please try to switch to HTTP and that should work."
As well as that a solution is on the way :
"We are actively looking at how to fix this bug. "
Here is the original link as a reference.
hth

Running AWS commands from commandline on a ShellCommandActivity

My original problem was that I want to increase my DynamoDB write throughput before I run the pipeline, and then decrease it when I'm done uploading (doing it max once a day, so I'm fine with the decreasing limitations).
They only way I found to do it is through a shell script that will issue the API commands to alter the throughput. How does it work with my AMI access_key and secret_key when it's a resource that pipeline creates for me? (I can't log in to set the ~/.aws/config file and don't really want to create an AMI just for this).
Should I write the script in bash? can I use ruby/python AWS SDK packages for example? (I prefer the latter..)
How do I pass my credentials to the script? do I have runtime variables (like #startedDate) that I can pass as arguments to the activity with my key and secret? Do I have any other way to authenticate with either the commandline tools or the SDK package?
If there is another way to solve my original problem - please let me know. I've only got to the ShellActivity solution because I couldn't find anything else in documentations/forums.
Thanks!
OK. found it - http://docs.aws.amazon.com/datapipeline/latest/DeveloperGuide/dp-concepts-roles.html
The resourceRole in the default object in your pipeline will be the one assigned to resources (Ec2Resource) that are created as a part of the pipeline activation.
The default one in configured to have all your permissions and AWS commandline and SDK packages are automatically looking for those credentials so no need to update ~/.aws/config of pass credentials manually.

Riak Map/Reduce enableForSearch() error

I'm trying to use the Riak Java Client in an application, however I'm facing some errors. What I need is to perform a Riak Search query as input for a Map/Reduce. According to the official tutorial the search property must be enabled in the Bucket. I'm doing so, in the following code:
IRiakClient riakClient = RiakFactory.httpClient(HTTP_CLIENT);
Bucket bucket = (Bucket) riakClient.createBucket("test-bucket").enableForSearch().execute();
When I do this, the store operation, in the Bucket, doesn't work anymore. And the following error appears:
com.basho.riak.client.RiakRetryFailedException: java.io.IOException: 500 Error:
{precommit_fail,{hook_crashed,{riak_search_kv_hook,precommit,error,badarg}}}
I've already googled the problem, but it wasn't much help!
Do you have search enabled in your app.config? Find this section
%% Riak Search Config
{riak_search, [
%% To enable Search functionality set this 'true'.
{enabled, false}
]},
and set enabled to true.

Resources