Trying to create Oracle Graph Instance on Cloud (DBCS) - oracle-graph

I am trying unsuccessfully to create the Oracle Graph instance on my Cloud Oracle ADB! I use the instructions from Ryota Yamanaka 'Setup Oracle Graph on Cloud (DBCS)' and it fails on `SSH Public Key:
This variable is required.
Where should I get this SSH key or how should I create it?
It would be nice to clarify this as it seems to me that this is my only chance to get technically involved with Oracle Graph.
Thanks, Fried

You need to generate an SSH key pair (= private key and public key) on your client first.
This is the instruction how to create an SSH key pair
Then, you can open the public key and copy the content.

Related

Authorization failed for URI for DBMS_CLOUD.send_request

I am facing issue with running DBMS_CLOUD.send_request to invoke a function via Autonomous DB.In the credential I am giving the right API signing key but it doesn’t seem to work and keeps trowing “Authorization failed for URI” not sure what am I missing as I am able to invoke the same function with the same credentials using SDK and same invoke endpoint. Also, in the private_key parameter of DBMS_CLOUD.CREATE_CREDENTIAL i am providing the private key content without the line breaks and excluding the BEGIN and END RSA PRIVATE KEY, would like to know if this is the right way to provide the key content.
Also, Please note that my Autonomous DB workload type is "APEX" and I have given EXECUTE GRANT on DBMS_CLOUD to my APEX Principal using ADMIN
Is your private key protected with a passphrase ...? AFAIK these are not supported, so you might work without a passphrase.
Also, you might try creating an APEX Web Credential (Use the OCI type), and then use APEX_WEB_SERVICE.MAKE_REST_REQUEST to call the REST API. This would at least help to verify the credential.

Failed SSH into instance

I have 15 instances running with same security group, however I can SSH into some of them but not the others. I received "Permission denied (publickey)" message for those instances. I also confirm that all instances are using the same public key and I try to ssh into all of them with the same private key.
What do I miss?
Thank you for helping out!
If you are getting "Permission denied (publickey)", it is not a security group issue. It is most likely one of the following:
You didn't specify the public key to use when launching some of the instances.
There was a problem with the metadata service on some of the instances which meant that cloud-init was unable to retrieve the public key.
You are using the wrong credentials; e.g. the admin account name is different on the different instances. (The default is OS dependent.)
You have multiple keys in your ~/.ssh directory and they are being tried in the wrong order. If you have fail2ban set up on the server side, each time that the client supplies a key counts as a login attempt. You can hit the limit before you tried the key that is going to work.
If you look at the respective instance's console log from their first boot, you can see which public keys were actually used. This can be used to diagnose 1 and 2.
For 3, check the OS documentation.
For 4, try using the ssh command's -i option to specify the path to the private key file.
There are other possibilities; e.g. if you launched instances from a non-pristine image / snaphot.
Reference:
Troubleshooting SSH access to a NeCTAR instance

HyperLedger Composer 0.19 How to encrypt/decrypt data using private/public key?

I have a usecase where I have to encrypt data using a participant's public key before adding the data to the chain.
I have already implemented issuing identity and creating cards through following the example here.
https://github.com/hyperledger/composer-knowledge-wiki/blob/latest/knowledge.md#card-api-errors--resolutions
The problem is the example only returns certificate and private key.
However, I need public key to encrypt data before adding to the chain so I can decrypt it with the private key later (when retrieving).
Can anyone please help me out? Am I missing something here?
Thanks!

GCP encryption thru Beam / Dataflow APIs for Bigquery and Cloud SQL

Context: We are trying to load some CSV format data into GCP BigQuery using GCP Dataflow (Apache Beam). As a part of this for the first time (for each table) creating the BQ tables thru BigQueryIO API. One of the customer requirement is the data on GCP needs to be encrypted using Customer supplied/managed Encryption keys.
Problem Statement: We are not able to find any way to specify the "Custom Encryption Keys" thru APIs while creating Tables. The GCP documentation details about how to specify the Custom encryption keys thru GCP BQ Console but could not find anything for specifying it thru APIs from within DataFlow Code.
Code Snippet:
String tableSpec = new StringBuilder().append(PipelineConstants.PROJECT_ID).append(":")
.append(dataValue.getKey().target_dataset).append(".").append(dataValue.getKey().target_table_name)
.toString();
ValueProvider<String> valueProvider = StaticValueProvider.of("gs://bucket/folder/");
dataValue.getValue().apply(Count.globally()).apply(ParDo.of(new RowCount(dataValue.getKey())))
.apply(ParDo.of(new SourceAudit(runId)));
dataValue.getValue().apply(ParDo.of(new PreProcessing(dataValue.getKey())))
.apply(ParDo.of(new FixedToDelimited(dataValue.getKey())))
.apply(ParDo.of(new CreateTableRow(dataValue.getKey(), runId, timeStamp)))
.apply(BigQueryIO.writeTableRows().to(tableSpec)
.withSchema(CreateTableRow.getSchema(dataValue.getKey()))
.withCustomGcsTempLocation(valueProvider)
.withCreateDisposition(BigQueryIO.Write.CreateDisposition.CREATE_IF_NEEDED)
.withWriteDisposition(BigQueryIO.Write.WriteDisposition.WRITE_APPEND));
Query: If anybody could let us know
If this is possible to provide encryption key thru Beam API?
If its not possible with the current version what could be the possible work
around?
Kindly let know if additional information is required.
Customer supplied encryption keys is a new feature, not all libraries have been updated to support it yet.
If you know the table name in advance, you can use UI/CLI or API to create table, then run your normal flow to load data into that table. That might be a work around for you.
https://cloud.google.com/bigquery/docs/customer-managed-encryption#create_table
API to create table: https://cloud.google.com/bigquery/docs/reference/rest/v2/tables/insert
You need to set this section on table object:
"encryptionConfiguration": {
"kmsKeyName": string
}
More details on table: https://cloud.google.com/bigquery/docs/reference/rest/v2/tables#resource

Finding out the public key of an ssh server

I have access to a student server and was interested in being able to visualize the public key of the ssh server. What unix command can help me see the ssh key, since this is different than my own generating of keys like on AWS etc.?
This is that simple as :
# cat /root/.ssh/id*.pub
or for a specific user :
# cat ~user/.ssh/id*.pub

Resources