How export postgresql database from Google Clould - google-cloud-datastore

I have one project in Google Clould with a Postgres database, someone can help me export this data to my pc?

Well you can ssh to your Google Cloud instance and run the command: pg_dump db_name > db_name.sql. The pg_dump command exports the given database to sql format. You can then download the database to your local computer.
See this link: https://www.postgresql.org/docs/9.4/static/app-pgdump.html

From the terminal, I had to specify the database IP address, user and password
pg_dump -h [IP-ADDRESS] -U [DB-USER] [DB-NAME] > dump.sql

Related

AWS amplify/dynamo/appsync - how to sync data locally

All I want to do is essentially take the exact dynamdb tables with their data that exist in a remote instance ( e.g. amplify staging environment/api) and import those locally.
I looked at datasync but that seemed to be FE only. I want to take the exact data from staging and sync that data to my local amplify instance - is this even possible? I can't find any information that is helping right now.
Very used to using mongo/postgres etc. and literally being able to take a DB dump and just import that...I may be missing something here?
How about using dynamodump
Download the data from AWS to your local machine:
python dynamodump.py -m backup -r REGION_NAME -s TABLE_NAME
Then import to Local DynamoDB:
dynamodump -m restore -r local -s SOURCE_TABLE_NAME -d LOCAL_TABLE_NAME --host localhost --port 8000
You have to build a custom script that reads from the online DynamoDB and then populate the Local DynamoDB. I found the docker image be just perfect to have an instance, make sure to give the jar file name to prevent the image to be ephemeral and have persistence of data.
Sort of macro instructions:
Download Docker Desktop (if you want)
Start docker desktop and in a terminal ask for Dynamo DB official image:
https://hub.docker.com/r/amazon/dynamodb-local/
docker pull amazon/dynamodb-local
And then run the docker container:
docker run --name dynamodb -p 8000:8000 -d amazon/dynamodb-local -jar DB.jar
Now you can start a python script that get the data from the online DB and copy in the local dynamoDB, as in official docs:
https://docs.aws.amazon.com/prescriptive-guidance/latest/patterns/copy-amazon-dynamodb-tables-across-accounts-using-a-custom-implementation.html
Working out the connections to the local container (localhost:8000) you shall be able to copy all the data.
I'm not too well versed on local Amplify instances, but for DynamoDB there is a product which you can use locally called DynamoDB Local.
Full list of download instructions for DynamoDB Local are available here: Downloading And Running DynamoDB Local
If you have docker installed in your local machine, you can easily download and start the DynamoDB Local service with a few commands:
Download
docker pull amazon/dynamodb-local
Run
docker run --name dynamodb -p 8000:8000 -d amazon/dynamodb-local -jar DB.jar
This will allow you to avail of 90% of DynamoDB features locally. However, migrating data from DynamoDB Web Service to DynamoDB local is not something that is provided out of the box. For that, you would need to create a small script which you run locally, read data from your existing table and write to your local instance.
An example of reading from one table and writing to a second can be found in the docs here: Copy Amazon Dynamodb Tables Across Accounts
One change you will have make is manually setting the endpoint_url for DynamoDB Local:
dynamodb_client = boto3.Session(
aws_access_key_id=args['AWS_ACCESS_KEY_ID'],
aws_secret_access_key=args['AWS_SECRET_ACCESS_KEY'],
aws_session_token=args['TEMPORARY_SESSION_TOKEN'],
endpoint_url='YOUR_DDB_LOCAL_ENDPOINT_URL'
).client('dynamodb')

Local Worpdress website can not connect to its database

I am trying to install a Wordpress site on a local server (Ubuntu 16.04 in a docker container).
Xampp is installed and running, and I have created a database and a username with proper rights:
mysql -uroot -e "CREATE USER 'localuser'#'localhost' IDENTIFIED BY 'localpassword';";
mysql -uroot -e 'CREATE DATABASE 'localdatabase';';
mysql -uroot -e "GRANT ALL ON localdatabase.* TO 'localuser'#'localhost';";
I've also updated my wp-config.php file with the credentials above.
Still, when I try to install wordpress from there (I use wp-cli), I get the message "Error: Error establishing a database connection. This either means that the username and password information in your wp-config.php file is incorrect or we can’t contact the database server at localhost. This could mean your host’s database server is down."
I've double checked the credentials, and xampp is indeed running, so what should I check next? Could this come from a config file that is missing something?
What hostname are you using in the wp-config? Also did the commands finish succesfully? For creating the db you used apostrophes instead of quotes. You can try the mysql client directly with mysql -u -p -h .
In wp-config.php, use 127.0.0.1 instead of localhost for the database hostname.

Set password for RStudio Server with AWS EC2 instance

I managed to follow all the steps to create EC2 instance and install R Server on it.
When I go to RStudio Server page to connect (which looks something like "ec2-[Public IP]-.eu-west-3.compute.amazonaws.com:8787"), I am asked a username and a password.
I figured out to set a username ("user1") this way:
$ sudo useradd user1
But then when I try this command to write the password:
echo user1:password | chpasswd
I receive this message:
chpasswd: cannot lock /etc/passwd; try again later.
I looked at different solutions suggested here:
https://superuser.com/questions/296373/cannot-lock-etc-passwd-try-again-later
but I do not see a resolution to my problem.
I did not find either any passwd.lock, shadow.lock, group.lock, gshadow.lock files to remove.
type in 'sudo passwd your_username' and you will be prompted to enter a new password

Copy a heroku postgres db to a local sqlite db

I want to copy my heroku production db (postgres) to my development (sqlite).
Copying a postgres db into another postgres db is easy using heroku pg:pull. Does anyone know how to use this command to copy postgres into sqlite?
Heroku docs on pg:pull do not say how to use different types of dbs. This old article implied that it used to be possible. Setting up a local postgres db is something I'd like to avoid.
You will need do a pg_restore locally then dump the data using the -a option to dump data only.
It should look something like this:
Download a data dump.
heroku addons:add pgbackups
heroku pgbackups:capture
curl -o latest.dump `heroku pgbackups:url`
Create a temporary database.
sudo -u postgres createdb tempdb
Restore the dump to your temporary database.
sudo -u postgres pg_restore --verbose --clean --no-acl --no-owner -h localhost -d tempdb latest.dump
Dump the data in the correct format.
sudo -u postgres pg_dump --inserts -a -b tempdb > data.sql
Read dump in sqlite3.
sqlite3
> .read data.sql
This is an approximate solution. You will most likely need to make some small adjustments.
I agree with Craig Ringer that it might be worth getting postgres running locally. Hopefully this process will do the trick though!

how to create super admin in my phpMyAdmin?

I just finished my website here I used ASP.net & MySQL.
I uploaded my website file all right.
But the problem is with my database. I create my database very will and I create all my tables but the problem is that I can't execute my stored procedure?
That is because i don't have the privileges to do this operation?
The error in phpMyAdmin is:
MySQL said:
#1227 - Access denied; you need the SUPER privilege for this operation
How can I fix this?
As MySQL root:
$ mysql -u root -p # ..or, if no password has been set..
$ mysql -u root
Run this command:
GRANT SUPER ON *.* TO user#localhost;
Further reading:
http://dev.mysql.com/doc/refman/5.1/en/grant.html

Resources