mysqldump not copying all data - drupal

I have a MySQL DB that is the backend for my drupal web site, I am going through the drupal upgrade and before I do the upgrade I need to make a copy of the database.
mysqladmin create ts_prod_bak -u root --password=XXXXXX && \
mysqldump -u root --password=XXXXXX ts_prod | mysql -u root --password=XXXXXX -h localhost ts_prod_bak
This does create a new DB called ts_prod_bak and fills that DB with data from ts_prod but it isn't copying all of the data. I see some tables, cache_* that are created in the new DB but have a different size than the source. Because of this I am not confident in my backup/copy.
How can I make an exact duplicate of my source database and verify that by restoring it to another DB?

Related

How to export and import mysql database to ignore Duplicate entries for key 'PRIMARY'?

I'm attempting to write a bash script that will dump a database and then import it to a staging database. I would like the staging database to match the 'master' database.
I have the following code, however I recieve:
ERROR 1062 (23000) at line 23: Duplicate entry '1' for key 'PRIMARY'
# Dump production master database, excluding school_hosts table
mysqldump -h $MYSQL_HOST -u $MYSQL_USERNAME -p$MYSQL_PASSWORD --no-create-info --ignore-table=hcl_master.school_hosts hcl_master > hcl_master.sql
# Dump hcl staging database, for backup.
mysqldump -h $MYSQL_HOST -u $MYSQL_USERNAME -p$MYSQL_PASSWORD hclstaging_master > hclstaging_master_backup.sql
# Import dump file into staging master database
mysql -h $MYSQL_HOST -u $MYSQL_USERNAME -p$MYSQL_PASSWORD hclstaging_master < hcl_master.sql
After searching, I found that I could add --replace to the mysql command that is importing, however I recieve an error stating that:
mysql: unknown option '--replace'
Can anybody help with getting this script to work correctly? I'm unsure how I can drop the staging database before i import or how to get it to overwrite the primary key record?
Any help would be much appreciated. I am using MariaDB.
--replace is a mysqldump option that you specify when creating the dump, not something you can tell mysql when importing the dump.

AWS amplify/dynamo/appsync - how to sync data locally

All I want to do is essentially take the exact dynamdb tables with their data that exist in a remote instance ( e.g. amplify staging environment/api) and import those locally.
I looked at datasync but that seemed to be FE only. I want to take the exact data from staging and sync that data to my local amplify instance - is this even possible? I can't find any information that is helping right now.
Very used to using mongo/postgres etc. and literally being able to take a DB dump and just import that...I may be missing something here?
How about using dynamodump
Download the data from AWS to your local machine:
python dynamodump.py -m backup -r REGION_NAME -s TABLE_NAME
Then import to Local DynamoDB:
dynamodump -m restore -r local -s SOURCE_TABLE_NAME -d LOCAL_TABLE_NAME --host localhost --port 8000
You have to build a custom script that reads from the online DynamoDB and then populate the Local DynamoDB. I found the docker image be just perfect to have an instance, make sure to give the jar file name to prevent the image to be ephemeral and have persistence of data.
Sort of macro instructions:
Download Docker Desktop (if you want)
Start docker desktop and in a terminal ask for Dynamo DB official image:
https://hub.docker.com/r/amazon/dynamodb-local/
docker pull amazon/dynamodb-local
And then run the docker container:
docker run --name dynamodb -p 8000:8000 -d amazon/dynamodb-local -jar DB.jar
Now you can start a python script that get the data from the online DB and copy in the local dynamoDB, as in official docs:
https://docs.aws.amazon.com/prescriptive-guidance/latest/patterns/copy-amazon-dynamodb-tables-across-accounts-using-a-custom-implementation.html
Working out the connections to the local container (localhost:8000) you shall be able to copy all the data.
I'm not too well versed on local Amplify instances, but for DynamoDB there is a product which you can use locally called DynamoDB Local.
Full list of download instructions for DynamoDB Local are available here: Downloading And Running DynamoDB Local
If you have docker installed in your local machine, you can easily download and start the DynamoDB Local service with a few commands:
Download
docker pull amazon/dynamodb-local
Run
docker run --name dynamodb -p 8000:8000 -d amazon/dynamodb-local -jar DB.jar
This will allow you to avail of 90% of DynamoDB features locally. However, migrating data from DynamoDB Web Service to DynamoDB local is not something that is provided out of the box. For that, you would need to create a small script which you run locally, read data from your existing table and write to your local instance.
An example of reading from one table and writing to a second can be found in the docs here: Copy Amazon Dynamodb Tables Across Accounts
One change you will have make is manually setting the endpoint_url for DynamoDB Local:
dynamodb_client = boto3.Session(
aws_access_key_id=args['AWS_ACCESS_KEY_ID'],
aws_secret_access_key=args['AWS_SECRET_ACCESS_KEY'],
aws_session_token=args['TEMPORARY_SESSION_TOKEN'],
endpoint_url='YOUR_DDB_LOCAL_ENDPOINT_URL'
).client('dynamodb')

Alfresco server gets startup gets hung

I am trying to start a Alfresco server but it got hung in between,Please see below screenshot, I have copied Alfresco instance from one server to another server, I have also made necessary changes in Alfresco-global.properties.
Please help on this
For backup of your database and alf_data you can download and run the following script.
http://www.contcentric.com/alfresco-backup/
Note: you will have to manually backup the indexes from the solr4 folder and other customizations (like amps and jar deployed)
Follow the alfresco restore steps
1. Install new alfresco instance. Do not start server
2. Start postgresql using the following command
./alfresco.sh start postgresql
3. Go to the <ALF-HOME>/postgresql/bin
4. Run the following commmand
psql -U alfresco -h <hostname> -p port
e.g. psql -U alfresco -h localhost -p 5422
5. It will ask you to set the password, enter the password and remember it
6. Run the following command
psql -U alfresco -h <host> -p port <dbname> < dumpFile
e.g. psql -U alfresco -h localhost -p 5422 alfresco < /opt/migration-backup/01-10-2018-15-54-47/database/alfresco_db_dump
7. You will notice the multiple tables and index are created
8. Start the tomcat using the following command
./alfresco.sh start tomcat
9. Test your migration.

Copy a heroku postgres db to a local sqlite db

I want to copy my heroku production db (postgres) to my development (sqlite).
Copying a postgres db into another postgres db is easy using heroku pg:pull. Does anyone know how to use this command to copy postgres into sqlite?
Heroku docs on pg:pull do not say how to use different types of dbs. This old article implied that it used to be possible. Setting up a local postgres db is something I'd like to avoid.
You will need do a pg_restore locally then dump the data using the -a option to dump data only.
It should look something like this:
Download a data dump.
heroku addons:add pgbackups
heroku pgbackups:capture
curl -o latest.dump `heroku pgbackups:url`
Create a temporary database.
sudo -u postgres createdb tempdb
Restore the dump to your temporary database.
sudo -u postgres pg_restore --verbose --clean --no-acl --no-owner -h localhost -d tempdb latest.dump
Dump the data in the correct format.
sudo -u postgres pg_dump --inserts -a -b tempdb > data.sql
Read dump in sqlite3.
sqlite3
> .read data.sql
This is an approximate solution. You will most likely need to make some small adjustments.
I agree with Craig Ringer that it might be worth getting postgres running locally. Hopefully this process will do the trick though!

how to create super admin in my phpMyAdmin?

I just finished my website here I used ASP.net & MySQL.
I uploaded my website file all right.
But the problem is with my database. I create my database very will and I create all my tables but the problem is that I can't execute my stored procedure?
That is because i don't have the privileges to do this operation?
The error in phpMyAdmin is:
MySQL said:
#1227 - Access denied; you need the SUPER privilege for this operation
How can I fix this?
As MySQL root:
$ mysql -u root -p # ..or, if no password has been set..
$ mysql -u root
Run this command:
GRANT SUPER ON *.* TO user#localhost;
Further reading:
http://dev.mysql.com/doc/refman/5.1/en/grant.html

Resources