Dynamic update to tableau - amazon-dynamodb

I currently connecting dynamodb with redshift and data can be displayed on tableau.
How can I do the automatic update the graph in tableau when a record on dynamodb is added or updated?
since now i appied the
copy
from 'dynamodb://'
iam_role
readratio 40;
however, the record seems not going to replace the previous but just duplicate the previous one.

Related

MariaDB table missing but I can't recreate it

Something went wrong during a structure synchronization between two databases.
One of our production databases now is missing a key table 'customers' (which just about every other table has foreign keys to)
I'm trying to recreate the table from last night's backup (I don't want to restore the entire db - just recreate this table as the data in it does not change that much and I don't want to lose the transactional data from today)
The hassle seems to be that all the foreign key data for this table still exists in INFORMATION_SCHEMA.KEY_COLUMN_USAGE and I am getting 121 and 150 errors when I try run the CREATE TABLE query.
I've manually deleted all FK to the missing table and I am still getting errno 150 when trying to recreate the table. Any ideas where else there might be lost references to this table that is stopping me creating it again?
This was eventually resolved by multiple consultations of the SHOW ENGINE INNODB STATUS query.
The missing table had various indexes - example on the customer name there was an index "customer_name_idx". The CREATE TABLE query asked for this index to be created. The show engine innodb status return was "could not create table because index customer_name_idx already exists."
There was no reference to this index, to any primary key or to the table itself in any of the meta-data tables - I checked
INFORMATION_SCHEMA.INNODB_SYS_INDEXES
INFORMATION_SCHEMA.TABLE_SCHEMA
INFORMATION_SCHEMA.STATISTICS -INFORMATION_SCHEMA.TABLE
so I could not explain why this error was being thrown.
My guess, after the fact, is that MySQL is holding a cached copy of the information_schema meta data in memory and was consulting that, and maybe that only gets refreshed if you restart MySQL?
The solution was to give the indexes new names as a short term fix, and to rename them during our next scheduled downtime.
Once these were made, the table was created and the backup data could be reinstated.

How to test AWS 'DynamoDB' table immediately after restore from Backup

I am writing python script to automate AWS DynamoDB table backup restore test. Once the table is restored from backup i can not check(test) table size or item count in restored table immediately. As per AWS "Storage size and item count are not updated in real-time. They are updated periodically, roughly every six hours."
I also tried using "scan" on the restored table to list sample items but that's also not seems to be working.
Does anybody know what could be the work around here? Suggestion would be appreciated.
Thanks !!
I was able to achieve it by using, table scan.
client = boto3.resource('dynamodb', 'us-east-1')
table = client.Table('ddb_test_table')
response = table.scan(Limit=XX)

Change a large dynamodb table to use LSIs instead of GSIs

I have a live table in dynamo with about 28 million records in it.
The table has a number of GSI that I'd like to change to be LSIs however LSIs can only be created when the table is created.
I need to create a new table and migrate the data with minimum downtime. I was thinking I'd do the following:
Create the new table with the correct indexes.
Update the code to write records to the old and new table. When this starts, take a note of the timestamp for the first record.
Write a simple process to sync existing data for anything with a create date prior to my first date.
I'd have to add a lock field to the new table to prevent race conditions when an existing record is updated.
When it's all synced we'd swap to using the new table.
I think that will work, but it's fairly complicated and feels prone to error. Has anyone found a better way to do this?
Here is an approach:
(Let's refer to the table with GSIs as oldTable and the new table with LSIs as newTable).
Create newTable with the required LSIs.
Create a DynamoDB tirgger for the oldTable such that for every new record coming to the oldTable insert the same record to the newTable. (This logic needs to be in the AWS Lambda).
Make your application point to the newTable.
Migrate all the records from oldTable to newTable.

R - how to react to database inserts/updates/deletes?

I'm reading in data from an SQLite database table into a data.frame with R's DBI. Often (as often as every 5 secs), new records get added into the database table externally, or existing ones updated/deleted, at which point I need to propagate these changes to my data.frame.
So the question is how can I hook onto and respond to these database events in R? I don't want to have to keep querying the database every 5 secs just to make sure nothing has changed. Is there some callback mechanism at my disposal?
If you have access to the C code that is writing your SQL data, then you can implement a callback:
http://www.sqlite.org/c3ref/update_hook.html
and then in your callback function you could update the timestamp of a file if the table being modified is one your R code cares about. Then your R code checks the timestamp of that file, and if its changed, only then does it need to query the SQLite database.
Now I don't know if you could add a callback to the SQLite connection held by R and expect to get a callback if another SQLite connection/process changes the database table. I doubt it, I suspect the callbacks are only triggered if the connection they are registered with does the update, because otherwise all sorts of asynchronous things happen, and there's no event handler.
Another idea is to use triggers to update a database table of modification times. Define triggers on all tables you care about so that they update a row in a "last modified" table. Then use the file modification time to check for any change to the database, and then your R only has to query the "last modified" table to see what specific table has changed since last check.

What methods are available to monitor SQL database records?

I would like to monitor 10 tables with 1000 records per table. I need to know when a record, and which record changed.
I have looked into SQL Dependencies, however it appears that SQL Dependencies would only be able to tell me that the table changed, and not which record changed. I would then have to compare all the records in the table to find the modified record. I suspect this would be a problem for me as the records constantly change.
I have also looked into SQL Trigger's, however I am not sure if triggers would work for monitoring which record changed.
Another thought I had, is to create a "Monitoring" table which would have records added to it via the application code whenever a record is modified.
Do you know of any other methods?
EDIT:
I am using SQL Server 2008
I have looked into Change Data Capture which is available in SQL 2008 and suggested by Martin Smith. Change Data Capture appears to be a robust, easy to implement and very attractive solution. I am going to roll CDC on my database.
You can add triggers and have them add rows to an audit table. They can audit the primary key of the rows that changed, and even additional information about the changes. For instance, in the case of an UPDATE, they can record the columns that changed.
Before you write/implement your own take a look at AutoAudit :
AutoAudit is a SQL Server (2005, 2008) Code-Gen utility that creates
Audit Trail Triggers with:
Created, CreatedBy, Modified, ModifiedBy, and RowVersion (incrementing INT) columns to table
Insert event logged to Audit table
Updates old and new values logged to Audit table
Delete logs all final values to the Audit table
view to reconstruct deleted rows
UDF to reconstruct Row History
Schema Audit Trigger to track schema changes
Re-code-gens triggers when Alter Table changes the table
What version and edition of SQL Server? Is Change Data Capture available? – Martin Smith
I am using SQL 2008 which supports Change Data Capture. Change Data Capture is a very robust method for tracking data changes as I would like to. Thanks for the answer.
Here's an idea.You can have a flag on each table that every time a record is created or updated is filled with current datetime. Then when you notice that a record has changed set its flag to null again.Thus unchanged records have null in their flag field and you can query not null values to see which record has changed/created and when (and set their flags to null again) .

Resources