push notification from DB2 - push-notification

i want to implement push notification from DB2.
I have two Database servers, Server#1 is DB2 and Server#2 is oracle 11g. I have a situation where, whenever some data is inserted in DB2 I need to insert that data in Oracle as well.
I want to inquire whether Push notification from DB2 is the right option available or is there any other suitable way to look forward to.
Or more simply I just need to know whether there is any event available in DB2 when a row is inserted in it like SQL Dependency feature in SQL Server.
Thanks

DB2 offers several options of getting data to another server. Here are some I can think of:
SQL Trigger to react on insert/update/delete and to possibly send out a notification
SQL Replication feature - this is based on triggers
Q Replication would capture the changes and send them over
Event Publishing would notify of change events, probably closed to what you are looking for
Start here in the Q Replication and SQL Replication overview to read about the features and differences.

Related

what is a great progress query tool

I am new to Progress. Previously I worked with Oracle, MSSQL and MYSQL. What is a great progress query tool, free or paid? I want to be able to write simple SQL queries.
In oracle i am using Toad,Pl/sql developer tool to connect oracle. Any thing in Progress 4gl?
Thank you
You can use any random SQL query tool that you like if the SQL engine has been exposed by the DBA. You just need login credentials and permissions just as you would for any SQL db.
Since you want to write SQL queries you should be looking for SQL access not 4GL tools. There is a very, very limited SQL subset embedded within the 4GL but that was created in the 80s and is little more than a marketing gimmick. For anything but trivial SQL you want to use the SQL-92 engine. Which means the DBA needs to have started a SQL broker, provided a port # for you and has granted appropriate permissions.

How do we make sure the record is being locked?

In Oracle EBS, when we are doing data conversions and interfaces, loading data into Oracle from another system, how do we make sure the record is being locked? How do we make sure no other person is updating our records?
Oracle EBS seeded API's will take care about locking. We don't insert data into EBS base tables directly,
validate the data and insert into interface tables later we run oracle standard programs to import interface tables data into base tables.
These oracle standard programs use oracle seeded API to insert data into multiple base tables.
How do we make sure no other person is updating our records?
Developers use their own custom staging tables to import data into EBS.
when data upload staging tables to interface table maintain each interface data source is different, usually other persons don't update other interfaces records, We can't track if anyone updated from database backend tools like sql developer or TOAD. we can track transaction from last updated by column if they update from applications
If you have any specific issue related locking let us know

Synchronize Postgres Server Database to Sqllite Client database

I am trying to create an app that receives an Sqlite database from a server for offline use but cloud synchronization. The server has a postgres database with information from many clients.
1) Is it better to delete the sql database and create a new one from a query, or try to synchronize and update the existing separate sqlite files (or another better solution). The refreshes will be a few times a day per client.
2) if it is the latter, could you give me any leads to resources on how I could do this?
I am pretty new to database applications so please excuse my ignorance and let me know if there is any way I could clarify.
There is no one size fits all approach here. You need to carefully consider exactly what needs to be done, what you are replicating, how much data is involved, and what your write models are, all before you build a solution. Along the way you have to decide how to handle write conflicts and more.
In general the one thing I would say is that such synchronization works best with append-only write models (i.e. inserts, no deletes, no updates), and one way to do it is to log changes that need to be made and replicate those changes.
However, master-master replication is difficult on the best of days and with the best of tools available. Jumping between databases with very different capabilities will introduce a number of additional problems. You are in for a big job.
Here's an open source product that claims to solve this for many database types including Postgres. I have no affiliation or commercial interest in this company.
https://github.com/sqlite-sync/SQLite-sync.com
http://sqlite-sync.com/
If you're able and willing to step outside relational databases to use an object store you might want to have a look at CouchDb and perhaps PouchDb that use a MVCC based replication protocol designed to support multi-master replication including conflict resolution. Under the covers, PouchDb uses adaptors for Sqlite, IndexDb, Local storage or a remote CouchBb instance to persist client side data. It auto selects the best client side storage option for the given desktop or mobile browser. The Sqlite engine can be either WebSQL or a Cordova Sqlite plugin.
http://couchdb.apache.org/
https://pouchdb.com/

Monitor update and deletion in database using ASP.net

Am using grids in VB.net to display database records stored in Microsoft Access, the tables allow editing and deleting using the grid fields.
Is there a way I can monitor whenever a user deletes or edits a record? I want to be able to view details of every update or deletion to certain records, such as the date and users who did it.
What you're speaking of is known as "auditing" and certain databases - such as MS SQL Server - have built-in support for this. MS Access does not include this feature. With the abscence of auditing, a common way to implement this in a custom manner is using update triggers. Unfortunately MS Access also does not have triggers. The only way you'll be able to do this is via an API you write yourself to interact with your tables and discipline to stick to that API.
What you want to do is hook into the save commands on your inserts and deletes. You could also hook into the events to capture the data. Either way, create an insert statement that dumps the log data into your log database.

How to use system_user in audit trigger but still use connection pooling?

I would like to do both of the following things:
use audit triggers on my database tables to identify which user updated what;
use connection pooling to improve performance
For #1, I use 'system_user' in the database trigger to identify the user making the change, but this prevent me from doing #2 which requires a generic connection string.
Is there a way that I can get the best of both of these worlds?
ASP.NET/SQL Server 2005
Unfortunately, no. Identifying the user just from the database connection AND sharing database connections between users are mutually exclusive.
Store the user from your web application in the database and let your triggers go off that stored data. It might even be better to let the web app handle writing all logging information to the database.

Resources