I am asked to automate the tracking of changes in the structure of the database: Any modification, addition or removal of tables, fields, indexes, etc.
I have searched the audit but only found that it can track changes in the "Database schema", which is something else.
Do you know if it is possible to do that?
We use 11.6.3.
One wonders how those magical changes in the schema (I think you clarified that it was actually schema changes you wanted to automate) occur. Optionally it could be up to those making the changes to also keep track of them. Usually (hopefully) the database is updated using "delta df-files". Those df-files if kept are a changelog of the database.
Another option is to daily/hourly/weekly dump the data definitions:
CREATE ALIAS DICTDB FOR DATABASE sports.
DISPLAY LDBNAME("DICTDB").
RUN prodict/dump_df.p ("ALL",
"c:/temp/sports.df",
"").
DELETE ALIAS DICTDB. /* Optional */
Taken from this entry in the knowledge base: https://community.progress.com/s/article/15884
Then you can diff that df-file using your favorite tool or keep as it is.
If you actually mean structure (that's more how the data is stored in different files on disc) you can use the prostrct command to save a new st-file to disc:
prostrct list sports
This will save a file called sports.st. Handle it as above and you will have a changelog of the database structure.
Related
I used the data entity creation wizard and selected Reqplan table as the main data source, then I manually added ReqPlanVersion, ReqPO, ReqTrans table as additional data sources and created the relationships below.
As for the data entities fields I manually dragged a subsets of fields from the three manually added tables.
However when I try to import the data and add file, I receive the following issue:
Q1. In the past for some other entities I have changed ‘Allow Edit on Create’ from ‘Auto’ to ‘YES’ on these fields and it has worked but I am not sure if it’s the only way or is it following best practice? Also what is the determining factor for a field to be editable or not during import since they are all on AUTO?
When I try to map source to staging manually by drawing the mapping lines I get below issue:
Q2. What is going on with the configuration key? Is it because I manually dragged the fields from the additional data sources but not using the data entity creation wizard?
Lastly I been getting below issue:
Q3: Is there a way to find out which unique key it is referring to? Is it talking about the EntityKey in my Data Entity or Indexes in staging table? In either case there are more than one so I am not sure what it is referring to?
Thanks in advance.
Response from the community forum:
1) Check allowEdit property on table itself, so if it is "No" there then auto means "No". If you want to update them through data entity you will have to force them to "Yes"
2) It's not connected to manual addition, it just says that tables used in the entity has configuration key disabled, so you cannot export or import data into them, however, these tables could be added by wizard or manually, it does not matter. Also, Configuration key could be on fields as well or on EDT that these fields use, check them as well.
3) Entity has Key node and there and there you have key generated by wizard for you. It is used by framework to understand if record should be updated or created, if it does not work for you, change it on the data entity and regenerate staging. You need to refresh staging because error you get is SQL error, at this stage SSIS transfers data from a file into a staging table and data could not be copied because of index vialotion, so check staging table index and see if your file has any duplicates.
I'm having some trouble getting relation deletion to work exactly how I would expect it to.
For example I have two simple tables, users and permissions with a one-to-many relation between users and permissions (or it could be many-to-many in this example as well).
I first tried deleting one of the related permissions using userDatasource.deleteItem() or userDatasource.item.permissions[index]._delete() but when you use either of those functions it marks the record as deleted client side so you run into trouble when you need to insert again.
I then found a related question that said to use item.relation.splice(startIndex, 1) to just break the relation and that worked as expected but now I have a bunch of extra rows in my database with the user foreign key null. I would much rather have the same behavior as .splice but also have it delete those records from the database. Is there any way to do that or is App Maker supposed to detect the broken relation and automatically delete the row from the table?
Just do a check after the splice like this:
if (item.relation.length === 0) {
item._delete();
}
Somehow in my firebase account 2 sets of data have changed the fieldname for 'DeviceID' to 'PeripheralID' and 'DeviceUUID'. Is there a way that I can search these and change them to be 'DeviceID'? I would prefer to do this by command line but will do whatever is easier.
Im thinking something along the line of firebase update:database ?
To answer the question with comments under question:
There is no tool for it. You'll have to download the data from database, edit JSON with some kind of text editor and upload it back to database.
This operation OVERRIDES information in database with information in JSON. In other words: uploading JSON changes database.
If you're using firebase console and data you want to change is not in whole database, you could open certain tree level and do those operation just there.
What is the best practice for checking if any references to a particular model record exist before deleting that record? Basically, I have a model that represents images, and all metadata associated with an image. Other models will have references to one or more images (depending on the model).
Let's say for example, I have an "Item" which has a "MainImage" and an "AltImage", both of which are just references to the Image model. If I delete an Item record, I have to check if the two images are referenced by any other Item, or any other table, and if not then delete the Image.
How would I go about this?
Since you are using a database, let it maintain referential integrity using foreign key constraints on the images. The database or EF will prevent you from deleting the image if it is still being referenced. You can catch this exception and continue processing the request without deleting the image.
I found a blog posting Inferring Foreign Key Constraints in EF that may be of use in setting these up.
Couple ideas based on what you prefer:
Create trigger in you DB
That will delete alt img.
and everytime you want to delete the main record row the other record will be deleted too.
Another (based on nhibernate)
Make sure that alternative image in your entity framework might have possibility to cascade commands. And in that case if you delete one image the other one will be deleted too. Here is one example from google how to do this
Most clumsy but easiest is:
Delete both records when you deleting the image.
I am having some difficulties with my module I am currently working on. As part of this module I have created a few fields that appear on a form. This form is based in a custom entity.
First I am using field_create_field($field); to create the row in the field_config table. I am then using field_create_instance($instance); to create the row in the instance table and also create the table that begins with field_data_field.
The problem I am running into is how to remove these tables correctly at the end. I have tried manual deletion (via hook_uninstall), I've tried field_delete_field, I've tried to use the remove_instance hook that is built into the Commerce module. Either way, I end up getting lots of field_deleted_data_xxx tables being created. These don't even have data in them as I created a manual query to empty the main data tables before this function was called that seems to create these tables.
Has anyone else ever run into this problem? How do I stop Drupal from creating these tables??
You can't stop Drupal from creating them but I believe you can rid yourself of them totally using field_purge_batch and its related functions.
I really wish I knew the answer to your second question (in your comment above), my instinct would be that if you re-attach the field to the bundle then that data would become automatically available again (otherwise it really doesn't make sense to keep hold of the deleted tables) but I really can't be sure about that.