We are using Oracle Warehouse Builder in our project.Accidentally,some of the internal tables
data got deleted.The impact is when i open the map in OWB,the canvas in completely blank.I cannot see the tables and transformations applied.However when I right click on the map and execute,it does that perfectly fine.But the code is not visible and neither can I deploy that map.The table whose data deletion caused this was CMPSCOMAPCLASSES.We do not have a regular backup of the databse.Hence cannot recover the data.
Can anybody please help me in getting back the data anyhow.
Appreciate your help.
Related
I am working on a project using typeorm and nestjs.
When a table is first created, I want to execute the sql statement and put the data in it.
I searched for a way to put the initial data, but couldn't find any results.
I need your help.
thank you :)
Hope you have a great day and help me with the problem
I am trying to recreate AP aging through ODBC. Everything is working fine except the Journal transactions.
In Netsuite Saved searches there is a field remainingamount which is not available in connection schema for some reason. We have tried to conctact Netsuite directly but still got no any feedback from them.
There is a field foreignamountunpaid/foreignamountunused in transactionline table I am trying to use right now. And with bills and Expense reports it's working totaly fine.
However, for no reason we have some problems with some of JEs. In some of them there is null value when in the Saved search there is a value for that line.
I tried to analyse why this is happening but it's look totaly random.
So, do you by any chance know if there is a better field for amount remaining I could use throug ODBC connection? Or why some of JEs have null values in those fields foreignamountunpaid/foreignamountunused ?
Thank you in advance.
Found the way to make it work
nvl(nvl(transactionline.foreignamountunpaid, -transactionline.foreignpaymentamountunused),-transactionline.foreignamount)
this column in sql gives you desired result
I am able to connect to an existing cube from Excel. I am then trying to connect this same Excel file to a different Cube that has the same schema but a completely different set of dimensions and measures.
However, when I try to refresh the file against the new cube, Excel stays in the "Executing OLAP query" state indefinitely and on the icCube server I see COMMON_GENERAL_ERROR with a null message. It seems that the problem happens because icCube is trying to pull data based on old dimension values that are in my PivotTable filters and since these values don't exist in the new cube, I'm getting an error. The dimensions in question have an All member and I was hoping that in such cases the query would use the All member. Is this possible and/or am I missing some settings in my schema definition?
I'm getting the following error:
net.rim.device.api.database.DatabaseOutOfMemoryException
When trying to update my DB. It seems to happen at random points in the code (of course these are always points where I interact with the DB).
What does this error mean and how can I prevent this from happening??
By the way, I use transactions when using INSERT and UPDATE to speed things up.
Any help at all would be greatly appreciated!
Thanks!!
I've searched through the site and haven't found a question/answer that quite answer my question, the closest one I found was: Syncing objects between two disparate systems best approach.
Anyway to begun, because there is no RSS feeds available, I'm screen scraping a webpage, hence it does a fetch then it goes through the webpage to scrap out all of the information that I'm interested in and dumps that information into a sqlite database so that I can query the information at my leisure without doing repeat fetching from the website.
However I'm also storing various metadata on the data itself that is stored in the sqlite db, such as: have I looked at the data, is the data new/old, bookmark to a chunk of data (Think of it as a collection of unrelated data, and the bookmark is just a pointer to where I am in processing/reading of the said data).
So right now my current problem is trying to figure out how to update the local sqlite database with new data and/or changed data from the website in a manner that is effective and straightforward.
Here's my current idea:
Download the page itself
Create a temporary table for the parsed data to go into
Do a comparison between the official and the temporary table and copy updates and/or new information to the official table
This process seems kind of complicated because I would have to figure out how to determine if the data in the temporary table is new, updated, or unchanged. So I am wondering if there isn't a better approach or if anyone has any suggestion on how to architecture/structure such system?
Edit 1:
I'm not sure where to put the additional information, in an comment or as an edit, so I'm going to add it here.
This expands a bit on the metadata in regards of bookmarking, basically the data source can create new data/addition to the current data, so one reason why I was thinking of doing the temporary table idea was so that I would be able to determine if an data source that has been "bookmarked" has any new data or not.
Is it really important to determine if the data in the temporary table is new, updated or unchanged? Do you really need to keep an history of the changes?
NO: don't use the temporary table but just mark as old (timestamp) your old records, don't do updates, and just insert your new data.
YES: your idea seems correct to me but all depends on how much data you need to process each time; i don't think it is feasible with a large amount of data.