How to find if file has been imported in Access database or created within it? - ms-access-2010

In my access database there is a dataset which I need to know how it has been created. I tried to backtrack and reached to a table for which I am not able to find any source data. I am pretty much sure that it has been imported from some where. I checked in "View" option there is not "SQL" view for that table. It only has "Datasheet" view and "Design View".
In access database is there any way to check that whether a file has been imported or has been created using SQL query within access database? Is there any "flag" raised or something like that?

No. Once data is persisted in a table, that's it.
If you need further info, you can have a Timestamp field with a default value of:
Now()
or, if a higher resolution than one second is needed:
Date() + Timer() / 86400
or another custom field where you record session info as you like during the import or creation of data.

Related

A few questions regarding importing a manually created data entity

I used the data entity creation wizard and selected Reqplan table as the main data source, then I manually added ReqPlanVersion, ReqPO, ReqTrans table as additional data sources and created the relationships below.
As for the data entities fields I manually dragged a subsets of fields from the three manually added tables.
However when I try to import the data and add file, I receive the following issue:
Q1. In the past for some other entities I have changed ‘Allow Edit on Create’ from ‘Auto’ to ‘YES’ on these fields and it has worked but I am not sure if it’s the only way or is it following best practice? Also what is the determining factor for a field to be editable or not during import since they are all on AUTO?
When I try to map source to staging manually by drawing the mapping lines I get below issue:
Q2. What is going on with the configuration key? Is it because I manually dragged the fields from the additional data sources but not using the data entity creation wizard?
Lastly I been getting below issue:
Q3: Is there a way to find out which unique key it is referring to? Is it talking about the EntityKey in my Data Entity or Indexes in staging table? In either case there are more than one so I am not sure what it is referring to?
Thanks in advance.
Response from the community forum:
1) Check allowEdit property on table itself, so if it is "No" there then auto means "No". If you want to update them through data entity you will have to force them to "Yes"
2) It's not connected to manual addition, it just says that tables used in the entity has configuration key disabled, so you cannot export or import data into them, however, these tables could be added by wizard or manually, it does not matter. Also, Configuration key could be on fields as well or on EDT that these fields use, check them as well.
3) Entity has Key node and there and there you have key generated by wizard for you. It is used by framework to understand if record should be updated or created, if it does not work for you, change it on the data entity and regenerate staging. You need to refresh staging because error you get is SQL error, at this stage SSIS transfers data from a file into a staging table and data could not be copied because of index vialotion, so check staging table index and see if your file has any duplicates.

BizTalk - Delete without a schema

I am importing a file with 200+ records into a master table.
The BizTalk package only services one source, other packages service other sources
I am using strongly type stored procedures for all SQL CRUD
All records inside the file come from the same source
The file does not contain source name or source Id
I want to determine source from package hard coded value
The Master table contains records from several sources
Before import: delete inside Master table existing records from source
Unlike the file import, the delete statement happens once
DELETE FROM Master WHERE SourceID = #SourceID
The file import works, but how can I hard code the delete source ID?
In your delete transform (just above the send shape) you can set up a SourcID property for the outgoing message. You can then populate the message context with this SourceID. This sourceID can then be used in your delete statement.
If I understand correctly, you want to delete all existing records for the SourceID before inserting new ones?
If so, you need to have access to the SourceID value on the inbound message into the orchestration.
To do this, use property promotion.
You can either do this:
inside a pipline component configured on the receive port so that the property is available when the message arrives on the orchestration, or,
inside the orchestration, which will require you moving the construct shape for the InsertCSV message above the delete construct shape, and promoting the property within the contruct shape.
Of these options, the first one is probably the best option as assigning properties should ideally be done during message dissasembly.
Alternatively, you can use an xpath() call within an Expression shape to interrogate the message using xpath, and retrieve the value like that. This way you can avoid thinking about property promotion.
However, while quicker to implement, this approach is not best practise because it makes your orchestration very sensitive to changes in message schema.

Why does a DLookup in a Before Change data macro sometimes return an old value?

I have a large relational Access 2010 database. It is normalized, and includes some union queries that are very slow. I therefore thought I could speed things up by creating some cached fields. For example in tblOrder I would create a CustomerName field. To maintain this cached field I created a Before Change data macro that would dLookup the customer's company name from tblCustomer. It worked great. Then I created an After Update data macro in tblCustomer so when the user changes the Company Name all the child records would automatically be updated. It worked, but then the Before Change data macro fired and the dLookup returned the old Company Name. Any help would be very appreciated.
I made a sample of my problem using the Northwing Database. You can download a copy of it at http://www.thetechmentors.com/freestuff/exerciseFiles/msAccess/DlookupDatamacroProblem.zip
All you need to do is tweak the Before Change data macro on [tblOrder] to do the name lookup only when the [CustomerID] changes in that table. You can do that using the Updated() function like so:
That way when the macro fires as a result of the update performed from the After Update data macro on [tblCustomer], the [tblOrder].[CustomerID] value has not changed so the name lookup is bypassed.

Razor MVC - Object Edit Field Change Log

Users modify a DB object in an edit form that I have, pretty straight forward.
I need to implement a 'change log' on this object. I need to record which fields where changed and what they were before and after. I'm using Razor MVC.
I've done this by writing triggers for the table on update/delete. On update/delete of a record, the trigger pushes the record to a History table, in a History database. This creates the change log. Then you would just need to display it; to identify the change would require evaluating each and every field.
There's nothing already built that wold do this for you that I know of.

updating batches of data

I am using GridView in asp .net and editing data with edit command field property (as we know after updating the edited row, we automatically update the database), and I want to use transactions (with begin to commit statement - including rollback) to commit this update query in database, after clicking in some button (after some events for example), not automatically to insert or update the edited data from grid directly to the DB...so I want to save them somewhere temporary (even many edited rows - not just one row) and then to confirm the transaction - to update the real tables in database...
Any suggestions are welcomed...
I've used some good links, but very helpful, like:
http://www.asp.net/learn/data-access/tutorial-63-cs.aspx
http://www.asp.net/learn/data-access/tutorial-66-cs.aspx
etc...
Well,
you can store your edited data in a DataTable in session. and then pass this data table as a bulk insert in to the database. 2 options are available for this
if you are using SQL Server 2005 you can use OpenXML to achieve this, as i have stated here
if you are using SQL Server 2008 youc an use Table Variables like i did here.
i hope it helps
First way:
Create session variable that will contain your DB object (DataTable or mapped objects).
The GridView should work with this instance instead of sending the data to the database.
Once editing is finished you may take the object from the session and save it in the way you normally do.
Second way:
I would use javascript to collect all changes on the client side while he is editing as a array of objects (each object is separate row).
Once the editing done, you can create json string from the collection and pass it to the server.
If your json object configuration is same as server class then you can use JavaScriptSerializer to deserialize your string into collection of object.
After that, you can save your objects in the way you normally do.

Resources