I am new to Autosys and have a query. I want to detect an arrival of a file. Hence I am using a file watcher job with watch_file as *. Now, I want to get the file name as I have to pass the name as a parameter to the next job. How can I get the filename? Any help is appreciated.
There is no real easy way to get the filename. A query on the wcc database and Table called dbo.MON_JOB there is a column AGENT_STATUS that will contain the filename.
You would need to filter based on the job name.
Hope that helps.
Dave
Related
I have a problem. I have a database and the entries of it, indexes at a specific time in IDOL indexing server. The problem is that a speficic entry has not been indexed for reason. Is there any way to force index it? Ideally, for a URL action call. I know that DREREGENERATE may be what I want but I don't understand how to specify my entry ID.
Any ideas?
Thanks in advance.
Manually create the IDX for the entry and use DREADD action to add the entry to the IDOL content DB. Easily achievable using a python script and HTTP Post.
The following document shows the various parameters which can be used in DREADD action.
https://www.microfocus.com/documentation/idol/IDOL_12_2/Content_12.2_Documentation/Help/index.html#Index%20Actions/IndexData/_IX_DREADD.htm?Highlight=DREADD
I am new to DF. i am loading bunch of csv files into a table and i would like to capture the name of the csv file as a new column in the destination table.
Can someone please help how i can achieve this ? thanks in advance
If you use a Mapping data flow, there is an option under source settings to hold the File name being used. And later in it can be mapped to column in Sink.
If your destination is azure table storage, you could put your filename into partition key column. Otherwise, I think there is no native way to do this with ADF. You may need custom activity or stored procedure.
A post said the could use data bricks to handle this.
Data Factory - append fields to JSON sink
Another post said they are using USQL to hanlde this.
use adf pipeline parameters as source to sink columns while mapping
For stored procedure, please reference this post. Azure Data Factory mapping 2 columns in one column
In my access database there is a dataset which I need to know how it has been created. I tried to backtrack and reached to a table for which I am not able to find any source data. I am pretty much sure that it has been imported from some where. I checked in "View" option there is not "SQL" view for that table. It only has "Datasheet" view and "Design View".
In access database is there any way to check that whether a file has been imported or has been created using SQL query within access database? Is there any "flag" raised or something like that?
No. Once data is persisted in a table, that's it.
If you need further info, you can have a Timestamp field with a default value of:
Now()
or, if a higher resolution than one second is needed:
Date() + Timer() / 86400
or another custom field where you record session info as you like during the import or creation of data.
I am having a kettle job and transformation.
Transformation will write the result set of a select sql into a csv file.
job will get the result file and mail it to the user.
I need to mail only if the file consists any data, else should not mail the result to user.
or how to find the result of a transformation is empty or not(is there any file size validator job entry available?).
I am not able to find any job entries for this kind of conditioning.
Thanks in advance.
You can use the Evaluate files metrics job step in the Conditions branch. Set your condition on the Advanced tab.
You can set your transformation to generate the file only if there is data and then use in your main job the File exists? step.
i have a file with documents.
i wrote a applikation with asp.net.
i need to read the creation date and the date of the last change (and title, etc.)
for that i used the API filesysteminfo.
in the file are documents, that are a copy of a vss server.
but when a document is copied, the date of creation changes to the "date of copy".
but i need the original date of creation.
any idea?
greetings
If the file has been overwritten, you will not be able to get the creation date that is stored in SourceSafe.
You may have more luck using the SourceSafe API, however this uses OLE automation, so may not be very simple.
i solved the problem with robocopy
robocopy source destination [parameter]
default it copies Data, Attributes and Timestamp
http://technet.microsoft.com/en-us/library/cc733145%28WS.10%29.aspx
http://www.microsoft.com/downloads/en/confirmation.aspx?familyId=9d467a69-57ff-4ae7-96ee-b18c4790cffd&displayLang=en