Unix script in informatica post command task - unix

I write script to find the particular filename in a folder and copy the files after loaded into target table using informatica.
I use this script in informatica post command task but my session got failed it did not loaded into target tables but copy the files to backup directory.
cd /etl_mbl/SrcFiles/MainFiles
for f in Test.csv
do
cp -v "$f" /etl_mbl/SrcFiles/Backup/"${f%.csv}"
done
I want to correct my script based on the source files loaded into target using informatica and copy the loaded files into backup directory.

Do not use a separate command task. Use informatica's Post session success command and Post session failure command to achieve this. Put your unix code in Post session success command so it will only be triggered after session is successful.

Go with #Utsav's approach. Alternatively you can use a condition $YourSessionName.Status = SUCCEEDED on your link between Session and Command Taks
The benefit of this approach is that the command is clearly visible at the first glance.

Related

Informatica Flat File source name

I am working on a project were we need to load flat file eg : (Gemstone_20220325.csv) I have given the source name as (Gemstone_*.csv) in script to search for the file in the path.
But it is failing with error , No such file .
Is that anything I am missing . Any idea on this is much appreciated .
You need to put either exact name or use a file list with the name of the file and then use indirect file type in the session that is reading the file.
You can use a pre session shell command like this ls -1 Gemstone_*.csv>/infa/home/tmp/Gemstone_filelist.txt. Or you can create a shell script too with this command for better control.
in the session that is reading this file, set the property to indirect file type and mention /infa/home/tmp/Gemstone_filelist.txt as file to be extracted.
Infa will pick files one by one and process them.
Once the file gets processed, delete it using a post session command task rm -f Gemstone_*..

How to copy new files from SFTP using WinSCP script

I want to download only new files from one SFTP server using WinSCP.
Suppose, I have 10 files in source and destination today.
Tomorrow one new file may be added to the source. In this scenario, I want to copy the new file only into destination.
I am using below script:
open sftp_connection
cd /path
option transfer binary
get "*.txt" localpath
close
exit
By using above, I am able to copy all files, but I want only new files which are not available in destination.
Thanks,
Srikrishna.
The easiest solution is to add -neweronly switch to your get command:
get -neweronly *.txt C:\local\path\*
For a very similar results, you can also use synchronize command:
synchronize local . C:\local\path -filemask=*.txt
See also WinSCP article on Downloading the most recent file.

How to save Robot framework test run logs in some folder with timestamp?

I am using Robot Framework, to run 50 Testcases. Everytime its creating following three files as expected:
c:\users\<user>\appdata\local\output.xml
c:\users\<user>\appdata\local\log.html
c:\users\<user>\appdata\local\report.html
But when I run same robot file, these files will be removed and New log files will be created.
I want to keep all previous run logs to refer in future. Log files should be saved in a folder with a time-stamp value in that.
NOTE: I am running robot file from command prompt (pybot test.robot). NOT from RIDE.
Could any one guide me on this?
Using the built-in features of robot
The robot framework user guide has a section titled Timestamping output files which describes how to do this.
From the documentation:
All output files listed in this section can be automatically timestamped with the option --timestampoutputs (-T). When this option is used, a timestamp in the format YYYYMMDD-hhmmss is placed between the extension and the base name of each file. The example below would, for example, create such output files as output-20080604-163225.xml and mylog-20080604-163225.html:
robot --timestampoutputs --log mylog.html --report NONE tests.robot
To specify a folder, this too is documented in the user guide, in the section Output Directory, under Different Output Files:
...The default output directory is the directory where the execution is started from, but it can be altered with the --outputdir (-d) option. The path set with this option is, again, relative to the execution directory, but can naturally be given also as an absolute path...
Using a helper script
You can write a script (in python, bash, powershell, etc) that performs two duties:
launches pybot with all the options you wan
renames the output files
You then just use this helper script instead of calling pybot directly.
I'm having trouble working out how to create a timestamped directory at the end of the execution. This is my script it timestamps the files, but I don't really want that, just the default file names inside a timestamped directory after each execution?
CALL "C:\Python27\Scripts\robot.bat" --variable BROWSER:IE --outputdir C:\robot\ --timestampoutputs --name "Robot Execution" Tests\test1.robot
You may use the directory creation for output files using the timestamp, like I explain in RIDE FAQ
This would be in your case:
-d ./%date:~-4,4%%date:~-10,2%%date:~-7,2%
User can update the default output folder of the robot framework in the pycharm IDE by updating the value for the key "OutputDir" in the Settings.py file present in the folder mentioned below.
..ProjectDirectory\venv\Lib\site-packages\robot\conf\settings.py
Update the 'outputdir' key value in the cli_opts dictionary to "str(os.getcwd()) + "//Results//Report" + datetime.datetime.now().strftime("%d%b%Y_%H%M%S")" of class _BaseSettings(object):
_cli_opts = {
# Update the abspath('.') to the required folder path.
# 'OutputDir' : ('outputdir', abspath('.')),
'OutputDir' : ('outputdir', str(os.getcwd()) + "//Results//Report_" + datetime.datetime.now().strftime("%d%b%Y_%H%M%S") + "//"),
'Report' : ('report', 'report.html'),

Filter command history by folder they were executed in?

I know shell history doesn't keep track of the folder the commands were executed in but I think it would be really useful to be able to output the history for a particular folder by using a flag like history --local for example.
I often jump from project to project which use very similar commands but have different destination host for ssh or environment variable...
Is there any way to achieve that –preferably using zsh?
In bash, you can set PROMPT_COMMAND to something like the following:
PROMPT_COMMAND='history | tail -n1 >> .$USER.history'
It will save each command to a file in the current directory.
For an alternative approach (replacing cd with a command that changes where history is saved), see http://www.compbiome.com/2010/07/bash-per-directory-bash-history.html.

Add downloadable executable to website

I have a website project, and an outlook addin that communicates via a webservice to the same database. I'd like to add the outlook addin as "downloadable file" to the interface of the website.
How to achieve that at build time the outlook addin installer ends up in the website's "Download" folder?
Is that possible?
Thanks in advance!
I am not sure this is really a good idea, because maybe not every time you build it it is ok to upload it (broken builds? untested bugs?), but anyway, the idea might be this:
find a way to mount the FTP site mounted as disk Z in the computer and keep it there
you probably want to zip it before, so find and install a command line zip.exe
find a way to have an automated job start every few minutes (like a batch file)
The job (might be a batch file) should do this:
check the file creation date of C:\build\folder\executable.exe and compare it with the file creation date of Z:\download\folder\executable.zip
only if newer, zip C:\build\folder\executable.exe to C:\build\folder\executable.zip and copy C:\build\folder\executable.zip to Z:\download\folder\executable.zip
In what language you write the script is your choice, a windows batch could do (the XCOPY command can copy only newer files), I know PHP and probably would use that with a batch file calling "php my_php_task.php", but you can launch any language interpreter you like.
UPDATE
For zipping you can download this:
http://www.info-zip.org/Zip.html
For copying only newer files u can use XCOPY with options /D (newer only) and /Y (confirm overwriting). Other options here:
http://www.computerhope.com/xcopyhlp.htm
So the batch file might look just similar to these two lines:
zip -f C:\build\folder\executable.zip C:\build\folder\executable.exe
xcopy /D /Y C:\build\folder\executable.zip Z:\download\folder\executable.zip
Have it called every 30 seconds and the job is done. The -f option in zip and /D option in xcopy make sure the script does nothing except check creation dates if you have not recently rebuilt the file.

Resources