TestCase for multiple files - tosca

I created one TestCase which I want to use on multiple files in one folder. The TestCase is for each file the same. Is there a possibility to do that in the Execution Section?
If possible I want to see after every File if it was successful or not.
Thank you and best regards

I have a similar use case where I have 1 test case executing several iterations from a file. Currently, I'm leveraging Tricentis TDM solution to store the data rather then pulling from a file(s). You can create multiple repositories to store data ie: SQL Lite, MS SQL and other source systems. Check out https://www.tricentis.com/products/automate-continuous-testing-tosca/test-data-management/

Related

SSDT 2017 Get Data - flat file source (csv/txt)

There are manay examples of new using modern Get Data feature while connecting to SQL Server. However, I can't find any examples of importing data from multiple flat files (csv/txt) located in one folder.
How should I make an initial connection to the data source? Whether it should be a connection to a folder or to one of the files? How should I buid the query chaing (query M).
It seems that the way I do it in Excel does not work.
I would be gratefull for any tips.
Here are two good examples of how to import multiple text files with Power Query in Power BI.
Import all CSV files from a folder with their filenames in Power BI
https://powerpivotpro.com/2016/12/import-csv-files-folder-filenames-power-bi/
Power BI – Combine Multiple CSV files,Ignore Headers and Add FileName as a Column
http://analyticsavenue.com/power-bi-combine-multiple-csv-filesignore-headers-and-add-filename-as-a-column/
There are several ways to do this with standard SSIS tasks also, but I think the most flexible way is to use a Foreach Loop Container to read all the files in a folder. In the properties of the Foreach Loop Container you specific the folder and a file name pattern (i.e. *.csv) of the files you want to import.
You create a variable to hold the name of the current file, and use that variable to change the Connection String property of the Flat File Source each iteration of the loop.
Here's a decent example of how to do this that covers most of the setup, and provides a downloadable example.
Loop through Flat Files in SQL Server Integration Services
https://www.mssqltips.com/sqlservertip/2874/loop-through-flat-files-in-sql-server-integration-services/

Possible to use .zip file with multiple .csv files?

Is it possible using U-SQL to unzip a zip folder with multiple .csv files and process them?
Each file has a different schema.
So you've got two problems here.
Extract from a ZIP file.
Deal with inner varying contents.
To answer your question. Is it possible?... Yes.
How?... You'd need to write a user defined extractor to do it.
First check out the MSDN extractors page:
https://msdn.microsoft.com/en-us/library/azure/mt621320.aspx
The class for the extractor needs to inherit from IExtractor with methods that iterate over the archive contents.
Then to output each inner file in turn pass a file name to the extractor so you can define the columns for each dataset.
Source: https://ryansimpson.net/2016/10/15/query-zipfile-adla/
Another option would be to use Azure Data Factory to perform the UnZip operation in a custom activity and output the CSV contents to ADL Store. This would involve some more engineering though and an Azure Batch Service.
Hope this helps.

Fetch Test data from Excel using TOSCA

This is about TOSCA Automation tool.
I have an excel and loaded all my test data in it.
All i need to do is getting this data from excel using TOSCA.
Please help me on this.
There are two kind of scenarios that originate when you say you have your test data in excel.
Scenario #1: If you have a single test scenario that needs to be tested against different datasets, then TOSCA provides you with a very good feature of TemplateInstances, which will not only create the desired number of test cases for you from your data set but will also import your excel's data and embed it in Test Case steps. For more details, please see the documentation of TOSCA Commander on TemplateInstances.
Scenario #2: If you have different test cases and want to just import data from an external source then you can use the common modules provided by Tricentis along with standard TOSCA installation under the name Excel Engine. Alternatively, you can also write your own Keywords using VBScript and import your excel into TOSCA.
TOSCA has its own Excel engine add-in which can be used as parametrization of data from excel workbook.
Other then that I've also searched on this topic and found some Worx.tce predefined module which we need to import as a subset, to use the excel as a test data provider.
If you want to use excel to fetch data for test scripts you can use excel module in standard module section provided by tosca and can use buffer as input type to store and data for later use OR you can use test sheet module where you can store data and can use it in future it would be more helpful as it will remove the dependency of excel file in your test scripts and handling test sheet is better way.

Drupal - Attach files automatically by name to nodes

i need better file attachement function. Best would be that if you upload files to FTP and have a similar name as the name of the node (containing the same word), so they appear under this node (to not have to add each file separately if you need to have more nodes below). Can you think of a solution? Alternatively, some that will not be as difficult as it always manually add it again.
Dan.
This would take a fair bit of coding. Basically you want to implement hook_cron() and run a function that loops through every file in your FTP folder. The function will look for names of files that have not already been added to any node and then decide which node to add them to.
Bear in mind there will be a delay once you upload your files until they are attached to the node until the next cron job runs.
This is not a good solution and if I could give you any advice it would be not to do it - The reason you upload files through the Drupal interface is so that they are tracked in the files table and can be re-used.
Also the way you're proposing leaves massive amounts of ambiguity as to which file will go where. Consider this:
You have two nodes, one about cars and one about motorcycle sidecars. Your code will have to be extremely complex to make the decision of which node to add to if the file you've uploaded is called 'my-favourite-sidecar.jpg'.

Maintaining same piece of code in multiple files

I have an unusual environment in a project where I have many files that are each independent standalone scripts. All of the code required by the script must be in the one file and I can't reference outside files with includes etc.
There is a common function in all of these files that does authorization that is the last function in each file. If this function changes at all (as it does now and then) it has to changed in all the files and there are plenty of them.
Initially I was thinking of keeping the authorization function in a separate file and running a batch process that produced the final files by combining the auth file with each of the others. However, this is extremely cumbersome when debugging because the auth function needs to be in the main file for this purpose. So I'd always be testing and debugging in the folder with the combined file and then have to copy changes back to the uncombined files.
Can anyone think of a way to solve this problem? i.e. maintain an identical fragment of code in multiple files.
I'm not sure what you mean by "the auth function needs to be in the main file for this purpose", but a typical Unix solution might be to use make(1) and cpp(1) here.
Not sure what environment/editor your using but one thing you can do is to use prebuild events. create a start-tag/end-tag which defines the import region, and then in the prebuild event copy the common code between the tags and then compile...
//$start-tag-common-auth
..... code here .....
//$end-tag-common-auth
In your prebuild event just find those tags, and replace them with the import code and then finish compiling.
VS supports pre-post build events which can call external processes, but do not directly interact with the environment (like batch files or scripts).
Instead of keeping the authentication code in a separate file, designate one of your existing scripts as the primary or master script. Use this one to edit/debug/work on the authentication code. Then add a build/batch process like you are talking about that copies the authentication code from the master script into all of the other scripts.
That way you can still debug and work with the master script at any time, you don't have to worry about one more file, and your build/deploy process keeps everything in sync.
You can use a technique like #Priyank Bolia suggested to make it easy to find/replace the required bit of code.
I ugly way I can think of:
have the original code in all the files, and surround it with markers like:
///To be replaced automatically by the build process to the latest code
String str = "my code copy that can be old";
///Marker end.
This code block can be replaced automatically by the build process, from one common code file.

Resources