From Robot framework file to Azure priority column of test report - report

Azure test results has a column called 'Priority'
In the Robot framework (.robot) files I have I'd like to use a metadata tag to pass to the report so that it shows up in the Azure pipeline results.
How can I do that? How should the format look like?
The location in the file is here
Publish test results Azure
( Priority /TestRun/TestDefinitions/UnitTest.Attributes["priority"].Value )
( read on 50% of the page )

Related

Show Operation Name on Application Insights with URLs in lowercase for dotnet core 3.1 application

Right now, Application Insights shows the Operation Name including the casing, so if clients use different casing, I end up with multiple entries, like so:
POST /api/v1/myapi
POST /api/v1/myApi // Capital "A" in Api
I want all of them to appear under the lowercase Operation Name.
My app is a REST dotnet core 3.1 Api without Mvc.
I tried adding services.AddRouting(options => options.LowercaseUrls = true);, but this changed nothing.
One way is to use ITelemetryInitializer and lowercase Operation Name there.

Yaml pipeline. How to swap connstrings in Builds with Console App

I built a console app with .Net Core 3.1. I have it building using Yaml leaning heavily on the learn.microsoft.com documentation. The release is pushing to the correct box. But I have an appsettings.json file that has a conn string variable that is different between my TEST, QA and PROD regions. I knew how to do this with the xml file transforms in .NET and MVC but I can't get this to work. Any help would be great since I don't even know the term for what I am trying to do here.
How do you change the connectionstring in the appsettings.json based on a variable or do I have to create 3 branches each with settings and create 3 build and release pipelines?
Thank you.
In order to push to different environments you usually
Have seperate release pipelines that trigger from different branches.
You have one release pipeline with different stages that need pre-approval to move to the next stage TEST -> QA -> PROD.
In both cases you will make use of Stage.
There you need to add a task named "File transformation"
In the File Format select JSON
Now, any variable found in the appsettings.json file will be replaced by the variables you set in the pipeline.
Be careful because nested variables like
{
SerilogSettings: {
BatchSize: 100
}
}
need to be set with a "." instead like
SerilogSettings.BatchSize

How can I use Nunit 3 to generate some test results after execution has finished?

I am using Nunit 3 with Nunit 3 Test adapter. How can I can a reporting xml file like Nunit-result.xml after executing using the Test explorer. I am using Unit Test Project (.NET Framework). My main goal is to get the Nunit report so I can configure and publish in Jenkin Job.
Probably it is possible when you use Test Engine API and get result from tests.
For more info about engine - https://github.com/nunit/docs/wiki/Test-Engine-API
When you run your tests in TestEngine you will get xml result and you can save like this:
IResultService resultService = services.GetService<IResultService>();
var resultWriter = resultService.GetResultWriter("nunit2", null);
resultWriter.WriteResultFile(nodes, "yourpath/TestResults.xml")

Howto recognize documents in Alfresco 5.0 and filing them according content?

I have following use case:
Existing scanner scans documents and stores them via WebDAV or a shared network drive to Alfresco
Documents are separated with a barcode to identify the customer and document type (e.g. bill)
If a document arrives in the shared drive, Alfresco should analyse it and move it (according customer and document type) to the suitable internal folder structure.
Example of a folder structure:
/scans/
/customers/ExampleCustomer1/bills
/customers/ExampleCustomer1/emails
/customers/ExampleCustomer1/hr
/customers/ExampleCustomer2/bills
/customers/ExampleCustomer2/emails
/customers/ExampleCustomer2/hr
Question:
What do I need in Alfresco to process step 3) to automatically recognize documents and file them?
P.s. I know there exists applications like Ephesoft/Kofax but I would like to have a module inside Alfresco which does the job for me without external dependencies.
I would suggest the following sequence:
1)Your scanner or other (OCR) software needs to interprete the barcode and save the customer and type somewhere in the document, for example in docx metadata.( I am not aware of an alfresco module doing ocr or barcode reading)
2)After upload via webdav, you have to run alfresco metadata extract action, which will have to extract the customer and type from the documents metadata into alfresco metadata by using an alfresco rule script or behaviour.
Using a rule, you can choose the action "extract common metadata fields"
Using a java behaviour, you can call the same action like this:
Action action = actionService.createAction("extract-metadata");
actionService.executeAction(action, node);
This extract action is described here : https://wiki.alfresco.com/wiki/Metadata_Extraction . You may have to add custom code for your barcode requirement. (https://wiki.alfresco.com/wiki/Content_Transformation_and_Metadata_Extraction_with_Apache_Tika )
3)an alfresco rule script or behaviour is now able to move your document by reading this alfresco metadata property.
This is a very good howto about custom types and let me deep dive into Alfresco:
http://ecmarchitect.com/alfresco-developer-series-tutorials/content/tutorial/tutorial.html
Alfresco Developer Tutorials: http://ecmarchitect.com/alfresco-developer-series

Visual Studio 2015 IntelliTest bug on Azure API Apps?

When running IntelliTest on a default/blank ASP.net application using the "Azure API App (Preview)" template, IntelliTest finds nothing to test. I am not sure if this is by design, a bug, or just not supported yet. Does anyone know a workaround?
The IntelliTest output window displays "monitored process exited with could not find any test to run (-1013 - 0xfffffc0b)". I have made sure the project targets x86.
If I use the "Web API" template, IntelliTest correctly produces test results (in step 4 below choose Web API instead of Azure API App). I have now verified the above behaviour on 2 machines.
To replicate:
Open VS 2015 Enterprise
File -> New Project
Under Templates -> Visual C# -> Cloud, pick "ASP.net Web Application"
Select Name location and click ok, at the below screen choose "Azure API App (Preview)" and click ok.
When the project loads, navigate to the "ValuesController".
Right click inside either of the default Get() methods and select "Run IntelliTest" as per below
Open the output window and select "IntelliTest" from the "show output from" dropdown and observe message above (...could not find any test to run)
Eventually tracked this problem down to some incompatibility between Swashbuckle and intelliTest (Swasbuckle is used by api apps to generate Swagger doc for the API).
To solve, open SwaggerConfig.cs in your App_Start folder of your API App project and remove the below class that inherits from IOperationFilter. The disadvantage of this is not having your params joined together in the swagger doc, something I do not like much anyway (the default model is much nicer to read a long list of params from).
internal class IncludeParameterNamesInOperationIdFilter : IOperationFilter
{
public void Apply(Operation operation, SchemaRegistry schemaRegistry, ApiDescription apiDescription)
{
if (operation.parameters != null)
{
// Select the capitalized parameter names
var parameters = operation.parameters.Select(
p => CultureInfo.InvariantCulture.TextInfo.ToTitleCase(p.name));
// Set the operation id to match the format "OperationByParam1AndParam2"
operation.operationId = $"{operation.operationId}By{string.Join("And", parameters)}";
}
}
}

Resources