I defined new pipeline code from sagemaker abalone pipeline but I ended up using same pipeline as abalone pipeline enter image description here
What should I do? Please help.
Please confirm that the JSON definition of the new pipeline is different to the original abalone pipeline.
When using the Python SDK, you can create the Pipeline and it has a function .definition() that you can use to check the Pipeline before you run pipeline.upsert() to create/update the Pipeline. If you've changed the code and updated the pipeline through upsert(), you should see the new Pipeline.
I work at AWS and my opinions are my own.
Related
I have this working as part of Startup.cs to validate that there are no missing configuration settings and this works as the app will fail to run if it's missing a setting, but I wondered if this can be added as a step or job as part of the pipeline before deploying to QA/UAT/Prod.
You could include a Unit Test that fails if any of your required configurations are missing.
Assuming you're running the Unit Tests as part of the build pipeline, if the test fails it should prevent deployment.
This question might help you achieve this.
I would like a recommendation on how to instrument each pipeline call using Elastic APM API on Intershop 7.10.
I want to to create a separate span as described here:
https://www.elastic.co/guide/en/apm/agent/java/master/public-api.html#api-span-start-span
(Using try catch block with parent.startSpan())
For now I have tried looking into ICM knowledge base for topics regarding ELK stack (found none) and looked in Component Framework section on how to inject some code around PipelineProcessorImpl.executePipeline or put another pipeline processor implementation through component framewowrk but couldn't find nothing, it seems for now that pipeline processor implementation are not hooked through Component Framework.
General answer is, you should not bother replacing PipelineProcessor with your own implementation. Even for such a seemingly small task of feeding your own monitoring solution.
I (may) have a better solution for you. Haven't tested it though. Have a look at the detailed answer to this intershop question: Adding a servlet to run in Intershop 7.4 application server context
You don't want to add a new servlet but you want to bind a new javax.servlet.Filter implementation that hooks into the Application Server request chain. You can do that the same way as described, but invoke method filter("/servlet/Beehive/*") instead of serve("/servlet/DEMO/*")
I built a console app with .Net Core 3.1. I have it building using Yaml leaning heavily on the learn.microsoft.com documentation. The release is pushing to the correct box. But I have an appsettings.json file that has a conn string variable that is different between my TEST, QA and PROD regions. I knew how to do this with the xml file transforms in .NET and MVC but I can't get this to work. Any help would be great since I don't even know the term for what I am trying to do here.
How do you change the connectionstring in the appsettings.json based on a variable or do I have to create 3 branches each with settings and create 3 build and release pipelines?
Thank you.
In order to push to different environments you usually
Have seperate release pipelines that trigger from different branches.
You have one release pipeline with different stages that need pre-approval to move to the next stage TEST -> QA -> PROD.
In both cases you will make use of Stage.
There you need to add a task named "File transformation"
In the File Format select JSON
Now, any variable found in the appsettings.json file will be replaced by the variables you set in the pipeline.
Be careful because nested variables like
{
SerilogSettings: {
BatchSize: 100
}
}
need to be set with a "." instead like
SerilogSettings.BatchSize
My dotnetcore app has one appsettings.json per environment (appsettings.json and appsettings.Development.json for example) and I would like to take advantage of this on my pipeline.
I see 2 options for the pipeline:
Build Artifact for Dev -> Deploy on Dev -> Build Artifact for Prod -> Deploy on Prod
or
Build Artifact -> Deploy on Dev -> Deploy on Prod
For the first option, I could set the environment as a parameter for the build.
For the second option, how could I build the App only once, and set the environment according to the current deployment step? Taking advantage of the multiple appsettings.json I have.
And finally, are these approaches aligned with the best practices? If not, what would be the best practices for pipelines with multiple environments?
Generally we can generate a single artifact, then deploy the artifact to different environments and perform the different transformations at any environment within it's own stage release phase. That means we can change and override the settings which defined in the appsettings.json in each release environment.
Please refer to File transforms and variable substitution reference on how to do the transformation with .json files.
Besides, we can try to install the Replace Tokens extension, then use Replace Tokens task to load and change the settings defined in the appsettings.json file in each release environment/stage.
You can also transform the settings or use File Creator to create a new appsettings.jsonfile to overwrite the existing one.
Below blogs for your reference:
Replace appsetting tokens in config files with Build & Release
Management in VSTS (TFS)
Transform configurations in a .NET Core 2.2 Web API using Azure
DevOps
Using custom appsettings.json with ASP.NET Core integration
tests
You could go with Azure AppConfiguration and add it as an extra source for the configuration. This way your building/releasing process stays extremely simple.
See this documentation: https://learn.microsoft.com/en-us/azure/azure-app-configuration/enable-dynamic-configuration-dotnet-core
It's very powerful: you can select only part of the configuration (through filters), you can have feature flags, and you can have secrets (from linked key vaults).
I have created a test application with single setup and used PHPUnit as per tutorial http://docs.phalconphp.com/en/latest/reference/unit-testing.html and it works well.
Can anybody help me how do I setup it for multi module setup?
Unit test doesnt care whether apps is single or multimodule. Given example just given DI and config in test suite. For example if you want to test AddUser function in UserModule, you have to create User object and mock its dependency. then call it in unit test.
$user = new \App\Module\User\User.php;
$this->assertTrue($user->addUser());