Bitbucket Pipeline / .Net Core - Project file does not exist - .net-core

.yml file:
image: microsoft/dotnet:sdk
pipelines:
default:
- step:
caches:
- dotnetcore
script: # Modify the comma`nds below to build your repository.
- export SOLUTION_NAME=SSU.IS.Services
- export PROJECT_NAME=SSU.IS.Identity
- export TEST_NAME=SSU.IS.Module.StaffRecords.Test
- dotnet restore $SOLUTION_NAME
- dotnet build $PROJECT_NAME
- dotnet test #$TEST_NAME
The bitbucket-pipeline.yml is in the root directory, there is a folder named SSU.IS.Services which contains the .sln as well as other folders which contain further folders containing projects.
The restore has no issues, however the build step errors out.
I have attempted leaving it blank, specifying the project name, giving what I believe to be the relative path to the project - all to no avail.
Path to the .csproj
identity-management-mvc/SSU.IS.Services/Sites/Identity/SSU.IS.Identity/SSU.IS.Identity.csproj

For those reading typing the full path including file extension has worked:
image: microsoft/dotnet:sdk
pipelines:
default:
- step:
caches:
- dotnetcore
script: # Modify the comma`nds below to build your repository.
- export SOLUTION_NAME=SSU.IS.Services
- export PROJECT_NAME=SSU.IS.Services/Sites/Identity/SSU.IS.Identity/SSU.IS.Identity.csproj
- export TEST_NAME=SSU.IS.Services/Tests/SSU.IS.Module.StaffRecords.Test/SSU.IS.Module.StaffRecords.Test.csproj
- dotnet restore $SOLUTION_NAME
- dotnet build $PROJECT_NAME
- dotnet test $TEST_NAME

/opt/atlassian/pipelines/agent/build/"ProjectSolutionName"/"ProjectName".csproj
ONLY THIS helped me. Enjoy =)

Related

Add git-ftp-include to wordpress project

I use git-ftp and bitbucket pipelines for my Wordpress project CI/CD. I make the wp-content git repo. I use Webpack to bundle the files and I also add dist folder to gitignore. Here is my setup:
wp-content/bitbucket-pipelines:
image: node:12.13.1
pipelines:
branches:
develop:
- step:
name: Deploy to staging
deployment: staging
script:
- cd themes/my-theme
- npm install
- npm run build
- apt-get update
- apt-get -qq install git-ftp
- git ftp push --syncroot themes/my-theme/ --user $FTP_username --passwd $FTP_password $HOST/wp-content/themes/my-theme/
The pipeline works just fine. However, it does not upload the dist folder to the host. I think that's because I add dist to my gitignore. How can I add the .git-ftp-include. Should I create the file and upload it to the host (under my-theme) or should I add it to my local and upload it to bitbucket
.git-ftp-include
themes/my-theme/dist/
.gitignore (under wp-content)
# ignore specific themes
themes/twenty*/
themes/my-theme/css/style.css
uploads
plugins
languages
cache
ai1wm-backups
dist
node_modules/
.idea/
.DS_Store
.log

Is it possible to generate artifacts without publishing with Semantic-release?

I'm currently using semantic-release for versioning of my react library.
https://github.com/semantic-release/semantic-release
Question:
Is it possible to generate the artifacts without publishing it?
For example, in my use case I would like to generate:
- Version Release Number (#semantic-release/commit-analyzer)
- tar file that will be publish to npm (#semantic-release/npm)
- change log (#semantic-release/release-notes-generator)
If you run the dry run option, it will print the version release number and the change log to console, but I want to store it to a file. One workaround is I could pipe the results and then parse, but it'll be nice if it can pass the plugin could put the data in a file during the dry run.
The dry run won't run the publish stage which is where the files get tar'ed up.
Any Advice appreciated,
Thanks,
Derek
You can use the npmPublish option of the #semantic-release/npm plugin. This will generate the tar file for the npm package but it won't publish it to the npm registry.
// In your package.json file add the following property which ensures that npm will not publish
"private": "true"
// In your github action's workflow/release.yml file use the following to store your tar file
- run: |
mkdir -p ~/new/artifact
echo ${{contents_of_your_file}} > ~/new/artifact/yourtarfile
- uses: actions/upload-artifact#v2
with:
name: artifactname
path: '~/new/**/*'
- name: download
uses: actions/download-artifact#v1
with:
name: artifactname
path: '~/new/**/*'

AWS CodeBuild artifact handling

So I have a react with SSR project that I am trying to create a CI/CD pipeline. My project needs to deploy the following artifacts:
dist folder created by Webpack
appspec.yml file needed by Code Deploy
script folder for scripts that are references in appspec.yml
When I tried to get the files one by one using this buildspec.yml:
version: 0.2
phases:
install:
runtime-versions:
nodejs: 10
commands:
- npm install
build:
commands:
- npm run build
artifacts:
files:
- 'dist/*'
- 'appspec.yml'
- 'deploy-scripts'
I got dist with only part of it's content and the appspec.yml and the deploy scripts folder.
Then I tried a different approach:
version: 0.2
phases:
install:
runtime-versions:
nodejs: 10
commands:
- npm install
build:
commands:
- npm run build
artifacts:
files:
- '**/*'
base-directory: "dist"
discard-paths: yes
The dist folder has the scripts and the appspec file inside it. Now it does deploy, but I lose the folder structure of dist which is necessary for my deploy scripts.
I need to get all the contents of dist in their folder structure. As well as the scripts and the appspec.yml file. The scripts and the appspec.yml can't be inside dist, but dist needs to have all of it's content.
Can anyone help with this?
The solution was to use the first buildspec file and adding "**/*" to the dist directory.
So in the dist line it ends up being this: "dist/**/*".
So if we apply this to the general context, anytime you want to get a directory to be sent along with single files in the build phase, you can add it like this:
"[directory_name]/**/*"
And that will get you both the directory and everything inside it in a recursive way.

Setting codebuild output source folder for SBT incremental compile

I am curious if you can control the output "src" folder in AWS CodeBuild.
Specifically, I see this when debugging the build in CodeBuild.
/codebuild/output/src473482839/src/github.....
I would love to be able to set/change/remove the src473482839 part of that path, because I have a feeling it is causing my sbt to recompile my scala source files, although I am using CodeBuilds new localcache to cache my target folders between builds, the compiled class's canonical path change between builds, which is what I suspect is causing the problem
After some more debugging I have managed to get my 6 minute builds down to 1:30s.
Although you are not able to set or override the CODEBUILD_SRC_DIR I have found a work around in my buildspec.
This is what my buildspec looks like now, with local caching enabled in codebuild.
version: 0.2
phases:
pre_build:
commands:
- mkdir -p /my/build/folder/
- cp -a ${CODEBUILD_SRC_DIR}/. /my/build/folder
build:
commands:
- cd /my/build/folder
- sbt compile test
cache:
paths:
- '/root/.ivy2/cache/**/*'
- '/root/.cache/**/*'
- 'target/**/*'
- 'any other target folders you may need'
The key change I had to make was copy over the source(cached target directories) in the pre_build phase, and change directory and compile from the new, static directory
I hope this helps someone else down the road until CodeBuild allows a person to set/override the CODEBUILD_SRC_DIR folder

Deploy ASP.Net Application to Window Server Core Container

I'm having an issue with manually building and running a windows server core container with a legacy asp.net web application. From Visual Studio I can run the container with the auto-generated dockerfile/yml file.
I want to do a docker build and docker run powershell command using the dockerfile instead of with Visual Studio.
This is current yml file:
version: '3'
services:
fulldotnetwebapplication:
image: fulldotnetwebapplication
build:
context: .\FullDotNetWebApplication
dockerfile: Dockerfile
This is the current dockerfile:
FROM microsoft/aspnet:4.7.1-windowsservercore-ltsc2016
ARG source
WORKDIR /inetpub/wwwroot
COPY ${source:-obj/Docker/publish} .
Let's say my ASP project is FullDotNetWebApplicationand it contains App_Data, Content, Controllers, etc folders plus Master/ASPX pages along with web/packages/config.
I tried this for my Dockerfile:
FROM microsoft/aspnet:4.7.1-windowsservercore-ltsc2016
WORKDIR /inetpub/wwwroot
COPY . .
COPY ./bin ./bin
and am getting this error:
docker : COPY failed: GetFileAttributesEx \\?\C:\Windows\TEMP\docker-builder977521850\bin: The system cannot find the file specified.
At line:1 char:1
+ docker build -t fulldotnetwebapplication .
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : NotSpecified: (COPY failed: Ge...file specified.:String) [], RemoteException
+ FullyQualifiedErrorId : NativeCommandError
What should my docker file look like to deploy this application to IIS from Powershell? I'm not understanding what magic VS is doing to make this work? Is it building the application or some sort of deployment file being generated? Any examples I could be pointed to or sample Dockerfile's would be great.
${source:-obj/Docker/publish}
That is why you can run it in VS but not from PowerShell/CMD, VS injects the source part of that path, to map it to the special docker publish folder for the current project.
If up replace that with . ., that means copy everything recursively from my current executing directory (which is usually the same as the dockerfile location if you didn't specify the file manually) to the working directory in the container. As such, your attempt COPY ./bin ./bin is unnecessary, as the proceeding COPY . . will have already copied that directory (assuming it does actually exist)
Is it building the application or some sort of deployment file being generated?
The container is the unit of deployment, you cannot deploy a container to IIS, it must be run on a container Host. Docker will not build your solution either, before building the container you should build your solution, and then copy just the required output (which is what the VS source path is attempting to do).
It is possible to get your container workflow to also build your solution using a multi-stage container, where by you use a container with VS installed to build the project, then copy the output to another container for running it, but that is a fairly advanced setup that I wouldn't recommend if you can't even get normal build copying working.

Resources