AWS CodeBuild artifact handling - aws-code-deploy

So I have a react with SSR project that I am trying to create a CI/CD pipeline. My project needs to deploy the following artifacts:
dist folder created by Webpack
appspec.yml file needed by Code Deploy
script folder for scripts that are references in appspec.yml
When I tried to get the files one by one using this buildspec.yml:
version: 0.2
phases:
install:
runtime-versions:
nodejs: 10
commands:
- npm install
build:
commands:
- npm run build
artifacts:
files:
- 'dist/*'
- 'appspec.yml'
- 'deploy-scripts'
I got dist with only part of it's content and the appspec.yml and the deploy scripts folder.
Then I tried a different approach:
version: 0.2
phases:
install:
runtime-versions:
nodejs: 10
commands:
- npm install
build:
commands:
- npm run build
artifacts:
files:
- '**/*'
base-directory: "dist"
discard-paths: yes
The dist folder has the scripts and the appspec file inside it. Now it does deploy, but I lose the folder structure of dist which is necessary for my deploy scripts.
I need to get all the contents of dist in their folder structure. As well as the scripts and the appspec.yml file. The scripts and the appspec.yml can't be inside dist, but dist needs to have all of it's content.
Can anyone help with this?

The solution was to use the first buildspec file and adding "**/*" to the dist directory.
So in the dist line it ends up being this: "dist/**/*".
So if we apply this to the general context, anytime you want to get a directory to be sent along with single files in the build phase, you can add it like this:
"[directory_name]/**/*"
And that will get you both the directory and everything inside it in a recursive way.

Related

Add git-ftp-include to wordpress project

I use git-ftp and bitbucket pipelines for my Wordpress project CI/CD. I make the wp-content git repo. I use Webpack to bundle the files and I also add dist folder to gitignore. Here is my setup:
wp-content/bitbucket-pipelines:
image: node:12.13.1
pipelines:
branches:
develop:
- step:
name: Deploy to staging
deployment: staging
script:
- cd themes/my-theme
- npm install
- npm run build
- apt-get update
- apt-get -qq install git-ftp
- git ftp push --syncroot themes/my-theme/ --user $FTP_username --passwd $FTP_password $HOST/wp-content/themes/my-theme/
The pipeline works just fine. However, it does not upload the dist folder to the host. I think that's because I add dist to my gitignore. How can I add the .git-ftp-include. Should I create the file and upload it to the host (under my-theme) or should I add it to my local and upload it to bitbucket
.git-ftp-include
themes/my-theme/dist/
.gitignore (under wp-content)
# ignore specific themes
themes/twenty*/
themes/my-theme/css/style.css
uploads
plugins
languages
cache
ai1wm-backups
dist
node_modules/
.idea/
.DS_Store
.log

Is it possible to generate artifacts without publishing with Semantic-release?

I'm currently using semantic-release for versioning of my react library.
https://github.com/semantic-release/semantic-release
Question:
Is it possible to generate the artifacts without publishing it?
For example, in my use case I would like to generate:
- Version Release Number (#semantic-release/commit-analyzer)
- tar file that will be publish to npm (#semantic-release/npm)
- change log (#semantic-release/release-notes-generator)
If you run the dry run option, it will print the version release number and the change log to console, but I want to store it to a file. One workaround is I could pipe the results and then parse, but it'll be nice if it can pass the plugin could put the data in a file during the dry run.
The dry run won't run the publish stage which is where the files get tar'ed up.
Any Advice appreciated,
Thanks,
Derek
You can use the npmPublish option of the #semantic-release/npm plugin. This will generate the tar file for the npm package but it won't publish it to the npm registry.
// In your package.json file add the following property which ensures that npm will not publish
"private": "true"
// In your github action's workflow/release.yml file use the following to store your tar file
- run: |
mkdir -p ~/new/artifact
echo ${{contents_of_your_file}} > ~/new/artifact/yourtarfile
- uses: actions/upload-artifact#v2
with:
name: artifactname
path: '~/new/**/*'
- name: download
uses: actions/download-artifact#v1
with:
name: artifactname
path: '~/new/**/*'

Setting codebuild output source folder for SBT incremental compile

I am curious if you can control the output "src" folder in AWS CodeBuild.
Specifically, I see this when debugging the build in CodeBuild.
/codebuild/output/src473482839/src/github.....
I would love to be able to set/change/remove the src473482839 part of that path, because I have a feeling it is causing my sbt to recompile my scala source files, although I am using CodeBuilds new localcache to cache my target folders between builds, the compiled class's canonical path change between builds, which is what I suspect is causing the problem
After some more debugging I have managed to get my 6 minute builds down to 1:30s.
Although you are not able to set or override the CODEBUILD_SRC_DIR I have found a work around in my buildspec.
This is what my buildspec looks like now, with local caching enabled in codebuild.
version: 0.2
phases:
pre_build:
commands:
- mkdir -p /my/build/folder/
- cp -a ${CODEBUILD_SRC_DIR}/. /my/build/folder
build:
commands:
- cd /my/build/folder
- sbt compile test
cache:
paths:
- '/root/.ivy2/cache/**/*'
- '/root/.cache/**/*'
- 'target/**/*'
- 'any other target folders you may need'
The key change I had to make was copy over the source(cached target directories) in the pre_build phase, and change directory and compile from the new, static directory
I hope this helps someone else down the road until CodeBuild allows a person to set/override the CODEBUILD_SRC_DIR folder

Bitbucket Pipeline / .Net Core - Project file does not exist

.yml file:
image: microsoft/dotnet:sdk
pipelines:
default:
- step:
caches:
- dotnetcore
script: # Modify the comma`nds below to build your repository.
- export SOLUTION_NAME=SSU.IS.Services
- export PROJECT_NAME=SSU.IS.Identity
- export TEST_NAME=SSU.IS.Module.StaffRecords.Test
- dotnet restore $SOLUTION_NAME
- dotnet build $PROJECT_NAME
- dotnet test #$TEST_NAME
The bitbucket-pipeline.yml is in the root directory, there is a folder named SSU.IS.Services which contains the .sln as well as other folders which contain further folders containing projects.
The restore has no issues, however the build step errors out.
I have attempted leaving it blank, specifying the project name, giving what I believe to be the relative path to the project - all to no avail.
Path to the .csproj
identity-management-mvc/SSU.IS.Services/Sites/Identity/SSU.IS.Identity/SSU.IS.Identity.csproj
For those reading typing the full path including file extension has worked:
image: microsoft/dotnet:sdk
pipelines:
default:
- step:
caches:
- dotnetcore
script: # Modify the comma`nds below to build your repository.
- export SOLUTION_NAME=SSU.IS.Services
- export PROJECT_NAME=SSU.IS.Services/Sites/Identity/SSU.IS.Identity/SSU.IS.Identity.csproj
- export TEST_NAME=SSU.IS.Services/Tests/SSU.IS.Module.StaffRecords.Test/SSU.IS.Module.StaffRecords.Test.csproj
- dotnet restore $SOLUTION_NAME
- dotnet build $PROJECT_NAME
- dotnet test $TEST_NAME
/opt/atlassian/pipelines/agent/build/"ProjectSolutionName"/"ProjectName".csproj
ONLY THIS helped me. Enjoy =)

Setting Up Multiple Grunt files

I want to run grunt on several projects simultaneously. However i'm not sure how to setup so it works. Here's what my setup looks like:
- Project 1 folder
-[project files]
-Gruntfile
- Project 2 folder
-[project files]
-Gruntfile
- Grunt Folder
-[nodeModules]
-package.json
So the idea is that have all grunt dependencies (node modules) in a single centralised folder. Then each project folder has its own Gruntfile.
The problem I have is that I don't know how to setup the gruntFile so that it can use the node modules and package.json from the grunt dependancy folder.
Can anyone help me with how I can get this to work? Specifically with code examples.

Resources