running Add-AzSqlElasticJobStep errors referencing credential - azure-sql

I am trying to use powershell to create an Elastic Job. I can create the job using TSQL with no issues but when try to add a job step it gives error: 'Elastic jobs management operation failed. Cannot reference the credential 'JobExecuteUser', because it does not exist or you do not have permission.' I have not been able to google any occurrences of this error. I know the credential exists because I use the same values in my TSQL code.
I am confident my parameter values are correct because I have used them to remove/create TargetGroups and TargetMembers and the job itself.
Here is the code:
Add-AzSqlElasticJobStep `
-ResourceGroupName $ResourceGroupName `
-ServerName $ServerName `
-AgentName $AgentName `
-JobName $JobName `
-TargetGroupName $TargetGroupName `
-CredentialName $CredentialName `
-Name "Deploy CommandLog" `
-CommandText "Do Nothing"

Related

Why does deno's exec package execute commands without outputting messages to the terminal?

import { exec } from "https://deno.land/x/exec/mod.ts";
await exec(`git clone https://github.com/vuejs/vue.git`)
when i use git clone https://github.com/vuejs/vue.git in .sh file, It has message in terminal,
but don't have in deno
First, I think it is important to echo what jsejcksn commented:
The exec module is not related to Deno. All modules at https://deno.land/x/... are third-party code. For working with your shell, see Creating a subprocess in the manual.
Deno's way of doing this without a 3rd party library is to use Deno.run.
With that said, if you take a look at exec's README you'll find documentation for what you're looking for under Capture the output of an external command:
Sometimes you need to capture the output of a command. For example, I do this to get git log checksums:
import { exec } from "https://deno.land/x/exec/mod.ts";
let response = await exec('git log -1 --format=%H', {output: OutputMode.Capture});
If you look at exec's code you'll find it uses Deno.run under the hoods. If you like exec you can use it but you might find you like using Deno.run directly instead.

How to make azure devops build fail when R linting issues occur

I am using lintr library in R to find linting issues in the code. I put them into an xml format like this:
<lintsuites>
<lintissue filename="/home/.../blah.R" line_number="36" column_number="1" type="style" message="Trailing blank lines are superfluous."/>
<lintissue filename="/home/.../blahblah.R" line_number="1" column_number="8" type="style" message="Only use double-quotes."/>
</lintsuites>
Now, I would like to fail the Azure devops build when issues like this occur.
I was able to get my tests in a JUnit format like this:
<testsuite name="MB Unit Tests" timestamp="2020-01-22 22:34:07" hostname="0000" tests="29" skipped="0" failures="0" errors="0" time="0.05">
<testcase time="0.01" classname="1_Unit_Tests" name="1_calculates_correctly"/>
<testcase time="0.01" classname="1_Unit_Tests" name="2_absorbed_correctly"/>
...
</testsuite>
And when i do this step in the azure pipeline, my build fails if any tests in the test suite fail:
- task: PublishTestResults#2
displayName: 'Publish Test Results'
inputs:
testResultsFiles: '**/*.xml'
searchFolder: '$(System.DefaultWorkingDirectory)/fe'
mergeTestResults: true
failTaskOnFailedTests: true
I would like something similar for failing the build when there are linting issues. I would also like the users to see what those linting issues are in the build output.
Thanks
This is not possible to achieve a similar result for lintr xml with plishTestResults#2.
The workaround you can try is to use a powershell task to check for the content of your lintr xml file. If the content isnot empty, then fail the pipeline in the powershell task.
Below powershell task will check the content of lintr.xml(<car></car>) and will echo the content to the task logs and exit 1 to fail the task if the content is null.
- powershell: |
[xml]$XmlDocument = Get-Content -Path "$(system.defaultworkingdirectory)/lintr.xml"
if($XmlDocument.OuterXml){
echo $XmlDocument.OuterXml
}else{exit 1}
displayName: lintr result.
You can aslo use below statement in a powershell task to upload lintr xml file to the build summary page where you can download
echo "##vso[task.uploadsummary]$(system.defaultworkingdirectory)/lintr.xml"
You can check here for more logging commands.
Update:
A workaround to show the lintr results in a nice way is to create a custom extension to display html results in azure devops pipeline.
You can try creating a custom extension, and produce a html lint results. Please refer to the answer to this thread an example custom extension to display html
Other developers already submit requests to Microsoft for implementing this feature. Please vote it up here or create a new one.

Deploy on Meteor galaxy server with bitbucket and deployment token as variable

Hello I want to use the automatic deploymen on bitbucket to the galaxy server with a deployment token.
For this reason I am creating a deployment token that is comitted in the repository.
https://galaxy-guide.meteor.com/deploy-guide.html#deployment-token
To strenghten the security I would like to use Repository variables in bitbucket pipelines:
https://confluence.atlassian.com/bitbucket/environment-variables-794502608.html
And to store the deployment token of meteor in the variables instead in file.
For the deployment we use in the command:
METEOR_SESSION_FILE=deployment_token.json
And my question is - Is there any way so that I use some variable(string) where the token is used like
METEOR_SESSION_DEPLOYMENT_TOKEN=$METEOR_TOKEN
instead to call it from a file?
Some research, after having the same problem, brought me to this article, which simply solves the problem that you can't feed meteor just the json in an env var in the following simple way:
By adding the json file content as an env var and then echoes it out into a file on deploy.
echo $METEOR_TOKEN_FILE > deploy_token.json
METEOR_SESSION_FILE=deploy_token.json
Thanks to this article I figured it out.
Save json settings as env variable and then in deployment procesS:
echo $DEPLOY_SESSION_FILE > deployment_token.json
METEOR_SESSION_FILE=deployment_token.json DEPLOY_HOSTNAME=galaxy.meteor.com meteor deploy --allow-superuser myApp-staging.meteorapp.com --settings config/staging/settings.json --owner username

.bashrc file not created in ldap client machine when using ldap authentication

My objective is to configure a machine as client to use ldap authentication from a ldap server. I added user in ldap server. Also registered ldap service to be used by client.
But the problem is when I am trying to login into the ldap client machine, I am not getting the "user#hostname" prompt, instead I am getting "-bash-4.1$".
I searched and find its something to do with ".bashrc" files such as ".bash_history", ".bash_profile", ".bash_logout".
I can manually create these files but I want them to be automatically generated, and executed while login.
If anyone knows the cause of the problem and the solution please share.
Thanks,
Yogesh
You should create a script in /etc/profile.d for this propose, like that:
cat /etc/profile.d/bash_create.sh
a='export PS1="\[$(tput bold)\]\[$(tput setaf 2)\][\u#\h \W]\\$ \[$(tput sgr0)\]"'
b='export other variable'
c='export other variable'
d='alias ...."
if [ ! -f BASHRC ]; then #if not exists
echo -e "$a\n$b\n$c\n$d" >> $BASHRC
source $BASHRC #execute bash script
fi
I had this problem too and I solved deleting the "/home/< ldapuser >" and then I login again and the home directory was created. Also you can check if /etc/pam.d/system-auth has properly pam_oddjob_mkhomedir.so
session optional pam_oddjob_mkhomedir.so umask=0077
To enable mkhomedir you can run
authconfig --enablemkhomedir --update

Powershell Get-WebSite name parameter is ignored

I want to retrieve information regarding specific IIS 7 website using the PowerShell Get-Website cmdlet. Unfortunately, Get-Website returns information for all websites regardless of the -Name parameter I pass in. It appears that the -Name parameter is ignored.
For instance, if I use:
Import-Module WebAdministration
Get-Website -Name "Test Website"
I will receive information for all websites on my machine:
Name ID State Physical Path Bindings
---- -- ----- ------------- --------
Default Web Site 1 Started %SystemDrive%\inetpub\wwwroot http *:80:
net.tcp 808:*
net.pipe *
net.msmq localhost
msmq.formatname localhost
Test Website 2 Started C:\websites\test http *:80:test.mydomain.com
According to the documentation Get-Website should return information for the website specified in the -Name parameter. I must be misunderstanding the documentation or misusing the cmdlet, or both.
How should I use Get-Website to return information for a specific website?
According to this forum post, this is a bug in the Get-Website cmdlet. The workaround until this is addressed is to use Get-Item.
$website = "Test"
Get-Item "IIS:\sites\$website"
Be sure to use double quotes, variables are not expanded when single quotes are used.
I realize it's an older post but I ran into this issue recently and found your question. I've had luck with the following syntax too:
get-website | where { $_.Name -eq 'foobar' }
Using wild cards will also get around this issue as mentioned in the work around in the connect topic referenced by #Joey
get-website -name "*Default Web Site*"

Resources