Azure Bicep - can't access output from external module - azure-resource-manager

I have a module called "privateendpoints.bicep" that creates a private endpoint as follows:
resource privateEndpoint_resource 'Microsoft.Network/privateEndpoints#2020-07-01' = {
name: privateEndpointName
location: resourceGroup().location
properties: {
subnet: {
id: '${vnet_resource.id}/subnets/${subnetName}'
}
privateLinkServiceConnections: [
{
name: privateEndpointName
properties: {
privateLinkServiceId: resourceId
groupIds: [
pvtEndpointGroupName_var
]
}
}
]
}
}
output privateEndpointIpAddress string = privateEndpoint_resource.properties.networkInterfaces[0].properties.ipConfigurations[0].properties.privateIPAddress
This is then referenced by a calling bicep file as follows:
module sqlPE '../../Azure.Modules/Microsoft.Network.PrivateEndpoints/1.0.0/privateendpoints.bicep' = {
name:'sqlPE'
params:{
privateEndpointName:'pe-utrngen-sql-${env}-001'
resourceId:sqlDeploy.outputs.sqlServerId
serviceType:'sql'
subnetName:'sub-${env}-utrngenerator01'
vnetName:'vnet-${env}-uksouth'
vnetResourceGroup:'rg-net-${env}-001'
}
}
var sqlPrivateLinkIpAddress = sqlPE.outputs.privateEndpointIpAddress
My problem is, it won't build. In VSCode I get the error The type "outputs" does not contain property "privateEndpointIpAddress"
This is the property I just added. Prior to me adding then all worked ok. I've made sure to save the updated external module and I've ensure right-clicked it in VSCode and selected build, it build ok and created a json file.
So, it seems the client bicep file is not picking up changes in the external module.
Any suggestions please?

The problem seemed to be caused by the fact I had the external module open in a separate VS Code instance. Once I closed this and opened the file in the same instance as the calling bicep file then it worked ok.

Related

Azure Functions .NET 5 fails to start after changing value in local settings file

This is a very strange problem and want to see if anyone else can replicate the issue.
I start a brand new Azure Functions app targeting .NET 5. Mine is a timer function but I don't it matters what type of function it is.
I then add a value in my local.settings.json file:
{
"IsEncrypted": false,
"Values": {
"AzureWebJobsStorage": "MY_CONNECTION_STRING_FOR_AZURE_STORAGE",
"FUNCTIONS_WORKER_RUNTIME": "dotnet-isolated",
"MY_APP_ID": "1324"
}
}
}
I modify the Program.cs to read this value:
var host = new HostBuilder()
.ConfigureAppConfiguration(c =>
{
c.AddEnvironmentVariables();
var config = c.Build();
var id = config.GetValue<string>("MY_APP_ID");
})
.ConfigureFunctionsWorkerDefaults()
.Build();
I then run the app and it seems to start up fine.
Then, I modify the local.settings.json file and add a section -- see below:
{
"IsEncrypted": false,
"Values": {
"AzureWebJobsStorage": "MY_CONNECTION_STRING_FOR_AZURE_STORAGE",
"FUNCTIONS_WORKER_RUNTIME": "dotnet-isolated",
"MY_APP_ID": "1324",
"MyApp": {
"MY_APP_ID": "1324"
}
}
}
}
When I try to run the app, I get the following error:
I then remove this section from my local.settings.json file and try running the app again and I get the same error.
It looks like, from this point on, there's nothing I can do to make this app run! If you noticed, I didn't change any code in my Program.cs file. Simply adding a section and removing it from local.settings.json file seems to render the app useless!
Any idea what's going on here?

Running into AWS Elastic BeanStalk Event Error: Manifest file has schema validation errors

I am setting up pipelines to AWS Elastic BeanStalk via bitbucket and I am running into: Manifest file has schema validation errors: Error Kind: ArrayItemNotValid, Path: #/aspNetCoreWeb.[0], Property: [0] Error Kind: PropertyRequired, Path: #/parameters.appBundle, Property: appBundle Error Kind: NoAdditionalPropertiesAllowed, Path: #/parameters, Property: parameters.
It seems that I am have a problem with my manifest file. However due to there being very little documentation on how to fix this problem. I am not able to resolve this issue. How do I solve this problem?
Here is my aws-windows-deployment-manifest file:
{
"manifestVersion": 1,
"deployments": {
"aspNetCoreWeb": [
{
"name": "CareerDash",
"parameters": {
"archive": "site",
"iisPath": "/"
}
}
]
}
}
Look like I figured it out. The issue is that the aws-windows-deployment-manifest.json file should be like the following:
{
"manifestVersion": 1,
"deployments": {
"aspNetCoreWeb": [
{
"name": "CareerDash",
"parameters": {
"appBundle": "site.zip", /*This line is where your web app file location is. The Web app folder should be in .zip file. */
"iisPath": "/" /* This line is the path to where your web app files are located in site.zip file, specifically the path to web.config file (which should be in the same level as your main web app files */
}
}
]
}
}
Overall your app bundle should be a zip file that contains the site.zip file and aws-windows-deployment-manifest.json file. In a hierarchy like so:
appBundleName.zip
site.zip
aws-windows-deployment-manifest.json

How to get outputs of one Rule item as inputs of another one?

I want to create automatic crossplatform installation builder for my project. For this reason I made this file myprojectpackage.qbs:
Product {
type: "mypackage"
Depends { name: "myproject" } // <- this one has type "application"
Depends { name: "applicationpackage" }
}
applicationpackage.qbs uses some submodules and looks like:
Module {
name: "applicationpackage"
Depends { name: "qtlibsbinariespackage" }
Depends { name: "3rdpartybinariespackage" }
Depends { name: "resourcepackage" }
}
All these modules try to find something and copy to package directory. After they finish, I have a folder with a portable version of application. Every module of this group have typical structure:
Module {
name: "somepackage"
Rule {
condition: qbs.targetOS.contains("windows")
multiplex: true
alwaysRun: true
inputsFromDependencies: ["application"]
Artifact {
filePath: "Copied_files.txt"
fileTags: "mypackage"
}
prepare: {
var cmdQt = new JavaScriptCommand()
// prepare paths
cmdQt.sourceCode = function() {
// copy some files and write to Copied_files.txt
}
return [cmdQt]
}
}
}
After portable folder package complete, I want to make a zip archieve. So, I need another Module, which will run after package modules. I think, that only way to do like this is taking .txt files, that were created by modules in applicationpackage, as inputs for another Rule.
I have tried a lot of things (FileTaggers, outputFileTags etc.), but noone worked properly. So is there any way to do make modules work in pipeline as I want to do?
Do I understand correctly that you want to "merge" the contents of the txt files tagged "mypackage" into the archive, i.e. everything listed in all the files is supposed to end up there?
If so, then you simply need a "top-level" rule that does the aggregation. Your existing rules would tag their outputs as e.g. "mypackage.part" and then a multiplex rule would take these as inputs and create a "mypackage" artifact.
Note that there is the archiver module (https://doc.qt.io/qbs/archiver-module.html) that can do the final step of creating the package for you from the aggregated txt file.

JSZip and cfs:collection in Meteor app

So, Im using udondan:jszip, cfs:collection,
cfs:standard-packages and
cfs:filesystem packages in my meteor app. The problem is that I cant store my zip files in the FS.COllection. Here is some of the code :
//Defining the collection
Reports = new FS.Collection('reports',{
stores: [new FS.Store.FileSystem('reports', {path: "~/public"})]
});
//Trying to add a file to the collection
var zip = new JSZip();
Reports.insert(zip);
After running the code Im getting this error:
Error: DataMan constructor received data that it doesn't support
Is there any way to make those packages work with each other ?
The JSZip object is not a file by itself. You can generate a file from it with the generateAsync function. The file type you'll want to create depends on if you want this to run on the client or server and how you want to use this file. The file types supported by both libraries are: (as per documentation, I haven't tested all these myself)
Blob object (client only): { type: 'blob' }
Uint8Array: { type: 'uint8array' }
ArrayBuffer: { type: 'arraybuffer' }
Buffer object (server only): { type: 'nodebuffer' }
So for example this should work:
zip.generateAsync({ type: 'arraybuffer' })
.then(function (content) {
Reports.insert(content);
});

Gradle - Web Module in deployment descriptor

I am trying to add a web module in an Ear file. I put it in my customized deployment descriptor using webModule(":wars/myweb","/mywebapp"). It is not including the war file in the ear file. It is just adding a entry in the generated application.xml with these details.
Can you please help in including a web module in ear, using customized deployment descriptor?
My ear task looks like this in build.gradle
ear {
libDirName ''
deploymentDescriptor {
// custom entries for application.xml:
// fileName = "application.xml" // same as the default value
version = "1.4" // same as the default value
applicationName = "myapp"
initializeInOrder = true
displayName = "myear" // defaults to project.name
description = "EAR for the basic package" // defaults to project.description
webModule(':wars/myweb','/mywebapp')
}
}
My settings.xml in the same dir as build.gradle looks like this
include "wars/myweb"
Appreciate your help.
I use this way to tie war dependencies to the webModules. The warMap provides a connection between the artifact id and the context path:
Map warMap = [
'my-war': 'contextpath',
'my2-war': 'contextpath2'
}
dependencies {
warMap.each {
deploy project(":$it.key")
}
}
ear {
deploymentDescriptor {
warMap.each {
webModule(it.key + '-' + project.version + ".war", it.value)
}
}
}

Resources