How to create an API Management service programmatically? - azure-resource-manager

I am trying to programmatically (via Azure Resource Manager, if possible) create an APIM service in a specific resource group. The goal is that we can have a template or script that is parameterized and can be checked into source control, so that we can duplicate and recreate our environments.
This table in an ARM documentation page says Yes, Resource Manager is Enabled for APIM, but the Quickstart Templates link finds no sample templates for the Microsoft.ApiManagement resource type. However, this could mean merely that no one has yet contributed a template to the gallery, and that I would have to write my own.
As for writing my own ARM template, the article that walks you through authoring a Resource Manager template says:
To learn much of what you need to know about resource providers, see Resource Manager providers, regions, API versions and schemas
... which links back to the same page as my "This table" text above. The same section of the "authoring templates" article also says:
[Properties are] Resource specific configuration settings. The values for the properties are exactly the same as the values you provide in the request body for the REST API operation (PUT method) to create the resource. For links to resource schema documentation or REST API, see Resource Manager providers, regions, API versions and schemas.
... which again links back to the same page as above.
I have checked both the APIM REST API and the azure-resource-manager-schemas page for documentation on how to create an APIM instance.
The APIM REST API requires that you have created an APIM instance already. It's designed for manipulating APIM resources within an APIM instance, not for creating the APIM instance in the first place.
The ARM schemas project does not contain a schema for Microsoft.ApiManagement. The strings "api management" and "apim" do not occur within the schemas project.
Is it possible to create an API Management service programmatically, and if so, how?

Yes it is possible to create an ApiManagement service programmatically using the Azure Powershell 1.0 commandlets https://azure.microsoft.com/en-us/documentation/articles/powershell-install-configure/. Below is the commandlet that will help you achieve it.
PS C:\WINDOWS\system32> get-help New-AzureRmApiManagement -example
NAME
New-AzureRmApiManagement
SYNOPSIS
Creates an API Management deployment.
SYNTAX
New-AzureRmApiManagement [-Capacity [<Int32>]] [-Sku [<PsApiManagementSku>]] [-Tags [<0, Culture=neutral,
PublicKeyToken=b77a5c561934e089>]] -AdminEmail <String> -Location {North Central US | South Central US | Central
US | West Europe | North Europe | West US | East US | East US 2 | Japan East | Japan West | Brazil South |
Southeast Asia | East Asia | Australia East | Australia Southeast} -Name <String> -Organization <String>
-ResourceGroupName <String> [<CommonParameters>]
Example 1: Create at Developer tier API Management service
PS C:\>New-AzureRmApiManagement -ResourceGroupName "ContosoGroup02" -Name "ContosoApi" -Location "Central US"
-Organization "Contoso" -AdminEmail "admin#contoso.com"
This command creates a Developer tier API Management service. The command specifies the organization and the
administrator address. The command does not specify the SKU parameter. Therefore, the cmdlet uses the default
value of Developer.
Example 2: Create a Standard tier service that has three units
PS C:\>New-AzureRmApiManagement -ResourceGroupName "ContosoGroup02 -Name "ContosoApi" -Location "Central US"
-Organization "Contoso" -AdminEmail "admin#contoso.com" -Sku Standard -Capacity 3
This command creates a Standard tier API Management service that has three units.
You can find additional commandlets by using
get-Help AzureRmApiManagement
Full documentation of the commandlets can be found here
https://msdn.microsoft.com/en-us/library/mt619282.aspx

You can now use the quickstart API Management Service ARM template from GitHub.

New-AzureRmApiManagement can be used to create APIM instance. But provisioning an APIM instance takes time, usually 20 to 30 minutes. If you have to create multiple instances, to save time, it's better to do it with an Azure Automation runbook, and run this command in parallel. Here is an example.

Related

When distributing a release bundle on Artifactory how do the edge nodes map to the jfrog CLI API?

I'm trying to distribute my release bundles using the jfrog CLI and based on how we have setup our Artifactory instance I'm not sure how the web interface maps to the [CLI API][1]. On the web interface it asks me to select the edge nodes I want to distribute to however, the API talks about sites, cities and country codes. My goal is to distribute to a single one of the edge nodes.
[1]: https://www.jfrog.com/confluence/display/CLI/CLI+for+JFrog+Distribution#CLIforJFrogDistribution-DistributingaReleaseBundle%20(
The difference between the UI and REST API/CLI is that:
Via UI the Distribution receives the list of available Destinations from Mission Control. Then, once you select any, it is automatically creates the pattern map that is submitted via REST API (/ui endpoint).
REST API: you need to provide the Destinations in a form of patterns
JFROG CLI: wrapping the API.
The pattern build in the json (distribution_rules):
"site_name": Destination name or wildcard (*)
Every Destination has a city and Country that can be set in advance (when the Destination is registered). Wildcard supported as well.
So, in your case site == edge, city and country code are optional params.
See Mission Control API for getting list of Destinations: https://www.jfrog.com/confluence/display/JFROG/Mission+Control+REST+API#MissionControlRESTAPI-GetJPDList

SABRE Hotel Passive Segment API

In SABRE just started on a "webapp" in SABRE for the agent to convert a HK segment to a passive segment (GK). I am new to the API's. Is there an API for does this with SABRE. I couldn't find one but surely there's a generic method for sending a "valid sabre command" ...
The agent entry would be like this...
0HHTAAGK1DFWIN27JUL-OUT2AUG/HI AIRPORT WEST/SGLB/75.00/SI-#5827 OCEAN DRIVE¥MIAMI FL 38834¥PHONE305-
555-1111#RQNEAR POOL/CF89732901
You are unable to convert an HK reservation to GK, as HK is an online sale with the hotel and GK is an offline sale with the hotel.
In order for you to sell a hotel segment in any status, you can use the services listed below according to your needs:
HotelResModifyLLSRQ: https://developer.sabre.com/docs/soap_apis/hotel/book/Modify_Hotel_Reservation
EnhancedHotelBookRQ: https://developer.sabre.com/docs/soap_apis/hotel/book/enhanced_hotel_book
OTA_HotelResLLSRQ: https://developer.sabre.com/docs/soap_apis/hotel/book/book_hotel_reservation
All of these services allow you to sell a hotel with HK or GK status.
I'm transitioning from sabre script writing to sabre red apps.
I found this example/sample it has textboxes to collect information and then push to sabre. You should be able to rename some of the boxes and add some textboxes to collect all the hotel data and push to sabre.
Location: C:\SDK\red-app-sdk-3.0-20.11.6\samples
File: com.sabre.redapp.example3.web.customworkflow-1.0.5-SNAPSHOT-v20201016-1358.zip
This is the main processor file that initiates the data in html, to ts, to sabre.
C:\SDK\workspace\com-sabre-test3-web-module\src\code
Main.ts
The modal shown in sabre comes from template dir. These are HTML pages with boxes/drop downs etc.
C:\SDK\workspace\com-sabre-test3-web-module\src\templates
PnrView.html
Transmit to sabre
C:\SDK\workspace\com-sabre-test3-web-module\src\code\view
Pnr.View.ts

Verify if data was returned from preferred region in cosmos

So I have setup my cosmos in 2 regions (say West US and South Central US). I also have my app services running in these two region and connecting to cosmos. For app services running in each region I have configured my preferred region list. So for app service running WUS region, preferred list is in order [WUS, SCUS] and for app service running SCUS region, preferred list is in order [SCUS, WUS].
I want to verify if this configuration is working and my data was returned from cosmos region in order as i have mentioned. For example if accessed from WUS app service, verify if region chooses to execute the query was WUS and vice versa.
Is there any way to verify this?
NOTE : I am using spring-data-cosmosdb-2.1.2 to connect to cosmos.
Not sure if you can get that information from Response in code but you can make use of Cosmos Metrics in Azure portal.There you can filter the metrics on Region.
So, Attempt a request through App from region 1 and then verify in portal that expected Cosmos region served it. Test in same way for region 2.
Yes, just dump the Diagnostics returned in the response object and look for the FQDN to the endpoint. It will have a regional subdomain pre-pended to the URI.

Rofxord: how to change the endpoint and connect with API Azure Cognitive Service?

I try to connect with Azure Cognitive Service using Roxford package. I got error propably due to wrong endpoint (after including Oxford Project into Azure Services there are several, region specific end points).
I got the key from personal account in Azure Cognitive Service project:
library(Roxford)
library(plyr)
library(rjson)
facekey <- "xxx" #look it up on your subscription site
getFaceResponseURL("http://getwallpapers.com/wallpaper/full/5/6/4/1147292-new-women-faces-wallpaper-2880x1800-for-phone.jpg",key= facekey)
#I got error
# {"error":{"code":"Unspecified","message":"Access denied due to invalid subscription key. Make sure you are subscribed to an API you are trying to call and provide the right key."}}
How to change the endpoint to the: "https://westcentralus.api.cognitive.microsoft.com/face/v1.0" ???
If your Roxford lib is the one here: https://github.com/flovv/Roxford/blob/master/R/videoAnalysis_LIB.R#L182
Then you can add the region when you call the method. Cognitive Services keys are dedicated to an Azure region, so you should use the same region when you use it. If you don't remember which region you choose when you generated the key, it's written in the overview in Azure portal.
Then when you use getFaceResponseUrl:
getFaceResponseURL <- function(img.url, key, region="westus")
Pass the region:
getFaceResponseURL("http://getwallpapers.com/wallpaper/full/5/6/4/1147292-new-women-faces-wallpaper-2880x1800-for-phone.jpg", key=facekey, region="theAzureRegionOfYourKey")

BizTalk 2006 Tutorial 1: EDI-to-XML Document Translation

I cannot find the translated file after running the solution in BizTalk 2006 Tutorial Lesson 3: Run the EDI-to-XML Solution.
It should be placed in the c:\Program Files\Microsoft BizTalk Server 2006 \EDI\Adapter\Getting Started with EDI\Northwind\In folder.
The Base EDI adapter picks up the file in c:\Documents and Settings\All Users\Application Data\Microsoft\BizTalk Server 2006 \EDI\Subsystem\Documents\PickupEDI folder, but I cannot find the translated file in the X-12 4010 850 document format.
I'm not immediately familiar with the tutorial you mention, but below are steps to find where any document has gone to in BizTalk.
First two places to check are in the event viewer and in the BizTalk Server Administration Console.
Check you have no errors in the event viewer.
In the admin console, click on the BizTalk Group in the left hand window and you should see two columns in the right hand pane, Work in Progress and Suspended Items. Click on Running service instances and Suspended service instances. Check that you message is not delayed for any reason (a Send Port being turned off perhaps).
Next, from Start -> All Programs -> Microsoft BizTalk Server 2006 select the Health and Activity Tracking (HAT) tool.
In HAT, select Queries -> Most recent 100 service instances. Find the pipeline that will have wrote out your file, right click the service instance and select Message Flow. In the message flow view you should see in the URL the disk location where your file was written to.
(You can also look in the admin console to check where the send port is pointing)
Thanks for your suggestion regarding how to troubleshoot an issue of BizTalk Server from generic point of view. It did help. I have resolved this problem by reading error logs.
Here is the error:
Access denied. The client user must be a member of one of the following accounts to perform this function.
SSO Administrators: SSO Administrators
SSO Affiliate Administrators: SSO Affiliate Administrators
Application Administrators: BizTalk Server Administrators
Application Users: BizTalk Application Users
It works now after adding a service account to "SSO Administrators" and restart all BizTalk related services.

Resources