Recover deleted ports in Biztalk - biztalk

Is there any way to recover the physical send & receive ports once deleted without exporting the bindings?

Even if you export the bindings the ports will not exist and so will not be exported. I think Bilal is correct - this data resides in the management database, so you would need to restore from a backup.

Your best option is always have a binding file configuration for your application. Once your BizTalk applications are configured with correct send, receive ports, orchestration host etc make sure you back up your binding configuration using "Export Binding" option in BizTalk admin console.
In fact, in bigger projects binding files will be normally owned by deployment teams for various environments and you never create ports using admin console in real environments.
It will be bit overkill to restore your BizTalk databases, just get your ports back. I would rather create them manually again, even if the numbers are hight. Restoring databases requires special attention, if you do it wrong you will end up reconfiguring the whole environment which will be time consuming.

I agree with Saravana.
My 5c: you can import existed binding files and that will not remove ports that are not in the binding files. The import process works in "addition" mode.

Related

nopcommerce 4.0 datasettings.json transform

This may seem a bit trivial...but how do you go about transforming the db connection for a nopcommerce app as it is deployed to various environments.
The db connection is set in app_data\datasettings.json.
Normally this type of stuff is handled with web.config transforms.
How do you go about setting up build transforms for different environments (dev, test, prod)?
I am also looking around this topic.
In my humble opinion, the nopCommerce config is a pain, because it makes it really hard to do proper Continuous Integration/Continuous Delivery while keeping secrets safe.
At initial deployment you are greeted with the install page. The problem is that the installation process writes a a bunch of files to on server, including datasettings.json, where the connection string to the DB is hard-coded.
This means that when I deploy nopCommerce to Azure App Service, for deployments after installation, I have to make sure NOT to delete "additional files on the server" or the config will be deleted, since these config files written by the installer, are not in source control.
It is really impractical not to be able to use standards ASP.NET connection strings, environment variables or KeyVault.
To answer your question on how you do transformation on the config file, one possibility is to use a PowerShell script to read, transform, and write the config file directly on the App Service instance. There is an API for that.
https://blogs.msdn.microsoft.com/gabeshapiro/2017/01/01/samples-for-using-the-azure-app-service-kudu-rest-api-to-programmatically-manage-files-in-your-site/
https://github.com/projectkudu/kudu/wiki/REST-API
Alternatively, you can modify the source to read from Web.Config:
Change the connection string of nopCommerce?

QSQLDatabase (using SQLite) takes long time to open a database

I have developed an application win QT which uses SQLIte database. The copy of database is located on each site.
On one site let's say site 'BOB1' it works perfectly without any problem. But when we try to use it on another site lets say 'BOB2' it takes long time to open a database connection(approx 2000 milliseconds).
I thought that perhaps there is a network problem, So they tried to use the server of the site 'BOB1' as their server, which works fine. But when i tried to use the server of the site 'BOB2' from the site 'BOB1', I have the same problem. So i thought it may not be the network issue.
Another thing that came to my mind was that, perhaps there is a problem of DNS resolution. But when i tried to ping the server using IP and hostname, the response time is the same.
Any idea or pointer that what can be the problem.
PS: Server + database file path is specified in the setDatabasePath() fuinction using enviornment variables.
Consider copying the database to the local machine (eg temp folder if transient, or other suitable location if permanent). You can safely use either file copy, or consider using the qt backup API to ensure that the transfer happens successfully (plus you get the option of progress feedback)
https://sqlite.org/backup.html
You could even "backup" the file from the remote server to in-memory if the file is small and you say you're reading only?
You can see some sample code here on how to import an sqlite DB into a Qt QSqlDatabase. Note that when you do this, you want to make sure the version of sqlite native API that you're using is the same as that compiled into Qt, or you may get error messages from sqlite or Qt.

Deploying multiple MSI's into the same BizTalk Application

During our development of schemas orchestrations, ports, etc. We've been exporting MSI's and binding files for deployment into our test and ultimately production environment
So, for example, we set up a series of receive ports/locations in a single BizTalk app, for the purpose of receiving all HL7 v2 messages from our HCIS. We then exported that to a bindings file, and imported into test.
Then, as we developed new schemas, we exported each schema into it's own msi file and deployed that into the same BizTalk application in our test environment. We did that because the schemas are specific to the inbound messages from our HCIS.
So now, in test, we've ended up with a BizTalk application with the receive ports and schemas we need to receive messages from our HCIS. The issue I discovered is that, if I look at the installed programs list in the control panel, I only see 1 application. So if I want to uninstall and re-install a particular schema, I'm not sure what will happen. For some reason, I half expected to see an entry for every msi I installed, but I suppose that because they're all going into the same BizTalk application, they are all registered in windows as the same application. I'm betting there is a better way to do this, any suggestions?
You can, and probably should, create different applications for each logical grouping of code. If you examine the 'deploy' section of the project properties you'll see a text box to enter your application name. When you trigger a deploy they will be placed into a separate application with the name you provide. You'll see it in the BizTalk management console.
We deploy to dev using the framework mentioned below. Then to deploy to QA right click on the application and create an MSI from that point. It will allow creating an MSI for only one application.
NOTE: the deploy setting is NOT saved globally. If another developer opens the project his project will not inherit the application name you've set.
We use the biztalk deployment framework to help manage changes when we do development.
So now, in test, we've ended up with a BizTalk application with the receive ports and schemas we need to receive messages from our HCIS. The issue I discovered is that, if I look at the installed programs list in the control panel, I only see 1 application.
I can only think of two scenarios where you might observe this behaviour:
You have multiple different MSI's (once for each schema) which you are importing into BizTalk (and hence they are appearing in the BizTalk Admin Console), but you are not running the MSI on the local machine (and so it is not appearing in 'Installed Programs'); or
You MSI's are all named the same, in which case after the import into BizTalk and the local install, you only have a single program visible in 'Installed Programs'.
I'm betting there is a better way to do this, any suggestions?
With regards to approach, you are certainly along the correct lines. I tend to advise clients to group logical artifacts into a single logical bucket - either project or Application - that can be deployed (and redeployed) without affecting other parts of the system.
In a HL7 scenario, one logical bucket might be Patient artifacts (schemas and supporting maps) and a second may be Financial artifacts (schemas and supporting maps). These logical buckets can either be deployed to different BizTalk Applications, or the same BizTalk Application depending on your requirements. However, the main benefit here is that they are separate and therefore all artifacts do not need to be redeployed if you need to make a small modification to A19 - Patient Query/Response schema for example.
How to deploy is another question entirely. I'm a massive fan of MSBuild and have written comprehensive build scripts that I tweak and reuse for each project I work on. These deployment scripts will tear down an existing environment and re-build from the ground-up, creaing Applications, deploying Resources, importing Bindings, creating Hosts and Host Instances etc. before finally starting the application. This approach removes all human error from the process and tends to be favoured by clients who often have their infrastructure teams perform the deployment rather than their development teams.
I notice that Jay mentioned the use of the BizTalk Deployment Framework. I personally struggle with this tool, partly because I need to maintain my configuration in Excel which I can't check in to source control easily.

How can I have a file appearing on a WebDav server trigger a BizTalk event?

I have a legacy system which can create files visible in WebDav as an output. I'd like to trigger a BizTalk orchestration receive port when a file matching a filter appears - so a lot like the standard File adapter, but for WebDav.
I found the BizTalk Scheduled Task Adapter, which can pull in a file by HTTP, but it looks abandoned, poorly documented, and out of date.
So, how is it done? Can I use the standard HTTP adapter perhaps?
If you're able access the WebDAV via a UNC path from the BizTalk server the File Adapter should do the trick.
Have you tried to assign a drive letter to the WebDav folder?
http://en.wikipedia.org/wiki/WebDAV
We've had to go with a workaround on this where we made a completely unrelated separate process to make a copy of the file from the legacy system appear in a Samba share, which we in turn attach to with an ordinary FILE adapter.

BizTalk Business Activity Monitor

I recently started with the BAM from BizTalk.
I created a simple orchestration.
I configured the BAM for BizTalk ofcourse.
I used excel to create a simple schema with only textfields.
I deployed this xml schema to the BizTalk primary import using: bm deploy-all -DefinitionFile:myxml.xml.
Opened the TPE and opened the deployed schema.
Opened the orchestration and here opened the used schema and linked the schemafields to the bamschemafields.
After this I applied the tracking profile.
I then put a file through BizTalk which uses the orchestration. The file was outputted.
If I now check in the primary import database, I can see that the file is visible in the active messages. But the completed field is set to false. And it doesn't change. Also no data is filled in, only the ActivityID and LastModified, none of the columns which i specified myself are filled, and also RecordID = null.
What am I doing wrong?
I thought I did all the necessary steps, I know it's all still pretty basic but I need to get this to work if I want to do more, right?
Getting BAM to work can be tricky sometimes. First, did you restart your biztalk hosts after deploying everything? That could cause issues if you didn't.
Almost the first thing I do when I run into any issues with BAM is to turn on BAM tracing and either redirect it to a file or use DbgView to check for any errors BAM might be running into.
One of the crappy things about BAM is that it will fail silently sometimes, with the only information about the error being dump on the BAM tracing, so getting familiar with it is important.

Resources