The EdiReceive and SendEdi pipelines are missing from my BizTalk 2006 R2 application - biztalk

I am still learning BizTalk and EDI. When I originally started at my current company I inherited my predecessor's computer, so a lot of configuration was already in place. I recently got a new laptop, and have almost finished configuring the new development environment. There was no documentation on how to setup a dev environment before I got here (I have created such a document which has gotten rather lengthy).
The last piece I can't seem to figure out is the EdiRecieve and SendEdi ports. They are on my old dev environment but they do not appear on my new one. From what I have been able to turn up Google dumpster diving, they exist in the Microsoft.BizTalk.Edi.EdiPipelines but do not appear to be installed by default. My question is how do I get these in my primary BizTalk application I use for development?
Another sideline, there is a BizTalk EDI Application that is apparently installed by default that does have the pipelines I am looking for. Do I need to reference that application or something somehow?
I tried GAC'ing the dlls, but that hasn't seemed to work.

You need to add a reference to the EDI application. To do this follow the steps below (taken from MSDN( http://msdn.microsoft.com/en-us/library/bb226366(BTS.10).aspx))
In the BizTalk Server Administration Console, under the Applications node, right-click the application that you want to use for EDI, such as BizTalk Application 1. Point to Add, and then click References.
Select BizTalk EDI Application, and then click OK.
You can also use the EDI components that make up the pipelines (e.g. the EDIDissambler) in your own custom pipelines, you are not limited to just the out of the box pipelines - this is a very handy thing to know that it took me a while to realise.

Related

Deploying multiple MSI's into the same BizTalk Application

During our development of schemas orchestrations, ports, etc. We've been exporting MSI's and binding files for deployment into our test and ultimately production environment
So, for example, we set up a series of receive ports/locations in a single BizTalk app, for the purpose of receiving all HL7 v2 messages from our HCIS. We then exported that to a bindings file, and imported into test.
Then, as we developed new schemas, we exported each schema into it's own msi file and deployed that into the same BizTalk application in our test environment. We did that because the schemas are specific to the inbound messages from our HCIS.
So now, in test, we've ended up with a BizTalk application with the receive ports and schemas we need to receive messages from our HCIS. The issue I discovered is that, if I look at the installed programs list in the control panel, I only see 1 application. So if I want to uninstall and re-install a particular schema, I'm not sure what will happen. For some reason, I half expected to see an entry for every msi I installed, but I suppose that because they're all going into the same BizTalk application, they are all registered in windows as the same application. I'm betting there is a better way to do this, any suggestions?
You can, and probably should, create different applications for each logical grouping of code. If you examine the 'deploy' section of the project properties you'll see a text box to enter your application name. When you trigger a deploy they will be placed into a separate application with the name you provide. You'll see it in the BizTalk management console.
We deploy to dev using the framework mentioned below. Then to deploy to QA right click on the application and create an MSI from that point. It will allow creating an MSI for only one application.
NOTE: the deploy setting is NOT saved globally. If another developer opens the project his project will not inherit the application name you've set.
We use the biztalk deployment framework to help manage changes when we do development.
So now, in test, we've ended up with a BizTalk application with the receive ports and schemas we need to receive messages from our HCIS. The issue I discovered is that, if I look at the installed programs list in the control panel, I only see 1 application.
I can only think of two scenarios where you might observe this behaviour:
You have multiple different MSI's (once for each schema) which you are importing into BizTalk (and hence they are appearing in the BizTalk Admin Console), but you are not running the MSI on the local machine (and so it is not appearing in 'Installed Programs'); or
You MSI's are all named the same, in which case after the import into BizTalk and the local install, you only have a single program visible in 'Installed Programs'.
I'm betting there is a better way to do this, any suggestions?
With regards to approach, you are certainly along the correct lines. I tend to advise clients to group logical artifacts into a single logical bucket - either project or Application - that can be deployed (and redeployed) without affecting other parts of the system.
In a HL7 scenario, one logical bucket might be Patient artifacts (schemas and supporting maps) and a second may be Financial artifacts (schemas and supporting maps). These logical buckets can either be deployed to different BizTalk Applications, or the same BizTalk Application depending on your requirements. However, the main benefit here is that they are separate and therefore all artifacts do not need to be redeployed if you need to make a small modification to A19 - Patient Query/Response schema for example.
How to deploy is another question entirely. I'm a massive fan of MSBuild and have written comprehensive build scripts that I tweak and reuse for each project I work on. These deployment scripts will tear down an existing environment and re-build from the ground-up, creaing Applications, deploying Resources, importing Bindings, creating Hosts and Host Instances etc. before finally starting the application. This approach removes all human error from the process and tends to be favoured by clients who often have their infrastructure teams perform the deployment rather than their development teams.
I notice that Jay mentioned the use of the BizTalk Deployment Framework. I personally struggle with this tool, partly because I need to maintain my configuration in Excel which I can't check in to source control easily.

How to move or copy a Dynamics AX 2009 Enterprise Portal instance?

I can't think of a better way to word the question than the subject line, but I'll try my best.
We're comfortable copying and handling the AX instances themselves, but how do I create a duplicate of an existing Enterprise Portal? For example, we have one that is working well, and we'd like to spin off a copy to point at a different copy of the AOS (for development purposes).
Is there a simple process, is it as easy as copying the inetpub folder from IIS to another site instance or server? Or can you re-run the setup for Enterprise Portal on another server, copying over the files after the fact?
Links to any documentation on the process would be appreciated, although my Google searches yielded very little results.
Given that your site only contains Enterprise Portal content (pages and web parts), you should make sure that everything is persisted with latest copy /version in the AOT.
Then it would be a matter of making a project with everything in and export it and import it. You can use axupdateportal utility to deploy on the second environment.
This procedure implies installing EP from scratch on the new server.
The trick with EP development is to make sure you have a solid project within AX that holds the most recent version of your custom EP applications.

How to keep deployed code on multiple BizTalk front ends in sync?

We have multiple BizTalk 2006 application servers, and I find it almost impossible to keep the versions of our projects in sync on them. It's a tedious process of deploying the MSI packages, importing them, matching up files in the GAC, deploying some registry changes, and if one step is missed or somebody deployed an updated copy of a DLL directly to one server and not another, there's no easy way to tell.
How do others ensure that copies of software between the two servers are the same version?
Some Background:
Our environment has two (non-clustered) BizTalk front end servers and a separate database back-end. Until recently, though we had both front-ends configured, the host instances were stopped on the second server because of some troubleshooting. They've been disabled for a few months, and we're deployed some updated code in the meantime.
This morning, I did a folder diff on the GAC, as well as the folder that holds the local disk copy of the DLLs for our deployed project (C:\OurProject\ on both servers), and everything matched - same file sizes, same timestamps. However, once I turned on the second set of services, it became obvious that Server2 was using an old version of the project DLL - of the next three files processed, two had normal results and one was clearly out of date.
Please help me avoid an aneurysm.
One thing you may want to look into is the BizTalk Deployment Framework.
We are currently building up a new environment with BizTalk 2009 and I started out with a set of MSBuild scripts that handle exporting sources from SubVersion, building and deploying assemblies using BTSTask.
Of course BTSTask lacks a lot of functionality (start/stop applications) but at least for BizTalk 2006 there is BTSControl.
We use an automated build script whose ultimate end is an MSI with binding files for Dev/Stage/Prod. All released binding files are stored on a share and used to load the BizTalk server by hand. First the App is stopped, MSI's executed on both servers and then MSI imported. During the import, we specify the environment for the bindings and voila. We've had no issues with loss of synch.
So, I'd suggest taking all of your latest MSIs and re-execute them on the servers where you have differences. Otherwise, just try to put a process in place to create a repeatable load process by hand.

Does anybody create installers to deploy internal asp.net web applications?

I've always deployed my web applications via FTP (sometimes even xcopy), and then manually run database scripts myself.
I started deploying this way in the 90's, but lately, I've seen a few web apps with installers. I'm starting to question, if I'm locked into an out dated process. I'm a consultant, my apps are usually internal, so I don't worry about distributing and having others installing them.
But I'm curious; does anybody create installers to deploy internal asp.net web applications?
If so, why? (Voluntarily, mandated, or part of an automation process)
And have you had any problems doing it this way?
absolutely. We use it to do all of our apps. That way we create the installer and run it on the qa and uat environments to test and we know exactly what is going to happen in production. There are no guesses as to what order someone might do something in, or if they miss a step. It makes things a lot easier.
Ooh I forgot about the automated process too. We have systems in place (Ant Hill Pro) which automatically deploy it to the proper environments. The qa people don't have to wait for something to be done, because it's all done at 2 am. If they need to rerun the build with updates, the devs check the code in and we push a button, and it's automatically deployed. No waiting for the build engineer, because he's in a meeting or sick or whatever.
You always want to have an automated way to build and deploy - it greatly reduces the chances of a one-off error if you forget a certain step. Also, it allows you to offload the deploy to someone else easily without having to teach them 100 customized steps. Whether the project is internal or not, all applications should follow best practices.
Personally I'm a bit like the OP; generally I just deploy using FTP, but in saying that typically my applications are internal, or in the case of other projects, 100% managed by me.
I've also been thinking about this lately however, and have started to think about how using proper deployment may improve the process - having to document a detailed install process can be a real pain.
I use Powershell and found really easy to automate lots of tasks. You will probably find a bit different at the very begining but at the end you will see that it's all about the power of the .NET libraries !!!
I have use the "Web Setup Project" to create an MSI that installed the output of a "Web Deployment Project" for an internal app. Our server admin wasn't up to the task to doing a 50 step manual install. For my current app, my server admin doesn't like the 'black box' feel of MSI installers and prefers getting a pile of files and a 50 step deployment manual. (See a pattern here? Ask your server admin what he wants.)
The Web Setup Project doesn't make it immediately obvious how to install to anything other than the "Default Website", other than that, it made the installation process repeatable and created a built in way to rollback (by just running the installer from 1 version ago).
This of course assumes that your virtual directory doesn't hold any user modified content-- I wouldn't trust an MSI to properly merge user created and new files.
We use the "XCopy" deploy model here, since the Ops folks have their own method of setting up security on a new web application on the server.
However, we did need to use an installer when we had to install a web application that was using a newer version of Crystal Reports since it had to do something special with a key and we didn't have a full blown version of CR on the server itself. So keep that in mind when working with third party apps, they may need to do some kind of merge module that the MSI handles easily.
Yep...we have an app that needs a lot of pre-requisites set up....web service, windows service, user accounts, security, folder creation, GAC bits etc....I rolled it all up into a nice MSI with custom actions that can install and uninstall cleanly. Saved about one hours worth of work to deploy on a new box.
A lot of the other smaller apps are just deployed by doing Publish Website to a local folder then ftp'ing the contents to the target.
It greatly depends upon the scale of your project, your enviornment and your internal user base. I rarely deploy with an msi because we are too small an operation to have multiple environments (except for SharePoint, that's different all together) . We develop and use VS to deploy web apps to a development box, assuming they are approved then we use VS again to deploy to the live box.
The only proviso is that we have multiple copies of the web.config (appended with test, dev and live) and we then delete the suffix off the relevant file depending upon where its been deployed.
It's probably not the best methodology (I know it's not), but it works and it aids rapid deployment of small to medium sized solutions in a small-scale user environment.
F5ToDebug...
Your saying its OK to take short cuts if you dont have time to do it properly?
"who's going to test the code on the test environment?" You said it yourself that you have config files for _test - why would that not be a suitable test?

Step-By-Step ASP.NET Automated Build/Deploy

Seems like there are so many different ways of automating one's build/deployment that it becomes difficult to parse through all the different scenarios that people support in tutorials on the web. So I wanted to present the question to the stackoverflow crowd ... what would be the best way to set up an automated build and deployment system using the following configuration:
Visual Studio 2008
Web Application Project
CruiseControl.NET
One of the first things I tried was to have CCnet automatically zip the output and copy it to the server, but then that requires manual work to unzip at the destination. However, if we try to copy all the files individually, then it could potentially take a long time if it's a large application (build server lives outside of the datacenter in our office ... I know).
Also of particular interest is how we would support multiple environments as we have dev, qa, uat, and then of course prod.
MSDeploy seems really interesting, but unless I'm interpreting the literature incorrectly, doesn't help in the scenario of deploying from the output of a build server. If anything, it seems like it'll be useful in deploying one build across a build farm ... but even for deploying from one environment to another, one would have to manually change config settings and web service URLs, etc.
I recently spent a few days working on automating deployments at my company.
We use a combination of CruiseControl, NAnt, MSBuild to generate a release version of the app. Then a separate script uses MSDeploy and XCopy to backup the live site and transfer the new files over.
Our solution is briefly described in an answer to this question Automate Deployment for Web Applications?
You might be interested in MSDeploy. Here's a Scott Hanselman post on this. It's only available as a technical preview at the moment (September 2008) but is worth evaluating against your requirements.
There is another new build tool (a very intelligent wrapper) called NUBuild. Its lightweight, open source and extremely easy to setup and provides almost no-touch maintenance. I really like this new tool and we have made it standard tool for our continuous build and integration process of our projects (we have about 400 projects across 75 developers). Try it out.
http://nubuild.codeplex.com/
Easy to use command line interface
Ability to target all .Net framework
version i.e. 1.1, 2.0, 3.0 and 3.5
Supports XML based configuration
Supports both project and file
references
Automatically generates the “complete
ordered build list” for a given
project – No touch maintenance.
Ability to detect and display
circular dependencies
Perform parallel build -
automatically decides which of the
projects in the generated build list
can be built independently.
Ability to handle proxy assemblies
Provides visual clue to the build
process e.g. showing “% completed”,
“current status” etc.
Generates detailed execution log both
in XML and text format
Easily integrated with
Cruise-Control.Net continuous
integration system
Can use custom logger like XMLLogger
when targeting 2.0 + version
Ability to parse error logs
Ability to deploy built assemblies to
user specified location
Ability to synchronize source code
with source-control system
Version management capability
Do you have the ability to run commands remotely? The PsExec utility from Systinternals would let run a command line unzip program on the remote machine. If you have a script that copies the build as a .zip file to the remote site, you would just need one more line for the PsExec call to unzip the files.
I had a related question about getting a deployable set of files from an automated build. I found Web Deployment Projects (links and all in the old question) did what I needed - they're a VS and MSBuild add-on.
This is a common problem (and I wish I had read it sooner) for all development, not just ASP.NET. Being one of its developers, my team naturally uses BuildMaster internally for the entire release process, and for most scenarios it's free. Within the tool, we are able to perform all the standard CI builds to create artifacts and then set up an automation process to deploy these artifacts to any one of the 40+ servers we have internally or externally hosted, depending on the specific application or environment.
Since you specifically mentioned deployment to different testing environments, this is a fundamental aspect of the tool. The idea is to model the environment workflow (e.g. Integration -> QA -> Production) you already have in place and essentially promote a build all the way from source control to production. Most times, it's as simple as adding a deployment action that deploys an artifact to the environment, other times it can be much more complex.
You also casually mentioned configuration file changes are part of deployment, which is another built-in component to BuildMaster. The idea we had was to use the tool itself as the central hub for all configuration files and deployments, thus ensuring the latest changes are applied automatically with a simple "deploy configuration files" action in your deployment plan.
One thing you didn't mention with regard to this process is the database deployment aspect. Most ASP.NET applications require an associated database, otherwise they could just be static HTML files. It is crucial that the database schema gets updated to the appropriate database version with every deployment. There is, not surprisingly, a module within BuildMaster that handles this for you as well. The idea is to store DDL-DML scripts within the tool itself, and by executing scripts only once per environment, it ensures that all of your databases across each environment are up-to-date as your builds are deployed through them. Other scripts (e.g. stored procedures, views, triggers, etc.) are essentially code files and therefore belong in source control. These DROP-CREATE-CONFIGURE type scripts can be run each and every time in most cases with a simple deployment action.
Another piece of the deployment puzzle that most developers do not think about is process automation. Many developers need to perform sign-offs or fill out change request forms in order to manually perform these processes. Again, this is all available as part of the automated workflow setup within BuildMaster. You can setup blockers that do not allow promotion to say the QA environment unless all unit tests have passed, or block promotion to the Staging environment unless someone from the QA team approves the build and all issues in your issue tracking tool are resolved/closed for that particular release.
While I realize I left out CC.NET from the answer, our applications are all built and deployed through BuildMaster so we no longer need it, though we could however just as easily pickup the artifacts from a drop location and deploy them in later environments.
I see that many people use CC for their .NET projects, but why not use Jenkins, Sonarqube? They got all you need. I setup all this in 3 days. I have a Win 2008 server R2, MSSQL, Jenkins, VIsual SVN and Sonarqube.
It all works great and u get all metrics on your project. Sonarqube uses Gallio, Gendarme, FXcop, Stylecop, NDepths and PartCover to get your metrics and all this is pretty straight forward since SonarQube do this automatically without much configuration.
i post som pictures for u too get a feeling for it. Here is Jenkins witch builds and get Sonar metrics and a another job for deploying automatically to IIS
And Sonarqube, all metrics for my project. This is a simple MVC4 app, but it works great!:
If you want more information i can be more specific but i think you should at least consider jenkins. If CC suites you better, at least you looked at good alternative before you chose.
This whole setup uses MSBuild, too build and deploy the apps.

Resources