I have created this blog to document some of the techniques/patterns/tricks that I have come across during some of my projects.
Tuesday, April 7, 2009
Another BizTalk Resource: Microsoft BizTalk Server Developer Center
Sunday, April 5, 2009
Azure Roadmap
One session of particular interest was A Lap around Microsoft .NET Services by John Shewchuk. Part of his presentation included the updated Azure Road map. I felt that this bit of info was worth sharing.
As you can see the Microsoft team has come a long way in a short period of time. I find the next milestone to be very interesting: "Pricing & SLA Confirmation". Up until this time, every time that I have heard a question asked that involved either Pricing or SLA, it has been deflected. This is understandable as it is no doubt a complex situation.Up until this point, it is very safe to say that people are impressed with the Cloud technology that has been presented. Pricing and SLA could very well make or break Azure. If the Pricing is too cost prohibitive then adoption will suffer, if the SLAs are not strong enough...adoption will suffer.
I am also interested in knowing whether the initial "Production" applications running on the platform will have to be "big" enough (read: Profitable) for Microsoft's initial adoption? Will smaller applications, that may not be as profitable for Microsoft, be included in the initial Production release of Azure? I guess time will tell.
Private Azure Clouds?
- Microsoft Nixes Private Azure Clouds
- Steve Martin's blog - Windows Azure and Windows Server - Licensing Model
- Dot Net Rocks Podcast with Pablo Castro (around the 22 minute mark)
- Recent Regional Microsoft event - Energize IT
During the PDC time frame(Fall 2008), I started to pay attention to the Azure platform and its offerings. Initially, I thought that this stuff was pretty cool and that it would be something that I would want to host in my enterprise. It wasn't until I started to better understand just what is involved in running a platform like this that my mind has since changed. At PDC, I attended a session called ".NET Services: Messaging Services - Protocols, Protection, and How We Scale" by Clemens Vasters. I believe that it was around this time that my opinion about hosting an on premise cloud started to change.
So I have put together a few reasons why I think it is a good idea to leave it to Microsoft to host the Azure platform.
Complexity - Part of the reason why I would be interested in a cloud offering to begin with is to let someone else take care of the hard part. Windows patching, upgrades and up time are all challenges of operating a highly available solution. If someone can provide me that, then I can focus on solving my core business problem.
Scale and Elasticity - The "pay as you go" model has many benefits, especially during the initial launch of your Cloud applications and services. If you were to build this on your own, how big, or small, do you make your cloud and how quickly can you allocate more resources to an application? Adding new processing capabilities on the Azure platform is as simple as "turning a dial". I suspect that this would be very problematic for many organizations to implement.
Cost - Perhaps if you had a stock pile of cash to build a larger than required data centre, you could. However, in this climate there are not a whole lot of companies that can do this. One company, with the capital to do so would be Microsoft. Generally, environments are planned based on expected requirements + x% for growth. How many companies would have the capital to invest in a platform that gives them the ability to scale out in the fashion that Azure allows you do.
Microsoft also has a Geo-Scale initiative where your applications/data may be stored in several different data centres around the world. The benefit of this is to take advantage of lower latency where you have customers accessing your application from all over the world. Also, avoiding natural disasters and widespread power outages are additional reasons to look for Geo-scale. It was only a few years ago where we had most of Eastern Canada and Eastern United States shut down do to a widespread power outages. Yes, UPS and generators will help, but many cannot withstand multi-day outages. Having Geo-Scale ensures of availability since regional events can be offset by data centres outside the affected areas.
Infancy - The Azure platform is still relatively immature. If Microsoft was about to "productize" this offering, I believe that it would slow down innovation. Microsoft would then have to worry about what other versions of the cloud a customer is running to ensure of backwards compatibility and interoperability. Note, I am looking at this from a platform perspective. I am not letting Microsoft off the hook for ensuring that their APIs/services still function for people who have written applications on top of them.
To look at the other side of the coin, I do feel that there are some valid use cases where someone would want to be able to host an Azure cloud platform:
ISV/ASP - Microsoft has generally worked well with ISV and ASPs in the past. I can see some ASP (Application Service Providers) looking to get on board the next wave of Microsoft Technology and start hosting these applications. For many ASPs, they have been hosting Web Applications for years. Evolving to the Azure platform is a natural progression. While I am sure there are many top notch providers out there, at the present time I am just not sure how many of them would be able to host a platform as big and complex as Azure.
Government - Many government departments, have privacy regulations, or laws, that prevent them from allowing their data to exist in 3rd party data centres. Other constraints may include hosting data in foreign countries. For instance data stored in the United States may be subject to the Patriot act. If your company is outside the United States, you may not want your data subject to the Patriot act. These types of constraints, do make hosting the Azure cloud on premise to be very appealing. You get to leverage the "building blocks" that Microsoft has provided in Azure, yet have total control within your environment.
Conclusion - For me, in my current situation, the Microsoft hosted Azure Platform is more advantageous. If I have requirements to expose, or exchange, data with people outside of my organization, I would prefer to let Microsoft handle the infrastructure related challenges that allow me to focus on solving my business requirements.
BizTalk 2009 - Failed to configure EDI/AS2 Status Reporting functionalities.
Everything went smoothly with configuration except for the configuration of the EDI/AS2 runtime. I ran into this error:
Failed to configure EDI/AS2 Status Reporting functionalities.
Failed to deploy BAM activity definitions. Please make sure that all BAM related Data Transformation Services (DTS) packages are removed along with the BAM databases.
Having not had a lot of exposure to the EDI/AS2 world this error message was rather foreign to me.
Like many solutions, I found what I was looking for with a google search. This seems to be a problem that many people have encountered with BizTalk 2006 R2 as there is a KB article.
I decided to take the "Method 2" route(from MSDN):
Method 2: Use the Bm.exe utility to remove the DTS packages
To resolve this problem, use the Bm.exe utility to remove the DTS packages. To do this, follow these steps:
1. Click Start, click Run, type cmd, and then click OK.
2. Use the CD command to locate the folder where the Bm.exe file is located.Note By default, the Bm.exe file is located in the following folder:
Microsoft BizTalk Server 2006\Tracking
3. At the command prompt, type the following command, and then press ENTER:
Bm.exe remove-all -DefinitionFile:folder\AS2ReportingActivityDefs.xmlNote The folder placeholder is the folder where BizTalk Server 2006 R2 is installed.If you receive an error message that an activity could not be removed, go to step 4.
4. At the command prompt, type the following command, and then press ENTER:
Bm.exe remove-all -DefinitionFile:folder\EdiReportingActivityDefs.xmlNote The folder placeholder represents the folder where BizTalk Server 2006 R2 is installed.If you receive an error message that an activity could not be removed, go to step 5.
5. At the command prompt, type the following command, and then press ENTER:
Bm.exe remove-all -DefinitionFile:folder\EdiReportingActivityIndexes.xmlNote The folder placeholder represents the folder where BizTalk Server 2006 R2 is installed.If you receive an error message that an activity could not be removed, go to step 6.
6. Try to configure the BizTalk Server EDI/AS2 Runtime feature again.
Once I had completed these steps, I was able to finish the EDI/AS2 configuration.
Saturday, April 4, 2009
BizTalk 2009 RTM!
The links work and I was able to download software for my organization. So how does a person access the links? Well if you have purchased BizTalk licenses in the past and have also bought Software Assurance (SA) then you should be entitled to download and use this software. I would check with your Microsoft Sales rep if you are unsure.
I also decided to check MSDN and it looks like the RTM bits have been posted there as well.Thursday, March 26, 2009
VANCOUVER, BC - DevTeach/SQLTeach June 9th - 11th 2009
Saturday, February 21, 2009
BizTalk vNext Wish list
Adapters
- An Email Adapter(POP3/Exchange) that supports regular expressions. For instance if you are only interested in retrieving certain messages from a mailbox, you should be able to provide a regular expression against the subject field of the message. If the words in the subject matches you reg ex, then the message would be retrieved.
- A native Exchange adapter. I have personally run into this and have also seen several instances of this on the MSDN forms. For many organizations, leaving their servers open to POP3 connections posses a security risk. The way that we solved this problem was limiting POP3 access to certain IP Addresses. This is "ok" for servers, but when I put in a request to have my IP Address added, that was declined as individual users should not have POP3 access. So yes this is more of "my problem" than Microsoft's, but if many others are in the same problem then perhaps adding an Exchange adapter would be beneficial.
- FILE Adapter that supports regular expressions. The current adapter does support file masks using '*', '?' etc, but it would be even better if regular expressions were supported. I work in an industry where the names of files are built into the File Exchange specification. Often times this will include ranges of numbers that are considered valid. It is pretty tough to properly constrain without regular expressions. Yes, I know that a Custom Adapter could be built that supports this scenario, however it would be nice if this was included in the core product.
- FTP Adapter that supports Temporary folder in ASCII mode. In order for the FTP adapter to support "once only guaranteed delivery", you are required to use a Temporary folder when moving files with the FTP Adapter as described here. The problem is that temporary files are only supported when the Adapter is used in binary mode. The issue is that when are exchanging data between heterogeneous environments like Windows and Unix that these environments use different symbols for carriage return line feeds. Using ASCII mode takes care of the conversions of these symbols. So as you can see we are in a bit of a Catch 22 situation here.
- Deployment has improved considerably from BizTalk 2004, but I still think that some improvements could be made. The organization that I work for has automated our build and deploy process including deploying a BizTalk project to a multi-node BizTalk group. It would be nice if this type of functionality was part of the core product. For instance when importing an MSI in the BizTalk Admin console if you could "Deploy to Group" that would be a welcomed feature where it does the import to the BizTalkMgmt database once and gac's the dlls on the remaining BizTalk nodes.
- Binding files - To be honest I am not exactly sure what I am looking for here but think there just has to be a better way. Obviously the ability to separate code/assemblies from configuration is required. However managing that configuration could be improved. If you have a project with only a few receive locations then binding files is not too big of a deal. However once you start reaching the 10+ receive locations managing this data and the passwords becomes tedious. I personally 'love' having to export the bindings after a deployment only to play the "Find and Replace" game with the password '*'s before checking the bindings into source control. I do this so that the next time I deploy I don't have to manually set passwords during the next deployment. I do recognize that having passwords, in binding files, in clear text is not a great option but if we could somehow improve this scenario it would relieve a lot of headaches.
Design Time
- The ability to update a generated schema automagically. For instance if I have generated a schema for an SAP IDoc and the IDoc has now changed, I need to run through the "Add Generated Items" wizard again. If I have renamed or modified the target namespace of the original schema I then need to update this new version of the schema. As an added bonus, a blank orchestration is added to my solution even though I already have a working solution.
- The ability to debug an orchestration from Visual Studio. When I first started using BizTalk, I really missed this feature. Prior to working with BizTalk I was a ASP.Net developer and really enjoyed ASP.Net debugging. Especially having worked with classic ASP. I am not sure how they could support this feature, but it would definitely be welcomed.
Management/Operations
- The ability to go back and historically look at a message. So yes this functionality does exist today however there is another catch 22 situation. If you have message body tracking enabled, then a copy of the message body will placed in the BizTalkDTADb database. However, in order to keep your BizTalk environment performing well, you need to archive and purge data from this database. The job that takes care of this is the DTA Purge and Archive job. If you neglect to enable this job, or keep your live window open for too long, you are bound to have performance problems. At one point, neglecting this job was one of the top BizTalk tickets to Microsoft support. One option is to take the extracts that this job will output and aggregate them together to create your own Long Term Archive solution. This would alleviate you from any run time issues, however it leaves you with a management problem as you need to manage this yourself. The requirement itself comes from a request to find out "What happened last month to order # 1234567"?
- If you have a complex business scenario and need to find all messages related to that instance of the business process, it would be great if you could link all of the related messages to the orchestration that managed this business process. So yes, there are Ids that do link all of these interactions together, but it would be nice if you could easily view all of this information from a tool. This would allow you to quickly view what happened to that particular business process instance. Consider the following interaction, if you needed to view all of the messages for this particular business process, how could you easily achieve this?
So here are a few items that tend to show up frequently elsewhere
- Support for Low Latency scenarios. BizTalk's design includes built in persistence. This is a feature that is used to support guaranteed delivery. While in many scenarios this feature is required, but there is a cost associated. For some scenarios this cost is just too expensive. Having the ability to by pass all of the persistence may satisfy some requirements that are not currently addressed.
- An expandable, or larger, expression editor. The argument has always been that if you need a larger expression editor then chances are that you are doing something that you shouldn't be doing. While I agree in principle with this statement, here is something else to consider. If you follow a popular namespace convention that includes using OrganizationName.BusinessUnit.FunctionalName as a namespace in a .Net assembly it does not take too long before you are starting to scroll to the right when calling a static method.
- Constructing a new message in an orchestration. We have all been there! You use some sort of questionable approach to create an instance of a message. This may include using a map that does not actually map any data, loading an XML string that matches your schema's format and assigning it to a message. It would be nice to just create a new instance of a message without having to jump through these hacks.
If you have any Wish list items of your own, please post them using the comments feature.


