Tuesday, December 10, 2013

BizTalk 2013–Integration with Amazon S3 storage using the WebHttp Adapter

I have recently encountered a requirement where we had to integrate a legacy Document Management system with Amazon in order to support a Mobile-Field Worker application.  The core requirement is that when a document reaches a certain state within the Document Management System, we need to publish this file to an S3 instance where it can be accessed from a mobile device.  We will do so using a RESTful PUT call.

Introduction to Amazon S3 SDK for .Net

Entering this solution I knew very little about Amazon S3.  I did know that it supported REST and therefore felt pretty confident that BizTalk 2013 could integrate with it using the WebHttp adapter.

The first thing that I needed to do was to create a Developer account on the Amazon platform. Once I created my account I then downloaded the Amazon S3 SDK for .Net. Since I will be using REST technically this SDK is not required however there is a beneficial tool called the AWS Toolkit for Microsoft Visual Studio.  Within this toolkit we can manage our various AWS services including our S3 instance.  We can create, read, update and delete documents using this tool.  We can also use it in our testing to verify that a message has reached S3 successfully.


Another benefit of downloading the SDK is that we can use the managed libraries to manipulate S3 objects to better understand some of the terminology and functionality that is available.  Another side benefit is that we can fire up Fiddler while we are using the SDK and see how Amazon is forming their REST calls, under the hood, when communicating with S3

Amazon S3 Accounts

When you sign up for an S3 account you will receive an Amazon Key ID and a Secret Access Key. These are two pieces of data that you will need in order to access your S3 services.  You can think of these credentials much like the ones you use when accessing Windows Azure Services.


BizTalk Solution

To keep this solution as simple as possible for this Blog Post, I have stripped some of the original components of the solution so that we can strictly focus on what is involved in getting the WebHttp Adapter to communicate with Amazon S3.

For the purpose of this blog post the following events will take place:

  1. We will receive a message that will be of type: System.Xml.XmlDocument.  Don’t let this mislead you, we can receive pretty much any type of message using this message type including text documents, images and pdf documents.
  2. We will then construct a new instance of the message that we just received in order to manipulate some Adapter Context properties. You may now be asking – Why do I want to manipulate Adapter Context properties?  The reason for this is that since we want to change some of our HTTP Header properties at runtime we therefore need to use a Dynamic Send Port as identified by Ricardo Marques.


    The most challenging part of this Message Assignment Shape was populating the WCF.HttpHeaders context property.  In C# if you want to populate headers you have a Header collection that you can populate in a very clean manner:

    headers.Add("x-amz-date", httpDate);

    However, when populating this property in BizTalk it isn’t as clean.  You need to construct a string and then append all of the related properties together.  You also need to separate each header attribute onto a new line by appending “\n” . 

    Tip: Don’t try to build this string in a Helper method.  \n characters will be encoded and the equivalent values will not be accepted by Amazon so that is why I have built out this string inside an Expression Shape.

    After I send a message(that I have tracked by BizTalk) I should see an HTTP Header that looks like the following:

    <Property Name="HttpHeaders" Namespace="http://schemas.microsoft.com/BizTalk/2006/01/Adapters/WCF-properties" Value=

    "x-amz-acl: bucket-owner-full-control
    x-amz-storage-class: STANDARD
    x-amz-date: Tue, 10 Dec 2013 23:25:43 GMT
    Authorization: AWS <AmazonKeyID>:<EncryptedSignature>
    Content-Type: application/x-pdf
    Expect: 100-continue
    Connection: Keep-Alive"/>

    For the meaning of each of these headers I will refer you to the Amazon Documentation.  However, the one header that does warrant some additional discussion here is the Authorization header.  This is how we authenticate with the S3 Service.  Constructing this string requires some additional understanding.  To simplify the population of this value I have created the following helper method which was adopted from the following post on StackOverflow:

    public static string SetHttpAuth(string httpDate)
              string AWSAccessKeyId = "<your_keyId>";
              string AWSSecretKey = "<your_SecretKey>";

             string AuthHeader = "";
            string canonicalString = "PUT\n\napplication/x-pdf\n\nx-amz-acl:bucket-owner-full-control\nx-amz-date:" + httpDate + "\nx-amz-storage-class:STANDARD\n/<your_bucket>/310531500150800.PDF";

             // now encode the canonical string
             Encoding ae = new UTF8Encoding();
             // create a hashing object
             HMACSHA1 signature = new HMACSHA1();
             // secretId is the hash key
             signature.Key = ae.GetBytes(AWSSecretKey);
             byte[] bytes = ae.GetBytes(canonicalString);
             byte[] moreBytes = signature.ComputeHash(bytes);
             // convert the hash byte array into a base64 encoding
             string encodedCanonical = Convert.ToBase64String(moreBytes);
             // finally, this is the Authorization header.
             AuthHeader = "AWS " + AWSAccessKeyId + ":" + encodedCanonical;

             return AuthHeader;

    The most important part of this method is the following line(s) of code:

    string canonicalString = "PUT\n\napplication/x-pdf\n\nx-amz-acl:bucket-owner-full-control\nx-amz-date:" + httpDate + "\nx-amz-storage-class:STANDARD\n/<your_bucket>/310531500150800.PDF";

    The best way to describe what is occurring is to borrow the following from the Amazon documentation.

    The Signature element is the RFC 2104HMAC-SHA1 of selected elements from the request, and so the Signature part of the Authorization header will vary from request to request. If the request signature calculated by the system matches the Signature included with the request, the requester will have demonstrated possession of the AWS secret access key. The request will then be processed under the identity, and with the authority, of the developer to whom the key was issued.

    Essentially we are going to build up a string that reflects that various aspects of our REST call (Headers, Date, Resource) and then create a Hash using our Amazon secret.  Since Amazon is aware of our Secret they can decrypt this payload and see if it matches our actual REST call.  If it does – we are golden.  If not, we can expect an error like the following:

    A message sent to adapter "WCF-WebHttp" on send port "SendToS3" with URI http://<bucketname>.s3-us-west-2.amazonaws.com/ is suspended.
    Error details: System.Net.WebException: The HTTP request was forbidden with client authentication scheme 'Anonymous'.
    <?xml version="1.0" encoding="UTF-8"?>
    <Error><Code>SignatureDoesNotMatch</Code><Message>The request signature we calculated does not match the signature you provided. Check your key and signing method.</Message><StringToSignBytes>50 55 54 0a 0a 61 70 70 6c 69 63 61 74 69 6f 6e 2f 78 2d 70 64 66 0a 0a 78 2d 61 6d 7a 2d 61 63 6c 3a 62 75 63 6b 65 74 2d 6f 77 6e 65 72 2d 66 75 6c 6c 2d 63 6f 6e 74 72 20 44 65 63 20 32 30 31 33 20 30 34 3a 35 37 3a 34 35 20 47 4d 54 0a 78 2d 61 6d 7a 2d 73 74 6f 72 61 67 65 2d 63 6c 61 73 73 3a 53 54 41 4e 44 41 52 44 0a 2f 74 72 61 6e 73 61 6c 74 61 70 6f 63 2f 33 31 30 35 33 31 35 30 30 31 35 30 38 30 30 2e 50 44 46</StringToSignBytes><RequestId>6A67D9A7EB007713</RequestId><HostId>BHkl1SCtSdgDUo/aCzmBpPmhSnrpghjA/L78WvpHbBX2f3xDW</HostId><SignatureProvided>SpCC3NpUkL0Z0hE9EI=</SignatureProvided><StringToSign>PUT


    x-amz-date:Thu, 05 Dec 2013 04:57:45 GMT

    Tip: Pay attention to these error messages as they really give you a hint as to what you need to include in your “canonicalString”.  I discounted these error message early on and didn’t take the time to really understand what Amazon was looking for. 

    For completeness I will include the other thresshelper methods that are being used in the Expression Shape.  For my actual solution I have included these in a configuration store but for the simplicity of this blog post I have hard coded them.

    public static string SetAmzACL()
            return "bucket-owner-full-control";

        public static string SetStorageClass()
            return "STANDARD";

    public static string SetHeaderDate()
              //Use GMT time and ensure that it is within 15 minutes of the time on Amazon’s Servers
              return DateTime.UtcNow.ToString("ddd, dd MMM yyyy HH:mm:ss ") + "GMT";

  3. The next part of the Message Assignment shape is setting the standard context properties for WebHttp Adapter.  Remember since we are using a Dynamic Send Port we will not be able to manipulate these values through the BizTalk Admin Console.

    msgS3Request(WCF.HttpMethodAndUrl) = "PUT";  //Writing to Amazon S3 requires a PUT
    msgS3Request(WCF.OpenTimeout)= "00:10:00";
    msgS3Request(WCF.CloseTimeout)= "00:10:00";
    msgS3Request(WCF.SendTimeout)= "00:10:00";
    msgS3Request(WCF.MaxReceivedMessageSize)= 2147483647;

    Lastly we need to set the URI that we want to send our message to and also specify that we want to use the WCF-WebHttp adapter.


    Note: the last part of my URI 310531500150800.PDF represents my Resource.  In this case I have hardcoded a file name.  This is obviously something that you want to make dynamic, perhaps using the FILE.ReceivedFileName context property.

  4. Once we have assembled our S3 message we will go ahead and send it through our Dynamic Solicit Response Port.  The message that we are going to send to Amazon and Receive back is once again of type System.Xml.XmlDocument
  5. One thing to note is that when you receive a response back from Amazon is that it won’t actually have a message body (this is inline with REST).  However even though we receive an empty message body, we will still find some valuable Context Properties.  The two properties of interest are:





  6. The last step in the process is to just write our Amazon response to disk.  But, as we have learned in the previous point is that our message body will be empty but does give me an indicator that the process is working (in a Proof of Concept environment).

Overall the Orchestration is very simple.  The complexity really exists in the Message Assignment shape. 



Not that watching files move is super exciting, but I have created a quick Vine video that will demonstrate the message being consumed by the FILE Adapter and then sent off to Amazon S3.



This was a pretty fun and frustrating solution to put together.  The area that caused me the most grief was easily the Authorization Header.  There is some documentation out there related to Amazon “PUT”s but each call is different depending upon what type of data you are sending and the related headers.  For each header that you add, you really need to include the related value in your “canonicalString”.  You also need to include the complete path to your resource (/bucketname/resource) in this string even though the convention is a little different in the URI.

Also it is worth mentioning that /n Software has created a third party S3 Adapter that abstracts some of the complexity  in this solution.  While I have not used this particular /n Software Adapter, I have used others and have been happy with the experience. Michael Stephenson has blogged about his experiences with this adapter here.

Sunday, December 1, 2013

BizTalk Summit 2013 Wrap-up

On November 21st and 22nd I had the opportunity to spend a couple days at the 2nd annual BizTalk Summit held by Microsoft in Seattle.  At this summit there were approximately 300 Product Group members, MVPs, Partners and Customers.  It was great to see a lot of familiar faces from the BizTalk community and talk shop with people who live and breathe integration.

Windows Azure BizTalk Services reaches GA

The Summit started off with a bang when Scott Gu announced that Windows Azure BizTalk Services has reached General Availability (GA)!!!   What this means is that you can receive production level support from Microsoft with 99.9% uptime SLA. 



During the preview period, Microsoft was offering a 50% discount on Windows Azure BizTalk Services (WABS).  This preview pricing ends at the end of the year.  So if you have any Proof of Concept (POC) apps running in the cloud that you aren’t actively using, please be aware of any potential billing implications.

Release Cadence

The next exciting piece of news coming from Microsoft is the release cadence update for the BizTalk Server product line.  As you have likely realized, there is usually a BizTalk release shortly after the General Availability of Platform updates.  So when a new version of Windows Server, SQL Server or Visual Studio is launched, a BizTalk Server release usually closely follows.  Something that is changing within the software industry is the accelerated release cadences by Microsoft and their competitors.  A recent example of this accelerated release cadence is Windows 8.1, Windows Server 2012 R2 and Visual Studio 2013.  These releases occurred much sooner than they have in the passed.  As a result of these new accelerated timelines the BizTalk Product Group has stepped-up, committing to a BizTalk release every year!  These releases will alternate between R2 releases and major releases.  For 2014, we can expect a BizTalk 2013 R2 and in 2015 we can expect a full release.

BizTalk Server 2013 R2

So what can we expect in the upcoming release?

  • Platform alignment(Windows, SQL Server, Visual Studio) and industry specification updates (SWIFT).
  • Adapter enhancements including support for JSON (Yay!), Proxy support for SFTP and authorization enhancements for Windows Azure Service Bus.  A request I do have for the product team is please include support for Windows Server Service Bus as well.
  • Healthcare Accelerator improvements.  What was interesting about this vertical is it is the fastest growing vertical for BizTalk Server which justifies the additional investments.



Hybrid Cloud Burst

There were a lot of good sessions but one that I found extremely interesting was the session put on by Manufacturing, Supply Chain, and Information Services (MSCIS).  This group builds solutions for the Manufacturing and Supply Chain business units within Microsoft. You may have heard of a “little” franchise in Microsoft called XBOX.  The XBOX franchise heavily relies upon Manufacturing and Supply chain processes and therefore MSCIS needs to provide solutions that address the business needs of these units.  As you are probably aware, Microsoft has recently launched XBOX One which is sold out pretty much everywhere.  As you can imagine building solutions to address the demands of a product such as XBOX would be pretty challenging.  Probably the biggest hurdle would be building a solution that supports the scale needed to satisfy the messaging requirements that many large Retailers, Manufacturers and online customers introduce.

In a traditional IT setting you throw more servers at the problem.  The issue with this is that it is horribly inefficient.  You essentially are building for the worst case (or most profitable) but when things slow down you have spent a lot of money and you have poor utilization of your resources.  This leads to a high total cost of ownership (TCO). 

Another challenge in this solution is that an ERP is involved in the overall solution.  In this case it is SAP(but this would apply to any ERP) and you cannot expect an ERP to provide the performance to support ‘cloud scale’.  At least not in a cost competitive way. If you have built a system in an Asynchronous manner you can now throttle your messaging and therefore not overwhelm your ERP system.

MSCIS has addressed both of these major concerns by building out a Hybrid solution. By leveraging Windows Azure BizTalk Services and Windows Azure Service Bus Queues/Topics in the cloud they can address the elasticity requirements that a high demand product like XBOX One creates. As demand increases, additional BizTalk Services Units can be deployed so that Manufacturers, Retailers and Customers are receiving proper messaging acknowledgements.  Then On-Premise you can keep your traditional capacity for tools and applications like BizTalk Server 2013 and SAP without introducing significant infrastructure that will not be fully utilized all the time.

Our good friend, Mandi Ohlinger ,who is a technical writer with the BizTalk team, worked with the MSCIS  to document the solution.  You can read more about the solution on the BizTalk Dev Center.  I have included a pic of the high-level architecture below.


While Microsoft is a large software company(ok a Devices and Services company) what we often lose sight of is that Microsoft is a very large company (>100 000) employees and they have enterprise problems just like any other company does.  It was great to see how Microsoft uses their own software to address real world needs.  Sharing these types of experiences is something that I would really like to see more of.


(These are my own thoughts and do not necessarily reflect Microsoft’s exact roadmap)

If you have evaluated Windows Azure BizTalk Services you have likely realized that there is not currently symmetry between the BizTalk Service and BizTalk Server.  BizTalk Server has had around 14 years (or more) of investment where as BizTalk Services, in comparison, is relatively new.  Within Services we are still without core EAI capabilities like Business Process Management (BPM)/Orchestration/Workflow, Business Activity Monitoring (BAM), Business Rules Engine (BRE), comprehensive set of adapters and complete management solution.

With BizTalk Server we have a mature, stable, robust Integration platform.  The current problem with this is that it was built much before people started thinking about cloud scale.  Characteristics such as MSDTC and even the MessageBox have contributed to BizTalk being what it is today (a good platform), but they do not necessarily lend themselves to new cloud based platforms.  If you look under the hood in BizTalk Services you will find neither of these technologies in place.  I don’t necessarily see this as a bad thing.

A goal of most, if not all, products that Microsoft is putting is the cloud is symmetry between On-Premise and Cloud based offerings.  This puts the BizTalk team in a tough position.  Do they try to take a traditional architecture like BizTalk Server and push it into the cloud, or built an Architecture on technologies that better lend themselves to the cloud and then push them back on premise? The approach, going forward,  is innovating in the cloud and then bringing those investments back on-premise in the future.

Every business has a budget and priorities have to be set.  I think Microsoft is doing the right thing by investing in the future instead of making a lot of investments in the On-Premise offering that we know will be replaced by the next evolution of BizTalk.  There were many discussions between the MVPs during this week in Seattle on this subject with mixed support across both approaches. With the explosion of Cloud and SaaS applications we need an integration platform that promotes greater agility, reduces complexity and addresses scale in a very efficient manner instead of fixing some of the deficiencies that exist in the current Server platform. I do think the strategy is sound, however it will not be trivial to execute and will likely take a few years as well.

Adapter Eco-system

Based upon some of the sessions at the BizTalk Summit, it looks like Microsoft will be looking to create a larger ISV eco-system around BizTalk Services.  More specifically in the Adapter space.  The reality is that the current adapter footprint in BizTalk Services is lacking compared to some other competing offerings.  One way to address this gap is to leverage trusted 3rd parties to build and make their adapters available through some sort of marketplace. I think this is a great idea provided there is some sort of rigor that is applied to the process of submitting adapters.  I would not be entirely comfortable running mission critical processes that relied upon an adapter that was built by a person who built it as a hobby.  However, I would not have an issue purchasing an adapter in this fashion from established BizTalk ISV partners like BizTalk360 or /nSoftware.


All in all it was a good summit.  It was encouraging to see the BizTalk team take BizTalk Services across the goal line and make it GA.  It was also great to see that they have identified the need for an accelerated release cadence and shared some news about the upcoming R2 release.  Lastly it was great to connect with so many familiar faces within the BizTalk community.  The BizTalk community is not a huge community but it is definitely international so it was great to chat with people who you are used to interacting with over Twitter, Blogs or LinkedIn.

In the event you still have doubts about the future of BizTalk, rest assured the platform is alive and well!

Saturday, October 26, 2013

BizTalk360 Product Specialist award


This post is long overdue but I felt it was necessary to create.  Back in April, 2013 Saravana Kumar and the BizTalk360 team introduced the BizTalk360 Product Specialist award.  The primary objective of the program is to honour individuals who have gained adequate knowledge in installing, configuring and implementing BizTalk360 solution at customer sites.

I have blogged( here, here and even wrote a whitepaper )  about some of my experiences with BizTalk360 in the past on this blog and am a strong supporter of the product.  I have seen the benefits first hand while leading teams who are responsible for the operational support of busy BizTalk environments.  I have also witnessed the adoption by non-BizTalk experts and seen their productivity increase without being intimidated by larger, complex monitoring solutions. 

Recently I introduced BizTalk to a new organization and this was a tool that would provide immediate benefit.  Sure enough it did, we had a source system experience issues that led to some suspended messages.  The BizTalk team knew about the issues going on in that system before the system owners did.  The end result was that the issues in the source system could be identified and resolved quickly, limiting the disruption to the business.

While I was in Norway, Saravana had a bit of a surprise for me and that was some hardware to keep my MVP awards company. I just wanted to take this opportunity to thank Saravana and the rest of the BizTalk360 team for their recognition and I am looking forward to working with Version 7.0 of the product. I got a sneak peak of the application while in Norway and it looks great.



Sunday, October 6, 2013

European Trip Recap


I recently returned back from Europe where I had a chance to participate in two extraordinary events: Bouvet BizTalk Innovation Day (s) in Stavanger, Norway and the 40th running of the Berlin Marathon.

Bouvet BizTalk Innovation Day–Norway Recap

This was a two day event hosted by Bouvet. For those of you who are not familiar with Bouvet, Bouvet provides services in the fields of information technology, digital communication and enterprise management. Bouvet has about 900 employees divided between 14 offices in Norway and Sweden. - See more at: http://www.bouvet.no/en/About-Bouvet/

On day one each of the speakers had the opportunity to present their topic to a crowd of around 70 BizTalk professionals from all over Scandinavia.  The topics ranged from newer technologies like Windows Azure BizTalk Services, Windows Azure Mobile Services, Windows Azure Service Bus and some more universal topics like being proactive when monitoring the health of BizTalk solutions, BizTalk Mapping Patterns, identifying and rescuing a BizTalk Hostage Project and seeing a preview of the next version of BizTalk360.  There was also a special keynote by Paolo Salvatori who works for Microsoft Italy and is recognized world wide for his abilities. All presentations were very well received as indicated by the attendee surveys.

My presentation focused on Enterprise Mobility.  This is a topic that I have been dealing with at my day job so I had an opportunity to demonstrate some of the areas of enterprise mobility that I have been thinking about lately.  It was also an opportunity to demonstrate a ‘reference’ application that I have been collaborating on with Mikael Hakansson.

Some of the core principles that I have taken into consideration when dealing with Enterprise Mobility include:

  • Active Directory Federation: When a person leaves the company and their AD account has been disabled, this “tap” should be turned off for other mobile/cloud based services.
  • Leverage a Mobility platform to reduce the diversity required in supporting multiple platforms.  Windows Azure Mobile Services helps us address this by providing APIs for the popular platforms that allow us to centralize activities like Data access, Authentication, Identity Providers, Custom APIs and Scheduled tasks.
  • Most, if not all, Enterprise Mobile apps need to consume Line of Business (LOB) System data.  Windows Azure BizTalk Services (and the BizTalk Adapter Service) allow us a secure way in and out of our enterprise without poking holes in firewalls.  I should note that these capabilities are also available with BizTalk Server 2013.
  • Accessing On-Premise LOB systems isn’t possible (in my scenarios) without the underpinnings of the Windows Azure Service Bus.  Using this technology to span network layers never gets old. The BizTalk Adapter Service has a strong dependency on these services.
  • Data Storage:  Even though I am leveraging SAP master data in this scenario, I do need to maintain the state of the business process.  In this case I am using SQL Azure to host our data.  We can leverage Windows Azure Mobile Services’ APIs that make getting data in and out of the database a breeze.
  • Finally, we can’t forget about Toast Notifications.  We want the ability to send notifications out to users (in this case approvers) and Windows Azure Mobile Services helps us deal with sending Toast Notifications to a variety of platforms. 

Here is one of the scenarios from my demo that illustrates many of the principles that were previously mentioned.


A few screenshots of the application running in the Windows Phone Emulator:


This was one of the more challenging demos that I have ever been involved in.  I had a lot of fun working on this reference app with Mikael and learned a lot in the process.  My complete slide deck can be found here.


Many people work on bringing events alive, but two people who I would like to recognize are Tord Glad Nordahl and Anders Stensland.  They, in addition to the support Bouvet provided, pulled off a fantastic event.  I have had the opportunity to present in Sweden in 2010 and 2011 and I continue to be amazed by the amount of BizTalk interest in Scandinavia.  If you do have the opportunity to attend the Bouvet BizTalk Innovation conference in the future, I highly recommend it.  They did an amazing job.

40th Berlin Marathon

One of my hobbies is running.  I am a pretty average runner but I enjoy the challenges of running and also try to reap the health benefits of staying active.  I have run over 12 half marathons over the past 6 years and finished my first marathon last year in Chicago.  Whenever I have gone to Europe to speak in the past I have always tried to make a side trip within Europe to experience another culture.  In speaking with one of the other presenters (Steef-Jan Wiggers) we had decided that we would head to Berlin after the conference in Norway.  He recommended going to Berlin to experience its rich history.  Having never been to Germany, myself and my wife made plans to join him in Berlin.

I knew that the Berlin Marathon was held in late September.  The Berlin Marathon is one of the 6 major Marathons in the world.  The others include New York, Boston, Chicago, London and Tokyo.  So when I found out that I would be in Berlin on the same day of this historic event, I couldn’t resist the temptation of participating in this event.

The registration deadline had passed but I was able to find a travel agent from Boston who would sell us packages.  With this information, I presented the opportunity to Steef-Jan and he obliged.  He has recently gotten back into running and this would provide a great opportunity to run his first marathon.

The event itself was rather amazing.  Over 42 000 runners participated in the event with an estimated 1 million spectators.  It was an awesome experience and one that I will never forget.  I finished the marathon in 4 hours 34 minutes and 56 seconds which was 4 minutes faster than my Chicago time.


A few pictures:

The prize



Before the race.  The garbage bags helped keep us warm while we waited for our turn.


Steef-Jan before the race



After the run



Celebrating – German style



After the race the Adidas store would engrave your time into a running band that was provided as part of your registration.




MVP Profile

One of the best parts of the MVP program is the people you meet and the friendships that you develop.  Without being in the MVP program, this trip would have never happened.  Being part of the program is truly an honor.

Thanks Tord for your hospitality in Norway.  It was a great opportunity to experience my Norwegian heritage and I thoroughly enjoyed your beautiful country. 

Thanks Steef for being an amazing tour guide while in Germany.  Your German came in handy many times and I learned a lot about German history while I was there.  Running the marathon with you was also a great experience.  Next time we won’t do as much sightseeing the day before the race Winking smile.

I also would like to thank the other MVPs (Sandro, Nino, Saravana) and Paolo for a great experience as well.  Talking shop whenever we get together is a lot of fun and always interesting. 

Monday, July 8, 2013

BizTalk360 - Monitoring Service High Availability Feature


I recently went through my second implementation of BizTalk360 and ran into a feature that I wasn’t previously aware of. Typically I have installed BizTalk360 on a BizTalk Server itself which posses a bit of a risk if you only install it on one BizTalk Server and that BizTalk Server happens to be offline. 

My current environment consists of a multi-node cluster (an actual cluster with Failover Cluster Services).  I recently asked the question to Saravana Kumar if this was the way to go when looking for a redundant monitoring solution.  He indicated that my idea would work and is completely supported however I may want to look into a new feature called Monitoring Service High Availability.  When using this feature, BizTalk360 itself is maintaining its state by storing it in the database.  In my case, One node will be active and the second node will be passive – much like a service being managed by Windows Failover clustering.

To access this feature click on the Settings link in the top right hand corner of the screen.


Next, click on the Monitoring Service High Availability link.


Even though the BizTalk360 Service is actively running on both Servers (in my case), BizTalk360 is designating one of the servers as being the primary.


We have the ability to change the primary server by selecting it and then clicking on the Bring Server Active button.


Instantly our primary will switch to becoming a secondary and vice-versa.  This was very quick.  Much quicker than I have experienced failing over a service using Windows Failover Clustering.


The next test is to take our primary Service (or Server Offline).  To do this I will just stop the BizTalk360 service.  By doing so I am simulating what would occur if our service stopped or we lost our entire primary server.  To make this test even more real I am going to enable a test alert, make sure I receive the first alert and then stop the BizTalk360 Service.  My expectation is that my second node will become primary and I should receive another test alert.  This time the alert will be generated from the newly activated node.


Below I have configured an existing alarm to function in  TEST MODE.


I have received my alert as expected.


I will now stop the BizTalk360 Service on Node 1.


If I navigate back to the Monitoring Services High Availability screen I find that my “Node 2” is now the active server and my “Node 1” is no longer participating as it is offline.


If I check my inbox, I find that I continue to receive these “TEST Alerts” from BizTalk360.  This time the alerts are coming from my 2nd Node.


If we now go back to my 1st Node and start the BizTalk360 Service, we will discover that BizTalk360 has recognized that the service is back online but is in a passive state.



I have been around Windows Fail-over Clustering for quite some time and am comfortable working within that environment.  The BizTalk environments that I have used in the past also tend to leverage Windows Failover Clustering in order to support Cluster Host Instances for adapters such as S/FTP, POP3 and Database Polling.  Using Windows Failover Clustering is an option for ensuring BizTalk 360 is online and redundant, but it is not a pre-requisite.  As I have demonstrated in this post; BizTalk360 provides this functionality out of the box.  This is great news, especially for those who have multi-node BizTalk environments but do not have (or need) Windows Failover Clustering.  This allows you piece of mind in the event one of your BizTalk Servers goes offline, that you can have BizTalk360 installed on another node your coverage will not be interrupted.  Kudos to the BizTalk360 team for building such an important feature and making it very easy to use!

Sunday, June 16, 2013

Dynamics AX 2012 R2 File based integration with BizTalk Server 2013


I am currently involved in a project where I need to integrate Dynamics AX with a 3rd party payroll system.  Dynamics AX 2012 provides a rich integration framework called Application Integration Framework or (AIF).  One of the core strengths of this framework is the ability to generate a typed schema through a configuration wizard.  Combine that with a AX’s ability to create inbound and outbound ports and you now have the ability to generate export(and import if interested) files rather quickly.

When I mentioned in the previous paragraph “typed messages” I meant that AX will generate XSD schemas that we can include in our BizTalk projects.  This is a breath of fresh air compared to some other ERP systems where you get handed a CSV flat file that you have to build a flat file schema for.

In my scenario I was receiving a list of Work Orders so a colleague working on the AX side was able to provide me with the Work Order Schema and an imported schema that includes AX types.  At this point I add the schemas into my solution, build my map and wire everything up.  Go to run an initial test and was presented with the following error:

Details:The published message could not be routed because no subscribers were found. This error occurs if the subscribing orchestration or send port has not been enlisted, or if some of the message properties necessary for subscription evaluation have not been promoted.

This is a pretty standard error message that basically comes down to BizTalk received a message that it was not expecting.  The reason why BizTalk was not expecting it was because AX wraps outbound messages with a SOAP Envelope as you can see in the image below.



SOAP Envelopes are certainly nothing new but I didn’t expect AX to use them when writing a file to the file system.  When receiving Web/WCF Services BizTalk automatically takes care of extracting the message body from the incoming SOAP message for us.  With the FILE Adapter that facility just does not exist.

You will notice in screenshot below that there is a namespace that is specific to AX.  This got me thinking that AX probably has an XSD for this message type as well.


After digging around a bit I did find the location of the AX schemas to be in the Program files\Microsoft Dynamics AX\60\Server\MicrosoftDynamicsAX\bin\Application\Share\Include folder.  The schema that I was looking for was called Message.xsd


Just adding this the BizTalk project was not enough.  I needed to make a few small tweaks to the schema:

  • Click the “Schema Icon” of the schema and then set the Envelope property to be True.  This instructs BizTalk that it is an envelope schema and when BizTalk sees this message that it needs to strip out the Envelope which in this case is a SOAP Envelope.


  • Set the Body XPath property by selecting the Root Node of the schema and then populating the appropriate value which in this case is

/*[local-name()='Envelope' and namespace-uri()='http://schemas.microsoft.com/dynamics/2011/01/documents/Message']/*[local-name()='Body' and namespace-uri()='http://schemas.microsoft.com/dynamics/2011/01/documents/Message']/*[local-name()='MessageParts' and namespace-uri()='http://schemas.microsoft.com/dynamics/2011/01/documents/Message']


We can now deploy our application. When it comes to our Receive Location that will be picking up this message, we want to ensure that we are using the XMLReceive Pipeline.  Within this Pipeline the XML Disassembler stage will take care of the SOAP envelope so that when the message body is presented to the MessageBox that  any subscribers will receive the expected message body.


When I first discovered that I was receiving a SOAP wrapped message my instincts said maybe AX could just use a WCF port instead of a FILE port.  This just wasn’t the case, there are only two options when it comes to configuring an outbound port: FILE and MSMQ.  Using MSMQ would not of helped me in this case as the same issue would have existed. 

AX certainly does provide the ability to call a WCF service but it is a more custom based approach.  I would have had to expose this schema as a WCF service but then my AX colleagues would have had to write code against the proxy to populate all of the different data elements.  This would have defeated the purpose of using the AIF framework in order to expedite the ability to delver a solution under very tight timelines.  Luckily with a little tinkering we were able to come up with a reasonable solution without writing custom code.

I have to think that AX is wrapping these messages in a SOAP Envelope for a reason.  Perhaps a WCF outbound port is coming in an upcoming release?

Sunday, June 2, 2013

Windows Azure BizTalk Services Preview (Part 2) –BizTalk Adapter Services SAP Integration


Over the past 7 years I have spent a lot of time integrating BizTalk and SAP systems.  I probably built more interfaces for SAP than any other system.  When new technology such as Windows Azure BizTalk Services surfaces I am always interested to understand what the experience looks like.

If you read my prior post in this series, you are probably wondering how it is possible to use Windows Azure BizTalk Services in the Windows Azure Cloud and connect to your On-Premise LOB Systems like SAP?  Enter BizTalk Adapter Services.

BizTalk Adapter Services allows us to use the BizTalk Adapter Pack in order to connect to LOB systems.  The BizTalk Adapter Services are hosted in IIS and leverage the Service Bus Relay (under the hood) in order to traverse Firewalls and NATs.  Using the BizTalk Adapter Service is just another tool in our toolbox when achieving Hybrid integration.


Adapter Services Installation

I am now going to quickly install the BizTalk Adapter Services and then provide a walkthrough of how we can integrate with SAP.  The scenario that we are going to walk through is part of a larger process that I won’t get into much detail within this blog but we have an SAP process were we need to feed sub-systems within the organization with Equipment data from SAP.  We will get this equipment by calling a custom SAP RFC(BAPI).

The BizTalk Adapter Service is a component that we want to install on a Server within our Network.  This machine does not require BizTalk Server to be present.





This is an important step as the account that we specify here will require access to SQL Server where configuration will be stored about our Relay Endpoints.


In this case I am using a local SQL Server instance.  You can use SQL Express if you so desire but I have leveraged SQL 2008 R2 (SQL Server 2012 is also supported).


This step kinda reminds me of the BizTalk SSO Service password but they are not related or linked.


The BizTalk Adapter Service installation will be installing a Management Web Service.  Since I am just using this for demo purposes I did not enable SSL. 


Once we have finished installing the BizTalk Adapter Service we will discover a new database called BAService.


We will also discover a new entry in Server Explorer within Visual Studio called BizTalk Adapter Services


If we right mouse click on BizTalk Adapter Services we have the ability to add a BizTalk Adapter Service by specifying the URL of the Management Service that was created as part of the installation process.


Once this Adapter Service has been added we can expand it and discover our LOB Targets.


We can add a new SAP target by right mouse clicking on SAP  and selecting Add SAP Target.


We now need to specify our SAP connection information much like we would if we were adding Generated Items within a BizTalk Server 2013 solution.  We need to specify the Application Server, the System Number, Client and Language.  I recently blogged about connecting to SAP Messaging Servers.  Not to worry these more advanced scenarios are supported by clicking on the Advanced button.

Next we need to provide our SAP credentials if we are not using Windows Credentials.


Much like the BizTalk Server experience we can search operations including IDOCs, BAPIs and RFCs.  Do note that at this point we are only able to push messages to SAP.  This means we can send an IDOC and call a BAPI/RFC and get the response back.  SAP cannot push IDOCs or call an RFC that is hosted by the BizTalk Adapter Service. 

Once we have found the artifact that we are looking for we can add it to the Selected operations list and then click the Next button.


Once again need to provide credentials that the BizTalk Adapter Service will use to connect to SAP.


This gets a little interesting.  Much like in my previous blog post where we connected to a Service Bus Topic, we will use the production Service Bus connectivity information.  So in this case I am going to leverage an existing Service Bus Namespace and credentials to enable Service Bus Relay capabilities.

We also have the ability to specify a Target sub-path that just allows us to organize our endpoints.


We now have a confirmation page that we can just click the Create button in order to finish the wizard.


With our SAP LOB Target created we can now drag that Target onto our canvas:


At this point we have just created an SAP target and added it to our canvas. but we have not actually generated our schemas that will be used within our solution.  In order to generate these schemas we need to right mouse click on our SAP LOB Target and specify Add Schemas to Project.


We will once again be prompted for SAP credentials.


Within our solution, we will now find two schemas have been added.  One is the core schema and the other is a schema for data types leveraged by the core schema.


A Web Request Schema, that we will expose externally to calling applications, is also needed.  This schema is very simple with only two fields: Plant and EquipmentName.


Next we need to add a Map that will allow us to transform our Web Request message into our SAP Request message.


The map is pretty straight forward as we only need to map a couple fields.


A Map that will allow us to transform our SAP Response into our Web Response is also required.


Next, we need to add  an Xml Request-Reply Bridge to our canvas by dragging it onto our canvas.


We now want to modify the Property page for this artifact  as it will become part of our overall URI.


By double clicking on our  Bridge, we have the ability to configure it.  Within this configuration we will want to specify the Message Type for the message that we will be receiving from the calling application and the Message Type for the type of message we will be sending back to the calling application.

We also want to specify our two Transformations.  The first one will take our Web Request and map it to our SAP Request.  The second Transformation will take our SAP Response and map it to our Web Response.


Once our Bridge has been configured, we need to connect our Request Response Bridge to our SAP LOB Target.  We can do so by dragging a Connector from our Toolbox and connecting our Bridge to our SAP LOB Target.



Routing messages is a core functionality within BizTalk Services. In order to route messages, we need to provide a filter condition that will allow us to specify messages to flow between the Bridge and the SAP LOB Target. In order to set this filter we need to click on the line that connects these two artifacts and then click on the Filter Condition.


Since we want all messages to flow to this LOB system we will choose Match All and click the OK button.


We also need to specify a SOAP Action much like we need to do with BizTalk Server.  In order to do this we need to highlight the connection between the Bridge and LOB target and then click on Route Action.


In this case we want to specify a SOAP Action and then specify the action from our Request Schema.  Much like the EAI/EDI Labs solutions we want to wrap our Expression around single quotes( ‘   ‘)


We are now almost ready to deploy our solution but before we can do that we need to specify our Deployment Endpoint.  We can do so by clicking on any blank space on our canvas and then specifying our Deployment Endpoint address in the Property page.


Deploying our solution is as simple as right mouse clicking on our Visual Studio Solution file and selecting Deploy Solution.


In order to deploy our solution we will need to provide the Deployment Endpoint, ACS Namespace(for BizTalk Service environment), Issuer Name and Issuer Shared Secret.


Within a few seconds our solution will be deployed and will be ready for testing. Once again I can  use the MessageSender application that is part of the BizTalk Services SDK. But, since looking at a black screen with white xml text isn’t very appealing, I created an ASP.Net Web application and borrowed the MessageSender class from the SDK project.  In this case I have my Web Application running on a laptop that isn’t on the same network as the machine that is hosting the BizTalk Adapter Service.



Within the web application, we have the ability to provide a Plant and a wild card string representing a piece of equipment in order to get more details about where that piece of equipment has been deployed, the Manufacturer and Model Number.




Hopefully this walkthrough has provided you with some valuable insight on what it takes to integrate a Line of Business system like SAP with Windows Azure BizTalk Services.  I can certainly envision scenarios where an organization uses an LOB system like SAP but may not have a dedicated integration platform to perform some of this integration. Windows Azure BizTalk Services may fill this void and enable some business scenarios that just may not have been possible for that organization.  Consider another situation where you may have a SalesForce system that requires SAP data (maybe Equipment data Smile ) and this type of functionality in BizTalk Services really becomes a business catalyst.

I also think that the BizTalk Adapter Service may provide some interesting Mobile use cases as well as we can now expose SAP data through the cloud in a secure manner.

While the scenario is a bit dated, I also wrote a similar article on how to submit IDOCs to SAP using the Windows Azure EAI/EDI Labs CTP which has now been superseded by Windows Azure BizTalk Services.