Sunday, April 13, 2014

Learning Mule ESB

 

I recently joined MuleSoft and have had quite a few people ask me how they can get started with the platform.  These people typically have some integration experience on other technology stacks and are curious about the buzz that MuleSoft is creating in the industry.  Instead of copying and pasting from email to email I figured that I would put together a blog post that identifies some beneficial resources for learning Mule ESB.  I will try to keep this post up to date as new material emerges.

Mule Before getting started with any of the learning resources, you will need to download the Mule ESB platform.  MuleSoft provides a free community edition that allows you to build and run Mule Applications. 

In addition to a Community Edition (CE), a commercial product called Enterprise Edition (EE) also exists that provides some additional Enterprise features.

The Bits

Mule ESB Community Edition
– Free download of the community edition of the software including the Mule AnyPoint Studio IDE for developing interfaces for the Mule Platform.  These tools can be run on Windows, Mac and Linux.
 
Tutorials

First 30 Minutes with Mule
– An introduction to the platform and simple walkthrough of your first Mule Application.

First Day with Mule
– Some more concepts are introduced including Message States, Global Elements, Content Based Routing and Connector Tutorials.

First Week with Mule
– Some more advanced concepts are introduced including Application Architecture, Security and Extending Mule.
 
Video Clips/Webcasts

Mule 101: Rapidly Connect Anything, AnywhereDiscover the MuleSoft’s Data Mapper, 120+ out of box connectors, development tools and deployment.

Mule 201: Develop and manage a hybrid integration applicationLearn about Legacy Modernization, Service Orchestration and Connectors. Also learn to deploy your Mule Applications and Manage/Monitor them through the Mule Management Console.

MuleSoft’s YouTube Channel – Find a lot of short demonstrations and promotional material.  Demonstrations include SAP, SalesForce, Marketo, LinkedIn, Amazon S3, Hadoop, Netsuite, Twitter and many more.
 
Books 

These are the books that I have read.  I can confidently say that I learned something from each of them.

Mule ESB Cookbook – I started with this one and found some easy to follow, walk-throughs of common integration scenarios.

Getting Started with Mule Cloud Connect – This book focuses more on the Cloud and SaaS connectors.  It is a good read, but I would suggest getting some more fundamental learning taken care of first then dig into these topics.

Mule in Action Second Edition- This is the most comprehensive book of the 3.  It gets into a greater level of detail than the cookbook and walks you through some rich examples.
 
Blog Posts

Here are few walkthroughs that I put together as I began my Mule journey.

Exposing Simple REST Service – As the title suggests, a simple REST Service.

Exposing SQL Server Data as HTTP Endpoint – This post will demonstrate how to expose SQL Operations through HTTP and return responses in JSON format.

Exploring Mule ESB SFTP Adapter – Since I have used the SFTP adapter on other platforms I was curious to take a peek at MuleSoft’s solution.

Twitter Integration – A quick look at the MuleSoft Twitter connector that allows you to interact with the Twitter API in a very eloquent way.  In this example I update my Twitter status via Mule ESB.
 
.Net Resources

On this blog you will without doubt find a lot of Microsoft related content.  MuleSoft is a company that is driven to connect any system to any device on any platform.  With this in mind there are some activities in the pipeline to better support .Net and other Microsoft products/services like SharePoint, Dynamics, Azure etc.  With this in mind, I figured I would include a few links that may be of interest to people who are interesting in integrating Microsoft technologies.

Connect .NET to anything, anywhere  - Whitepaper

.NET Connectivity – Article 

In addition to what you will find in those articles here are some of the ways that Mule ESB integrates with Microsoft technologies.

Mule ESB Anypoint Connectors for Microsoft platforms

  • MSMQ
  • AMQP
  • Active Directory
  • SOAP/WS-* (WCF interoperability)
  • REST (ASP.NET WebAPI interoperability)
  • SharePoint
  • SQL Server
  • Microsoft Dynamics GP
  • Dynamics CRM
  • Dynamics Online
  • Excel/CSV
  • Yammer

 

 

Lastly, I wanted to mention an upcoming event in San Francisco where you will be able to learn more about the .Net investments and other areas of focus for MuleSoft.  Click on the image below for more details.

image

Saturday, February 15, 2014

European Tour 2014

As I look at the calendar and see some important dates are quickly approaching, I thought I better put together a quick blog post to highlight some of the events that I will be speaking at in early March.

I will be using the same content at all events but am happy to talk offline about anything that you have seen in this blog or my presentation from Norway this past September.

The title of my session this time around is: Exposing Operational data to Mobile devices using Windows Azure and here is the session’s abstract:

In this session Kent will take a real world business scenario from the Power Generation industry. The scenario involves real time data collection, power generation commitments made to market stakeholders and current energy prices. A Power Generation company needs to monitor all of these data points to ensure it is maintaining its commitments to the marketplace. When things do not go as planned, there are often significant penalties at stake. Having real time visibility into these business measures and being notified when the business becomes non-compliant becomes extremely important.
Learn how Windows Azure and many of its building blocks (Azure Service Bus, Azure Mobile Services) and BizTalk Server 2013 can address these requirements and provide Operations people with real time visibility into the state of their business processes.

London – March 3rd and March 4th

The first stop on the tour is London where I will be speaking at BizTalk360’s BizTalk Summit 2014.  This is a 2 day paid conference event which has allowed BizTalk360 to bring in experts from all over the world to speak at this event.  This includes speakers from Canada (me), my neighbor, the United States, Italy, Norway, Portugal, Belgium, the Netherlands and India.  These experts include many Integration MVPs and the product group from Microsoft.

There are still a few tickets available for this event so I would encourage you to act quickly to avoid being disappointed.  This will easily be the biggest Microsoft Integration event in Europe this year with a lot of new content.

londonbanner

Stockholm – March 5th

After the London event, Steef-Jan Wiggers and I will be jumping on a plane and will head to Stockhom to visit our good friend Johan Hedberg and the Swedish BizTalk Usergroup.  This will be my third time speaking in Stockholm and 4th time speaking in Scandinavia.  I really enjoy speaking in Stockholm and am very much looking forward to returning to Sweden.  I just really hope that they don’t win the Gold Medal in Men’s Hockey at the Olympics otherwise I won’t hear the end of it.

I am also not aware of any Triathlons going on in Sweden at this time so I should be safe from participating in any adventure sports.

At this point an EventBrite is not available but watch the BizTalk Usergroup Sweden site or my twitter handle (@wearsy) for more details. 

icy-harbour-stockholm

Netherlands – March 6th

The 3rd and last stop on the tour is the Netherlands where I will be speaking at the Dutch BizTalk User Group.  Steef-Jan Wiggers will also be speaking as will René Brauwers.  This will be my second trip to the Netherlands but my first time speaking here. I am very much looking forward to coming back to the region to talk about integration with the community and sample Dutch Pancakes, Stroopwafels and perhaps a Heineken (or two).

The eventbrite is available here and there is no cost for this event.

amsterdam

See you in Europe!

Wednesday, January 1, 2014

2013–Year in Review and looking ahead to 2014

With 2014 now upon us I wanted to take some time to reflect on the past year.  It was an incredible and chaotic year but it was also a lot of fun!  Here are some of the things that I was involved in this past year.

MVP Summits

This year there were two MVP summits.  One in February and another at the end of November.  MVP Summits are such great opportunities on a few different levels.  First off you get to hear about what is in the pipeline from product groups but you also get to network with your industry peers. I find that these conversations are so incredibly valuable and the friendships that are developed are pretty incredible.  Over time I have developed an incredible world wide network with so many quality individuals it is actually mind blowing.

(Pictures from February MVP Summit)

MVPSummit1b

 

At the attendee party at Century Link stadium

MVPSummit1

Dinner with Product Group and other MVPs

PGand MVPs

(Pictures from November Summit)

At Lowell’s in the Pike Place Market in Seattle  for our annual Integration breakfast prior to the SeaHawk’s game.

Breakfast

A portion of the Berlin Wall with Steef-Jan at Microsoft Campus

Kent_Steef_BerlinWall

Dinner at our favourite Indian restaurant in Bellevue called Moksha.

Dinner

At Steef Jan’s favorite Donut shop in Seattle prior to the BizTalk Summit.

Donuts

Speaking

This year I had a lot of good opportunities to speak and share some of the things that I have learned.  My first stop was in Phoenix at the Phoenix Connected Systems Group in early May

The next stop was in Charlotte, North Carolina where I presented two sessions at the BizTalk Bootcamp event.  This conference was held at the Microsoft Campus in Charlotte.  Special thanks to Mandi Ohlinger for putting it together and getting me involved.

KentCharlotte

Soon after the Charlotte event I was headed to New York City where I had the opportunity to present at Microsoft Technology Center (MTC) along side the Product group and some MVPs to some of Microsoft’s most influential customers in New York City.

New York City

The next stop on the “circuit” was heading over to Norway to participate in the Bouvet BizTalk Innovation Days conference.  This was my favourite event for a few reasons;

  • I do have some Norwegian heritage so it was a tremendous opportunity to learn about my ancestors.
  • Another opportunity to hang with my MVP buddies from Europe
  • I don’t think there is a more passionate place on the planet about integration than in Scandinavia (Sweden included).  Every time I have spoke there I am completely overwhelmed by the interest in Integration in that part of the world.

Special thanks to Tord Glad Nordahl for including me in this event.

NorwaySpeakers2

After the Norway event I had the opportunity to participate in the 40th Annual Berlin Marathon with my good friend Steef Jan Wiggers. This was my second Marathon that I have run and it was a tremendous cultural experience to run in that city.  I also shaved 4 minutes off of my previous time from the Chicago marathon so it was a win-win type of experience.

Celebrating

The last speaking engagement was in Calgary in November.  I had the opportunity to speak about Windows Azure Mobile Services, Windows Azure BizTalk Services and SAP integration at the Microsoft Alberta Architect forum.  It was a great opportunity to demonstrate some of these capabilities in Windows Azure to the Calgary community.

Grad School

2013 also saw me returning to School! I completed my undergrad degree around 12 years ago and felt I was ready for some higher education.  I have had many good opportunities for career growth in my career but always felt that it was my technical capabilities that created those leadership and management opportunities.  At times I felt like I didn’t have a solid foundation when it came to running parts of an IT organization.  I felt that I could benefit from additional education.  I don’t ever foresee a time when I am not involved in Technology.  It is my job but it is also my hobby. With this in mind I set out to find a program that focused on the “Management of Technology”.  I didn’t want a really technical Master’s program and I also didn’t want a full blown Business Master’s program.  I really wanted a blend of these types of programs.  After some investigation I found a program that really suited my needs.  The program that I landed on was Arizona State University’s MSIM (Masters of Science in Information Management) through the W.P. Carey School of Business.

In August, 2013, I headed down to Tempe, Arizona for Student Orientation.  During this orientation myself and 57 other students in the program received detailed information about the program.  We also got assigned into groups of 4 or 5 people who you will be working closely with over the course of the 16 month program.  There are two flavors of the program.  You can either attend in-person at the ASU campus or you can participate in the on-line version of the program.  With me living in Calgary, I obviously chose the remote program. 

One thing that surprised me was the amount of people from all over the United States that are in this program.  There are people from Washington St, Washington DC, Oregon, California, Colorado, New Mexico, Texas, Indiana, New York, Georgia, Vermont, Alabama, Utah and of course Arizona in the program. When establishing groups, the school will try to place you in groups within the same time zone.  My group consists of people from Arizona which has worked out great so far.  This is really a benefit of the program as everyone brings a unique experience to the program which has been really insightful.

I just finished up my 3rd course (of 10) and am very pleased with choosing this program.  Don’t get me wrong, it is a lot of work but I am learning alot and really enjoying the content of the courses.  The 3 courses that I have taken so far are The Strategic value of IT, Business Process Design and Data and Information Management.  My upcoming course is on Managing Enterprise Systems which I am sure will be very interesting.

If you have any questions about the program feel free to leave your email address in the comments as I am happy to answer any questions that you have.

388644_10151308828386207_698535131_n

 

 Books

Unfortunately this list is going to be quite sparse compared to the list that Richard has compiled here, but I did want to point out a few books that I had the opportunity to read this past year.

Microsoft BizTalk ESB Toolkit 2.1

In 2013, it was a slow year for new BizTalk books.  In part due to the spike in books found in 2012 and also the nature of the BizTalk release cycle. However we did see the Microsoft BizTalk ESB Toolkit 2.1  book being released by Andres Del Rio Benito and Howard Edidin. 

This book comes in Packt Publishing’s new shorter format.  Part of the challenge with writing books is that it takes a really long time to get the product out.  In recent years Packt has tried to shorten this release cycle and this book falls into this new category.   The book is approximately 130 pages long and is the most comprehensive guide of the ESB toolkit available.  I have not seen another resource where you will find as much detailed information about the toolkit.

Within this book you can expect to find 6 chapters that discuss:

  • ESB Toolkit Architecture
  • Itinerary Services
  • ESB Exception Handling
  • ESB Toolkit Web Services
  • ESB Management Portal
  • ESB Toolkit Version 2.2 (BizTalk 2013) sneak peak.

If you are doing some work with the ESB toolkit and are looking for a good resource then this a good place to start. (Amazon)

ESB Book

 

The Phoenix Project: A Novel about IT, DevOps and Helping your Business Win

I was made aware of this book via a Scott Gu tweet and boy it was worth picking up.  This book reads like a novel but there are a lot of very valuable lessons embedded within the book.  This book was so relevant to me that I could have sworn that I have worked with this author before because I had experienced so much of what was in this book.  If you are new to a leadership role or are struggling in that role this book will be very beneficial to you. (Amazon)

The Phoenix Project

 

Adventures of an IT Leader

This is a book that I read as part of my ASU Strategic Value of IT course.  It is similar in nature to the Phoenix Project and also reads like a novel.  In this case a Business Leader has transitioned into a CIO position.  This book takes you through his trials and tribulations and really begs the question is “IT Management just about Management”. (Amazon)

IT Leadership

The Opinionated Software Developer: What Twenty-Five Years of Slinging Code Has Taught Me

This was an interesting read as it describes Shawn Wildermuth’s experiences as a Software Developer.  It was a quick read but was really interesting to learn about Shawn’s experiences throughout his career. I love learning about what other people have experienced in their careers and this provided excellent insight into Shawn’s. (Amazon)

Shawn

Hard Facts, Dangerous Half-Truths, and Total Nonsense: Profiting from Evidence-based Management

Another book from my ASU studies but this one was interesting.  It does read more like a text book but the authors are very well recognized for their work in Business Re-engineering space.  I think the biggest thing that I got out of this book was to not lose sight of evidence-based management. All too often technical folks use their previous experiences to dictate future decisions.  For example at a previous company or client a particular method worked.  However taking this approach to a new company or client provides you no guarantees that it will work again.  This book was a good reminder that a person needs to stick to the facts when making decisions and to not rely (too much) on what has worked (or hasn’t) in the past. (Amazon)

Hard Facts

 

 2014

Looking ahead I expect 2014 to be as chaotic and exciting as 2013.  It has already gotten off to a good start with Microsoft awarding me with my seventh consecutive MVP award in the Integration discipline.  I want to thank all of the people working in the Product Group, the Support Group and in the Community teams for their support.  I also want to thank my MVP buddies who are an amazing bunch of people that I really enjoy learning from.

MVP_FullColor_ForScreen

Also, look for a refresh of the (MCTS): Microsoft BizTalk Server 2010 (70-595) Certification Guide book. No the exam has not changed, but the book has been updated to include BizTalk 2013 content that is related to the Microsoft BizTalk 2013 Partner competency exam.  I must stress that this book is a re-fresh so do not expect 100% (of anywhere near that) of new content.

Tuesday, December 10, 2013

BizTalk 2013–Integration with Amazon S3 storage using the WebHttp Adapter

I have recently encountered a requirement where we had to integrate a legacy Document Management system with Amazon in order to support a Mobile-Field Worker application.  The core requirement is that when a document reaches a certain state within the Document Management System, we need to publish this file to an S3 instance where it can be accessed from a mobile device.  We will do so using a RESTful PUT call.

Introduction to Amazon S3 SDK for .Net

Entering this solution I knew very little about Amazon S3.  I did know that it supported REST and therefore felt pretty confident that BizTalk 2013 could integrate with it using the WebHttp adapter.

The first thing that I needed to do was to create a Developer account on the Amazon platform. Once I created my account I then downloaded the Amazon S3 SDK for .Net. Since I will be using REST technically this SDK is not required however there is a beneficial tool called the AWS Toolkit for Microsoft Visual Studio.  Within this toolkit we can manage our various AWS services including our S3 instance.  We can create, read, update and delete documents using this tool.  We can also use it in our testing to verify that a message has reached S3 successfully.

image

Another benefit of downloading the SDK is that we can use the managed libraries to manipulate S3 objects to better understand some of the terminology and functionality that is available.  Another side benefit is that we can fire up Fiddler while we are using the SDK and see how Amazon is forming their REST calls, under the hood, when communicating with S3

Amazon S3 Accounts

When you sign up for an S3 account you will receive an Amazon Key ID and a Secret Access Key. These are two pieces of data that you will need in order to access your S3 services.  You can think of these credentials much like the ones you use when accessing Windows Azure Services.

image

BizTalk Solution

To keep this solution as simple as possible for this Blog Post, I have stripped some of the original components of the solution so that we can strictly focus on what is involved in getting the WebHttp Adapter to communicate with Amazon S3.

For the purpose of this blog post the following events will take place:

  1. We will receive a message that will be of type: System.Xml.XmlDocument.  Don’t let this mislead you, we can receive pretty much any type of message using this message type including text documents, images and pdf documents.
  2. We will then construct a new instance of the message that we just received in order to manipulate some Adapter Context properties. You may now be asking – Why do I want to manipulate Adapter Context properties?  The reason for this is that since we want to change some of our HTTP Header properties at runtime we therefore need to use a Dynamic Send Port as identified by Ricardo Marques.

    image

    The most challenging part of this Message Assignment Shape was populating the WCF.HttpHeaders context property.  In C# if you want to populate headers you have a Header collection that you can populate in a very clean manner:

    headers.Add("x-amz-date", httpDate);

    However, when populating this property in BizTalk it isn’t as clean.  You need to construct a string and then append all of the related properties together.  You also need to separate each header attribute onto a new line by appending “\n” . 

    Tip: Don’t try to build this string in a Helper method.  \n characters will be encoded and the equivalent values will not be accepted by Amazon so that is why I have built out this string inside an Expression Shape.

    After I send a message(that I have tracked by BizTalk) I should see an HTTP Header that looks like the following:

    <Property Name="HttpHeaders" Namespace="http://schemas.microsoft.com/BizTalk/2006/01/Adapters/WCF-properties" Value=

    "x-amz-acl: bucket-owner-full-control
    x-amz-storage-class: STANDARD
    x-amz-date: Tue, 10 Dec 2013 23:25:43 GMT
    Authorization: AWS <AmazonKeyID>:<EncryptedSignature>
    Content-Type: application/x-pdf
    Expect: 100-continue
    Connection: Keep-Alive"/>

    For the meaning of each of these headers I will refer you to the Amazon Documentation.  However, the one header that does warrant some additional discussion here is the Authorization header.  This is how we authenticate with the S3 Service.  Constructing this string requires some additional understanding.  To simplify the population of this value I have created the following helper method which was adopted from the following post on StackOverflow:

    public static string SetHttpAuth(string httpDate)
         {
              string AWSAccessKeyId = "<your_keyId>";
              string AWSSecretKey = "<your_SecretKey>";

             string AuthHeader = "";
            string canonicalString = "PUT\n\napplication/x-pdf\n\nx-amz-acl:bucket-owner-full-control\nx-amz-date:" + httpDate + "\nx-amz-storage-class:STANDARD\n/<your_bucket>/310531500150800.PDF";
                

             // now encode the canonical string
             Encoding ae = new UTF8Encoding();
             // create a hashing object
             HMACSHA1 signature = new HMACSHA1();
             // secretId is the hash key
             signature.Key = ae.GetBytes(AWSSecretKey);
             byte[] bytes = ae.GetBytes(canonicalString);
             byte[] moreBytes = signature.ComputeHash(bytes);
             // convert the hash byte array into a base64 encoding
             string encodedCanonical = Convert.ToBase64String(moreBytes);
             // finally, this is the Authorization header.
             AuthHeader = "AWS " + AWSAccessKeyId + ":" + encodedCanonical;

             return AuthHeader;
         }

    The most important part of this method is the following line(s) of code:

    string canonicalString = "PUT\n\napplication/x-pdf\n\nx-amz-acl:bucket-owner-full-control\nx-amz-date:" + httpDate + "\nx-amz-storage-class:STANDARD\n/<your_bucket>/310531500150800.PDF";
                

    The best way to describe what is occurring is to borrow the following from the Amazon documentation.

    The Signature element is the RFC 2104HMAC-SHA1 of selected elements from the request, and so the Signature part of the Authorization header will vary from request to request. If the request signature calculated by the system matches the Signature included with the request, the requester will have demonstrated possession of the AWS secret access key. The request will then be processed under the identity, and with the authority, of the developer to whom the key was issued.

    Essentially we are going to build up a string that reflects that various aspects of our REST call (Headers, Date, Resource) and then create a Hash using our Amazon secret.  Since Amazon is aware of our Secret they can decrypt this payload and see if it matches our actual REST call.  If it does – we are golden.  If not, we can expect an error like the following:

    A message sent to adapter "WCF-WebHttp" on send port "SendToS3" with URI http://<bucketname>.s3-us-west-2.amazonaws.com/ is suspended.
    Error details: System.Net.WebException: The HTTP request was forbidden with client authentication scheme 'Anonymous'.
    <?xml version="1.0" encoding="UTF-8"?>
    <Error><Code>SignatureDoesNotMatch</Code><Message>The request signature we calculated does not match the signature you provided. Check your key and signing method.</Message><StringToSignBytes>50 55 54 0a 0a 61 70 70 6c 69 63 61 74 69 6f 6e 2f 78 2d 70 64 66 0a 0a 78 2d 61 6d 7a 2d 61 63 6c 3a 62 75 63 6b 65 74 2d 6f 77 6e 65 72 2d 66 75 6c 6c 2d 63 6f 6e 74 72 20 44 65 63 20 32 30 31 33 20 30 34 3a 35 37 3a 34 35 20 47 4d 54 0a 78 2d 61 6d 7a 2d 73 74 6f 72 61 67 65 2d 63 6c 61 73 73 3a 53 54 41 4e 44 41 52 44 0a 2f 74 72 61 6e 73 61 6c 74 61 70 6f 63 2f 33 31 30 35 33 31 35 30 30 31 35 30 38 30 30 2e 50 44 46</StringToSignBytes><RequestId>6A67D9A7EB007713</RequestId><HostId>BHkl1SCtSdgDUo/aCzmBpPmhSnrpghjA/L78WvpHbBX2f3xDW</HostId><SignatureProvided>SpCC3NpUkL0Z0hE9EI=</SignatureProvided><StringToSign>PUT

    application/x-pdf

    x-amz-acl:bucket-owner-full-control
    x-amz-date:Thu, 05 Dec 2013 04:57:45 GMT
    x-amz-storage-class:STANDARD
    /<bucketname>/310531500150800.PDF</StringToSign><AWSAccessKeyId><your_key></AWSAccessKeyId></Error>

    Tip: Pay attention to these error messages as they really give you a hint as to what you need to include in your “canonicalString”.  I discounted these error message early on and didn’t take the time to really understand what Amazon was looking for. 

    For completeness I will include the other thresshelper methods that are being used in the Expression Shape.  For my actual solution I have included these in a configuration store but for the simplicity of this blog post I have hard coded them.

    public static string SetAmzACL()
        {
            return "bucket-owner-full-control";
        }

        public static string SetStorageClass()
        {
            return "STANDARD";
        }

    public static string SetHeaderDate()
          {
              //Use GMT time and ensure that it is within 15 minutes of the time on Amazon’s Servers
              return DateTime.UtcNow.ToString("ddd, dd MMM yyyy HH:mm:ss ") + "GMT";
             
          }

  3. The next part of the Message Assignment shape is setting the standard context properties for WebHttp Adapter.  Remember since we are using a Dynamic Send Port we will not be able to manipulate these values through the BizTalk Admin Console.

    msgS3Request(WCF.BindingType)="WCF-WebHttp";
    msgS3Request(WCF.SecurityMode)="None";
    msgS3Request(WCF.HttpMethodAndUrl) = "PUT";  //Writing to Amazon S3 requires a PUT
    msgS3Request(WCF.OpenTimeout)= "00:10:00";
    msgS3Request(WCF.CloseTimeout)= "00:10:00";
    msgS3Request(WCF.SendTimeout)= "00:10:00";
    msgS3Request(WCF.MaxReceivedMessageSize)= 2147483647;

    Lastly we need to set the URI that we want to send our message to and also specify that we want to use the WCF-WebHttp adapter.

    Port_SendToS3(Microsoft.XLANGs.BaseTypes.Address)="http://<bucketname>.s3-us-west-2.amazonaws.com/310531500150800.PDF";
    Port_SendToS3(Microsoft.XLANGs.BaseTypes.TransportType)="WCF-WebHttp";

    Note: the last part of my URI 310531500150800.PDF represents my Resource.  In this case I have hardcoded a file name.  This is obviously something that you want to make dynamic, perhaps using the FILE.ReceivedFileName context property.

  4. Once we have assembled our S3 message we will go ahead and send it through our Dynamic Solicit Response Port.  The message that we are going to send to Amazon and Receive back is once again of type System.Xml.XmlDocument
  5. One thing to note is that when you receive a response back from Amazon is that it won’t actually have a message body (this is inline with REST).  However even though we receive an empty message body, we will still find some valuable Context Properties.  The two properties of interest are:

    InboundHttpStatusCode

    InboundHttpStatusDescription

    image

     

  6. The last step in the process is to just write our Amazon response to disk.  But, as we have learned in the previous point is that our message body will be empty but does give me an indicator that the process is working (in a Proof of Concept environment).

Overall the Orchestration is very simple.  The complexity really exists in the Message Assignment shape. 

image

 Testing

Not that watching files move is super exciting, but I have created a quick Vine video that will demonstrate the message being consumed by the FILE Adapter and then sent off to Amazon S3.

 https://vine.co/v/hQ2WpxgLXhJ

Conclusion

This was a pretty fun and frustrating solution to put together.  The area that caused me the most grief was easily the Authorization Header.  There is some documentation out there related to Amazon “PUT”s but each call is different depending upon what type of data you are sending and the related headers.  For each header that you add, you really need to include the related value in your “canonicalString”.  You also need to include the complete path to your resource (/bucketname/resource) in this string even though the convention is a little different in the URI.

Also it is worth mentioning that /n Software has created a third party S3 Adapter that abstracts some of the complexity  in this solution.  While I have not used this particular /n Software Adapter, I have used others and have been happy with the experience. Michael Stephenson has blogged about his experiences with this adapter here.

Sunday, December 1, 2013

BizTalk Summit 2013 Wrap-up

On November 21st and 22nd I had the opportunity to spend a couple days at the 2nd annual BizTalk Summit held by Microsoft in Seattle.  At this summit there were approximately 300 Product Group members, MVPs, Partners and Customers.  It was great to see a lot of familiar faces from the BizTalk community and talk shop with people who live and breathe integration.

Windows Azure BizTalk Services reaches GA

The Summit started off with a bang when Scott Gu announced that Windows Azure BizTalk Services has reached General Availability (GA)!!!   What this means is that you can receive production level support from Microsoft with 99.9% uptime SLA. 

 

image

During the preview period, Microsoft was offering a 50% discount on Windows Azure BizTalk Services (WABS).  This preview pricing ends at the end of the year.  So if you have any Proof of Concept (POC) apps running in the cloud that you aren’t actively using, please be aware of any potential billing implications.

Release Cadence

The next exciting piece of news coming from Microsoft is the release cadence update for the BizTalk Server product line.  As you have likely realized, there is usually a BizTalk release shortly after the General Availability of Platform updates.  So when a new version of Windows Server, SQL Server or Visual Studio is launched, a BizTalk Server release usually closely follows.  Something that is changing within the software industry is the accelerated release cadences by Microsoft and their competitors.  A recent example of this accelerated release cadence is Windows 8.1, Windows Server 2012 R2 and Visual Studio 2013.  These releases occurred much sooner than they have in the passed.  As a result of these new accelerated timelines the BizTalk Product Group has stepped-up, committing to a BizTalk release every year!  These releases will alternate between R2 releases and major releases.  For 2014, we can expect a BizTalk 2013 R2 and in 2015 we can expect a full release.

BizTalk Server 2013 R2

So what can we expect in the upcoming release?

  • Platform alignment(Windows, SQL Server, Visual Studio) and industry specification updates (SWIFT).
  • Adapter enhancements including support for JSON (Yay!), Proxy support for SFTP and authorization enhancements for Windows Azure Service Bus.  A request I do have for the product team is please include support for Windows Server Service Bus as well.
  • Healthcare Accelerator improvements.  What was interesting about this vertical is it is the fastest growing vertical for BizTalk Server which justifies the additional investments.

image

 

Hybrid Cloud Burst

There were a lot of good sessions but one that I found extremely interesting was the session put on by Manufacturing, Supply Chain, and Information Services (MSCIS).  This group builds solutions for the Manufacturing and Supply Chain business units within Microsoft. You may have heard of a “little” franchise in Microsoft called XBOX.  The XBOX franchise heavily relies upon Manufacturing and Supply chain processes and therefore MSCIS needs to provide solutions that address the business needs of these units.  As you are probably aware, Microsoft has recently launched XBOX One which is sold out pretty much everywhere.  As you can imagine building solutions to address the demands of a product such as XBOX would be pretty challenging.  Probably the biggest hurdle would be building a solution that supports the scale needed to satisfy the messaging requirements that many large Retailers, Manufacturers and online customers introduce.

In a traditional IT setting you throw more servers at the problem.  The issue with this is that it is horribly inefficient.  You essentially are building for the worst case (or most profitable) but when things slow down you have spent a lot of money and you have poor utilization of your resources.  This leads to a high total cost of ownership (TCO). 

Another challenge in this solution is that an ERP is involved in the overall solution.  In this case it is SAP(but this would apply to any ERP) and you cannot expect an ERP to provide the performance to support ‘cloud scale’.  At least not in a cost competitive way. If you have built a system in an Asynchronous manner you can now throttle your messaging and therefore not overwhelm your ERP system.

MSCIS has addressed both of these major concerns by building out a Hybrid solution. By leveraging Windows Azure BizTalk Services and Windows Azure Service Bus Queues/Topics in the cloud they can address the elasticity requirements that a high demand product like XBOX One creates. As demand increases, additional BizTalk Services Units can be deployed so that Manufacturers, Retailers and Customers are receiving proper messaging acknowledgements.  Then On-Premise you can keep your traditional capacity for tools and applications like BizTalk Server 2013 and SAP without introducing significant infrastructure that will not be fully utilized all the time.

Our good friend, Mandi Ohlinger ,who is a technical writer with the BizTalk team, worked with the MSCIS  to document the solution.  You can read more about the solution on the BizTalk Dev Center.  I have included a pic of the high-level architecture below.

image

While Microsoft is a large software company(ok a Devices and Services company) what we often lose sight of is that Microsoft is a very large company (>100 000) employees and they have enterprise problems just like any other company does.  It was great to see how Microsoft uses their own software to address real world needs.  Sharing these types of experiences is something that I would really like to see more of.

Symmetry

(These are my own thoughts and do not necessarily reflect Microsoft’s exact roadmap)

If you have evaluated Windows Azure BizTalk Services you have likely realized that there is not currently symmetry between the BizTalk Service and BizTalk Server.  BizTalk Server has had around 14 years (or more) of investment where as BizTalk Services, in comparison, is relatively new.  Within Services we are still without core EAI capabilities like Business Process Management (BPM)/Orchestration/Workflow, Business Activity Monitoring (BAM), Business Rules Engine (BRE), comprehensive set of adapters and complete management solution.

With BizTalk Server we have a mature, stable, robust Integration platform.  The current problem with this is that it was built much before people started thinking about cloud scale.  Characteristics such as MSDTC and even the MessageBox have contributed to BizTalk being what it is today (a good platform), but they do not necessarily lend themselves to new cloud based platforms.  If you look under the hood in BizTalk Services you will find neither of these technologies in place.  I don’t necessarily see this as a bad thing.

A goal of most, if not all, products that Microsoft is putting is the cloud is symmetry between On-Premise and Cloud based offerings.  This puts the BizTalk team in a tough position.  Do they try to take a traditional architecture like BizTalk Server and push it into the cloud, or built an Architecture on technologies that better lend themselves to the cloud and then push them back on premise? The approach, going forward,  is innovating in the cloud and then bringing those investments back on-premise in the future.

Every business has a budget and priorities have to be set.  I think Microsoft is doing the right thing by investing in the future instead of making a lot of investments in the On-Premise offering that we know will be replaced by the next evolution of BizTalk.  There were many discussions between the MVPs during this week in Seattle on this subject with mixed support across both approaches. With the explosion of Cloud and SaaS applications we need an integration platform that promotes greater agility, reduces complexity and addresses scale in a very efficient manner instead of fixing some of the deficiencies that exist in the current Server platform. I do think the strategy is sound, however it will not be trivial to execute and will likely take a few years as well.

Adapter Eco-system

Based upon some of the sessions at the BizTalk Summit, it looks like Microsoft will be looking to create a larger ISV eco-system around BizTalk Services.  More specifically in the Adapter space.  The reality is that the current adapter footprint in BizTalk Services is lacking compared to some other competing offerings.  One way to address this gap is to leverage trusted 3rd parties to build and make their adapters available through some sort of marketplace. I think this is a great idea provided there is some sort of rigor that is applied to the process of submitting adapters.  I would not be entirely comfortable running mission critical processes that relied upon an adapter that was built by a person who built it as a hobby.  However, I would not have an issue purchasing an adapter in this fashion from established BizTalk ISV partners like BizTalk360 or /nSoftware.

Conclusion

All in all it was a good summit.  It was encouraging to see the BizTalk team take BizTalk Services across the goal line and make it GA.  It was also great to see that they have identified the need for an accelerated release cadence and shared some news about the upcoming R2 release.  Lastly it was great to connect with so many familiar faces within the BizTalk community.  The BizTalk community is not a huge community but it is definitely international so it was great to chat with people who you are used to interacting with over Twitter, Blogs or LinkedIn.

In the event you still have doubts about the future of BizTalk, rest assured the platform is alive and well!

Saturday, October 26, 2013

BizTalk360 Product Specialist award

 

This post is long overdue but I felt it was necessary to create.  Back in April, 2013 Saravana Kumar and the BizTalk360 team introduced the BizTalk360 Product Specialist award.  The primary objective of the program is to honour individuals who have gained adequate knowledge in installing, configuring and implementing BizTalk360 solution at customer sites.

I have blogged( here, here and even wrote a whitepaper )  about some of my experiences with BizTalk360 in the past on this blog and am a strong supporter of the product.  I have seen the benefits first hand while leading teams who are responsible for the operational support of busy BizTalk environments.  I have also witnessed the adoption by non-BizTalk experts and seen their productivity increase without being intimidated by larger, complex monitoring solutions. 

Recently I introduced BizTalk to a new organization and this was a tool that would provide immediate benefit.  Sure enough it did, we had a source system experience issues that led to some suspended messages.  The BizTalk team knew about the issues going on in that system before the system owners did.  The end result was that the issues in the source system could be identified and resolved quickly, limiting the disruption to the business.

While I was in Norway, Saravana had a bit of a surprise for me and that was some hardware to keep my MVP awards company. I just wanted to take this opportunity to thank Saravana and the rest of the BizTalk360 team for their recognition and I am looking forward to working with Version 7.0 of the product. I got a sneak peak of the application while in Norway and it looks great.

 

BizTalk360Award

Sunday, October 6, 2013

European Trip Recap

 

I recently returned back from Europe where I had a chance to participate in two extraordinary events: Bouvet BizTalk Innovation Day (s) in Stavanger, Norway and the 40th running of the Berlin Marathon.

Bouvet BizTalk Innovation Day–Norway Recap

This was a two day event hosted by Bouvet. For those of you who are not familiar with Bouvet, Bouvet provides services in the fields of information technology, digital communication and enterprise management. Bouvet has about 900 employees divided between 14 offices in Norway and Sweden. - See more at: http://www.bouvet.no/en/About-Bouvet/

On day one each of the speakers had the opportunity to present their topic to a crowd of around 70 BizTalk professionals from all over Scandinavia.  The topics ranged from newer technologies like Windows Azure BizTalk Services, Windows Azure Mobile Services, Windows Azure Service Bus and some more universal topics like being proactive when monitoring the health of BizTalk solutions, BizTalk Mapping Patterns, identifying and rescuing a BizTalk Hostage Project and seeing a preview of the next version of BizTalk360.  There was also a special keynote by Paolo Salvatori who works for Microsoft Italy and is recognized world wide for his abilities. All presentations were very well received as indicated by the attendee surveys.

My presentation focused on Enterprise Mobility.  This is a topic that I have been dealing with at my day job so I had an opportunity to demonstrate some of the areas of enterprise mobility that I have been thinking about lately.  It was also an opportunity to demonstrate a ‘reference’ application that I have been collaborating on with Mikael Hakansson.

Some of the core principles that I have taken into consideration when dealing with Enterprise Mobility include:

  • Active Directory Federation: When a person leaves the company and their AD account has been disabled, this “tap” should be turned off for other mobile/cloud based services.
  • Leverage a Mobility platform to reduce the diversity required in supporting multiple platforms.  Windows Azure Mobile Services helps us address this by providing APIs for the popular platforms that allow us to centralize activities like Data access, Authentication, Identity Providers, Custom APIs and Scheduled tasks.
  • Most, if not all, Enterprise Mobile apps need to consume Line of Business (LOB) System data.  Windows Azure BizTalk Services (and the BizTalk Adapter Service) allow us a secure way in and out of our enterprise without poking holes in firewalls.  I should note that these capabilities are also available with BizTalk Server 2013.
  • Accessing On-Premise LOB systems isn’t possible (in my scenarios) without the underpinnings of the Windows Azure Service Bus.  Using this technology to span network layers never gets old. The BizTalk Adapter Service has a strong dependency on these services.
  • Data Storage:  Even though I am leveraging SAP master data in this scenario, I do need to maintain the state of the business process.  In this case I am using SQL Azure to host our data.  We can leverage Windows Azure Mobile Services’ APIs that make getting data in and out of the database a breeze.
  • Finally, we can’t forget about Toast Notifications.  We want the ability to send notifications out to users (in this case approvers) and Windows Azure Mobile Services helps us deal with sending Toast Notifications to a variety of platforms. 

Here is one of the scenarios from my demo that illustrates many of the principles that were previously mentioned.

image     

A few screenshots of the application running in the Windows Phone Emulator:

imageimageimageimage

This was one of the more challenging demos that I have ever been involved in.  I had a lot of fun working on this reference app with Mikael and learned a lot in the process.  My complete slide deck can be found here.

Conclusion

Many people work on bringing events alive, but two people who I would like to recognize are Tord Glad Nordahl and Anders Stensland.  They, in addition to the support Bouvet provided, pulled off a fantastic event.  I have had the opportunity to present in Sweden in 2010 and 2011 and I continue to be amazed by the amount of BizTalk interest in Scandinavia.  If you do have the opportunity to attend the Bouvet BizTalk Innovation conference in the future, I highly recommend it.  They did an amazing job.

40th Berlin Marathon

One of my hobbies is running.  I am a pretty average runner but I enjoy the challenges of running and also try to reap the health benefits of staying active.  I have run over 12 half marathons over the past 6 years and finished my first marathon last year in Chicago.  Whenever I have gone to Europe to speak in the past I have always tried to make a side trip within Europe to experience another culture.  In speaking with one of the other presenters (Steef-Jan Wiggers) we had decided that we would head to Berlin after the conference in Norway.  He recommended going to Berlin to experience its rich history.  Having never been to Germany, myself and my wife made plans to join him in Berlin.

I knew that the Berlin Marathon was held in late September.  The Berlin Marathon is one of the 6 major Marathons in the world.  The others include New York, Boston, Chicago, London and Tokyo.  So when I found out that I would be in Berlin on the same day of this historic event, I couldn’t resist the temptation of participating in this event.

The registration deadline had passed but I was able to find a travel agent from Boston who would sell us packages.  With this information, I presented the opportunity to Steef-Jan and he obliged.  He has recently gotten back into running and this would provide a great opportunity to run his first marathon.

The event itself was rather amazing.  Over 42 000 runners participated in the event with an estimated 1 million spectators.  It was an awesome experience and one that I will never forget.  I finished the marathon in 4 hours 34 minutes and 56 seconds which was 4 minutes faster than my Chicago time.

 

A few pictures:

The prize

Medal

 

Before the race.  The garbage bags helped keep us warm while we waited for our turn.

KentBefore

Steef-Jan before the race

SteefBefore

 

After the run

Kent_SteefAfter

 

Celebrating – German style

Celebrating

 

After the race the Adidas store would engrave your time into a running band that was provided as part of your registration.

Timeband

 

 

MVP Profile

One of the best parts of the MVP program is the people you meet and the friendships that you develop.  Without being in the MVP program, this trip would have never happened.  Being part of the program is truly an honor.

Thanks Tord for your hospitality in Norway.  It was a great opportunity to experience my Norwegian heritage and I thoroughly enjoyed your beautiful country. 

Thanks Steef for being an amazing tour guide while in Germany.  Your German came in handy many times and I learned a lot about German history while I was there.  Running the marathon with you was also a great experience.  Next time we won’t do as much sightseeing the day before the race Winking smile.

I also would like to thank the other MVPs (Sandro, Nino, Saravana) and Paolo for a great experience as well.  Talking shop whenever we get together is a lot of fun and always interesting.