Sunday, August 11, 2019

Serverless Tips - Exception Handling and Scopes


In late 2018 I started contributing to the #ServerlessTips on https://www.serverlessnotes.com, a community knowledge base hosted by Serverless360. I have written more than 30 tips, mostly on Azure Logic Apps, on that site and I wanted to highlight some of my favorites. Now, I won’t go ahead and cross-post these entire topics, but I wanted to elevate their visibility as I think they are important.


In a world of connected services running across data centers and public clouds world-wide, errors are bound to happen. These errors can be related to underlying technical infrastructure issues or related to missing or unexpected data.


Regardless of the reasons, Logic Apps developers need to plan and react when these exception events occur.


Java and .NET developers are very comfortable using try-catch-finally semantics when it comes to error handling. However, in Azure Logic Apps, developers use a different approach, to achieve similar behaviors. Read More




Configure Run After Settings allow you to decide what a subsequent action should do in the event of a previous action succeeding or failing. But, what happens when you have a more complex logic app and you have logically related actions that must succeed or fail as a group? Within Azure Logic Apps, we don’t have the luxury of using distributed transaction coordinators, but we do have other capabilities that manage this type of scenario. Read More

Friday, August 9, 2019

Udemy Course: Microsoft Flow vs Azure Logic Apps...which tool should I use?

Introducing my new course on Udemy on Microsoft Flow and Azure Logic Apps, two of my favorite cloud services.
**Discount Code available at end of blog**

Why did I write this course?

I wrote this course out of frustration! You can’t go to a Microsoft conference these days when the question of which tool should I use doesn’t come up. I have no problem with people asking the question, but I grew tired of the same watered-down answers coming from the respective Product Groups about why their tool is better. Now, I have friends on both the Flow and Logic Apps teams, so I can’t blame them. After all, if you work for Coke, are you going to recommend Pepsi? Not likely.
I thought with my experience as a Program Manager on the Flow team and having been an Azure Integration MVP (and BizTalk MVP before that), I was in a good position to build a course on this subject that provides objective guidance on each tool’s strengths and opportunities that come with that tool.


What will I learn in this course?
This course contains over 5.5 hours of content. I never intended it to be this long but I wanted to get deep into details and really demonstrate some of the unique capabilities of each tool. More specifically, the agenda looks like this:
  • Course Introduction
  • Microsoft Flow Features
    • PowerApps and CDS Integration
    • Flow Maker Portal (Templates, Connectors)
    • Microsoft Flow Approvals
    • Microsoft Flow Maker Analytics
    • Microsoft Flow Buttons
    • Sharing
    • Application Lifecycle Management (ALM)
    • Admin and Governance
  • Azure Logic Apps Features
    • Azure Portal (Templates, Connectors)
    • Editing Experiences
    • Enterprise Integration Pack
    • Inline Code
    • Integration Service Environment
    • Azure Integration Services
    • API Management
    • Azure Monitoring
    • Governance
  • Organizational Fit and how does the design of your organization impacts which tool(s) you use
  • Microsoft Flow Solution Demos (2 deep dives)
  • Azure Logic Apps Solution Demos (2 deep dives)
  • Declaring the Winner
  • Course Wrap-up and Additional resources
One area that I feel is really important in this course is the Governance models that exist for each tool. This becomes a very important consideration when levering one tool over another, in part due to the significant differences in approaches.
I spent a lot of time in demos providing you with deep exposure to the features. I will also compare and contrast the different approaches used by each team as they try to solve that specific problem.
So whether your are in the Microsoft Flow camp or the Azure Logic Apps camp, I am confident that you will learn something new.
In appreciation of checking out my blog, I have a special code for the first 15 5 people to buy my course at a discounted price of $19.99 by using this link.
Thanks for checking this out and if you have taken the course, I would love to hear from you.
Thanks,
Kent

Friday, January 1, 2016

2015 – Year in Review

 

Feels like I was writing my 2014 Year in Review just last week. Astonishing how time flies when you are busy having fun.

Learning

In the technology industry, if you are not learning you are dying. Having spent a lot of time in the Architecture space there is not much that frustrates more than Ivory Tower Architects.  For me, I need to touch a technology to really understand it.  Reading about it doesn’t give me enough insight to set direction for its usage in a company. 

In 2015, one of my goals was to really dive into API Management platforms and more specifically Azure API Management.  I started hearing about API Management platforms when I was working at Mulesoft.  While I never had any engagements that required their APIM platform, I knew enough about it to know that API Management, as a domain, will be big.

When trying to balance work and speaking opportunities/obligations, I try to ‘kill two birds with one stone’.  As you will find in the next section of this blog, I had the opportunity to speak about Azure API Management on several occasions.  It was signing up for these sessions that motivated me to do a good job researching the technology.

Probably one of the most rewarding moments was taking all of this research and speaking and turning it into a tangible solution at work. We had a requirement come up in a project where we needed to do some trading partner integration using a RESTful API.  As a result of all the ‘homework’ I was doing, I was able to spin up and API Management and supporting APIs all within two weeks that addressed a project requirement and gave the organization flexibility.  We have had approximately 15 million calls to this API in the last 6 months which has been very rewarding.

Another area of learning for me was around SaaS connectivity and more specifically ServiceNow.  ServiceNow is an IT Service Management tool.  This was a tool that our organization was implementing and was given some warning that some integration with this tool was bound to happen.

Since there was no Azure API App (connector) available for ServiceNow, this allowed me to create my own.  This provided me with another learning opportunity where I got to dive into all of the recent investments that Microsoft was making in Integration Platform as a Service (IPaaS)

Speaking

Being in the MVP program has created many opportunities for me to speak all over the world.  For that I am grateful to fellow MVPs, BizTalk360 and Microsoft for creating those opportunities.

This year was another busy year speaking.  I spent more time speaking in my home town (Calgary) than ever before which is encouraging to see as it shows there is more appetite for cloud integration.

.Net Usergroup I was a last minute addition to the MVPs putting on an Azure Cloud day.  In this session I was able to talk about Azure Service BUs messaging.  For many in attendance they were familiar with MSMQ but never heard of Azure Service Bus.  It is always fun to demo ServiceBus as people feel there is a little bit of magic whenever you start showing the Relay Service.

#IntegrationMonday – The brainchild of Michael Stephenson and Saravana Kumar has brought together a world-wide Microsoft Integration community on a weekly basis.  I had the opportunity to speak twice (link, link).  Thanks Mike and Saravana for giving me the opportunity.

BizTalk Summit (London) – This was my second time speaking in London and have to thank Saravana and his team for the opportunity.  This was probably the largest audiences that I have spoken in front of with more than 350 people in attendance.  In my session I talked about an Introduction to Azure API Management.  I think this is an untapped discipline amongst BizTalk resources so it was a good opportunity to introduce many people to the subject. 

Following this event my wife and I went to Portugal to visit Sandro and SteefJan.  Sandro took great care of us and showed us all around his hometown of Porto.  It was an amazing trip so thanks Sandro! 

BizTalk Booktcamp (Charlotte) – Mandi Ohlinger, from Microsoft, was hosting another edition of the BizTalk Bootcamp.  I had the opportunity to speak at this event in 2013 and was happy to return.  I had two sessions at this event.  The first was a replay of my BizTalk Summit API Management session and my next session was a live Lab walkthrough.  I had some tremendous feedback after this event.  I had people who had never heard of API Management, provision their own API Management instance, manage a set of APIs, call it from Postman all within an hour. They could not believe how far they were able to go within 1 hour. While I appreciated the feedback, it is also a testament to that Azure API Management platform as well as it is a simple but powerful tool.

MVPDays – I was approached by a local MVP Dave Kawula to speak at his upcoming MVPDays event in Calgary.  It was more of a Cloud Infrastructure event, but I appreciated the opportunity to introduce API Management and SaaS connectivity to a new audiance.

Azure Hybrid Integration Day – This time it was my turn to host some of my European MVP buddies and put on an event in Calgary.  With the help and support of my Canadian MVP Lead Sim Chaudhry, support from Microsoft Canada employees such as Darren King and BizTalk360 we were able to pull of an entire day focused on Microsoft Integration.  My session focused on Azure App Service and SaaS Connectivity using Microsoft’s latest bits.

After the event was over we had the opportunity to take in a football game (with tailgate) and cheer for the Saskatchewan Roughriders (event though they lost).

MVP Summit Videos – For the second straight year Microsoft arranged for Integration MVPs to enter the Channel 9 studios to record some short sessions. I want to thank Jon Fancey and Mark Mortimore for co-ordinating this.  My session focused on some of my demos from the Azure Hybrid Integration Day.

InfoQ

Around the August timeframe I had the opportunity to start writing for InfoQ.  For those of you who are not familiar with InfoQ, it is an online media outlet that focuses on Technology News and also hosts many conferences called QCon.  The organization is pretty impressive.  They have assembled a distributed team of technologists who also have a passion for writing.  Their goal is not necessarily to break news but to provide some technical substance to the happenings in the industry.

I am part of Richard Seroter’s Cloud Editorial team.  Richard and I co-authored a book several years ago and we continue to be good friends.  Richard is also one of those people that I regularly say “how does he do it” as he always has a million things on the go and the quality never suffers.  The opportunity to work along side him in this domain was too good to pass up and appreciate the opportunity he gave me.

The best part about writing for InfoQ is all of the ‘forced learning’ that occurs. While I pride myself on staying up to date it can become difficult especially when you consider all of the platforms out there.  As you probably know, I spend a lot of time in the Microsoft eco-system which is obviously one I enjoy.  Previously I was not very focused on what some of the big cloud players like Amazon, Salesforce and Google were up to.  As a result of covering these companies I now have a new perspective about what these companies are doing right and where Microsoft may have room for improvement.  Ultimately, I think this helps me do my day job better as I have a good appreciation of where the industry is headed.

Since September, I have had the opportunity to write approximately 17 articles. I figured it would be fun to list my 5 favourite articles (in no particular order).

  • Salesforce Enters IOT Platform-  This provided me with one of those ‘ah ha’ moments.  I think Salesforce is onto something with this platform.  If you think about tying customer events into a customer engagement platform, I think Salesforce will have a lot of opportunities in this space.
  • PowerAppsI was sitting beside Richard at the MVP Summit where the team is talking about PowerApps.  Richard gave me a nudge and said – “hey you should break this story when it is no longer NDA”.  After the session I reached out to Wade Wagner from the product group who put me in touch with some marketing folks at Microsoft who ensured I had all of the information I required in order to launch a detailed article as soon as the embargo was lifted. It was neat to be part of launching a story like this.
  • Microsoft’s Integration Roadmap – While I did write about this on my blog, I was deliberate to provide my personal opinions on the matter.  The goal of the InfoQ article was to remain objective and speak to the facts.  Regardless, It was fun to write about this topic from that perspective and in that outlet.
  • Amazon IOT Beta– Once again, having not been familiar with what Amazon was doing with IOT, this gave me the opportunity to compare and contrast Amazon’s vision against Salesforce and Microsoft.
  • Event Hubs surpasses 1 Trillion messages in a month – This was my very first article and also gave me an opporunity to interview Dan Rosanova.  You can always get a good sound bite out of Dan.  It was really neat to see where Dan, Clemens and ther rest of the team have been able to take this service.

Looking ahead…

2016 should be another very interesting year in the area of Microsoft Integration.  We will see a new version of BizTalk Server, Logic Apps Updates, PowerApps Updates and also another Integrate event in Q2.

It is also off to a good start with my MVP being renewed.  All Integration MVPs have been moved into the Azure discipline. I believe this is my 9th year in the program.  I know someday it will end but until that time I am happy to continue to contribute to this excellent community.

Thursday, December 24, 2015

My Point of View: Microsoft Releases Integration Roadmap

On December 24th, 2015 Microsoft provided a Christmas gift to its customers, partners and broader eco-system in the form of a highly sought after roadmap.  For several years, customers and partners have been awaiting an “official” statement from Microsoft with clear direction on where they are headed.  Competitors have used the lack of a roadmap against them in compete situations. That has all changed as of this writing.

You can find the roadmap here and BizTalk360 founder and Microsoft MVP Saravana Kumar has provided his thoughts here.

My POV:

BizTalk is not dead

The BizTalk ‘franchise’ will continue to exist.  We will see a BizTalk Server 2016 next year that will include the following features:

    • Platform Alignment (Windows Server, SQL Server, Visual Studio)
    • SQL Server 2016 support and  Always-On Availability groups which will simplify BizTalk Disaster Recovery.
    • Full support for High Availability in Azure IaaS
    • Better support for cloud based integration and SaaS connectivity.  Today we have a lot of SaaS connectivity through API Apps.  I suspect we will see BizTalk Server to tap into these API Apps rather seamlessly.
    • Bug fixes and Adapter enhancements.

We will also continue to see ‘BizTalk’ capabilities being leveraged in Logic Apps in the form of API Apps such as BizTalk Transformations, encoding/decoding, Business Rules etc.

A unified vision from Microsoft

For some outsiders this may not be abundantly clear, but the BizTalk team lives within the Azure App Service team.  Subsequently, both the Logic App and BizTalk teams are the same team. This roadmap accounts for this and represents a single vision for Integration at Microsoft. 

For people familiar with both BizTalk and Logic Apps, it is probably evident that BizTalk and Logic Apps tend to operate at different ends of the integration spectrum. With BizTalk, customers get a solid on-premises integration broker that is very robust.  It is also very feature rich with support for BAM, Business Rules, EDI, ESB, Exception Portal, Pub/Sub messaging and much more.  However, all of these capabilities, there is a price to pay in terms of complexity and technical dependencies for it all to work.  As a result agility can become a concern for some customers.  For teams with concerns about BizTalk’s agility, their concerns can often times be resolved in Logic Apps.  In Logic Apps we have IPaaS capabilities with loads of SaaS connectivity and (soon)direct integration with API Management.

I think the following image (from roadmap) does a great job of illustrating where Microsoft is headed.  The goal is clearly symmetric capabilities, but provided in a modern platform. This modern platform is not BizTalk Server, but rather building out Logic Apps to address outstanding enterprise features and deliver them in the cloud and on-premises.  A key enabler of this story is Azure Stack.  Without it we will not see the new Logic App assets running in your own data center.  Microsoft is targeting an “IPaaS” preview in Q2 2016 and GA by end of the 2016.

image_thumb[1]

Bringing more people to the party

Let’s be honest, BizTalk developers have a very niche skill set.  I have been working with BizTalk since 2004 at several different organizations.  I have seen some amazing BizTalk solutions being built that literally run a company and have enabled many business opportunities for organizations.  I have also seen what happens when you don’t have good BizTalk people working in BizTalk.  It gets messy quickly.  This especially a problem in the BizTalk 2004 and 2006 days when there was little documentation and guidance out there.  Today, there are so many resources out there provided by MVPs and the community this is becoming less of an issue. (What other ecosystem can brag about a weekly international user group meeting run by the community). However, and I am confident in saying this, there are not a lot of BizTalk experts out there and it is a steep learning curve in getting people to a place where they will be productive and not ‘paint an organization into a corner’.

While this may not be a popular statement with people who have invested a significant amount of time in BizTalk, it needs to get simpler.  Microsoft has around 10 000 BizTalk customers (give or take).  With the introduction of SaaS and mobile, and subsequently more demand for integration, how can you scale both a technology and a resource pool to meet that demand?  In my opinion, you can’t do that with BizTalk nor is it designed to excel in these ‘new’ use cases.

As a result, we will continue to hear  messages of ‘democratization of integration’ or ‘citizen developers’.  While many will scoff, the need is real.  If I need to connect a SaaS application, such as Salesforce, with an on-premises application this should take hours and not days(if you need to setup BizTalk environments).  For organizations with an existing broker or ESB, they can turn this around quicker than an organization without, but not as quick as an IPaaS platform.  At the end of the day, organization don’t do integration for the sake of integration but rather for a business opportunity and the old adage of ‘time is money’ could not be truer in today’s economy.

The biggest challenge, and common rebuttal,  in a simplification scenario is that integration can be complex.  This is true and this will not go away.  Recently, my team has been involved in a complex energy trading implementation with many complex, large interfaces with critical data.  I am very confident in saying that this was not a good use case for IPaaS, at least not at this time. 

However, I also run into scenarios such as SaaS connectivity where I don’t need the heavy broker.  So clearly I can relate to both points of the integration spectrum.  For customers, lowering the barrier of entry for building interfaces is a good thing.  Expert integrators will continue to be required to address more complex scenarios and develop the right patterns and architectures, but we will also see integration tools being made available to mobile and web developers to build interfaces in a timely and cost efficient manner.  Ultimately this will allow Microsoft to grow both the platform and the ecosystem.  A healthy ecosystem is good for all parties.

image_thumb[3]

Conclusion

I am sure everyone reading the roadmap would love a magical, cohesive platform that combines both BizTalk Server with Azure App Service yesterday.  BizTalk was not built overnight, and similarly it will take time for the convergence of these two platforms to happen.  The good news is that we have official confirmation from Microsoft on where they are headed which is a great step while we await the bits to arrive.

Take this for what it is worth, but here is how I am acting on this roadmap.

  • Continue to use BizTalk for its strengths.  If you have complex integration needs that deal with on-premises systems, or complex messaging patterns,  continue to use BizTalk for those purposes. 
  • The SQL Server Always On feature may be worth the price of an upgrade alone from a Disaster Recovery perspective.  Let’s be honest, DR with the current BizTalk version is not ideal
  • Where you have trading partner, that can leverage APIs, mobility or SaaS connectivity requirements look to the modern IpaaS platform.  I am a big fan of keeping this type of integration on the edge of my enterprise.  I don’t want to open up firewalls and manage those configurations using legacy approaches.  Very easy to do so using Azure API management and API Apps. 
  • Azure Service Bus is a great way to bridge on-premise workloads with IPaaS connectivity.  It also enables Pub-Sub for Logic Apps.
  • Vote early and vote often! The Azure App Service team is very interested in feedback.  If you think something is missing, add it on User Voice here. Back in May I created topic for allowing BizTalk to talk to API Apps in order to allow for SaaS connectivity in BizTalk.  While I cannot take credit for this featuring being included in BizTalk 2016, it does show that the team is listening.

Tuesday, September 29, 2015

Azure Hybrid Integration Day coming to Calgary

Every year some of the brightest minds in Microsoft Integration descend upon Redmond, Washington for the Microsoft MVP Summit. This year 3 MVPs (Saravana Kumar, Steef-Jan Wiggers and Michael Stephenson) from Europe will be stopping by Calgary on their way to the Summit and will be giving some presentations.   Myself and a local Microsoft employee, Darren King, will also be presenting.

I have shared the stage with these MVPs before and can vouch that attendees are in for a unique experience as they discuss their experiences with Microsoft Azure and BizTalk Server.

During this full day of sessions you will learn about how BizTalk and Microsoft Azure can address integration challenges. Session topics include SaaS connectivity, IoT, Hybrid SQL Server, BizTalk administration & operations and Two Speed IT using Microsoft Azure. Also bring your burning questions for our interactive Ask the Experts Q & A.

The free event takes place on October 30th, 2015  at the Calgary Microsoft office.  You can find more details here.

Monday, May 11, 2015

/nSoftware Powershell Adapter for BizTalk Server

 

In the past I have blogged about /n Software and their SFTP Adapter here and here.  Hard to believe one of those posts goes back to 2007. One thing that /nSoftware continues to do is add new adapters to their suite.  In this case it is the Powershell Adapter.

Can’t say that a Powershell Adapter previously was on my radar until a scenario was brought to me.  We have a very specialized piece of software that does “analysis” (I will leave at that for now).  This software is essentially a library that has been wrapped around an exe.  This exe will receive a series of parameters including a path to a file that it will use to perform its analysis on.

A suggestion was brought up about calling this exe using Powershell.  While I am sure we could call this from .Net the Powershell warranted some investigation.  So sure enough in a web search, /nSoftware comes up with an offering and sure enough we had it installed in all of our environments.

Since BizTalk is going to deliver the flat file used as an input to this process, I decided to check out the Powershell Adapter and allow BizTalk to orchestrate the entire process.  For the purpose of this blog post I will over-simplify the process and focus more on a POC than the original use case.

As part of the POC I am going to receive an xml file that represents our Parameter data.  We will then send this same message out through a Send Port that is bound to the /nSoftware Powershell adapter.

In order to help illustrate this POC, I have a console application that will simply receive 3 parameters and then write the contents to a file in my c:temp folder.  The reason why I am writing to a file is that when I call this exe from Powershell I don’t see the console window being displayed.  Perhaps there is a way to do that but I didn’t look for a solution for that.

namespace PowerShellPOC
{
    class Program
    {
        static void Main(string[] args)
        {

            string[] lines = { args[0], args[1], args[2] };
            // WriteAllLines creates a file, writes a collection of strings to the file,
            // and then closes the file.
            string filename = DateTime.Now.ToString("ddMMyyyymmhhss") + ".txt";
            System.IO.File.WriteAllLines(@"C:\temp\" + filename, lines);

           
        }
    }
}

 

In hindsight, I should have just built a send port subscription but here is my orchestration.

image

Using a standard FILE – receive location

image

On the Send things start to get a little more interesting. We will create a Static One-Way port and select the nSoftware.PowerShell.v4 Adapter.

image

Within our configuration we need to provide a Port Name (which can be anything) and our script.

image

If we click on the Script ellipses we can write our PowerShell script.  In this case we are able to retrieve our message that is moving through our Send Port and pass it into our executable.

image

If we only wanted some data elements we can also use $param3 = $xml.arguments.ReturnType

In this case “arguments” is our root node of our BizTalk Message and “ReturnType” is a child node in our XML Document.

When we go to process a file we will find that our text file has been created and it contains our 3 parameters; 2 that are hard coded and our BizTalk Messsage Payload.

image

Conclusion

When I think of BizTalk, I don’t necessarily think of Powershell.  But there will always be times when you need to perform some function that is a little bit off mainstream.  What I do like about this approach that there was no additional custom dev required to support the solution and we can use the actual BizTalk message in our Powershell script.

I am still exploring the capabilities of the adapter but after a dialog with the /nSoftware team I understand that remote Powershell scripts can be run and we can also use Dynamic ports and Solicit Response ports if we want to return messages from our PowerShell script to BizTalk.

For more details please check out the /nSoftware website.

Saturday, May 9, 2015

BizTalk 2013 SharePoint Adapter not respecting SharePoint 2013 View Name

 

I have done a lot of BizTalk-SharePoint Integration in the past and ran into a situation recently that surprised me. There wasn’t an easily identifiable resolution online so I have decided to document this for the benefit of others.

Background

We have a process that requires a user to approve a financial summary document in SharePoint.  Once the document has been approved, BizTalk will then fetch the details behind those financial transactions, from another source system, and send them to SAP.

In the past I have leveraged SharePoint views as a way for BizTalk to pickup messages from a SharePoint document library. The way to achieve this is to rely upon meta data that can be populated within a SharePoint document library column.

Adding a custom column to a document is very simple.  Under the library tab we will discover the Create Column label. We can simply click this button and then add a column and related properties as required. 

image

With our custom column created, we can now create a view for BizTalk to “watch”.  In our example we were dealing with an approval workflow.  We can create our custom column called Status and then when BizTalk initially publishes this financial summary document(for users to approve), we can use the SharePoint adapter to populate this column with a value of Pending.  After a user has reviewed the document, that Status value can be changed to Approved.

Since we don’t want BizTalk to move Pending documents we will create a view that will only show Approved documents.  To create a custom View we can once again click on the Library tab and then click on Create View

 

image

For our purposes a Standard View will be sufficient.

image

We need to provide a View Name and can also indicate where we want this column to be positioned.

Tip – In my experience I have experienced odd behavior with spaces in the name of SharePoint entities.  My advice is to avoid spaces in names where possible.

image

Lastly, since we only want Approved documents to show up in this field we need to add a filter.

Within our filter we want to Show items only when the following is true:

Status is equal to Approved

image

We can now save our view and test it. To test it we will upload two documents.  One will have the Status of Approved and the other will have a Status of Pending.  When we click on All Documents we should see both documents.

image

When we click on our view for BizTalk, which in this case is called BizTalkMoveView we will only see our Approved document.

image

From a SharePoint perspective we are good and we can now create our SharePoint Receive Location in BizTalk.  For the purposes of this blog post I am using a Send Port Subscription; I will receive the message from SharePoint and then send it to a File folder.

In our BizTalk Receive Location configuration we are going to use the Client OM which in this case is the SharePoint Client Object API.  This allows us to communicate with SharePoint without having to install any agents on a SharePoint Server.

We also need to configure our SharePoint Site URL, Source Document Library URL and View Name

image

When we enable our Send Port and Receive Location we should receive 1 file in our File Folder right? WRONG! Both files were picked up and moved to our file folder even though we have a View enabled.

image

If we go back to SharePoint we will discover both documents are gone.

image

Issue

The issue is that for some reason, BizTalk 2013 is not using/respecting the View Name property that is available in the Receive Location Configuration.

Resolution

The resolution is to install BizTalk 2013 CU 2. The download and more details about CU2 can be found here.

Before you install, the recommended approach from Microsoft is:

  • Stop all host instances
  • Stop SQL Server Agent which is responsible for running the internal BizTalk jobs
  • Perform a Database Backup

Running the CU2 exe is pretty straight forward and only takes a couple minutes.  I wasn't prompted for a reboot but decided to go that route regardless.

After applying the CU, I uploaded two documents again.  One had a Status  of Approved while the other had a Status  of Pending.

image

Our BizTalkMoveView is also displaying our file correctly

image

When we enable our Receive Location we will discover that only our Approved file has moved.

image

image

Our document that was in a Pending state remains in SharePoint as expected.

image

Conclusion

BizTalk 2013 was the first version that had support for the SharePoint Client Object model.  So I am not sure if this bug only impacts when you are using the Client OM within the BizTalk Receive Location.  I do know that in previous versions of BizTalk that this was not an issue.  However those versions of BizTalk relied upon the SharePoint Adapter Service being installed on the remote SharePoint Server.  Using the Client OM is the way to go as it also allows you to communicate with SharePoint Online/Office365.