Thursday, December 31, 2009

BizTalk Adapter Pack 2.0 Migration Wizard – SAP IDOCs

I have had a couple readers ask me some questions about the BizTalk Adapter Pack Migration Wizard and how well it migrates projects that leveraged the .Net connector based SAP Adapter.  This is something that I have been interested in exploring for quite some time but just hadn’t gotten around to it.  When we initially went live with our BizTalk 2009 environment we decided to reduce the complexity and and not migrate the SAP projects to the new WCF based adapter knowing that we had a little time to take care of this task.  We have now started to migrate some of these apps to use the WCF based adapter so I figured I would re-create a use case from an existing integration scenario to see how well the wizard works. 

In order to re-create this use case I decided to start the slate clean by creating a new BizTalk 2009 project and then adding an extended CONF32 IDoc called ZConf32.  If you are unfamiliar with the term “extended IDoc” please see one of my previous posts.  Some of this post may be redundant, as it discusses the old adapter,  but it could help someone trying to integrate with SAP with the legacy adapter as well.

Note:  From here on in I will refer to the .Net Connector SAP adapter as the legacy adapter and the BizTalk Adapter Pack 2.0 SAP adapter as the WCF adapter.  Nothing in this post covers the BizTalk Adapter Pack 1.0 SAP adapter however I would assume that most scenarios that apply to the 2.0 adapter also apply to the 1.0 adapter.

Creating the BizTalk project

  • Adding a legacy Generated Schema to the project

image

 

  • With the legacy adapter you need to “Add Adapter Metadata” but with the WCF adapter you would want to “Consume Adapter Services’

image

  • Select the “SAP” adapter and then select a configured port that will be used to connect to an SAP instance.  I miss this feature in the WCF adapter because when the schema generation fails, you will need to re-establish the connection to SAP by providing the URI, Username and Password again.  I usually copy this information into notepad but the solution is not as clean as it is with the legacy adapter.

image

  • Since this IDoc has been extended, it is referred to as “CONF32-ZCONF32”.

image

  • The latest version released will show up by default

image

  • The Wizard working…

image

  • Success

image 

  • Two artifacts are added to my project: An Orchestration and my CONF32.xsd schema.  I have gone ahead and renamed these two artifacts accordingly.

image

  • The ZCONF32 IDoc will be sent to SAP so we need a source document that we will transform into the ZCONF32 IDoc.  The input message captures a Fleet Vehicle’s time captured while working in the field.  I work in the Energy industry and we need to record the amount of time that our fleet vehicles are in the field.  There is some additional information that I need to provide SAP with called a control record that will describe the message type, IDoc Type and Partner profile to name a few of the data elements.  I have captured this information inside String Concatenation functiods

image

  • Here is where the “core” data is being mapped into the IDoc.

image

  • The legacy adapter treats an IDoc as a flat file.  So in order for BizTalk to assemble a flat file we need to create a Send Pipeline and assign the IDoc schema to the Flat File Assembler

image

image

  • The end solution looks like the following:

Note: This scenario could easily be accomplished as a messaging only scenario, but I have decided to provide an orchestration for illustrative purposes.

image

  • My receive location is your standard local file drop

image

  • In the Send Port, I want to choose the legacy SAP adapter and ensure to select the Flat File Pipeline that will assemble a flat file message for SAP.

image

  • If you navigate to a screen called “WE02” in the SAP GUI (assuming you have permissions) you can see all of the IDocs that SAP has sent/received.  The status of the IDoc will also be captured here (green light) and indicate whether or not the process was unsuccessful/successful or still in processing.

image

  • You can drill into a specific IDoc to validate the data that BizTalk sent was complete and is accurate.

image

So that completes creating a fairly simple integration scenario with SAP using the legacy adapter.   Next we will take a detail look into what is involved in migrating this project to use the new WCF Adapter.

  • First, you will need to download the BizTalk Adapter Pack 2.0 Migration wizard which you can find here. Unzip the file to a location on a BizTalk Server that has a functional installation of the WCF-SAP adapter. You will find the following 3 files but don’t bother running the “BizTalkAdapterPackMigrationToole.exe” just yet.

clip_image002

  • Ensure that you have the following pre-requisites already installed and then copy the Microsoft.WizardFramework.dll into this folder.

Ensure the following are installed:

    • Microsoft BizTalk Adapter Pack 2.0
    • WCF LOB Adapter SDK SP2
    • BizTalk Server 2009 or BizTalk Server 2006 R2
    • Visual Studio 2008 or Visual Studio 2005

To setup the migration directory:

    • Copy %Microsoft Visual Studio Install Dir% \Common7\IDE\ Microsoft.WizardFramework.dll to the same folder.

clip_image004

 

  • Now you should be able to run the “BizTalkAdapterPackMigrationTool.exe”

clip_image002[5]

  • I am now going to load up the project the I previously demonstrated into this tool

    clip_image002[7]

     

  • Provide the URI for the SAP instance that you would like to connect to.  Here would be a good place to leverage a re-useable connection such as a send port as every time you run this tool you need to provide these details.

image

 

  • Next

image

 

  • The Wizard has determined that I am using the ZConf32 IDoc in my project and would like to generate a new schema using the WCF adapter.

image

  • You have the option to specify a release version of an IDoc.  In the legacy project, I generated a schema based upon version 700 so I am going to select 700 for this IDoc.  As mentioned BizTalk will be sending this IDoc to SAP so I will select the “Send” Operation

image

  • Gotcha!!! I previously mentioned that I am using an Extended IDoc called ZConf32.  Well the base IDoc is called “CONF32” so I need to type this information into the “BaseType” drop down.  This drop down is not populated with any data so you need to provide it.

image

  • If you don’t provide this information you will probably be prompted with an error similar to the following:

image

 

  • You have two options when running the migration wizard:
    • The first option is intended to be less intrusive to your existing BizTalk solution and takes advantage of Outbound or Inbound maps, depending upon your scenario, that are applied at your physical ports.  You essentially modifying your solution at the edges which has some advantages.  I will touch this feature a little more at the end of this blog post.
    • The second option is to update the existing solution with the new artifacts including maps and orchestrations.  This option is more intrusive as you are making changes inside your solution that will impact the way your application functions. 

So which option is better?  Like most things in IT, it depends.  If you have a complex solution with very “busy” maps I would opt for the first option – making changes at the edges.  This option poses less risk as you are limiting the amount of change you are introducing.  This option does require some updates to the core application due to the synchronous nature of the WCF based adapter.  Previously IDocs were sent asynchronously (one way) where as now they are processed using a Request-Response (two way) process.  So if you had an Orchestration that had a logical Send Port and Send Shape, you would now need to account for the response message coming back into the BizTalk Orchestration.

If you have a simple integration, like the one I am describing I would opt for option #2.  Adding additional maps will decrease performance(most likely marginally) and also becomes a more complex application to support operationally as you are now supporting additional maps.  If you have the same developer who built the BizTalk solution migrating it then they would probably also feel more comfortable in option #2.  If you have brought in someone in to perform the migration they may feel more comfortable with the first option.

 

image

 

  • Success

image

  • Conversion report

image

 

  • When I open up the migrated solution, I noticed a few new artifacts including my IDoc Request message (SAPBinding3.xsd) and my IDoc Response message (SAPBinding4.xsd).  I also have a folder called “NewSchemaTypes” that includes some of the complex types that the Request message uses.  Like any schema that has been generated by the BizTalk Adapter Pack I also have a binding file called “SAPMigration_Binding.xml”

image

image

  • As advertised the migration option that I chose has updated my orchestration including the message type for a message called “msgZConf32”.

image

  • It also updated the Logical Port Type to use the message type of the WCF based schema.

image

  • What it failed to do is update the map that transforms the incoming request document into the ZConf 32 IDoc.  What I decided to do was replace the legacy schema with the new WCF based schema.
image

image

  • Any existing functoids remained in the map however I had to reconnect all of the lines.

image

  • Although the Migration Wizard updated my logical Port, I needed to add a new port type since the old port type was a one way send port.

image

image

image

  • I also had to create a new response message which I called “msgZConf32Response” and added a Receive shape that would be responsible for handling the response message from SAP.

image

  • At this point I was ready to build and deploy.  The build process was successful but the deployment process was not.
Error 1 Failed to add resource(s). Resource (-Type="System.BizTalk:BizTalkAssembly" -Luid="SAPMigration, Version=1.0.0.0, Culture=neutral, PublicKeyToken=bb0707bffbca2a29") is already in store and is either associated with another application or with another type.

Having seen this error before I figured that the name of the application had probably been removed from the project file.  So I just added the Application Name and then I was able to deploy.

image

  • I still had the legacy Send Port configured in my application so I needed to add a WCF based Send Port.

image

  • In order to add this send port, I needed to import the binding file that was generated by the migration wizard

image

image

image

Let’s explore what is included in the binding file that was generated by the wizard:

  • Notice that BizTalk uses the WCF-Custom adapter that leverages an SAP Binding and you no longer need to build flat file pipelines when communicating with SAP using this adapter.

image

  • You will need to update the action as the action that is included in the binding file will result in an error being generated:

The adapter failed to transmit message going to send port "processVehicleTime_PortType_SendZConf32" with URL "sap://CLIENT=XXX;LANG=EN;@a/SAPSERVER/XX". It will be retransmitted after the retry interval specified for this Send Port. Details:"Microsoft.ServiceModel.Channels.Common.UnsupportedOperationException: Incorrect Action <BtsActionMapping xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:xsd="http://www.w3.org/2001/XMLSchema">

<Operation Name="Operation_SendZConf32" Action="http://Microsoft.LobServices.Sap/2007/03/Idoc/3/CONF32/ZCONF32/700/Send" />

</BtsActionMapping>. Correct the specified Action, or refer to the documentation on the allowed formats for the Actions.

Check out this link for more details.

image

  • Here I am using the default values including enabling BizTalk compatibility and Safe Typing.

image

  • Don’t forget to include you credentials as they are not brought in with the binding file.

image

  • One last thing that I need to do before starting the application is to bind the Orchestration to use this new Send Port.

image

  • At this point I have started the application and processed a message.  The message was successfully submitted to SAP as confirmed in the WE02 transaction.

image

image

  • From a BizTalk tracking perspective this what the old and new process looks like:

Old process

clip_image002[9]

New Process – since we are using a Request-Response port type we now receive a message back from SAP(XMLReceive)

clip_image004[5]

 

  • As previously mentioned, I decided to use the intrusive approach when migrating this application.  I am now going to quickly walk through the less intrusive approach that involves using mapping on the edges to deal with the differences between the legacy and WCF based schemas.  Unfortunately this did not migrate as I would have expected so I don’t have a whole lot to show you.

image

 

image

  • The project was not updated as advertised.  Only the new schemas was created, no new maps were created.

image

image

  • The orchestration remained the same as advertised

image

image

 

Summary

So as you can see the migration wizard had some mixed results. I am not quite sure why the Map, in either scenario, was not created/modified.  The map itself was not overly complex although it did included some functoids.  Another feature that I did not like was that you don’t have the ability to apply a naming convention to the schemas that were generated.  If you were to add these schemas to a BizTalk solution from Visual Studio you do have the ability to provide a pre-fix using that tool so I am not sure why you could not use this same feature in the migration tool.

If you use option #2(the intrusive approach) it will go ahead and update your ports and message to use the new schema so I do think that there is some value in using the migration tool. Also, if you want to update several IDocs within one project you should be able to perform this migration by running the tool once. I definitely do not think that the tool is a silver bullet and you will need someone who is knowledgeable with BizTalk to perform these migrations.  In defense of the tool, it is not an officially supported tool but is there to act as an aid and I do believe that it does accomplish this.  All in all I think it took me about 20 minutes to migrate my fairly trivial application so the effort was not momentous.

If you are interested in seeing how this tool works with the new WCF based SQL Server adapter please check out Richard Seroter’s post.

Tuesday, December 8, 2009

BizTalk Server Futures and Roadmap

I realize that I am a little late to this party, but figured I would provide my thoughts on the BizTalk Server Futures and Roadmap presentation by Balasubramanian Sriram that occurred at PDC 09 this past November. To view a re-play of his presentation check out the following link. Also note that some of the content from this post has been 'borrowed' from his presentation.


Current State
  • BizTalk adoption continues to be strong. Currently over 10 000 Global customers are benefiting from BizTalk Server. If my memory serves me correctly, they have added approximately 1500 customers over the past 2 years.
  • Major industries using BizTalk include Electronic Parts companies, Telecommunications, Aerospace and Defense, Chemical Companies, Railroads and Insurance. In Canada I am aware of a few other industries that are also benefiting from BizTalk Server including Utilities, Oil and Gas, Healthcare and Government.


Short-term Road Map

Here is a list of some of the Customer "wants" that are being addressed:

Platform support

  • Platform Alignment (Windows Server 2008 R2, SQL Server 2008 R2, Visual Studio 2010)
  • Tighter integration with Windows Workflow
  • Common Application Model (scaling up .net apps into an integration server)

Productivity

  • Ease of use in transformation scenarios
  • Trading Partner Management
  • Out of box connectivity (more adapters)

Enterprise Capabilities

  • ESB
  • Enterprise manageability (consolidated view of integration assets)
  • Low latency scenarios
How are some of these wants going to be addressed in the short term?
BizTalk Server 2009 R2


Platform support
  • VS 2010, Windows Server 2008 R2, SQL Server 2008 R2
Productivity Improvements
  • Single dashboard to apply and manage performance parameters
  • Out of box support for event filtering and delivering (RFID)
  • Powershell access to management tasks
  • New SCOM object model to better reflect BizTalk Artifacts

B2B Scenarios made easy

  • Mapper enhancements to make complex mapping easier to create and maintain
  • FTPS to provide secure transactions between businesses
  • Updated B2B accelerators for latest protocol versions
I am extremely encouraged to see theses added features in an R2 release especially since it seemed like BizTalk 2009 shipped yesterday. In particular, I am looking forward to the following features:
  • Platform support - It is nice to see that we should not have to wait as long as we previously did to be able to leverage some of these new technologies. It is always a bit of a downer when your organization is trying to stay current with technology only to have your application "slow the train down" because your application is not supported on the latest version of OS/DB/IDE etc.

  • Single Dashboard - It would be nice to get a composite view of your BizTalk environment from a single source. This Dashboard is also suppose to provide performance metrics and server health from one location. When working in an enterprise environment, multi-node groups are the norm, not the exception so it will be nice to have improved visibility.

  • New SCOM object model - I haven't been overly impressed with the SCOM management pack for BizTalk 2009. I find that some of the terminology that is used in the Management pack does not align with the terminology in the BizTalk Administration Console. For instance SCOM may notify you that you have an "Adapter" that is currently offline. No where in the BizTalk Admin console can you turn on/off, enable/disable an adapter. You can certainly add or delete an adapter but I am pretty sure that is not what SCOM is referring to. However, you can certainly enable/disable a receive location or stop a host instance which will impact an adapter's ability to function, but SCOM doesn't explicitly describe it this way. I also feel that the SCOM management pack doesn't correctly understand clustered host instances. I have received many alerts indicating that Clustered Host instance 'XYZ' is not running on Server 'b'. Well the reason for this is that the Host instance is currently active on Server 'a' and the nature of a Clustered resource is that it runs in an active/passive state by design.

  • Mapper Enhancements - Balasubramanian was correct when he mentioned the mapper as a "value add" for BizTalk. Since this is the case, then more should be done to improve a developer's productivity when using this tool. I have shown this tool to SAP resources describing how great the tool is only to find them cringing as we try to follow the connector lines when mapping an IDOC. I am extremely encouraged by what I saw and can definitely benefit by some of the new features described including only showing nodes that are involved in a map, auto scrolling, moving sections of maps to a new page and searching for nodes.




  • FTPS Adapter - This is welcomed change for me. The reality is that not every single interface is going to use the latest version of WCF. Providing an updated FTP adapter that supports security will be beneficial for BizTalk customers. I am still waiting for an update to the POP3 and SMTP adapters so that we can communicate with Exchange over MAPI. There is perceived risk associated with POP3 that makes Exchange administrators cringe when you ask them to open up POP3 connectivity. Yes 3rd party adapters do exist, but after several years, I do have expectations that something like this could be included in the box.

Future



Fear not, Microsoft will continue to support the BizTalk investments that have already been made to date. However, BizTalk will eventually run on top of AppFabric.

Here are some of the features that are envisioned for the future:

  • Use data contracts in maps and have the ability to transform to a schema and vice versa. One of the limitations of AppFabric (Dublin), in its current state, is the lack of a mapper tool to allow for transformation of messages. This new capability should fill in this current gap and also allow you to mix the use of data contracts with schemas in maps.
  • Create workflow activity based on map
  • Low-latency scenario versus durable messaging. This is a feature that many have been asking for. This feature is really intriguing in the sense that you can choose which path you would like to take when designing your message interactions. In Balasubramanian's scenario he was wanting to book a trip. Whenever read requests are required durable messaging is probably not required so why take on the additional performance hit that using MessageBox incurs. However when performing insert or update operations you probably want to use durable messaging to ensure that people's trips are booked. Using this hybrid approach allows for the best of both worlds within the same technology stack. Cool!
  • BizTalk will use innovations in AppFabric while preserving your investments
  • Current demo included Workflow in running in BizTalk Host – long term vision is for BizTalk to run in AppFabric host

A question that I have is: currently BizTalk is a product and has a licensing cost associated with it. AppFabric is really a framework that has no licensing costs associated with it. As these two platforms converge, which "components" will continue to have a licensing cost versus no licensing costs?

Overall it appears to me that the future for both of these platforms is extremely bright and Microsoft continues to make key investments into integration.

Sunday, November 8, 2009

BizTalk 2009: Windows Server 2008 and SAMBA shares

Another adventure that we experienced during a recent BizTalk 2009 cutover was the behaviour that Windows 2008 has on SAMBA shares.  For those of you who are unfamiliar with SAMBA shares they basically provide you the ability to access *nix shares from Windows based computers.

In our previous setup(BizTalk 2006 running on Windows 2003) we had no issues communicating with SAMBA shares. Our infrastructure team recently updated the test system that we connect to version > 3.x of SAMBA where we had no issues when communicating with Windows 2008/BizTalk 2009 servers.  When we went live with this cutover in Prod, the SAMBA version had not been updated in that environment yet and we were running an older version of SAMBA (2.2.x).  The result of this was the following error which led to Host Instances going offline.

 

Event Type:        Error
Event Source:    BizTalk Server 2009
Event Category:                (1)
Event ID:              6913
Date:                     11/7/2009
Time:                     7:00:50 PM
User:                     N/A
Computer:          Server
Description:

An attempt to connect to "BizTalkMgmtDb" SQL Server database on server "SQLServer\SQLInstance" failed.

Error: "Login failed. The login is from an untrusted domain and cannot be used with Windows authentication."

For more information, see Help and Support Center at http://go.microsoft.com/fwlink/events.asp.

The underlying cause for these errors was the BizTalk Host Instance account, that is used to connect to these shares, was being locked out due the issues in connecting with SAMBA.  Pretty much a BizTalk Developer/Admin/Architect’s worst nightmare.

After performing some online searches we ran into the following article. The Article simply states that “Windows Vista and Server 2008 have a default version requirement of MS-LAN Manager communication that prohibits communication to older Linux-based Samba installations. This can be fixed via group policy or the local security policy.

You can read the article for more details, but what helped us was setting the LAN Manager authentication level to “Send LM & NTLM responses”.  Upon forcing a Group Policy update we were back in business thanks to the help of a few members of our infrastructure team.

We also looked into providing Authentication credentials but that wouldn't help since these are only provided when the Host Instance does not have access.  The BizTalk user did have access so those credentials are sent before these configured credentials are passed anyways.

image

BizTalk 2009 – NSoftware FTP Adapter FTP protocol error: 550 rename: Cross-device link

During a recent Production implementation of BizTalk 2009 we ran into an issue with the temporary folder option in the NSoftware FTP adapter.  The error itself is not an NSoftware bug, but rather a feature/limitation of the FTP Protocol.

Event Type:    Error
Event Source:    nsoftware BizTalk FTP Transmit Adapter
Event Category:    None
Event ID:    0
Date:        11/7/2009
Time:        2:50:22 PM
User:        N/A
Computer:    Server
Description:
Source: nsoftware BizTalk FTP Transmit Adapter (3.5.3488.0)
Source URI:
FTP://%User%@%FTPServer%:21/%RemotePath%/%RemoteFile%
Message Type: Error
Current Thread: 71
Transmission failed for message "019ef16e-0b57-4309-8a37-b5c05a61e7f9": Error uploading FTP data: FTP protocol error: 550 rename: Cross-device link

If this error persists, you may enable Warning or Verbose logging modes to enable the adapter to report more information regarding the progression of this error.

 

Our scenario involved writing files in a specific sequence using delivery notification and using a temporary folder to support some reliability requirements.  The idea behind using a temp folder is you get the data to a “staging” area and once the adapter can confirm that all data was written it performs a rename operation on that file which is extremely quick.  It also reduces the risk of a consuming application retrieving the data from the file as it is being written.  This is especially important in *nix environments since file level locking has not been implemented the same way as it has in Windows environments.

Here is how our destination folders have been configured:

/home/work

/home/arch

/home/temp

Since we are using delivery notification, the “work” file has to be written to first.  Once that operation has completed, then the arch(archive) file can be written.  Both of these messages are being sent through a dynamic send port and we have a routing rules repository where we store the specifics of where the message is suppose to be delivered to.  Our configuration called for both operations to use the same temp location. (Hindsight is 20/20)

In Test, everything worked out perfectly but when we ran this process in Prod we ended up with the aforementioned error.  The difference between these environments is that the “arch” folder exists on a separate disk than “work” or “temp” in Prod.  Essentially the FTP Protocol cannot perform the rename operation from the “temp” folder across to a folder on a different disk that has been mounted to the same server.

I was able to validate this by opening up a command prompt and establishing an FTP session.  Once I connected I tried to rename a file and place it on the “arch” disk.  I got a similar error:

ftp> rename list.txt /home/arch

350 File exists, ready for destination name

550 rename: Cross-device link

ftp>

 

In a previous blog post I discussed that NSoftware use to have a property called temporary file extension.  This property is no longer a property that is exposed in the Send Port configuration but is still supported when you dive into their documentation.  To use this property , you want to click on the “…” button beside the “Other” property.

 

image

In this property dialog box, populate TemporaryUploadExtension=.tmp

image

This feature acts similarly to the temp folder, the difference is that the “temp” file will be written to the same folder as the file that you are transferring .  A consuming system that respects file extensions will still be “OK” with this situation since the file name will have “.tmp” added to it until the entire file has been written and at this point will remove this temporary extension via the rename operation.

This situation just highlights how important matching your QA/Test environments and Production really is.  The other lesson is that not everything is exactly as it appears.  In our case the folder structures were exactly the same.  The difference was that one of the folders was mounted to a different disk. Finding this out is not a lot of fun when you are in the middle of a go live.