PS: This is continuation of this great blog post, and aims to address few drawbacks
One of the many frequent requirements in integration is delivering aggregated data to systems for further processing. BizTalk facilitates this and there are several aggregation pattern in BizTalk.
But for large message sets many of these patterns are insufficient. During a recent project we had a similar requirement. I came across this blog post that solved the problem for us. There were few tweaks that we introduced because there were few problems specifically the delay before the batch gets written was something not safe to assume for us as the batches were big in size and load on the server was high. So from a performance and reliability point of view we had to adjust it.
We also tweaked how the header and trailer are written to make it more generic.
A guaranteed delivery and write of the batch message was achieved by using the DeliveryNotification in BizTalk.
Delivery failure notification took care of writing the complete batch to file so this enabled removing of the delay while writing the batches and thus making the aggregation reliable.
If you prefer to use static port then the file adapter should have Append to existing file property selected and in case of dynamic port
Instead of writing Header and Footer while writing the file, we tweaked it to write header and footer while reading the file. While reading the file in the orchestrations we can construct the final message as below.
This can also be implemented at a pipeline level to make it more generic. Once the file is written to the disk , a receive location can be used to read the file and then send it to destination systems.
On the receive pipeline define.
With this implementaion it was posible to process 2800-3000 batches quickly and with low memory usage.
Transformation is one of the key elements in integration and BizTalk facilitates this in way which is one of the best.
Transformations in BizTalk can be done within the Orchestrations , or at the port level by specifying inbound or outbound maps.
Transformations can also be executed on pipelines so as to avoid persisting messages to database if the source is huge.
For example, you are receiving a big message say 100 MB which contains all the records extracted but for your business process within BizTalk we need only a subset of that, that meets some condition. One way of achieving this is by applying the filtering map at the port level and then feeding the business process orchestration with the smaller set. However there is limitation of one map that can be executed on port and for such scenario where we need to execute multiple maps on source the drawback here would be that whole message say 100-150 MB will be persisted to message box before further transforms applies and this becomes overhead if there are many such processes.
To overcome this, executing the map at the pipeline level before the persistence goes a long way in improving performance.
For this we can define a pipeline component with below properties.
MapFQN is the fully qualified name for the map to be executed.
(The same can be extended to include xsl transformations as well. Say a path to XSLT file that needs to be executed for transformation.)
Virtual stream takes care of disk offloading during transformation for big messages thus greatly reducing the server memory load.
Quite recently I was working for a project that needed Salesforce data from cloud back to on premise SQL.
Salesforce Rest Apis enable working with Queries and a SOQL query can be executed under Query resource that returns all the results in a single response, or if needed, returns part of the results and an identifier used to retrieve the remaining results.
Rest Api Call: https://yourInstance.salesforce.com/services/data/v20.0/query/?q=SELECT+name+from+Account -H “Authorization: Bearer token”
As we can see there the SOQL needs to have the field name that we want and a select * is not supported by SOQL.
So the requirement was to fetch all the data, that could have been achieved by specifying all the field names on Query but then the solution would be dependent on Salesforce updates and modification.
The next approach is to query the available fields from Salesforce and dynamically create the query to execute.
Rest Api :
The resultant xml is like:
Now we can iterate over all the <fields> and get all field names for our SOQL query.
And then build the “Select GetAllFieldsQuery(describe Xml) from Account”
This way for any changes to Salesforce object, the interface doesnot need to change and is not prone to breaks or unexpected result because the results are laways in sync on Salesforce cloud and On premise SQL.
The Rest Api calls are easily done via BizTalk 2013R2 Web-Http adapter. The default response for Api calls is Json and BizTalk 2013R2 is fully equiped with tools to handle Json messages. If however, we intend to have the response as xml, this can be achived by setting Http Header – Accept : application/xml
Quite recently we encountered an error in BizTalk WCF sends when the message being sent over the wire.
The error being reported on BizTalk and event viewers is more of a generic error like:
We noticed that the error happened wheever the message being sent is big in size say 100 MB- 150 MB.
So to check if this was infact the issue, we increased the maxRequestLength=”2147483647″ which did not help.
For IIS 7 and above there are more settings to be modified for this to work, there needs to be a security settings in the config that allows the service to accept bigger messages.
<requestLimits maxAllowedContentLength=”1048576000″/> <!– 1000 MB –>
The setting can be applied at the IIS level as well and that would then apply to all the services hosted on the IIS server.
Salesforce will soon be disabling TLS 1.0 support.
Starting in June 2016, Salesforce will begin disabling the TLS 1.0 encryption protocol in a phased approach across impacted Salesforce services. The disablement of TLS 1.0 will prevent it from being used to access the Salesforce service within inbound and outbound connections.
In Continuation to the great post on how to call salesforce APIs via BizTalk , the WCF behaviour can be extended to inforce any other TLS protocol.
ApplyClientBehaviour method can be modified to apply security protocol on the outgoing messages. By default this is SSL 1.0
You can pass on the parameter from the configuration like the other params.
WCF Behaviour extension would look something like.
The tool is built for automating your BizTalk applications deployments across multiple servers using the native BizTalk built MSIs that is generated from BizTalk Admin console.
The deployment tool frees you from hassles of scripting/ multiple server logons to run scripts etc. You get a single UI from where all your actions can be executed giving you a complete picture of the deployment steps.
The tool uses library WindowsInstaller from http://wix.codeplex.com/ to get Msi info.
The tool is intuitive and various features in the tool are easy to use. Please use the tool and post your comments for any questions/ wishes etc..
BizTalk Application Deployment tool has following main features :
- Single click multi server deployment of Msi
- Complete view of all the actions being performed during the deployment process on screen
- Use the standard BizTalk generated Msi for deployments
- Has a bunch of productivity tools to assit during deployment and administration
- HostInstances feature assists you in quick host instance operations
- Deployment Health checks
- Artifact Inspector helps you deeper view of the application’s artifacts such as schemas, maps etc
- Msi inspector helps you quicly inspect your Msi for resources info
- IIS deployment features are also included in the tool that would help you automate IIS related operations across multiple servers in the farm
- Manage Application pools : Create, Recycle, Delete
- Manage Applications : Create, Delete, Change application pool
- Perform health checks of your biztalk web service
- BRE deployment tool that would save you from the cumbersome Bre deployments
- Assembly Gac feature that would facilitate GAC of multiple assemblies across multiple farm servers from UI and single click.
- Easy to use and is intuitive
The other day I was working on an application that would be uploading files to SharePoint 2013 using BizTalk 2013.
With BizTalk 2013, Microsoft has introduced CSOM (Client Side Object Model) and as a result of which there no need to install the SharePoint Service that was required till BizTalk 2010.
Everything seems to be working fine till I tried uploading a file of size 1.8 MB and BizTalk failed to upload the file with below exception.
So the error is clearly the server exception and SharePoint is not allowing files of size > 2097152 byes or approx 2MB.
There is one interesting stuff to check here.. The file size being uploaded is 1.8 MB and allowed size is 2 MB so why is it failing ??
After digging on to SharePoint logs I could see the content of the file and voila the content is a Base64 string. So this clarified the reason as Base64 string is greater than the original data so a 1.8MB data could well grow > 2MB after Base64 conversion..
But still a 2MB limitation is too much and it should not be like this.
There is a setting in SharePoint where you can increase this size limit. It can be done using a power shell script.
$service = [Microsoft.Sharepoint.Administration.SPWebService]::ContentService
$service.ClientRequestServiceSettings.MaxParseMessageSize = 2147483647
$service.ClientRequestServiceSettings.MaxReceivedMessageSize = 2147483647
Using the above script the size can be increased to 2 GB and after running the script everything went fine and files of size 200, 250 MB were getting successfully uploaded.
Note : Its the MaxParseMessageSize that actually does the trick 🙂