Campbell 2006-08-01 06:32:13
I’d suggest you use a receive pipeline with an envelope schema to split
this message up into multiple messages.
I would imagine the hardware required to process that size of file
would be prohibitively expensive.
There are good examples on how to do this. You can find them if you
google for “The blogger’s guide to biztalk”.
Alan smith 2006-08-01 06:38:02
There’s an article on large messages from the BizTalk Core Engine team here:
A 500 MB file with 300,000 records sounds pretty big, so you need to take
care hou you handle it. I’ve not tested anything like this, but here’s a few
Make sure you have SQL Server Agent running, and the Backup BizTalk Server
task configured and running before you run any tests. Leave the interval at
15 minutes, and clean out the backup directories regularly (you’re gonna be
creating a LOT of log data from the message box database!).
Create a pipeline to split the messages, BizTalk can handle a lot of small
messages better than one large one. Receiving a very large message in an
orchestration and looping through it may be a perfromance killer.
Use the temp folder options on the FTP adapter.
Run some tests with increasing batch sizes (100, 1000, 10000 etc), and see
if your scenario holds up to the load.
You may be better off with a pure messaging solution rather than using
orchestrations and rules. Can you use place logic in the stored procedure
instead of using the rules engine?
You may get a lot of transaction deadlocks using the SQL adapter. Try
reducing the transaction isolation level in the stored procedure if this
occurs (the default is serializable).
Post again if you need any more help.
The Bloggers Guide to BizTalk