How to handle large data transfers
I have huge SCCM tables transferred to another server/database for reporting purposes (ie: Add_Remove_Programs_DATA, etc. They are coming in as heaps (no indexes) into a "staging" schema. I need to copy them over to other areas tables with indices, etc.)
But I am getting log filled up erros and other problems. I have tried SSIS and am familiar with it, but don't know how to configure it efficiently to do these huge tables, often with 200,000,000+ rows. I understand there are some minimally logged operations,
but I still am not sure of the best way to do it. I thought someone on this forum has done this regularly and has gotten a system down. I have heard of some good methods along the way, but they work for smaller datasets, but take too long or fail with large
ones. Thank you for any input.
November 19th, 2010 10:15am