I am currently running sql 2000 with 2 way merge replication... I have a problem of I need to delete about 3 million rows from a table that is constantly being accessed buy an e-commerce site. I have thought about setting up a filter to only replicate new or altered rows beyond a a certain indexed number... then delete from both sites the unneeded rows... this should work.. but can I delete the filter after I am done?
If this is not an option, any other suggestions? I cannot take either site offline and I cannot do anything that will take more than a few seconds to replicate... by deleting these rows. my table size will be reduced by almost 8GB...
thanks
Hi
There are several options for filtering data on merge publications here is some more information.
http://msdn2.microsoft.com/en-us/library/ms151775.aspx
These options were developed for scenarios like your where some of the subscribers only need a subset of the data.
As the filter is part of the article you can change the article to remove the filter if you don't need it anymore.
As your e-commerce site cannot afford any down time, I'd suggest you review docs to be sure you select the correct filtering solution.
When merge replication is setup correctly there should be mimimal downtime. There are also additional performance enhanments made in SQL Server 2005 for partition evaluation.
Thanks
No comments:
Post a Comment