sql - Very large "delete from" statement with "where" close, should I optimize? -


i have classic "sales" database contains millions of rows in tables. on each of these large tables, have associated "delete" trigger , "backup" table.

this backup table keeps "deleted" rows last 7 days : trigger starts copying deleted rows backup table, perform delete in backup in fashion :

create trigger dbo.trigger on dbo.example_data delete insert example_backup select getdate(), * deleted delete example_backup modified < dateadd(dd, -7, getdate()) 

the structure of backup table similar original data table (keys, values). difference add in backup tables "modified" field, integrate key.

a colleague of mine told me should use "a loop" because delete statement cause timeouts/issues backup table contains several millions of rows. delete blow @ point ? should in different manner ?

it looks sybase 12.5 supports table partitioning; if design such data can retained 7 days (using hard breakpoint), partition table on day of year, , construct view represent current data. clock ticks past day, truncate older partitions explicitly.

just thought.

http://infocenter.sybase.com/help/index.jsp?topic=/com.sybase.dc20020_1251/html/databases/x15880.htm

otherwise, deleting in loop reasonable method deleting large subsets of data without blowing transaction log. here's 1 method using sql server:

http://sqlserverperformance.wordpress.com/2011/08/13/gradually-deleting-data-in-sql-server/


Comments

Popular posts from this blog

c# - SVN Error : "svnadmin: E205000: Too many arguments" -

c# - Copy ObservableCollection to another ObservableCollection -

All overlapping substrings matching a java regex -