Performance tuning for data quality in SQL Server -
i working in sql server 2008. have database use run data quality validations on test files receive our source system before load our production data warehouse instance. in past, these test files have been small. but, receiving large amounts of data (on order of 3 [gb]). have been asked performance tune our database handle this.
because run data quality validations, have designed tables in db handle trashiest data possible. instance, though equivalent tables in production have appropriate data types, had create columns varchar(255), such import of test data tables doesn't run data type conflicts. also, there duplicates. so, tables simple. they're bunch of varchar(255) columns, no constraints, indexes, or key relationships defined.
also, tables truncated. general process follows: -a user loads test files tables -the user executes stored procedures against table, run our data quality tests -the next user loads test files tables , runs data quality tests ... -the table gets large, such need truncate table , re-start
given these parameters, there can performance tune db?
Comments
Post a Comment