Hi,
I have two table as follows
Tabel Cat
{
ID
Update Datetime
}
Table Master
{
UniqueId
Cat Datetime
}
I wanted to insert millions of rows from Cat to Master table and can be achieved from the following query
Code Snippet
IINSERT INTO Master (UniqueId, Cat )SELECT ID, Update from Cat
Question : Is there any better way do achieve the same operation with better performance mean to say that the execution should be less than the time it takes to execute the above statement
Thanks for your time
~Mohan
There are some elaborate things you might do (looping, BCP etc) but the time to develope and test those will MORE than eat up the time it takes to run a probably-quite-efficient update like you've create above.
You might experiment with disabling the indexes on the target table, and then rebuilding them after the insert is complete (if your application will tolerate the indexes being offline for a period). Indexes tend to generally slow down inserts.
Code Snippet
ALTER INDEX ALL on Master DISABLE;
IINSERT INTO Master (UniqueId, Cat )
SELECT ID, Update from Cat
ALTER INDEX ALL ON Master REBUILD;
|||Hello Mohan,
'Insert into' statement writes to log and will take a long time to insert millions of rows. I would use the follwoing to make it quicker...
1. drop any existing second table. In your case t is 'table Master'.
2. Execute this query - select * into master from Cat
3. This will create and populate the rows into master table. Note 'master' table will have the same column names as in Cat.
4. You can rename the columns using 'sp_rename'. Look it up in BOL. Check if you can rename the columns in designer.
Regards....
|||Hi ,
If we do not have indexes on the table then the query will be same as the one i mentioned. In that case it will be time cosuming to insert millions of rows..right?
~Mohan
|||Hello AsianIndian,
I can't drop the table and the column name are different and i cannot change the column names too as the master tabel has different intentiion.
Can u please let me know if there is any other way of doing it
~Mohan
|||Hello Mohan,
1. Do you have any rows in Master table? If yes, approx how many?
2. Did you try BCP OUT and BCP IN?
regards......
|||Hello,
BCP is a command line utility to read into or form a file. I don't think so we cann't use it to dump data from one tabel to another table.
~Mohan
|||Disabling the indexes is a very good idea. You might also see if you can partition the rows you are inserting into seperate batches of some length (based on testing)
The problem is that each insert requires a log write. So one row at a time in a normal RDBMS operation can take longer than doing multiples. However, if you try to insert too many rows at a time, you get a situation where it takes a lot more time because things cannot be committed, using more memory.
However, along the lines that the other person mentioned, I would consider writing a simple SSIS (2005 and up) or DTS (2000 and earlier) package to bulk insert the data (might be labeled fast load) along with the disabling/dropping of indexes. I have used this technique on several occasions to reduce a process of inserting rows that took over an hour to just taking a few minutes.
|||Hi,
I am agree with rusag2 and Louis Davidson..
I also would like to suggest to use SSIS or DTS to transfer the huge number of data and run it .. and run them on the server throuch scheduled job(instead on Client) which makes it to perform much faster than runnin it in Management Studio or Query Analyzer..
No comments:
Post a Comment