Wednesday, March 7, 2012

Fast Way To "Insert Into" a million records?

See the SQL below, on our SQL server this takes about
10min for 50,000 records, and about 3 hours for a million
records. Is there ANYTHING I can do to speed this up?
-Can I allocate DB space ahead of time?
-Can I put a table in to some type of lock mode?
-Is there something better than insert into?
INSERT INTO MaintHist (DebtorID, AssignCollector,
ChangeCollector, DateChanged, TableName, FieldName,
OldValue, NewValue)
SELECT TempUpdate.RecordUniqueValue, CAST
(TempUpdate.CurrentCollectorID AS VarChar(4)), 'BATC',
GetDate(), TempUpdate.TableName, TempUpdate.FieldName,
TempUpdate.OldValue, TempUpdate.NewValue FROM TempUpdate
WHERE TempUpdate.BatchID = 505
Thanks!
Jason
you can try to Bulk copy it in as this is non-logged. However that may NOT
be the right solution for you.
it's also fairly common to drop indexes before you do huge inserts and then
rebuild them when inserts are complete.
just food for thought
Greg Jackson
PDX, Oregon
|||Comments Inline
Geoff N. Hiten
Microsoft SQL Server MVP
Senior Database Administrator
Careerbuilder.com
I support the Professional Association for SQL Server
www.sqlpass.org
"Jason Roozee" <jason@.camcoinc.net> wrote in message
news:16e3c01c448b6$d7c55ca0$a601280a@.phx.gbl...
> See the SQL below, on our SQL server this takes about
> 10min for 50,000 records, and about 3 hours for a million
> records. Is there ANYTHING I can do to speed this up?
>
Note that your example indicates an insert speed of 5000 records/minute.
That means 200 minutes for 1M rows. 200 minutes = 3.33 hours, so you have
better than linear scaling. This is good.

> -Can I allocate DB space ahead of time?
Yes. Expand the database before adding the records. This will improve
performance some.
> -Can I put a table in to some type of lock mode?
SQL handles this automagically. I doubt you could improve performance with
a locking hint.
> -Is there something better than insert into?
You could try a DTS package to do the transfer, but I am not sure if that
will help.
There is always faster hardware.
>
> INSERT INTO MaintHist (DebtorID, AssignCollector,
> ChangeCollector, DateChanged, TableName, FieldName,
> OldValue, NewValue)
> SELECT TempUpdate.RecordUniqueValue, CAST
> (TempUpdate.CurrentCollectorID AS VarChar(4)), 'BATC',
> GetDate(), TempUpdate.TableName, TempUpdate.FieldName,
> TempUpdate.OldValue, TempUpdate.NewValue FROM TempUpdate
> WHERE TempUpdate.BatchID = 505
> Thanks!
> Jason
|||How would a BULK COPY be done?
Jason Roozee

>--Original Message--
>you can try to Bulk copy it in as this is non-logged.
However that may NOT
>be the right solution for you.
>it's also fairly common to drop indexes before you do
huge inserts and then
>rebuild them when inserts are complete.
>
>just food for thought
>
>Greg Jackson
>PDX, Oregon
>
>.
>
|||To add some to Geoff's comments. You might try inserting them in smaller
batches if at all possible. When inserting into an existing table it is
usually faster to do insert them in batches of say 10,000 rows vs all 3
million in one transaction. If you have a clustered index on the table
being inserted into you should try to insert them in that order as well.
Andrew J. Kelly
SQL Server MVP
"Geoff N. Hiten" <SRDBA@.Careerbuilder.com> wrote in message
news:O7ZfMlLSEHA.3056@.TK2MSFTNGP11.phx.gbl...
> Comments Inline
> --
> Geoff N. Hiten
> Microsoft SQL Server MVP
> Senior Database Administrator
> Careerbuilder.com
> I support the Professional Association for SQL Server
> www.sqlpass.org
> "Jason Roozee" <jason@.camcoinc.net> wrote in message
> news:16e3c01c448b6$d7c55ca0$a601280a@.phx.gbl...
> Note that your example indicates an insert speed of 5000 records/minute.
> That means 200 minutes for 1M rows. 200 minutes = 3.33 hours, so you have
> better than linear scaling. This is good.
> Yes. Expand the database before adding the records. This will improve
> performance some.
> SQL handles this automagically. I doubt you could improve performance
with
> a locking hint.
> You could try a DTS package to do the transfer, but I am not sure if that
> will help.
> There is always faster hardware.
>
|||look up BCP in Books On Line for all the details.
Commonly used utility for blasting lots of data into sql server.
GAJ
|||Im already using Bulk Insert to get the new data in to the
server, but now I need to update data from one table to
another table.
Jsason
>--Original Message--
>look up BCP in Books On Line for all the details.
>Commonly used utility for blasting lots of data into sql
server.
>
>GAJ
>
>.
>
|||to do massive updates, you'll probably want to batch the updates into groups
as others have suggested.
Cheers,
GAJ

No comments:

Post a Comment