I have successfully moved 10 million rows into a new table in 1 MS and incurred no IO through IO stats; however, IO has to be incurred to update meta data, although it should be minimal.This method has limited use, but can be extremely advantageous.
Updating millions of records in oracle
There are many ways of creating a column with a default value or updating an existing column.
Some of the options used to speed up heavy DML jobs are: using parallel processing, the export/import utility, partitions or by simply breaking the activity into multiple jobs.
Databases are often taxed by applying SQL statements to enormous tables.
One such activity is when we add a new NOT NULL column with default value in a huge transaction table. I will discuss here, the addition of a new column with default value specifically; however, the methods discussed below can be used for other kinds of batch processing also.
This blog post is more a tip that I picked up on while at PASS 2009.
Have you ever had the need to copy the contents of an entire table into another table?
However, if the concerned table has records numbering in the millions, then the updates need to be revisited and the database settings for undo segments and temporary tablespace sizes need to be considered.
A huge DML/DDL activity would take lot of time and would result in space usage problems and heavy resource utilization, hence such batch processing is normally scheduled in off peak hours.
I reasoned that, because the index data are physically stored in a way that depends on the values, changing the values could involve a lot of physical restructuring on disk.