We have developed an application for a telecom service provider with subscriber base of 15 million. The application uses logminer utility to capture data from the archive logs. It requires enabling supplemental logging to capture information on updates.
As the data that is required from archive logs is very selective (for around 5% of the tables and only few columns per table) we are creating supplemental log groups (with ALWAYS option) on these tables only.
In all only 5% of transactions effect the tables on which we create the supplemental log groups.
Can anyone give an indiacation of the performance impact this is going to have on the database under these conditions.
what did you see when you tested it?
Are you using Data Mirror or Golden Gate ?