Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Optimize creation of UNIQUE constraints which can be determined to be already Unique based on other constraints. [CORE4956] #5247

Closed
firebird-automations opened this issue Oct 9, 2015 · 9 comments

Comments

@firebird-automations
Copy link
Collaborator

Submitted by: Tim Kelly (m00bh000)

Optimization request:

I have a large database and regularly have to create/modify UNIQUE constraints. Most of my unique constraints contain two fields where the first field is already the primary key, I create them to allow foreign keys to point to them. I notice that is can sometimes take a long time to create the unique constraint. Presumably a check is being performed that the existing data will not violate the unique constraint. If so this step should not be required if it can be determined that the UNIQUE constraint will already be unique by definition. In my case:

ALTER TABLE MASTER ADD CONSTRAINT C UNIQUE (A,B);

Where A is the primary key of MASTER. We know 100% that the existing data will not violate this constraint.

Since A is already known to be a primary key then it should not be necessary to check the data before creating the constraint.

If I then created another constraint D:

ALTER TABLE MASTER ADD CONSTRAINT D UNIQUE (A,B,C);

Again no checks should need to be carried out as we already know that A,B is unique so A,B,C will always be unique too.

@firebird-automations
Copy link
Collaborator Author

Commented by: Omacht András (aomacht)

Tim,

and what should happen, if you drop the primary key? The engine should drop C and D uniques too, or it sould recreate C as "real" unique key?

@firebird-automations
Copy link
Collaborator Author

Commented by: Sean Leyne (seanleyne)

Tim's comment was not that the indexes/constraints should be linked, but rather that if a unique constraint contains a PK it is not necessary to check uniqueness of the existing data.

@firebird-automations
Copy link
Collaborator Author

Commented by: Sean Leyne (seanleyne)

Tim,

I am not sure that the testing for uniqueness is as expensive as you believe, because the significant part is system creating an index which will be used to validate any new values, since as was pointed out it would be possible to drop the PK and the new constraint would still need to be validated.

Separately, why are you "regularly" creating new constraints?

@firebird-automations
Copy link
Collaborator Author

Commented by: @hvlad

There is no extra overhead when engine build unique index compared with non-unique index on the same fields.
You can easy check it by yourself.

@firebird-automations
Copy link
Collaborator Author

Commented by: Sean Leyne (seanleyne)

Vlad, who was the comment directed to?

@firebird-automations
Copy link
Collaborator Author

Commented by: @hvlad

Sean,

to Tim of course.

@firebird-automations
Copy link
Collaborator Author

Commented by: Tim Kelly (m00bh000)

Thanks for your comments.

Fair enough - if there is little performance benefit to creating a non-unique key then I suggest close this thread. Sorry I should have spent the time to look at this before posting.

The constraints are being regularly rebuilt as the database is being generated by an object modelling system.

@firebird-automations
Copy link
Collaborator Author

Modified by: Sean Leyne (seanleyne)

status: Open [ 1 ] => Resolved [ 5 ]

resolution: Won't Fix [ 2 ]

@firebird-automations
Copy link
Collaborator Author

Modified by: @pcisar

status: Resolved [ 5 ] => Closed [ 6 ]

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

1 participant