You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Currently, when the table cardinality (number of records) gets estimated, all records are considered to be compressed twice in size (hardcoded compression ratio of 0.5). It would be better to have some estimation (even rough) of the actual compression ratio, because in practice it can differ much from the expected value. This becomes more important starting with versions 2.1.4 and 2.5.0 that have the join cost more depending on the proper cardinality estimations than in prior versions.
Submitted by: @dyemanov
Currently, when the table cardinality (number of records) gets estimated, all records are considered to be compressed twice in size (hardcoded compression ratio of 0.5). It would be better to have some estimation (even rough) of the actual compression ratio, because in practice it can differ much from the expected value. This becomes more important starting with versions 2.1.4 and 2.5.0 that have the join cost more depending on the proper cardinality estimations than in prior versions.
Commits: 11297a0 b5dc562 f19f33a d81d678
The text was updated successfully, but these errors were encountered: