|
Can anyone explain me please why following script:
script-1: ###### set names utf8; ----------------------------- [ 1 ] create database 'localhost/3333:C:\MIX\firebird\QA\fbt-repo\tmp\c4881.fdb' default character set utf8; set planonly; set sqlda_display on; select _octets ' . . .literal string containing ONLY ascii characters with total length = 65535 bytes ...' as ascii_only from rdb$database; -- produces: ========== OUTPUT message field count: 1 01: sqltype: 452 TEXT scale: 0 subtype: 0 len: 32767 charset: 1 OCTETS : name: CONSTANT alias: ASCII_ONLY : table: owner: And this: script-2: ###### set names NONE; --------------------------- [ 2 ] create database 'localhost/3333:C:\MIX\firebird\QA\fbt-repo\tmp\c4881.fdb' default character set utf8; // ... the rest is the same as in script-1 ... -- produces: ========== OUTPUT message field count: 1 01: sqltype: 452 TEXT scale: 0 subtype: 0 len: 65533 charset: 1 OCTETS : name: CONSTANT alias: ASCII_ONLY : table: owner: From where values '32767' and '65533' come ? 1) If connection charset is UTF8 than is is possible to create literals that will use MORE than two bytes for encoding (in practice one may easy to create literals with 3 bpc; I could not create with 4 bytes, but this seems due to my poor windows) So, value '32767' is wrong, it should be floor( 65535 / 4 ), isn't ? 2) from where 65533 rather than 65535 for second test ? |
from rdb$database;
Produces sqllen = 65533 (not the value you said) because internally it's converted to varchar and that is the max. That will cause a runtime error.