You are using MySQL Cluster and crazy enough to digest NDB API? Sick of SQL? Here’s a treat: a function to make C/C++ strings ready for inserting into a VARCHAR
field. The special thing about them is that the length is prefixed in the first 2 bytes.
void make_ndb_varchar(char *buffer, char *str)
{
int len = strlen(str);
int hlen = (len > 255) ? 2 : 1;
buffer[0] = len & 0xff;
if( len > 255 )
buffer[1] = (len / 256);
strcpy(buffer+hlen, str);
}
Yes, you can use memcpy
. Whatever floats your boat.
Lets use this function for a table t1
, defined as follows (note: latin1
!):
CREATE TABLE t1 (
id INT UNSIGNED NOT NULL,
vc VARCHAR(128),
vclong VARCHAR(1280),
PRIMARY KEY (id)
) ENGINE=NDB DEFAULT CHARSET=latin1
Here is part of the code, simplified for this post:
char vc[128+1]; // Size of 'vc', +1 for length info
char vclong[1280+2]; // Size of 'vclong', +2 for length info
..
make_ndb_varchar(vc, "NDB API kicks ass");
operation->setValue("vc", vc);
..
The above example uses latin1
. You could use Unicode
, but that would probably mean converting from one encoding to the other using [iconv](http://www.gnu.org/software/libiconv/)
. That’s another story.
This post complements Johan Andersson’s blog entry. Thanks to my colleagues Mats and Roger who helped me with a silly problem today regarding this function.
Comments
One little note. make_ndb_varchar() should not look at the length of the string passed to it but at the type of the column using Column::getType(), see:
http://dev.mysql.com/doc/ndbapi/en/ndb-column-methods.html#ndb-column-gettype
to see if it is Varchar, Longvarchar, or something else listed in:
http://dev.mysql.com/doc/ndbapi/en/ndb-column-types.html#ndb-column-type
I suspect make_ndb_varchar(vclong, “Easter egg”) will not work as expected (depending on expectations).
/Gustaf
And if some is interested in NDBScanFilters :) - http://johanandersson.blogspot.com/2006/10/ndbscanfilter.html