ERROR: Got a packet bigger than 'max_allowed_packet' bytes

Greetings,

I am trying to set up a script to import 10,000 items to the mysql database through a loop. Instead of doing 10,000 inserts, I’ve come up with something like this:

[php]for ($c=0; $c < 10000; $c++) {
$inserts[] = sprintf(,$variables,);
}
$query = implode(",", $inserts);
mysql_query(“INSERT INTO table (,) VALUES $query”) or die(mysql_error());[/php]

So all of the items get inserted into the database with just 1 query… During the insert, I now get this error: “Got a packet bigger than ‘max_allowed_packet’ bytes.”. Note that the size of my test upload file is about 16.5M.

Does anyone know how to fix this? I’ve set my maximum file upload size to 20M with a phprc file, so people can upload a healthy maximum of 10,000 items onto my site. Eventually, I would like to at least double the maximum in case people want to import a little more. So it’s not like I’m allowing members to upload 100M to Gigabyte-sized movies and files, and it importing probably won’t be done very often.

Please let me know how to go about this.

Thanks
Kind regards

It sounds as though you’re trying to insert all 10,000 items in a single SQL query. You’ll need to break it up into reasonably sized chunks, each one in a separate query.

Thanks for the response Andrew F. I was just thinking maybe I would have to limit each insert to 2500 items. Is there some built in function to do this or do I have to do something more manual like this:

[php]for ($i = 0; $i <= $num_inserts/2500; $i++) {
for ($j = $i2500; $j <= ($i2500) + 2500; $j++) {

}
}[/php]

Do you know what the max_allowed_packet size actually is? This will help me determine the limits for each chunk size.

Thanks
Kind regards

max_allowed_packet is set to 16 MB (16777216 bytes) on most of our servers. You can get the exact value by running this SQL query (how circular!):

As far as code goes, my recommendation would be to flush the query out whenever you start getting close to the maximum packet size:

[php]
$sql_prefix = "INSERT INTO table (…) VALUES ";
$values = “”;
foreach ($item_to_insert as $item) {

if ($values) $values .= “,”;
$values .= sprintf(…);

if (strlen($values) > 16e6) { // a bit under the 16 MB hard limit
    mysql_query($sql_prefix . $values);
    $values = "";
}

}

if (strlen($values)) {
mysql_query($sql_prefix . $values);
}
[/php]

It is more efficient to use load data to insert large amount of records to mysql

A bit, but it’s considerably harder to use from PHP. You can get most of the performance benefits of LOAD DATA by running the import in a transaction (if you’re using InnoDB — which you should be), using INSERT DELAYED, and disabling keys during the insert.

Thanks for the responses. It seems that as the table grows larger and larger, the inserts take longer and longer. I’ve been adding 10,000 items at a time and the inserts get slower each time…

I am currently using the MyISAM database because it has the FULLTEXT search feature, which is required so people can search for items on my site. I don’t know what else I would use for search except for Sphinx, which requires a Private Server.

Apparently Dreamhost has to update to MySQL version 5.6.4 to get FULLTEXT ability on innoDB? https://blogs.oracle.com/MySQL/entry/mysql_5_6_4_development

innoDB, importing, dealing with large tables and large inserts are all new to me.

Also, when someone uploads a 16MB text file, is it normal for it to take 60 seconds just to upload the file (I have high speed cable internet)? Anything I can do to speed that up? I am splitting up my import into 3 parts. Part one takes 60 seconds to upload file, verify correct file type, re-name it save it in a directory, then user clicks link to start part 2 which will open saved file, verify and sanitize each row and write a new sanitized txt file into a different directory (about 30 seconds). Part 3 simply opens this new txt file and inserts the items to the database using the looping method above (takes 20 seconds to insert 10,000 items starting with an empty table - takes about 120 seconds to insert 10,000 items once table has already grown to 100,000+ items). Dividing all of this into 3 parts was to help prevent timing out.