Wikipedia:Database dump import problems

From Wikipedia, the free encyclopedia
Note: This page discusses the SQL dump format, which is now obsolete. New Wikipedia dumps are in XML format.

This page attempts to resolve problems that can occur while importing one of the downloadable database dumps into a local database.

Working my.cnf[edit]

sample my.cnf — Here is a working (as of 8 July 2005) my.cnf configuration for Linux. This should allow you to simply cat the uncompressed dump to the mysql client without any other command line arguments or stream edits.

"MySQL server has gone away"[edit]

This error message usually occurs when trying to import the page Wikipedia:Upload log archive/May 2004 (1) because it is bigger than 1 MB (more precisely: the SQL statement required to describe it is bigger than 1 MB). The (uninformative) error message occurs because this exceeds the default value for max_allowed_packet, one of MySQL's configuration variables.

It is harmless to increase the size of that variable to 32 MB or even more. You can do so in /etc/my.cnf under Linux (make sure to do it for both the server and client), or in the winmysqladmin application under Windows (in the latter, go to the "my.ini Setup" tab and add the line set-variable=max_allowed_packet=32M). Please note the exact syntax may vary depending on your version of MySQL. See the [1] for more information regarding the error and how to fix it.

Dealing with file size problems[edit]

If you are trying to import the database but are having problems downloading or uncompressing the tar file, you might try something along the lines of

wget -q -O- http://download.wikimedia.org/whatever | bunzip2 | mysql {some import options}

I haven't had time to try this yet. Please report back with any successes or problems. Mr. Jones 20:33, 21 Feb 2004 (UTC)


This worked fine for me (for {some import options} I just put the database name). Wmahan. 21:44, 2004 May 1 (UTC)


The (amended) suggestion did not work for me as wget does not support files > 2GB (See [2]) and MySQL tables do not (by default) support sizes > 4GB (See [3]). I'm currently trying

lynx -dump http://download.wikimedia.org/archives/en/20041126_old_table.sql.bz2 | bunzip2 | sed "s/InnoDB PACK_KEYS=1/InnoDB MAX_ROWS=50000 PACK_KEYS=1/g " | mysql wp

Mr. Jones 17:50, 6 Dec 2004 (UTC)

It seems to have worked. There might be a better way to increase the size of the table here (the db won't import with the default limit of 4Gb) Mr. Jones 13:56, 16 Dec 2004 (UTC)

  • Indeed setting myisam_data_pointer_size to 7 works as well, which seems better. Brighterorange 8 July 2005 16:25 (UTC)
  • Curl appears to also have the 2gb download limit.

66.87.1.3 02:39, 1 August 2005 (UTC)[reply]

Got error 27 from table handler[edit]

As of the start of the 2005, the current article dumps for the English Wikipedia will produce a table in MySQL which is larger than 2 gigabytes (2,147,483,648 bytes). If this happens, and your operating system does not support files larger than 2 gigabytes, then MySQL will abort the import, with an error like this: "Error 1030 at line 2214 - Got error 27 from table handler". For example, this problem is known to happen in versions of Windows using FAT32 or Linux 2.2 (which for example is still the default kernel in Debian's current stable "woody" release). To resolve this problem, you need to upgrade to a file system which supports files larger than 2 gigabytes (such as Linux 2.4 or later).