Optimizing import large sql dump

Categories: optimization

I recently wanted to import a large dump of github data provided by ghtorrent[http://ghtorrent.org/] .

The data was 99GB and I wanted to know the progress.

There is a cli tool called pv that gives progress of the dump file read.

So, one can use

pv sqlfile.sql | mysql -uxxx -pxxxx dbname

Another tip, I learned was that the default mysql options are not optimized for your system config.

This site provides a wizard to get an optimal mysql config options for your system. The tool is maintained by Percona,service provider built on top of mysql.

My references: Source 1