This is another little handy utility I’ve found myself using recently.
If you’ve ever worked with phpMyAdmin you know that it often times place a restriction on the .sql file size that you wish to upload (size depends on your hosting typically). This can be a burden when you want to import a particularly large database. That is when Sql Dump Splitter (new link edited) comes to the rescue.
Just specify the max file size that you can upload, and Sql Dump Splitter will take care of the rest. It will split your large .sql file out into smaller, ordered, numbered files. You can then easily import them one by one using phpMyAdmin.
So large .sql file imports are no longer an issue! Thanks SQL Dump Splitter.
this spliter not working at all, it just create files of 0 bytes huh…..
Hi Nik,
It’s giving you no errors, and no log files to look at?
It seems the software only support the sql file less than 2GB. How to deal with the larger sql files?
Hi Xiaoxiao, For splits larger than 2GB, I am not sure. I have never had to split a file that large, but I’m sure there are utilities to do it. Good luck!
I have the very same problem
:( I don’t need to split SQL dumps smaller than 2GB so a 2GB limit doesn’t help at all.
Same problem… :( 2GB Limit…
The site is not working at all could you please check
Hi Kusaboo,
I have updated the post with the Sql Dump Splitter 2 link. The site has been changed, but the url is: http://www.rusiczki.net/2007/01/24/sql-dump-file-splitter/
Hi.
Try out http://sqlsplit.com
This is online .sql file splitter.
thanks Vladimir, although I see that the source code is available on github, I would advise anyone to be cautious about uploading sql scripts containing passwords or other sensitive information to an online service. Just make sure you use common sense.
Have compressed the files before uploading??? If not just compress the .sql into .zip or .rar format and import it to your database. A 2GB file is compressed to 6000 MBs Approximately.
The PHP myadmin will automatically extract the uploaded files and import them to the database or table u want to update.
Hope this helps….
oh sorry 600 MBs approximately.
Hi there,
I just wanted to mention, that I’ve rewritten this very old tool of mine, now cross platform with a real parser and in 64 bit. So it handles files more intelligent and should be able to deal with files over 2 GB. :)
Checkout https://philiplb.de/sqldumpsplitter3/
Thank you for the update, and for creating this tool, Philip!
Your tool just saved my life! Thank you!
Hi,
thank you so much i used it worked
❤❤❤❤❤❤❤❤❤
Love from Botswana :)
I’m glad it worked!
Thanx a lot
This saved me so much time, and hair!! Thank you very much.
Hi
I am trying to split a very large SQL file (44 GB). while it has created 44 – 1 GN files for me, they are same in content. I was expecting inserts to be split across files. Is there something that i am doing incorrectly ?
Thanks
Akshay
Hi Akshay, I would also expect it to split the content. I can’t say what went wrong, you’d have to talk to the developer of the tool. Thanks for commenting!