Ask Ubuntu Asked by Software Developer on November 19, 2021
I have a large mysql database backup file that is more than the maximum allowed by PHPMYADMIN. I have created the database and now wish to import the database back but the file is so large I cant import it as a single file. Now I know you can file split but that I’m afraid may mess up the queries and corrupt the database. Is there a way whereby I can split it at exactly where one query ends automatically so that I can upload it to my live server?
Im using Ubuntu 18.0.4 LTS
My server is remote (shared hosting) so can’t have that much control of configuring phpmyadmin
This solution only works if you are creating dump file as well as restoring it; e.g. using mysqldump/phpMyAdmin
I found another even simpler way; find find out which table(s) are the heaviest (using ls); those tables are usually the log table or primary table
sudo ls -Shlr /var/lib/mysql/[database_name]
Btw: -S is for sort by file size; h is for human readable (instead of giving file size in bytes use mb etc); l is for long listing and r is for reverse order when listing i.e. largest file will come at last.
Then at first; when you are dumping using mysqldump exclude those big file (don't worry they will be included in the second part dump file. e.g.
First dump file; Everything except those big table(s)
mysqldump -h localhost -u root --password='[password]' --add-drop-database --add-drop-table --add-drop-trigger --dump-date --single-transaction --routines --events --ignore-table=[databaseName].[tableName1] --ignore-table=[databaseName].[tableName2] [databaseName] > /mnt/[path]/backup/2020/07Jul/dbName_year_month_day_all.sql
second dump file; only consisting of those big table(s)
mysqldump -h localhost -u root --password='[password]' --dump-date --single-transaction [databaseName] [tableName1] [tableName2] > /mnt/[path]/backup/2020/07Jul/dbName_year_month_day_tab1_tab2.sql
By using this method you can have approximately equal size (2 or more) smaller dump files. which would easily upload and execute on the server.
On server size just import those dump files using any client. e.g. phpMyAdmin.
FYI: I noticed that using compression on dump file i.e. (.sql.zip format) gives bad gateway error on server.
Answered by Adeel Raza Azeemi on November 19, 2021
Emacs is a good option. It can open and make changes massive SQL dump file.
Please remember making changes to dump file (ESPECIALLY which is greater than 100 MB); might consumes all your memory (RAM and swab of small desktops or laptop) if you are not careful. By careful I mean if you try to delete more than 8000 line all at once; you will choke your system of memory. Because to delete 8k or 9k lines emacs will need massive amount of RAM. On my system with 16 GB RAM and 7.5 GB Swap; deleting 9k lines consumed all my memory (15 GB RAM and 7.5 GB swap); thus forcing OS (ubuntu) to terminate this (emacs) process.
Recommendation: Every byte counts; that means close everything that is unnecessary. Don't do this if you are in a hurry.
How to divide the dump file properly
My dump file contained 10917 SQL line statements. And I want to divide this dump file into 4 equal parts. Dividing this on 4 gives approximately approximately 3000 line par file. Dividing by line number failed because all the 99% chuck of file size end up in the first file i.e. 132 MB and the remaining 2 files were of 100 to 300 kb. So dividing by line numbers didn't work in my case; because MySQL dump file had 99% (by file size) in the first quarter. (By first quarter; I mean if you divide the total number of lines in the dump by 4). First quarter is all the create and insert table commands and in the remaining quarters are the stored procedures, functions, views and the DCL command.
Dividing by tables would also not work; because in my case; My dumb file contains approximately 70 tables. But only 11 table in datadir (datadir = /var/lib/mysql) are in MB, whereas all others are in kbs.
Highly recommends to "save as" before doing anything, because it will be totally insane to mistakenly corrupt the original file OR doing thing and wait several minutes before realizing that you had to undo the steps.
We will try to divide the dump file by basically focusing on the file size. E.g. my dumb file is 133 MB. So I will try to divide three part equal size files i.e. 44 mb. Mainly focusing on dividing the first quarter into equal part (3 for instance) and all the remaining quarters in a single file. We will "save as" original file 3 times, and then using emacs efficiently divide it into 3 equal parts.
1. Open the dump file in emacs
2. "Save as" the original dump file 3 times; i.e. for save as press ctrl+x and then w and then give filename_1.sql. Then second time filename_2.sql and then on third time filename_3.sql
3. Open the first file i.e. filename_1.sql.
4. Goto line at 1K line by M-g g. (i.e. press alt plus g and then g.)
5. Usually this line might be in a middle of some SQL (DDL / DML). Choose a wise decision by going up or down, where the previous statement ended. In mine case it was in middle of a stored procedure. I opt to go till the procedure ended. Remember that line number.
6. Delete all the lines after the selection. Alt+space to start selection and then press ctrl+shift+end to select till the end. then press delete button to delete.
7. save the file i.e. ctrl+s
then press ctrl+x and then c for exit.
Save all the remaining part by repeating from step 4.
Using this very simple process you can create multiple smaller dump files from a single giant sql dump.
For windows platform, you can use notepad++ instead of emacs (I prefer still emacs)
Answered by Adeel Raza Azeemi on November 19, 2021
I used this tool called SQL DUMP SPLITTER to chop the large file into small chunks of my choice. The tool is cross-platform and comes in appimage format meaning it can run almost on any linux distros. Just make it executable and run it(double click it).
Answered by Software Developer on November 19, 2021
Get help from others!
Recent Answers
Recent Questions
© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP