These are useful when I occasionally need to migrate schemas and/or data to a new database, especially where triggers are involved. Also, if I need to reformat any of the table data, these provide some options to help with that process.
In my case, I have the mysqldump program installed here:
C:\Program Files\MySQL\MySQL Server 8.0\bin\mysqldump.exe
These examples use ^
as the line continuation token for Windows. Use \
instead, if you are on Linux.
It is up to you where you place your my.cnf
file
In my case, I explicitly mention the location in my commands. The file contains a mysqldump
section similar to this:
|
|
See here for more notes on config files.
There are many valid variations on the following - these are just to get me started.
The following command dumps only the schema. It does not dump data or triggers:
|
|
Note that the use of --databases
is needed to ensure the --add-drop-database
option actually generates a DB drop
statement.
Only INSERT
statements are generated. In particular, triggers are skipped:
|
|
Note the use of --no-create-info
to prevent CREATE TABLE
statements from being generated.
Note the use of --complete-insert
so that the INSERT
statements also include all column names.
Only trigger statements are generated:
|
|
This can be particularly useful if you need to set up a copy of a schema containing data - but you need to avoid having all the triggers fire when you insert that data. First insert all the data - and then create the triggers.
You can control how dumped data is formatted. This can be useful if you need to parse that data (e.g. to modify/transform it after extracting it from the database).
By default, mysqldump
will generate INSERT
statements for your data, where:
(
and )
parentheses;'
single quotes;,
commas.Example:
|
|
However this is not easily parseable. For example, look at that column data with a comma in it:
'bat, "with a comma"'
Don’t even try to use a regex - that road eventually leads to madness.
Instead, you can do something like this:
|
|
The use of --tables
means that all the arguments following the my_db
database name will be interpreted as the list of tables you want to export. See the final line of the above command.
The output formatters are:
|
|
That “optionally” means that non-textual data (such as numbers) will not be enclosed in double-quotes.
This gives us formatted data similar to standard CSV data - and that is much easier to parse using any decent CSV parsing library in the language of your choice:
Instead of this:
(1,'foo','bar',...),(2,'baz','bat, "with a comma"',...),...;
we get this:
1,"foo","bar",...
2,"baz","bat, \"with a comma\"",...
...;
So, we know exactly where each row of data starts and ends - and any double-quotes encountered in text are automatically escaped: \"
. (Commas inside double-quotes no longer cause parsing difficulties.)
You may prefer to use SELECT INTO statements or TABLE statements to create formatted data. You select data from a table into a file, and specify how the data is to be formatted in that file. This is very similar to the way you can format data using mysqldump
, as shown above.
The following extracts data from a table into a CSV-formatted file:
|
|
Note in the above example I used TABLE my_table
to export all columns and all rows from my_table
. See table statements for details.
(I could also have written a specific SELECT
statement, instead, to have more control over what rows/columns get exported to my file.)
This is one way to ensure you have a file which can be read correctly by a LOAD DATA
statement.
LOAD DATA statements are the complement of SELECT INTO
statements.
The following loads data from a CSV-formatted file into a table:
|
|
Run these commands from a MySQL batch file, or a mysql>
command prompt (or in MySQL Workbench or similar).
MySQL may complain that you do not have the required FILE
permissions to export formatted data, so be prepared for that:
GRANT FILE ON *.* TO `someuser`@`%`;
FLUSH PRIVILEGES;
A user who has the
FILE
privilege can read any file on the server host that is either world-readable or readable by the MySQL server. (This implies the user can read any file in any database directory, because the server can access any of those files.) [ref.]
See here for further guidance.
Additionally (and separately from the above grant), you may only be able to write files to (and read files from) a specific directory, as specified by the following:
|
|
This shows me what directory I can use for uploading/creating files - in my case, this is:
C:\ProgramData\MySQL\MySQL Server 8.0\Uploads\
This path needs to be used as follows in the above LOAD
and SELECT INTO
statements:
infile 'C:/ProgramData/MySQL/MySQL Server 8.0/Uploads/my_table.csv'
and:
into outfile 'C:/ProgramData/MySQL/MySQL Server 8.0/Uploads/my_table.csv'
Note the use of forward slashes (/
) in place of Windows backslashes (\
).
You may also need to increase the MySQL server parameter innodb_buffer_pool_size
from its default value of 8 MB. Edit this setting in your server’s my.ini
file, and then restart the server. Recommended: 64 MB. Check the current value with this command:
|
|
If you have MySQL Shell installed (and a compatibe version of MySQL server), you can do similar things - both at the mysql-shell
command line and in Windows/Linux shell scripts.
This is not a MySQL Shell tutorial, but here are two examples:
At the mysql-shell
interactive command line:
|
|
The empty array []
means “all tables”.
Or, at the Windows command line (e.g. for use in shell scripts):
|
|
The MySQL Shell util
object (as used above) supports the following for script integration:
|
|