-
Notifications
You must be signed in to change notification settings - Fork 0
/
Copy pathmysql
36 lines (24 loc) · 1.1 KB
/
mysql
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
MySQL mass dump/insert/copy:
DUMP:
mkdir /ssd/tmp/ads_lp_html2
chmod 0777 mkdir
mysqldump -u[user] -p[pass] fbstat ads_lp_html2 --where="aid>19519480" --no-create-info -T /ssd/tmp/ads_lp_html2
LOAD:
trzeba zmienic nazwe pliku z dumpem na nazwe tabeli do ktorej chcemy wrzucic dane
mv ads_lp_html2.txt html_test.txt
mysqlimport -u[user] -p[pass] --local fbstat html_test.txt
Dumping custom data to compressed (multiprocess compression) archive
mysql --host=[host] --user=[user] --password=[pass] --quick [db_name] < /home/ubuntu/query.sql | pv -b | pigz > ./archive.gz
We can control MAX_EXECUTION_TIME from query
SELECT
/*+ MAX_EXECUTION_TIME(999999999) */
id
FROM scheduling_eventlogvalue
WHERE
last_modified >= '2022-01-01 00:00:00';
Using mysqldump to save indicated tables:
mysqldump --host=[db_host] --user=[user] --set-gtid-purged=OFF --column-statistics=0 -p db_name table1 table2 > output.sql
and load it back:
mysql --host=[db_host] --user=[user] -p db_name < data.sql
Dumping without locking tables:
mysqldump --user=[user] --host=[host] --password=[pass] --lock-tables=false db_name > db_dump.sql