|
最近因测试需要,在本地将7G左右的MySQL文件导入到本地MySQL,然而由于里面数据量较大的表存在Innodb的表类型,拿到如速度简直就是坑爹的速度,经历了近40多个小时的折磨之后,会员表才开始~~
气煞~~果断终止导入进程,测试MyISAM表的导入速度~~同样的一张表,440万的数据,基于Innodb时用了20多个小时,基于MyISAM时用了不到6分钟~~~狂汗啊~
于是乎,操起python脚本,开始写相关代码,转换开始~~用了差不多21分钟的时间,将7G的数据完整的检查了一次,并把Innodb替换成了MyISAM,4000多万行处理结束了~~- #! /usr/bin/env python
- #-*- coding:utf-8 -*-
- # 修改 MySQL 到处 SQL语句的引擎
- import os, sys, re, shutil, time, math, copy, string, datetime
- def write_file(file_name, content, mode = 'ab+'):
- wf = open(file_name, mode)
- wf.write(content)
- wf.close()
-
- def print_lines(lines):
- if lines%1000==0:
- print 'Found ' + str(lines/1000) + ' * 1000 records'
-
- SCRIPT_ROOT = os.path.dirname(sys.argv[0])
- MySQL_FILE = r'E:\bbs_ipad.sql' # 含有Innodb 的原始数据库文件
- MySQL_NEW_FILE = r'E:\bbs_ipad_new.sql' # 转换后的文件保存
- Log_FILE = r'E:\logs.log' # 记录相关转换信息的日志文件
- currentLine = 1
- totalLine = 0
- emptyLine = 0
- eofLine = 100 # 如果连续出现15行无数据,则认为终结
- if os.path.isfile(MySQL_FILE)!=True:
- print 'MySQL file Not exist!'
- exit()
-
- lf = open(MySQL_FILE, 'rb')
- nf = open(MySQL_NEW_FILE, 'ab+')
- if not lf or not nf:
- print r'Failed when MySQL File opened '
- exit()
- starttime = datetime.datetime.now()
- while True:
- line = lf.readline()
- line = line.strip()
-
- if emptyLine > eofLine:
- break
-
- if not line:
- print_lines(currentLine)
- emptyLine += 1
- currentLine += 1
- continue
- else:
- emptyLine = 0
-
- if not re.compile(r'(?is)^.*?ENGINE[ ]{0,5}=InnoDB[ ]{0,5}.*?).match(line):
- print_lines(currentLine)
- nf.write(line + "\r\n")
- currentLine += 1
- continue
-
- newLine = line.replace(r'InnoDB', r'MyISAM')
- nf.write(newLine + "\r\n")
- write_file(Log_FILE, str(currentLine) + "\t" + str(line) + "\tTo\t" + str(newLine) + "\r\n")
- print_lines(currentLine)
- currentLine += 1
- lf.close()
- nf.close()
- endtime = datetime.datetime.now()
- print (endtime - starttime) # datetime.timedelta(0, 23, 797000)
- print currentLine - 100
- print r'Finished'
复制代码 代码仅作为阅读,为了确保不变样,下载附件吧~
贴一张记录的图,呵呵
|
|