I have a table with 5 million rows of data.
I need to do a export without adding the PK column. mysqldump cant select columns.
python export.py > insert.sql
But this is taking forever.
I understand if fetchall() first gets all the data into the buffer, but I though fetchone would take one row at a time in buffer.
I need to do a export without adding the PK column. mysqldump cant select columns.
sql = """SELECT `col2`,`col3`,`col4`,`col5`,`col6`,`col7` FROM `tbl1`;""" cursor.execute(sql) while True: r = cursor.fetchone() if r is None: break print "INSERT IGNORE INTO `tbl2` VALUES " , ("",str(r['col2']),r['col3'],str(r['col4']),str(r['col5']),str(r['col6']),str(r['col7'])),";"
python export.py > insert.sql
But this is taking forever.
I understand if fetchall() first gets all the data into the buffer, but I though fetchone would take one row at a time in buffer.
freemem : total used free shared buffers cached Mem: 8061084 8014704 46380 0 1228 40300 -/+ buffers/cache: 7973176 87908 Swap: 2000084 1673828 326256So how I do select all rows and print/update each one ?