How to efficiently insert millions of datasets to a MySQL database using PHP? -
this question has answer here:
i run php script reads data rows file, analyses them , inserts them 1 one local mysql database:
$mysqli = new mysqli($db_host, $db_user, $db_password, $db_db); if ($mysqli->connect_errno) { echo "failed connect mysql: (" . $mysqli->connect_errno . ") " . $mysqli->connect_error; } else { /* long there data in file */ while(...) { ... // analyse each row (contained in object $data) /* write database table. */ $mysqli->query($data->getinsertquery($db_table)); } }
i have 40 million data rows. first couple of million datasets inserted fast, in last 6 hours 2 million inserted (i'm @ 30 million), , seems becomes slower , slower (so far, no index defined!).
i wondering, if more efficient way of writing data table. if possible i'd preferr solution without (temporary) files.
you more efficient first translate file sql 1 (so simple change script write down statements file) , load using mysql command line that:
mysql -uuser -p dbname < file.sql
over such large import, save quite bit on overhead comes using php. remember stream data file 1 query @ time ;)
Comments
Post a Comment