/    Sign up×
Community /Pin to ProfileBookmark

Inserting ALOT of records into a DB at one time

Hello,

I have a CSV file which I need to insert into a DB for future needs, the csv at present contains 10k+ records.

If I comment out my insert SQL code, my row counter + csv file prints into my table with all 10k records.

The second I uncomment my code, the page prints (and inserts into SQL) about 350 rows of data and does not print out my footer or finishes rendering the table. It is like the connection dies or something says you are being greedy.

Is there a limit on how many records you can insert into mysql at one time and if so how can I fix my code?

I do not get any information in terms of mysql_error(), it just stops print “succcessful” records @ somewhere between 340 – 365

Any help would be greatly appreciated.

[code=php]
<?php
include ‘dbConnection.inc.php’;
dbConnection(“localhost”,”cms”,”root”);
$fileTempName = $_FILES[‘locationFile’][‘tmp_name’];
$fileOrigName = $_FILES[‘locationFile’][‘name’];
$fileSize = $_FILES[‘locationFile’][‘size’];
$fileType = $_FILES[‘locationFile’][‘type’];

$targetFileName = $_SERVER[‘DOCUMENT_ROOT’] . “ha-cms/” . $fileOrigName;

if (!copy($fileTempName, $targetFileName))
{
echo “failed to copy file.”;
}

$data = array(
‘col0’ => array(),
‘col1’ => array(),
‘col2’ => array()
);

$f = fopen($targetFileName, “r”);
$rowCtr = 1;
$emptyCtr = $closedCtr = $openCtr = 0;
while (($line = fgetcsv($f)) !== false)
{
$data[‘country’][] = $line[9];
$data[‘status’][] = $line[4];
$data[‘officeId’][] = $line[2];
}
echo “<table>”;
foreach ($data[‘country’] as $ix => $printCols)
{
$printCols = ucfirst($printCols);
switch ($printCols)
{
default:
echo “<tr>”;
echo “<td>$rowCtr</td>”;
echo “<td>” . $data[‘officeId’][$ix] . “</td>”;
echo “<td>$printCols</td>”;
echo “<td>” . $data[‘status’][$ix] . “</td>”;
echo “</tr>”;
$rowCtr++;
break;
case ‘Country’:
echo “<thead>”;
echo “<tr>”;
echo “<th>Row Number</th>”;
echo “<th>Office ID</th>”;
echo “<th>Country</th>”;
echo “<th>Status</th>”;
echo “</tr>”;
echo “</thead>”;
break;
case ”:
$emptyCtr++;
echo “<tr>”;
echo “<td>$rowCtr</td>”;
echo “<td>” . $data[‘officeId’][$ix] . “</td>”;
echo “<td>” . “BLANK” . “</td>”;
echo “<td>” . $data[‘status’][$ix] . “</td>”;
echo “</tr>”;
$rowCtr++;
break;
case $data[‘status’][$ix] == ‘open’:
$openCtr++;
echo “<tr>”;
echo “<td>$rowCtr</td>”;
echo “<td>” . $data[‘officeId’][$ix] . “</td>”;
echo “<td>$printCols</td>”;
echo “<td>” . $data[‘status’][$ix] . “</td>”;
echo “</tr>”;
$rowCtr++;
break;
case $data[‘status’][$ix] == ‘closed’:
$closedCtr++;
echo “<tr>”;
echo “<td>$rowCtr</td>”;
echo “<td>” . $data[‘officeId’][$ix] . “</td>”;
echo “<td>$printCols</td>”;
echo “<td>” . $data[‘status’][$ix] . “</td>”;
echo “</tr>”;
$rowCtr++;
break;
}
$dC = $data[‘country’][$ix];
$dC = trim($dC);
$dC = htmlspecialchars($dC);
$dO = $data[‘officeId’][$ix];
$insLocs = “insert into countrytable set officeId=$dO, countryname=’$dC'”;
if ($data[‘country’][$ix] != ‘Country’) {
if (@mysql_query($insLocs)) {
echo “<p>Successful: $rowCtr</p>”;
}
else
{
echo “<p>$insLocs ” . mysql_error() . “</p>”;
}
}
}
echo “</table>”;
$updateDate = date(‘ymd’);
$updateStatus = array(‘locationsTotal’ => $rowCtr,
‘blankCtry’ => $emptyCtr,
‘locationsClosed’ => $closedCtr,
‘locationsOpen’ => $openCtr,
‘locationsDate’ => $updateDate
);
foreach ($updateStatus as $printUpdateStatusK => $printUpdateStatusV)
{
echo “<p>”. $printUpdateStatusK . “: ” . $printUpdateStatusV . “</p>”;
}
$insLocUpdateInfo = “insert into cms set openLoc=$openCtr, closeLoc=$closedCtr, updateDate=$updateDate, totalLoc=$rowCtr”;
if (@mysql_query($insLocUpdateInfo))
{
echo “<p>Database updated.</p>”;
}
fclose($f);
?>
[/code]

to post a comment
PHP

2 Comments(s)

Copy linkTweet thisAlerts:
@TecBratMay 12.2011 — It sounds like your server host might be limiting you. Try putting in a sleep() command to let it sit for a second or two after each 300 entries.

add something like this near the insert query
[code=php]
<?php
//set $count to 0 outside of the foreach
$count++;
if ($count > 300){$count=0; sleep(2);}
?>
[/code]

You might run in to script timeouts though.

See Also http://php.net/manual/en/function.sleep.php

&#9632;usleep() - Delay execution in microseconds

&#9632;time_nanosleep() - Delay for a number of seconds and nanoseconds

&#9632;time_sleep_until() - Make the script sleep until the specified time

&#9632;set_time_limit() - Limits the maximum execution time
Copy linkTweet thisAlerts:
@NogDogMay 12.2011 — Your script may simply be exceeding the PHP [url=http://www.php.net/manual/en/info.configuration.php#ini.max-execution-time]max_ececution_time[/url], which you can normally override with [url=http://www.php.net/manual/en/function.set-time-limit.php]set_time_limit[/url]() (if not running in safe mode).

My first recommendation though would be to use the SQL [url=http://dev.mysql.com/doc/refman/5.1/en/load-data.html]LOAD DATA INFILE[/url] command to directly read the CSV file. It is [B]much[/B] more efficient than calling mysql_query() a few thousand times from a PHP script. This can be a bit problematic though if you use a separate database server (i.e. not "localhost") if you cannot copy the CSV file to that server. In that case, it's a good idea to group your values so that you don't have to run so many queries:
<i>
</i>INSERT INTO table_name (col1, col2, col3) VALUES
('v11', 'v12', 'v13'),
('v21', 'v22', 'v23'),
('v31', 'v32', 'v33'),
('v41', 'v42', 'v43'),
[etc....]
;

You can implement in PHP with something along the lines of:
[code=php]
$groupsize = 20; // number of rows to insert per query
$data = array();
while (($line = fgetcsv($f)) !== false) {
$data[] = sprintf(
"('%s', '%s', %d)",
mysql_real_escape_string($line[0]),
mysql_real_escape_string($line[1]),
(int) $line[2]
);
if(count($data) == $groupSize) {
doQuery($data);
$data = array();
}
}
if(count($data)) {
doQuery($data);
}
function doQuery($data)
{
$sql = "INSERT INTO table_name (col1, col2, col3) VALUES " . implode(",n", $data);
$result = mysql_query($sql);
return $result;
}
[/code]
×

Success!

Help @gingerj spread the word by sharing this article on Twitter...

Tweet This
Sign in
Forgot password?
Sign in with TwitchSign in with GithubCreate Account
about: ({
version: 0.1.9 BETA 5.19,
whats_new: community page,
up_next: more Davinci•003 tasks,
coming_soon: events calendar,
social: @webDeveloperHQ
});

legal: ({
terms: of use,
privacy: policy
});
changelog: (
version: 0.1.9,
notes: added community page

version: 0.1.8,
notes: added Davinci•003

version: 0.1.7,
notes: upvote answers to bounties

version: 0.1.6,
notes: article editor refresh
)...
recent_tips: (
tipper: @AriseFacilitySolutions09,
tipped: article
amount: 1000 SATS,

tipper: @Yussuf4331,
tipped: article
amount: 1000 SATS,

tipper: @darkwebsites540,
tipped: article
amount: 10 SATS,
)...