I have to insert loads of data at a time in a mysql table (like 4000-9000 rows at a time). What’s the best way to do this? I can make a huuuge query and insert everything with that, or i can make a LOT of small queries for each row… What’s better?
@JDM71488Aug 22.2007 — #i don't see why you couldn't try all 4000-9000 the first time as a test, see how the server handles it, and figure out if you need to break it up.
@ZnupiauthorAug 22.2007 — #Well that's what I did and it works for ~5000 rows (I can't test with more, I don't have the data now, I'm still working on the project). The server handles it remarkably well: it runs the query in under 1 second ?. And right now this runs on localhost which is my desktop computer and runs many apps... from firefox to beryl. When the site will be up it will run on a proper server.
Thing is, it will probably run 100 queries like this one after another, and on my localhost it gave a 'memory limit exceeded' error once when I was only running a couple of big queries so I had to use [code=php]ini_set("memory_limit", "-1");[/code]
...So, would inserting it bit by bit be easier on the machine? Memory, CPU, etc...?