/    Sign up×
Community /Pin to ProfileBookmark

Hi All,

I am trying to put the csv file data in mysql db.Now I have few questions like:

my csv file data format like this(in mysql table the UID is primary key and auto incremental, rest is like name,SID and phone.So there are total 4 fields):

name SID phone
aa 1 234
bb 2 456

Should I include UID as first column in the csv file or it will automatically generate the ids in the table.

1.Is it better to use “load data local infile” or use fgetcsv()

2.Next question is that the csv file may get updated frequently.So the csv file can be uploaded several times either with updated few rows or inserted new row along with the old rows.So i will first delete all the data and then insert with new one or is there any alternative way.i am not too much supportive to delete first and then insert.

for now I have these 2 questions.

Please suggest the best possible way.

Many thanks,
Raj

to post a comment
PHP

12 Comments(s)

Copy linkTweet thisAlerts:
@MindzaiJul 08.2010 — 1: If you have specified AUTO_INCREMENT there is no need to provide a value, it will happen automatically.

2: I would be inclined to stick with fgetcsv as it is easier (imo) to handle any extra extra logic via PHP, plus afaik LOAD DATA INFILE is MySQL-specific and I try to avoid tying myself to one DBMS where possible. This is just my preference though, either way will work fine.

3: Personally I would load the data then empty the csv file. That way you can just insert the new data next time, delete from the csv file again and so on, without trucating your database tables every time. This is more efficient as you are not having to reinsert old data multiple times.
Copy linkTweet thisAlerts:
@raj_2006authorJul 08.2010 — Thanks for your suggestion...


1: If you have specified AUTO_INCREMENT there is no need to provide a value, it will happen automatically.[/QUOTE]

So I don't have to write the header as UID in the csv file.It will manage automatically.

2: I would be inclined to stick with fgetcsv as it is easier (imo) to handle any extra extra logic via PHP, plus afaik LOAD DATA INFILE is MySQL-specific and I try to avoid tying myself to one DBMS where possible. This is just my preference though, either way will work fine.[/QUOTE]

yes I agree to go with fgetcsv atleast I can handle the logical part if there is any.

3: Personally I would load the data then empty the csv file. That way you can just insert the new data next time, delete from the csv file again and so on, without trucating your database tables every time. This is more efficient as you are not having to reinsert old data multiple times. [/QUOTE]

So you mean each time the file content is loaded in the table...then I will delete the content of the file(not manually but using the code)....but what about if I want to update any old record?

This one is not cleared.Suppose I have uploaded 10 records...now I want to update 5th record as well as I have inserted a new row in the csv which is 11th.....so at the same time I want the table to be updated as well as inserted....any short cut way or will back to old method....first delete then insert...
Copy linkTweet thisAlerts:
@adam_jimenezJul 09.2010 — Thanks for your suggestion...




So I don't have to write the header as UID in the csv file.It will manage automatically.



yes I agree to go with fgetcsv atleast I can handle the logical part if there is any.



So you mean each time the file content is loaded in the table...then I will delete the content of the file(not manually but using the code)....but what about if I want to update any old record?

This one is not cleared.Suppose I have uploaded 10 records...now I want to update 5th record as well as I have inserted a new row in the csv which is 11th.....so at the same time I want the table to be updated as well as inserted....any short cut way or will back to old method....first delete then insert...[/QUOTE]



I reckon truncate and insert to guarantee it matches up.

Just so long as no other table is reliant on the ids - which will go out of kilter.
Copy linkTweet thisAlerts:
@raj_2006authorJul 09.2010 — Sorry Adam...can you please explain a bit more...I have not understood completely.
Copy linkTweet thisAlerts:
@adam_jimenezJul 09.2010 — Sorry Adam...can you please explain a bit more...I have not understood completely.[/QUOTE]

My suggestion is to clear out the mysql table using truncate - and the then re-insert all the csv data from fresh. This ensures that all the data matches up.

However I would not do this if you have another table that is referencing an ID column from this table. Because when you re-insert the data the IDs will be different and may no longer match up.

Hope this is clearer!
Copy linkTweet thisAlerts:
@criterion9Jul 09.2010 — I guess the bigger question here is why are you editing/deleting data in a csv instead of in the database in the first place? I understand adding data...but edits/deletes to a flatfile then replicated in a database seem like duplication of effort and over complication to me.
Copy linkTweet thisAlerts:
@raj_2006authorJul 09.2010 — Hi Adam..I have understood what you meant.

Hi criterion9,

Yes you are correct that it seems duplication of the same thing...but I can't find a good way to minimize this procedure.Its obvious that someone may want to update or delete a record from the db after he inserts a set of records.

Now the point is instead of querying the table and delete/edit one by one its easy to do bulk update all at once....using the same csv file...which you used first time for insert....

Just can't find a good tricks...I will get back to you with my code....
Copy linkTweet thisAlerts:
@criterion9Jul 09.2010 — You could pretty easily use a server-side language to administer multiple updates/deletes. It is in fact a pretty common thing that is done (especially with eCommerce).
Copy linkTweet thisAlerts:
@raj_2006authorJul 09.2010 — Yes thats a common way...but i am trying to do it with the same csv file which can used several times(and follow what adam told)....or else i have to think about some other way....
Copy linkTweet thisAlerts:
@criterion9Jul 09.2010 — Then why bother with using a database at all? You could parse the csv on the fly as needed...
Copy linkTweet thisAlerts:
@raj_2006authorJul 09.2010 — I have to use the db..bcoz after insert i have to search the db for specific ids and if matched it will return the data in a csv format.
Copy linkTweet thisAlerts:
@criterion9Jul 09.2010 — Truncating the table repeatedly kind of defeats the purpose of storing the data in the database in the first place. If all you are doing is searching through the csv file you can pretty easily parse the file for the needed data....or better yet use Excel (or other spreadsheet software) to search through the data since you are making it basically a manual process anyway.
×

Success!

Help @raj_2006 spread the word by sharing this article on Twitter...

Tweet This
Sign in
Forgot password?
Sign in with TwitchSign in with GithubCreate Account
about: ({
version: 0.1.9 BETA 5.24,
whats_new: community page,
up_next: more Davinci•003 tasks,
coming_soon: events calendar,
social: @webDeveloperHQ
});

legal: ({
terms: of use,
privacy: policy
});
changelog: (
version: 0.1.9,
notes: added community page

version: 0.1.8,
notes: added Davinci•003

version: 0.1.7,
notes: upvote answers to bounties

version: 0.1.6,
notes: article editor refresh
)...
recent_tips: (
tipper: @AriseFacilitySolutions09,
tipped: article
amount: 1000 SATS,

tipper: @Yussuf4331,
tipped: article
amount: 1000 SATS,

tipper: @darkwebsites540,
tipped: article
amount: 10 SATS,
)...