Hello, does anybody knows way how to scrap some site where i have from menu 5000 tabs, and every tabs have houndreds datas. Now i need that site in offline mode. How to do that ?
@ginerjmDec 29.2017 — #You want to try and explain that a little better?
'scrap'?
'houndreds'?
'from menu 5000 tabs'?
'datas'?
'in offline mode'?
Yes - I am picky. It makes it easier to figure out what one is trying to relate.
PS - a website once up and running is almost not recognizable as a "php website". How does one know it was written in php unless the url shows it as such? And why do you care for the purposes of your problem?
@ginerjmDec 29.2017 — #I barely gleaned that from the original post, but thought I would drive home the need for posters to re-read their post or at least try to use their language skills in as good a way as possible.
But - knowing that the task involved scraping I still couldn't understand what all the other references were about. And so far there seems to be no more info forthcoming...
@phpmillionDec 31.2017 — #It depends on your needs. If you just want to scrap whole site at once, use any basic tool as HTTrack ( http://www.httrack.com/ ).
If you need very specific parts of site (just some text data, no images, etc.), and want them to be re-downloaded at specific periods every X hours/days/weeks, you may need to write your own scrapper (PHP + curl should be enough), which extracts just the text you want.
Or, as . wisely suggest, contact site's owner asking if he can provide you with data or some kind of API.