A
Anonymous
Guest
1. two ways:themitch said:Hi all
I am looking to scrape a webpage and e-mail myself when it has been updated. I think, to use this I should use file_get_contents, say, and just save the contents into a mysql database, then check it every hour or so and if changed, I can e-mail myself.
Firstly, does this sound possible?
Secondly, how do I do the hourly thing? Is that a 'cron' job?
Finally, the page has, I think, HTTP authentication (an explorer popup appears asking for a username and password). I obviously have a username and password, but how do I pass that through to the file_get_contents function?
Thanks in advance for your help.
M
a) put output file in array where each string of file it's element of array.
b) use Output Control Function functions what start as ob_
2. If you are use xNIX, you are use cron application.
3. need see how info is retrive. you are may try like this http://www.site.com?user=<user>&password=<password>