Friday 14 November 2003 10:13:55 am
Well, my site is hosted on a third-party server with wget preinstalled, and it could be that there are som sysadmin-limits as to how the wget-command can be used.. I'll check that up on monday. How I run the command? Well, actually just like i wrote in the previous posting, only with a different url of course. As I mentioned, I've never used wget before so it could be I'm just not using it correctly:
wget --spider http://www.mysite.com/ez/
-----------------------------------------
returns:
wget --spider www.mysite.com/ez/
--19:08:05-- http://www.mysite.com:80/ez/
=> `index.html.1'
Connecting to www.mysite.com:80...
Connection to www.mysite.com:80 refused. ------------------------------------------ By the way: I've achieved some better performance by including my custommade views in the site.ini.append of my siteaccess (sitemap2,sitemap3 and so on). Still, it seems like the caching of my site only last for so long... Wish I could set this caching to be permanent, unless article republished.. And then of course, be able to do the wget-thing to recache if necessary now and then.. Thanks for your feedback, Alex.. It's useful! Valentin
Resources on the net:
http://www.lns.cornell.edu/public/COMP/info/wget/wget_2.html
http://www.lns.cornell.edu/public/COMP/info/wget/wget_7.html http://www.gnu.org/manual/wget-1.8.1/html_chapter/wget_4.html
|