[aclug-L] Re: Sychronizing data across the Internet
[Top] [All Lists]
[Date Prev][Date Next][Thread Prev][Thread Next][Date Index] [Thread Index]
At 02:05 AM 4/16/00 , you wrote:
For lack of knowing what "Fantastic Manual" to read, I ask this question.
I have data - usually either a directory of source code files or an entire
web site that I'd like to have updated periodically. I will always be
sychronizing it from the same machine, and hope to eventually get a cron
job set up to run this, yet still have the option to run it manually.
I'd like it to work so that if I update a file on either machine, it can
update things so that the newest version of the file will then be on both
machines. I don't want to mess with CVS or anything like that. Is there
some standard command to do this, or wil I need to write a script. It will
run on my home Linux box, and need to log in to (ftp) another machine
running UNIX to accomplish this.
Thanks for any tips or suggestions,
John
John,
I was doing something similar back when I was still on a dial up connection.
Since being disconnected was fairly common I ran a cron job every 30 minutes
to check whether I was connected. If I was not it would trigger a script that
reconnected me to Southwind. At the end of that script was a call to another
script that would update a html file on my SW site to reflect the new IP of my
web server here at home. I'll paste it in below in the hopes that it may give
you some ideas on how to do what you need to. It's not a drop-in solution,
but perhaps it will help.
wayne
#! /bin/bash
sleep 90
IPnum=`tail /var/log/messages | grep local | awk '{ print $9 }'`
export IPnum
sed "s/209\.134\.90\.[0-9]*/$IPnum/" /temp/pixx.html > /temp/pix.html
ftp -vn southwind.net < /sbin/ftp.put
echo $IPnum ;
-- This is the discussion@xxxxxxxxx list. To unsubscribe,
visit http://tmp2.complete.org/cgi-bin/listargate-aclug.cgi
|
|