Ask your questions here.
Post a reply

bash scripting question

Sat Oct 20, 2012 12:50 pm

I guess i have asked that before,
and didn't find a solution.

Say i want to run one command ("rsync") for a few directories.
Instead of running it again and again:
rsync -ruv --delete ~/Documents .
rsync -ruv --delete ~/Programming .
rsync -ruv --delete ~/Pictures .
couldn't i put Documents, Programming and Pictures in an array, and the loop thru the array?

The question arises here:
https://github.com/tornow/snippets/blob ... ettings.sh
Any other hints are welcome too. It is a first writup (does that word exist, writeup? It sounds good, if it doesn't exist, insert it into the English language soon).


-----------------------------
I made a more short version. Still pointless as long one ain't got a git-repo on the lan.
https://github.com/tornow/snippets/blob ... s-short.sh

Re: bash scripting question

Sun Oct 21, 2012 1:06 am

Say i want to run one command ("rsync") for a few directories.
Instead of running it again and again:

Can't you specify multiple sources? I thought you could.

couldn't i put Documents, Programming and Pictures in an array, and the loop thru the array?

dunno, too much for my skillz

Re: bash scripting question

Sun Oct 21, 2012 1:29 am

Code:
--files-from=FILE       read list of source-file names from FILE

is probably more efficient than looping through an array and starting rsync for each item. I've never used that option, so I'm not exactly sure how it works. I assume it'll take directory names.

Edit: Or you could just list multiple source files/directories. I just tried it. Remember to omit the final slash on source directories if you want to copy the directory and its contents, rather than just its contents.

Re: bash scripting question

Sun Oct 21, 2012 1:55 pm

Thanks to both for your answers.

As i understand you said:
a) just add the files/folders to be rsync'ec in one command, like
rsync source_foo source bar source_baz destination_folder

and
b) Use a file which contains the files to be synced, use it with --from:File

Both sounds good.
What i think of is to keep the source code as plain and clear as possible.
I would want to store the files to be synced at the top of the script,
so anything which someone might want to edit will be in one place (top of the script).

Using all files in one command will make the command look more difficult than it is (if there are many files). Besides that it is in the middle of the functions, not all in one place.
Still a good solution.

Using --from-file will give me the ability to edit it all in one place, at the top of the script. It would be perfect, but than i got two files or more. For unknown reasonsn i don't like that very much
(I could create the file with each run... mhhh.... not sure what to think of that.... sounds good).

Now if you tell me:
The way it is everything is easy enough i am happy.
I will rewrite and try both options you told me, perhaps i like them better.

In case that didn't make any sense: i just awoke :-)
again:thanks.

Re: bash scripting question

Sun Oct 21, 2012 7:42 pm

Code:
mylist=(
tmp/
tmp2
)
rsync -avx ${mylist[@]} tmp3/

tmp3 then contains copies of the files that are in tmp/ (without copying that folder,
and a copy of tmp2 with its contents.

So yeah, you can make it an array, and you can put one item per line, so it's easy to edit.

Re: bash scripting question

Tue Oct 23, 2012 12:37 am

fsmithred wrote:
Code:
mylist=(
tmp/
tmp2
)
rsync -avx ${mylist[@]} tmp3/

tmp3 then contains copies of the files that are in tmp/ (without copying that folder,
and a copy of tmp2 with its contents.

So yeah, you can make it an array, and you can put one item per line, so it's easy to edit.

Perfracta !
Not even a loop. btw: That makes the functions to backup home and etc one-liners (hence superfluous...)
Code:
#######################
function backup_private {
        rsync -auv --delete ${HOME_LIST[@]} $BU_HOME

}

#####################
function etc_backup {
        rsync -auv --delete ${ETC_LIST[@]} $BU_ETC
}



I got another question, but not yet. Thanks a lot.
Post a reply