rsync

devam eden rsync

rsync ile dosya transfer ederken bağlantınız kopması durumunda sorun yaşamadan kaldığınız yerden devam edebilirsiniz.
Nasıl ?
—parital  opsyonu ile 
Ne kadar süre kaldığını
—progress opsyonu vererek 
Örnek:rsync -avz —partial —progress foo:src/bar /data/tmp
Transferinden sonra md5sum ile gelen dosyayı kontrolunu yapabilirsiniz.

Backup: rsync-клиент

В предыдущей статье мы рассмотрели в общих чертах возможности rsync, а также его настройку в режиме демона. Пока что я не буду рассматривать rsync через ssh-туннель, поскольку в моём примере шифрование не требуется по той причине, что рабочие станции находятся в пределах локальной сети по отношению к серверу, а синхронизация между серверами выполняется через защищённый VPN-туннель. Надеюсь, в будущих статьях у меня будет возможность это сделать. Итак, поехали.

Read More

added growl support to osx user backup script

I’ve just added growl support to my little user backup script.

Mainly for use with cron jobs so you can get a simple notification of when a backup has taken place.

It’s pretty simple but the script saves log files so if there’s trouble you can still check those.
Any suggestions on how to improve the script etc are appreciated as I’ve only just really moved from linux to osx so growl etc is new to me.

View/Download the script at github

back up files and mySQL database

Backup remote directory to local

rsync -r -a -v -e ssh username%domain.tld@domain.tld:/path/to/directory /local/path/directory

MySQL Backup

All Database

mysqldump —user=XXXXXXXX —password=XXXXXXX -A > /PATH/TO/DUMPFILE.SQL

Individual / Multiple Databases

mysqldump —user=XXXXXXXX —password=XXXXXXX —databases DB_NAME1 DB_NAME2 DB_NAME3 > /PATH/TO/DUMPFILE.SQL

Specific table

mysqldump —user=XXXXXXXX —password=XXXXXXXX —databases DB_NAME —tables TABLE_NAME > /PATH/TO/DUMPFILE.SQL

Restore MySQL Database

mysql —verbose —user=XXXXXXXX —password=XXXXXXXX DB_NAME < /PATH/TO/DUMPFILE.SQL

How to backup your Mac to Dreamhost with rsync
All Dreamhost hosting accounts now come with 50gb of free personal backup.I’ve been searching for a while on a good tutorial for this and couldn’t find one, so I thought I would make my own.

1. Setup passwordless login for rsync (can be skipped if you don’t mind typing your password each time you want to sync)
  • Open terminal and type the following (press return after each line)
  • "ssh-keygen -t dsa" (press enter after all three prompts)
  • "scp ~/.ssh/id_dsa.pub remote"
  • "cat ~/.ssh/id_dsa.pub » ~/.ssh/authorized_keys"
  • "chmod 644 ~/.ssh/authorized_keys"
  • "sftp backupusername@backup.dreamhost.com”
  • "lcd /Users/OSXUsername/.ssh
  • "mkdir .ssh"
  • "put authorized_keys .ssh"

2. Sync your files / folders
  • Open terminal and type the following, then press return
  • "rsync -avz BackupFolder backupusername@backup.dreamhost.com:DestinationFolder" (example: "rsync -avz ~/Documents bxxxxxx@backup.dreamhost.com:macbook_backup")
  • Optionally, you can add “—delete” after “-avz” (ie: “-avz —delete “) if you want to delete files on your backup once they’ve been deleted locally



3. Automate the backup process
  • Still working on this, check back later or leave some advice in the comments. Thanks!

Sources of information:

Michael Twomey - http://blogs.translucentcode.org/mick/archives/000230.html
Marc Climent - http://codelog.climens.net/2008/12/03/using-dreamhost-backup-account-with-rsync/


[gmap]

rsnapshot is a filesystem snapshot utility for making backups of local and remote systems. Using rsync and hard links, it is possible to keep multiple, full backups instantly available. The disk space required is just a little more than the space of one full backup, plus incrementals. Depending on your configuration, it is quite possible to set up in just a few minutes. Files can be restored by the users who own them, without the root user getting involved.

Backup: пишем скрипт

Когда всё готово для организации резервного копирования (продумана схема, настроен rsync-клиент и сервер, составлен перечень требующих архивации файлов и каталогов на каждой рабочей станции), неплохо бы всё это дело автоматизировать. Действительно, не заводить же себе будильник и запускать на каждой станции команду? :-) В этой статье я расскажу о том, как приготовить простой bash-скрипт определённой универсальности, который можно будет разместить на всех рабочих станциях и серверах данные с которых подлежат резервному копированию. После размещения файла скрипта вам останется лишь сообщить планировщику (я использую cron для решения подобных задач) расписание запуска и спокойно курить в сторонке, пока все участники вашей сети будут заботиться сами о себе. То есть, о ваших данных, конечно же!

Read More

New Project - rdiff-backup-web

I am proud to announce that I am now managing a new project on Sourceforge.net called rdiff-backup-web.

For those that dint know Sourceforge is a great site that allows developer to advertise there open source coding projects and supply services such as CVS, storage, web space and collaboration tools to manage your project professionally. The best bit is it is free cant talk it up enough. Another site that offers the same kind of features is Freshmeat.net

Back to the announcement, rdiff-backup-web is a web based tool for configuring, managing, and deploying backup jobs for rdiff-backup which is a reimplementation of librsync. The project was started by David Evans, and I am taking it on.

The aim of the project is to provide a simple interface for administrators to create and maintain backup jobs and for users too be able to easily recover files.

The project is currently in 0.4.0 alpha and in the next 2 weeks once i have familiarised myself with the code and tidied a few things up 0.4.1 will be released with 0.5.0 following soon after.

The projects source forge page is here and the projects website is rdiffbackupweb.sourceforge.net for now.

Any one looking for a bit more information or looking to get involved in the project feel free to leave a comment and I will get back to you.

Best file backup solution for me so far

I tried a lot of backup solutions in the past. Different programs, different scripts, partition imagers, folder synchronizers. I have found my favourite so far.

cwRsync is a port of the known unix tool rscync which let’s you synchronize folders over network while making sure you transfer as little data as needed to do the job. Normally you need a server and a client to let the latter connect to the server and transfer the file. But as I am doing my backups locally  I just need the client. Then I have a batch file that call the cwrsync command several times with different parameters to copy the folders I want to backup to an external drive. This has benefits:

  • The files are not packed into some archive file and therefore do not require special software to be recovered or searched.
  • Because of the way rsync works, only the parts of files that have been altered are transfered. This saves a lot of time when working with big files. Imagine an e-mail database file with 600 MB, you receive one e-mail and the files has changed. But instead of transferring 600 MB, the rsync command only transfers the portion of the files that has been changed.

If you like encryption you can use Truecrypt at the backup destination.

Backup: настраиваем rsync-сервер

Систем для создания и управления резервными копиями в Linux создано великое множество. Не столько создано, сколько портировано с других систем, но суть та же. В этой заметке я хочу рассказать о простом и удобном для меня способе создания резервных копий важных данных, имея в наличии небольшой набор компьютеров и установленную по-умолчанию практически во всех дистрибутивах утилиту rsync.

Read More

Why You No Rsync!

rsync -Prn /local/dir/ /external/dir

-P = show progress and allow partial transfers
-r = traverse directories recursively
-n = dry run 

So this is supposed to show me a dry run on updating files from /local/dir to /external/dir. The key here is /local/dir/ (notice the trailing slash) as it tell rsync to copy the contents of dir and not dir itself. Alas, for some reason, this isn’t working as expected. It’s pretty much forcibly replacing all files and folders. Not what I want!

So I stumbled upon the issue. I’m transferring from a Mac filesystem(HFS+) over to a Windows one(NTFS). Rsync picks this up as a change in the file and will proceed with updating everything. And the simple solution to this? Make rsync run a file size check only.

rsync -Prn —size-only /local/dir/ /external/dir

Done and done! Once my dry run checked out, I ran it with no problems. I also had to run —force —delete since I renamed and removed some directories.

I love what I know, but rsync is such a powerful tool that my knowledge is but the tip of the iceberg. 

Backup your Minecraft world daily

This will work for both single-player multi-player worlds. This steps outlined here can be used individually or in conjunction with my previous post.

[1] Set up a folder to store the backups. For added peace of mind, I store my backups in a Dropbox folder to be synced online. Call this folder /home/[user]/Dropbox/Public/Minecraft/world-backups/. Also take note of where your minecraft world is located. I’ll call this /home/[user]/minecraft-server/world.

[2] Create a bash script to be run on a regular basis. rsync will be used to mirror the world files to a separate folder to avoid file errors. The interval at which it runs will be defined in step [5].

#!/bin/bash
rsync -r -t -v /home/[user]/minecraft-server/world /home/[user]/minecraft-temp-world
mydate=`date +"%Y%m%d-%H%M%S"
cd "/home/[user]/minecraft-temp-world"
zip -rq "/home/[user]/Dropbox/Public/Minecraft/world-backups/world$mydate" "world"

If you followed my pigmap post, change /home/[user]/minecraft-temp-world to /home/[user]/pigmap-input. Save this as ~/scripts/minecraft-backup.sh and make sure it’s executable.

[3-optional] Create a second script to trim backups, keeping only the 5 most recent. The format we’ve chosen for our filenames allow us to easily find the most recent backups. When viewing the files alphabetically, the ones listed last will be the most recent.

#!/bin/bash
cd "/home/[user]/Dropbox/Public/Minecraft/world-backups/"
ls world*.zip | head -n -5 | xargs rm

To keep more than 5 backups, change head -n -5 to the value of your choice. Save this as ~/scripts/minecraft-trim-backups.sh and make sure it’s executable.

[4-optional] If you’ve chosen to use the trim script or if you followed my pigmap post and already have ~/scripts/pigmap-update.sh created, create one more to be run daily, which will execute all of our scripts in the proper order.

#!/bin/bash
cd /home/[user]/scripts
./pigmap-update.sh
./minecraft-backup.sh
./minecraft-trim-backups.sh

Save this as ~/scripts/crontab-daily.sh and make sure it’s executable.

[5] Add an entry to the end of your crontab file to run the script daily. To run the script at a different time or at different intervals check out this crontab quick reference.
$ crontab -e
0 7 * * * /home/[user]/scripts/minecraft-backup.sh

Or, if you followed optional steps [3] or [4]:
0 7 * * * /home/[user]/scripts/crontab-daily.sh

Fixed: rsyncing a bootable backup #programming #answer #it

Fixed: rsyncing a bootable backup #programming #answer #it

rsyncing a bootable backup

jwz mentions PSA: Backups a backup system that’s fairly clever. Your backup method is to essentially restore to the replacement drive. This seems to work simply on OSX. What all do I need to do to adopt this guide to Ubuntu?

Answer [by womble]: rsyncing a bootable backup

Partition the two external drives the same way as your system drive, and rsync all your partitions…

View On WordPress

rsync

A while back, I was looking for a command line replacement for FTP. Even with the best FTP app out there, dragging and dropping files can get tedious, especially for little updates that you just want to see on the remote server (yeah, you can sync in Transmit… but you still have to open the app and whatnot).

In review, SFTP is decent if you want to browse around the remote server. SCP is quick and easy for uploading a single file. But if you’re looking to sync-up a whole directory, rsync is the way to go.

Because rsync is such an old utility, I had some trouble finding a decent tutorial or blog post. The best reference is its man. So long as you have SSH access, using rsync is fairly straight forward. To upload:

rsync -avz /local/myproject/ user@server:/remote/myproject

To download:

rsync -avz user@server:/remote/myproject/ /local/myproject/

Looking at the arguments, -a syncs all subdirectories, -z compresses the transfer, and -v makes the procedure verbose.

This would recursively transfer all files from the directory src/bar on the machine foo into the /data/tmp/bar directory on the local machine. The files are transferred in “archive” mode, which ensures that symbolic links, devices, attributes, permissions, ownerships, etc. are preserved in the transfer. Additionally, compression will be used to reduce the size of data portions of the transfer.

Note that the remote path in the download script has a trailing /.

A trailing slash on the source changes this behavior to avoid creating an additional directory level at the destination. You can think of a trailing / on a source as meaning “copy the contents of this directory” as opposed to “copy the directory by name.”

I’m using this command in a couple li’l bash scripts to deploy projects to my remote server. Here’s demo/deploy.sh

#!/bin/bash
rsync -avz --exclude '.git' --exclude '*.sh' ./ $BERNA:~/www/demo

The --exclude argument specifies which files to not sync. For desandro.com/deploy.sh, I use --exclude-from and a separate file exclude.txt. ($BERNA is a local alias I have user@server, so I can keep my username and server semi-private.)

I use this following script, sync.sh, upload and download:

#!/bin/bash

REMOTE=user@server:remote/dir

if [ "$1" == 'down' ]; then
    PATHS="$REMOTE/ ./"
else
    PATHS="./ $REMOTE"
fi

rsync -avz --exclude 'ignoreme.txt' $PATHS

./sync.sh uploads, and ./sync.sh down downloads.

Using rsync for scheduled remote backup
  1. Generate public SSH key on the source server (leave passphrase blank):
    ssh-keygen -t rsa
  2. SCP the new public key it to the backup server, e.g.
    scp cookie_public_key.pub 192.168.1.10://home
  3. Append the public key to the end of /root/.ssh/authorised_keys on the backup server:
    cat cookie_public_key.pub >> authorized_keys
  4. Create an rsync update file on the source server (/home/rsync/rsync-update) containing:
    rsync -arzve ssh /mnt/music/ 192.168.1.10:/home/cookiemonster/
  5. Edit /etc/crontab on source server and add something like:
    # backup music
    00 04 * * * root /home/rsync/rsync-update > /home/rsync/cronlogs/rsync-update.log
Resolved: rsync: How to exclude Dotfiles only in topmost directory? #computers #it #solution

Resolved: rsync: How to exclude Dotfiles only in topmost directory? #computers #it #solution

rsync: How to exclude Dotfiles only in topmost directory?

When performing backups under Bash with rsync, I’m trying to exclude all dotfiles and hidden directories in the top directory, but not those in otherwise targeted directories. For example:

/copyme.c /.dontcopythisfile /.dontcopythisdirectory/or_its_contents /directory/.butcopymetoo 

rsync -a include=".includeme" exclude=".*" . DESTfails…

View On WordPress