Auto-sync in GNOME

I really want to just synchronize two directories on multiple machines. I don’t want to worry about IP addresses and things like that and I don’t want to store my private files “in the cloud”. Has anyone done a cute hack using ssh, avahi and inotify for GNOME? Note: I don’t want to backup a folder like DejaDup wants me to do, I want live multi-master replication. Unison also fails for me, as it currently doesn’t work in Fedora 15, and has to be run manually. Ideas?

Published by

hughsie

Richard has over 10 years of experience developing open source software. He is the maintainer of GNOME Software, PackageKit, GNOME Packagekit, GNOME Power Manager, GNOME Color Manager, colord, and UPower and also contributes to many other projects and opensource standards. Richard has three main areas of interest on the free desktop, color management, package management, and power management. Richard graduated a few years ago from the University of Surrey with a Masters in Electronics Engineering. He now works for Red Hat in the desktop group, and also manages a company selling open source calibration equipment. Richard's outside interests include taking photos and eating good food.

37 thoughts on “Auto-sync in GNOME”

          1. No, but it reduce.

            Play with Mono is like play with a grenade offer by your ennemy.

          2. (replying to korbe)

            yeah why don’t you go wank yourself with your reduced-patent-danger system on youporn with your proprietary flash plugin

          3. (replying to antimonio)

            1: I don’t have Flash Player.
            2: I don’t want to go to youpporn.
            3: I don’t want to go wank.
            4: Stay polite. Thanks.

  1. Alberto: neat – didn’t know about that one!

    Though – can you use it to share a folder containing a git repository? Looks a little tricky.

  2. I’d say dbus service that uses fanotify to keep a list of changed files across the filesystems, and replicate the files across the network to a separate incremental directory, a-la Time Machine.

    Or make it possible to export btrfs snapshots across the network.

  3. could always give gluster a shot. you’d configure the two servers as a replica pair, and then mount the export locally. You can also then expand to replica over more servers (not sure what the limit of the replica count is) if you want, and it is default accessible over the network.

  4. rsync or backintime
    rsync: command line tool for synchronizing
    backintime: powerful gui for rsync with automatic sync function
    or:
    repair the unison-starter with a start-script (I think the autostarter doesn’t work because of systemd)

    [WORDPRESS HASHCASH] The poster sent us ‘0 which is not a hashcash value.

  5. Rsync rules. There are many scripts around that are quite neat. Some even do backups and have fallback options.

  6. As far as I know Unison has a command line interface and can run automaticically. Of course you have to set the appropriate options in the .conf file. I used Unison to automatically make an update on a backup-server on my girlfriends laptop, once a day and the setup worked for years, without manual interaction.
    But perhaps I did not quite get what type of solution your are looking for?
    If you are looking for something that makes efficient updates, say once per hour, Unison might be worth to be considered.

  7. For what it’s worth, I put together a thin wrapper around git to implement that:

    https://github.com/agateau/deveba

    I have been using it to sync documents and pictures from two laptops and a desktop machine. One of the laptops is my wife’s laptop so one of my goals was to ensure no special manipulations was required (that is, until there is a conflict…)

    It is not inotify-based, I just run it from cron.

    Has Jeff said, it is not a good solution if you want to sync large files. I investigated integrating with git-annex, but my understanding of git-annex was that it was difficult to integrate seamlessly (I don’t want to have to run git-annex commands to get my files)

  8. “Unison also fails for me, as it currently doesn’t work in Fedora 15, and has to be run manually.”

    No, that’s how unison works. It doesn’t do live syncing, it just syncs when you tell it to.

  9. Drbd and GFS2. Maybe a little overkill, but works beautifully. At least on my two servers.

  10. >I really want to just synchronize two directories on multiple machines.
    >I don’t >want to worry about IP addresses and things like that and
    >I don’t want to store my >private files “in the cloud”.

    This is the same I want.

    You could have a look at:
    http://git.csync.org/users/cjann/csync.git/plain/doc/userguide/csyncd.html

    The GUI is written for KDE but the daemon only needs a c compiler.

    But it’s not finished yet and I have still a lot to do.

    [WORDPRESS HASHCASH] The poster sent us ‘0 which is not a hashcash value.

  11. That’s the holy grail for me too.

    Live async multi-master replication. And oh yeah, disconnected mode, because of laptops.

    Contenders are unison, csync (with pam_csync to replicate on login and logout), chironfs (fuse based).

    Things I’ve looked at too were DRBD, Zumastor (seems dead now, but some ddsnap concepts are going in-kernel in devicemapper), Coda, InterMezzo (never went anywhere), etc.

    In the end, unison has always been most reliable for me.

    In practice, there’s a lot of stuff you shouldn’t sync when syncing home directories.

    Syncing gconf/dconf is problematic (especially if you want exceptions for different machines). Maybe a one-file-per-key backend would help – but I guess that would slow things too much.

    Proper use of XDG_*_DIRs would also be helpful:
    – Epiphany still does too much in ~/.gnome2/
    – FireFox and Thunderbird also mix caches and settings too much)
    – ~/.thumbnails, ~/.fontconfig etc

    Well – enough said.

  12. … well, for the rest of us, using cloud-based backup ala ubuntu one, dropbox, spideroak and many others works just wonderful.

  13. If you don’t mind closed source there’s AeroFS [1]. It has optional backup to the cloud, but works just fine without it (default). It’s still in alpha, but I can give you an invite.

    On the vcs-home mailing list [2] there’s been a lot of discussion [3] about dvcs-autosync [4]. It aims to to provide automatic keep dvcs in sync with each other. It has been described as being similar to SparkleShare, but without Mono and a GUI.

    [1] http://www.aerofs.com/
    [2] http://lists.madduck.net/listinfo/vcs-home
    [3] http://lists.madduck.net/pipermail/vcs-home/2011-March/000314.html
    [4] http://gitorious.org/dvcs-autosync

  14. I’ve used Unison to perform automatic synchronization in the past. There is a “batch” options that can be set in a Unison for this purpose. Also a Unison profile can be created rather than specifying everything on the command line, if you wish. The options available in the preferences section of the Unison manual can be used in the the profile.

    http://www.cis.upenn.edu/~bcpierce/unison/download/releases/stable/unison-manual.html#profile

    Use “batch = true” in the profile or “-batch” option to automate the synchronization. Other options can be specified to control behaviour. For instance, in my case, I specified “prefer = newer”. These options would depend on how you want to resolve conflicts.

    I just did this with two systems, but I’m pretty sure it could be extended to more. I would think that with multiple hosts it would best be accomplished by coordinating changes via one primary host. Primary host actions sync to all hosts and all secondary hosts would only initiate sync with primary. This should propagate changes to all hosts with minimal issues.

    I’ve also had success using incrond. With this you can make use of inotify if you find this more useful then scheduling in cron.

    I tried Unison on F15 and the -ui text option seems to work.

    –CS

Comments are closed.