More Backups and Synchronization

I have spent rar too much time this past week reviewing and updating the backup procedures on my development workstation. In the past, I was:

  • Using Acronis to backup the files on drives C, D to drive Z.
  • Using Syncback to occasionally copy Videos, Music and Photos to an SMB share on a Synology DS209J.
  • Using the Synology DS209J as a media server.

The challenges I faced:

  • Acronis has failed in the past, so I need to either trust it again or dump it.
  • The Synology server is embarassingly slow. File copying is slow, and videos occasionally stutter.
  • I’m concerned about SMB shares being trashed by a possible Cryptowall virus.
  • Am a bit low on disk space in areas.

Hardware changes made:

  • Replace the Synology with a CentOS 7 VM running on an existing ESXi server, and install Plex on it.
  • Add a 4TB WD Red drive to the ESXi server, create a 4TB ext4 volume and attach it to the CentOS VM.
  • Replace an existing 10/100 ethernet switch with a 10/100/1000 ethernet switch.

Functionality to test:

  • Disk and file backups with incremental and block level support for extra large files.
  • Cataloging of the backups.
  • Directory/file synchronization.
  • Cloud and SFTP support.

Software tested:

  • Syncback (I have been using Syncback free)
  • Syncovery
  • Acronis Backup/Recovery (I already own this)
  • Acronis True Image
  • Cloudberry

 

A Realization

It slowly dawned on me that when performing occasional full backups with Syncback, what I really wanted to do was to perform file synchronization. In this case, I didn’t want incrementals or the ability to restore an old version. I just wanted a source and destination synchronization of my media directories. It also turns out that dedicated backup software such as Acronis or Cloudberry don’t really support that functionality. So, let’s look at each of the packages I tried.

 

Syncback

There are several versions of this product, and Syncback free has allowed me to backup across SMB shares without any problems in the past.

It does not support block level backups, which I felt were important with the number of Virtual Machines I’m using.

In order to Synchronize to the CentOS server via the more secure SFTP, I needed to purchase Syncback Pro (and not SE).

Being about same price, the block level support of Syncovery put it ahead of Syncback Pro.

Syncovery

This product costs about the same as Syncback Pro, but does support block level backups, as well as file/directory synchronization.

I ended up buying a copy of this to use strictly for synchronization with the new media server. Maybe I’ll use it for full backups down the road once it gains my trust. I’m not there yet.

Acronis Backup/Recovery

I already own this product, so I spent time determining if I wanted to continue using it or replace it with an alternate. After significant testing, I decided to update it and use it for backups. One of the main reasons was that using the catalog for file recovery was easier to use than the alternatives.

Acronis True Image

This package had great reviews, however the simplistic interface was not for me. I ditched it early on.

Cloudberry

This product performs backups to various local and Cloud services and worked very well. I have a client that uses it and it’s a good product. In order for me to purchase Cloudberry, Acronis Backup/Recovery had to fail in some significant fashion and the only weakness I could see with Acronis (other than the fact that it’s huge), was that the version of Acronis I own does not support Cloud Services. Fortunately for me, that was not a requirement and as a result, Cloudberry was off the table.

 

Current Status

Acronis now performs custom daily/weekly backups of drive C and drive D to drive Z as well as to the CentOS server via SFTP. One of the advantages of the custom configuration is that I can tell it to purge old backups when the destination drive gets full. At 4TB, that should take a while.

Syncovery synchronizes the Photo, Music and Video directories to the CentOS server via SFTP for use by the Plex media server. It also does so at a MUCH faster clip with the new gigabit switch.

Oh, and I’m also running Crashplan as a backup of last resort. When it comes to backups, you can never have too many.

 

 

Continue Reading

Backing up my web sites

I’ve implemented web sites of varying types, including:

  • Static
  • Drupal
  • Joomla
  • WordPress
  • Media Wiki
  • Probably a couple of others

With the exception of the static sites, these ALL run in databases and invariably they ALL need to be backed up, files and databases included.

Many web site administrators will either rely on their web host provider to perform these backups and/or implement a backup plug-in for their CMS of choice.

The advantage of webhost provider based backups is that it’s a complicated matter that you don’t have to worry about. The drawback for the webhost provider based backups is that backup and recovery for YOUR site is now out of your hands. Let’s hope your webhost provider has got it right. Otherwise, you may not have a site to recover. In addition, if your site gets hacked, you could have a real difficult time finding where the bad code is, and I’ll cover this later.

The advantage of plug-ins is that, depending on the plug-in, you can get a LOT of functionality, such as file/database backup and recovery, site migrations and much more. The disadvantage is that the site needs to be RUNNING in order to use it.

For the past few years, I’ve been logging into my sites via ‘ssh’ and have been performing file/database backups to a remote host. I’ve been performing daily as well as weekly backups and have kept about 3 months worth of backups for these sites. The advantage is that I can go back several months to restore a site and have used this to compare recent and old site files to find a site hack. In addition, it doesn’t matter WHICH CMS I’m using or even if it’s running. As long as I can ssh into the site and dump the files and database, I’m good to go. The disadvantage is that each backup is a FULL backup and these can consume considerable disk space.

Really, what I need is a decent backup utility that:

  • Can access multiple sites via ssh or rsync
  • Can backup files
  • Can backup the databases
  • Is automated
  • Doesn’t matter which CMS I’m using
  • Supports incremental backups (thus saving huge amounts of disk space)

In the Linux world, Bacula is very popular, however I’ve chosen an application called rsnapshot. This uses perl scrips and the rsync command to create a repository of backups and I’ve been able to configure it to access multiple sites and databases and to backup much more data than I otherwise would have been able to with FULL daily/weekly backups.

 

 

Continue Reading

Getting FastLED help

The FastLED community is on Google+.

Before asking the community for support, please remember that they need detailed information in order to help. Depending on the issue (and there have been far too many), here’s some important details to include:

  • What kind of LED’s you’re using.
  • What microcontroller board you’re using.
  • What version of the FastLED library you’re using.
  • What version of Arduino IDE you’re using. May even be an alternate fork.
  • Other functionality you’ve added.
  • What OS you’re building from.
  • A circuit diagram/layout (try using Fritzing to show it).
  • A copy of your .ino file (copy it to www.pastebin.com or gist.github.com).
  • Essentially, give us EVERYTHING in excruciating detail.
  • Provide specifics on how/where it’s all being powered up.
  • That would be pins, voltages, power supply used. . . every . . single . . wire.
  • Try and remove all the superfluous code and minimize the amount of code that exhibits the issue.
  • Oh, and triple check EVERYTHING. Again!
Continue Reading

Some ‘yum’ Commands for CentOS

The instructions in this article do not work with Ubuntu and its’ variants, such as Debian or Mint, as they use apt-get instead of yum. Read on if you’re a CentOS or related user.

When I first started using CentOS 7, I often selected the advanced packages during the graphical installation process, such as ‘Web Server’ or ‘GNOME Desktop’. The problem, was that I wanted to create a LAMP server with Samba, Git and phpMyAdmin for web development and none of the options seemed to fit. As a result, I eventually chose a minimal install and manually downloaded individual packages with the CentOS/RedHat specific ‘yum’ command.
I’m happy with how that went and have thoroughly documented this with my ‘A CentOS Web Development Environment’. In the meantime, I wanted to delve a little further into the ‘yum’ command to find out a bit more about the available packages.
Continue Reading →

Continue Reading

Finding the Right OS

Quite often, I use the Pareto principle of spending 20% of the effort to get 80% of the results. On other occasions, I spend far too much time to get something just right.

From DOS 1.0 in 1981 to Windows 8 and Linux today, I’ve installed dozens of micro and minicomputer operating systems over the years. To me, an OS needs to fit like a glove, which is why I’ve spent considerable time looking for a version of Linux that will complement my Windows 7 desktop environment. Being on a tight budget, I use Oracle’s VirtualBox instead of VMWare Player as it includes a Snapshot facility.
Continue Reading →

Continue Reading

Site Backups

backups

If you have ever experienced the horror of losing files or a database, then you’ll understand the importance of implementing and TESTING your backups and site recovery capabilities.

I’ve developed some scripts that run on my local Debian server, which backs up my Hostgator sites and associated databases on a daily as well as a weekly basis.

Note: Unless you have a pre-existing arrangement in place, do not assume that your web host provider has adequate backup/recovery protection for your web sites. You’ve been warned.

Continue Reading →

Continue Reading