NETGEAR is aware of a growing number of phone and online scams. To learn how to stay safe click here.
Other Backup Solutions
357 TopicsRequest: Backup to OneDrive
Hello, I see that DSM5.1 on Synology NAS devices now supports OneDrive cloud storage for backing files up to. I already subscribe to OneDrive extensively, so would like to be able to backup my files there. Now that the competition has it, any chance that NetGear could pull this off too? Would please me greatly, and a few others no doubt. Thanks, Fraser8.9KViews4likes8CommentsHow to backup to Google Drive, S3 etc using rclone
Hi, Im new to Netgear Community, in response to these two closed community posts: Developer Request: Full Cloud Sync Access via iClone & incond Backing up to your ReadyNAS I can confirm that rclone can (easily) be installed on ReadyNAS and successfully synced with Google Drive. While I havent deployed an official ReadyNAS developer app, and at this stage dont intend to, the setup took me under 10 minutes and can easily be replicated. Tested on ReadyNAS 104 firmware 6.6.1. Installation SSH onto the ReadyNAS Follow the rclone installation quickstart with the following changes: Substitute the AMD64 precompiled binary for ARM32, ie: curl -O https://downloads.rclone.org/rclone-current-linux-arm.zip Dont worry about copying the man file. As of version 6.6.1, ReadyNAS Linux doesn't come with man installed. Also, rclone --help is really thorough. Usage Follow the rclone for Google Drive doc. If you want to make it a supported Google APIs app, check this part of the rclone docs. Regards, Clive Typhon Solutions Pty Ltd4.6KViews4likes1CommentSpiderOakONE install/upgrade
I have documented how to install and configure SpiderOak in this thread How I got SpiderOak --headless to run on OS6 x64/x86 I thought I'd post, in the correct forum, on how to uninstall SpiderOak and install SpiderOakONE. SpiderOak have recommended not to just install over the top of the existing SpiderOak install. I assume you have SpiderOak installed as a systemd service to start in boot-up. if that's not the case, skip steps 1 and 2. Step 1: Stop the service and disable the existing service from starting on boot-up systemctl stop spideroak systemctl disable spideroak Step 2: Uninstall/remove the existing SpiderOak install apt-get remove spideroak Step 3: Install SpiderOakONE apt-get install SpiderOakONE Step 4: Rename (or create if this is a clean install) the /lib/systemd/system/spideroakone.service file mv /lib/systemd/system/spideroak.service /lib/systemd/system/spideroakone.service Step5: Edit the file to point to the correct Environment file (I installed and used 'nano' as my text editor). See the 'EnvironmentFile' below. Also, I edited 'User' to be the owner of SpiderOakONE so i didn't need to pass the variable in a command line. Note the "ExecStart' line also points to SpiderOakONE [Unit] Description=SpiderOakONE backup service [Install] WantedBy=multi-user.target [Service]cd system, Type=simple EnvironmentFile=-/etc/sysconfig/SpiderOakONE User=root ExecStart=/usr/bin/SpiderOakONE --headless $SPIDEROAKOPTS & Step 6: Move (or create)) the SpiderOakONE environment file mv /etc/sysconfig/SpiderOak /etc/sysconfig/SpiderOakONE Edit it (i used 'nano'). This is a systemd and systemV config. For systemd only, no change is needed as the SPIDEROAKOPTS line is the only one that matters. # # Configuration for SpiderOakONE service # # Options that get passed to SpiderOak. '--headless' is automatically specified # by the appropriate script or configuration file. The available options are at # https://spideroak.com/faq/question s/67/how_can_i_use_spideroak_from_the_commandline/ SPIDEROAKOPTS="--purge-historical-versions h24,d14,w4,m6,y" # The following two options are only used by the SysV init script. # # SPIDEROAKUSER is the user(s) that will run the SpiderOak service. If multiple # (space-separated) users are listed, a SpiderOak process will be started for # each user. SPIDEROAKUSER="bcotton" “root” or “admin” etc # SPIDEROAKCMD is the binary that is executed. Normally it should not be # necessary to set this. # SPIDEROAK='/usr/bin/SpiderOak' Step 8: Now we can enable the service for start-up at boot systemctl enable spideroakone systemctl start spideroakone systemctl status spideroakone One interesting aspect I noted, given my previous post on moving the cache folder from the /root, internal storage to the /apps directory on the spinning disk storage, is the SpiderOakONE install created a link in my root dir to SpiderOakONE. And SpiderOakONE links to /apps/Spideroak. I didn't do this - the install did! drwx------ 1 root root 34 Jul 15 13:17 autostart lrwxrwxrwx 1 root root 26 Sep 24 10:29 SpiderOak -> /root/.config/SpiderOakONE lrwxrwxrwx 1 root root 15 Sep 7 09:21 SpiderOakONE -> /apps/SpiderOak root@Sextuple:~/.config# So it appears that my /apps/SpiderOak directory, which holds the cache, is automatically "re-instated" as the cache for SpiderOakONE. Good luck!4KViews1like1CommentCrashplan JRE update needed
Crashplan released an update on 21 September 2015 (4.4.0) which fails to install on my pro-6 (running 4.2.28). The issue is that the new service requires jre 1.7 or jre 1.8, and my system was running 1.6. The upgrade not only repeatedly failed, it also filled the OS partition with repeated downloads. So if you are running crashplan on OS 4.2.x, then take a look right away. You can install jre 1.8 on the NAS - I used the procedure here: http://minimserver.com/ejre-installer.html Crashplan has not tried up upgrade since I installed the jre - I'll update the thread after that happens.Solved17KViews1like85CommentsAmazon S3 backup for OS6
A long time ago, I got S3 backup working on a Duo, and then later an NVX. This all worked using the S3FS package and the FUSE filesystem. So I want the same on my new RN314 running OS6.2.2. This seems to have the FUSE filesystem support already there, so here is what I have done. (As a side note: S3FS on google code was moved to github in 2013. The google code stuff is still there. The newer code in github seems to have moved on so much that it uses autogen, and has some stack of dependancies such that I've given up trying to use it until a later time. So this is all based on the now-quite-old code at googlecode.com). First an acknowledgement: much of this comes from here: http://www.glaurent.com/fr/howto/readynas#s3. However did fond those instructions to be a little incomplete. Specifically, packages pkg-config and libcrypto are not installed, but are needed. (Libcrypto is supplied in package libssl-dev. Go figure.) So: cd /usr/local/src mkdir s3fs cd s3fs apt-get install build-essential libfuse-dev fuse-utils libcurl4-openssl-dev libxml2-dev mime-support pkh-config libssl-dev wget http://s3fs.googlecode.com/files/s3fs-1.74.tar.gz tar xvzf s3fs-1.74.tar.gz cd s3fs-1.74 ./configure --prefix=/usr make make install Optionally (for non root users) edit the file /etc/fuse.conf, and uncomment #user_allow_other, by removing the leading # only. Create a new file to store your Amazon S3 credential: vi /etc/passwd-s3fs chmod 640 /etc/passwd-s3fs The content needs to be the credentials provided by Amazon (don't disclose this to anyone!). It is of the form: # Access Key ID:Secret Access Key ID xxx:xxx Create the directory to mount the bucket. I prefer to use /s3 (rather than the suggested /mnt/s3) because sometimes other things or applications will use /mnt as a mount point, and then either that would fail or your mount point /mnt/s3 would become invisible. So: mkdir /s3 To test: s3fs <your bucket name> <mountpoint> For example: s3fs fred_bloggs /s3 You can also optionally add a line in /etc/fstab to mount your bucket: s3fs#<your bucket name> /s3 fuse 0 0 NOTE THE ABOVE WILL ACCESS S3 USING INSECURE HTTP. If you want to access your S3 storage using HTTPS, use: (manual mount): s3fs fred_bloggs /s3 url=https://s2.amazonaws.com (auto mount): s3fs#<your bucket name> /s3 fuse url=https://s2.amazonaws.com 0 0 You can unmount the Amazon S3 bucket using: umount /s3 And for backup using rsync from the command line (for example, of my subversion repository): rsync -a --delete -v -m /data/.svn /s3 Backup using frontview is, at present, a step too far.5.4KViews1like1CommentCrashPlan updated to 4.3 and now my GUI will not connect.
OK CrashPlan updated to 4.3.0 and now my Windows 7 GUI will not connect. The fix: 1)Make sure Windows 7 GUI is 4.3 and make sure it is not running when you are doing the steps below: 2) Copy your .ui_info from your nas to your desktop: a) .ui_info on NAS is probably located /var/lib/crashplan and is hidden. (unless you changed it from the default when installing i.e. What directory do you wish to store backups in? [/usr/local/var/crashplan]) b) .uni_info on your desktop is located C:\ProgramData\CrashPlan. (Rename your old .uni_info to .uni_info.orig) EDIT: Configure your ui.properies file on you PC appropriately. i.e. sevice port should be 4200 if you are using ssh to tunnel or service host should be changed to your NAS ip adresss if you made the proper changes in the my.service.xml file on you nas. 3) Start up Crashplan 4.3 on your desktop. Done !8.9KViews1like15CommentsGoodsync Server on the ReadyNAS
Hi, FYI, I succesfully set up a goodsync server (with paid licence) on my ReadyNAS Pro Pioneer. Benefits from using only the NAS as a SAMBA place with a goodsync client on a laptop is that it enables their new block sync protocol which speeds up things *a lot* ! If you want details, PM me !6.8KViews1like4CommentsBackup to google drive. Is it at all possible ?
Hello people, I'm a "happy" owner of a RND4000-100EUS (sparc) which is working fairly. However, I'd be happy to backup some data on google drive but for some reason my backups tasks aren't saving anything on the cloud. Here is the config I made : Source : share xxx Destination : HTTP protocol Host : drive.google.com Path : ReadyNAS/xxx (this folder exists on the google drive) User : my google username Password : my google password Though, when I run the backup, the tasks starts and stops immediately, and nothing has been backed up. Do anyone happen to know which settings I should apply, or is it just not possible to backup on the cloud ? Thanks7.3KViews1like12CommentsAmazon S3 backup for Windows Server 2016
Is it possible to send my Windows Server 2016 backup to Amazon S3 storage? The NAS is handling my local backups on a LUN. I need a backup stored off-site also. If it's not possible, please explain. So far, all I see on the cloud tab for S3 is to connect to data folders, not to iSCSI LUNs. If not possible, can you recommend some simple, reliable, inexpensive backup software for my server that will back up both to the NAS and to Amazon S3?2.4KViews1like0Comments