NETGEAR is aware of a growing number of phone and online scams. To learn how to stay safe click here.
Forum Discussion
ArchPrime
Sep 10, 2019Guide
RN102 migrating to ReadyNAS Ultra 6
Hi, I am lookingto establish the optimal migration process from my RN102 (Arm based CPU), containing 2 x6TB drives currently set up as individual volumes under JBOD and holding roughly 6TB of data, ...
ArchPrime
Sep 21, 2019Guide
Hi Stephen
Apparently my C-panel webhost uses PureFTPd for providing its ftp services.
They looked into things at their end but apprently can't find any sign of ftp errors on the server.
They also confirm they have no resource limitations or filesize limitations their end that could cause this error. Only restriction is a speedlimit (which speedlimit is fine for the other smaller files getting ftp'ed to ReadyNAS this way)
I can also confirm that the files still are turning up each day, even when accompanied by error messages - so there is no issue with login creddentials supplied to readyNAS destination, or path supplied, or writability of destination folder on ReadyNAS.
But something appears to be causing the ftp transfer to cut off in a non graceful way at the end, for just these larger files.
StephenB
Sep 21, 2019Guru - Experienced User
If the cron job is running in the NAS, then the ftp client is running on the NAS, with PureFTPd being the FTP server.
What ftp client are you using?
- ArchPrimeSep 21, 2019Guide
Hi Stephen
The cronjob is being run at my Web hosting company server - and is intiating the sending of sending files via FTP to ReadyNAS.
Yes, I understand the webhost uses PureFTPd for both incoming and outgoing FTP connections to the web server. The client in this context is presumably whatever OS6 uses by default? I do not have any 3rd party FTP ap installed.
- StephenBSep 22, 2019Guru - Experienced User
ArchPrime wrote:
Hi Stephen
The cronjob is being run at my Web hosting company server - and is intiating the sending of sending files via FTP to ReadyNAS.
Ok, I didn't get that from your earlier posts. So the NAS is the server in this case. Do you have upload resume enabled on the NAS?
Also, how many passive ports have you configured? Is the full passive port range forwarded in your router?
- ArchPrimeSep 22, 2019Guide
Yes, I have upload resume enabled, but just ports 50001-50005 were fowarded.
I have just set the whole range 32768-65535 to forward - will see if that helps - thanks for your suggustions!
- ArchPrimeSep 23, 2019Guide
Hmm, apparently having all ports fowarded does not help.
I had a response from a developer of the Xcloner application that generates the cron jobs. He believes the error message is consistant with an FTP timeout.
I am not sure where any timout could be comming from, but presumably, because error messages are generated at webhost end, any detected timeout at webhost would relate to lack of some expected response by the ReadyNAS, but only at the end of large enough file.
- StephenBSep 24, 2019Guru - Experienced User
ArchPrime wrote:I have just set the whole range 32768-65535 to forward - will see if that helps - thanks for your suggustions!
How many are needed depends on how many simultaneous uploads the client is trying to make. Since it didn't help, just set it back (making sure to match the forwarding range to the port range configured in the NAS).
ArchPrime wrote:
I had a response from a developer of the Xcloner application that generates the cron jobs. He believes the error message is consistant with an FTP timeout.
I was thinking that too, and that is why I asked about upload resume. It's possible you are getting timeouts, and upload resume is recovering. Though I would expect them to be logged on the server end, it's possible that they aren't.
Do you have disk spindown enabled? That could cause a timeout at the beginning of the transfer (since the disks might not be spun up). So if that is enabled, you can try disabling it as a test. If it helps, then set up a spindown schedule.
FWIW, if the web host also has an ftp server, you can reverse the client/server relationship by using a NAS backup job to download the folder from the source.
- ArchPrimeSep 24, 2019Guide
Hi thanks Stephen
Will reduce the number of fowarded ports to 10.
I checked and did not have spindown enabled.
A good idea about reversing roles between ReadyNas and webhost - except that the chron job on Webhost also compresses the website , includes a copy of the website sql database from completely seperate part of webhost (no directly user accessable database files that could be accessed remotely exist, until cron creates them ) before uploading it all as a single TAR file. I don't know how I could instigate that collation and archiving process remotely.
As a test of concept anyway, I set up the Monsta FTP app on ReadyNAS and tried to manually download an example of a 2GB site backup TAR file that generates error reports when sent via cron job.
Though I was able to log on to the web host and view directories via the app, MonstaFTP simply stopped responding once I clicked 'download ' on the file. No error message or progress indicator. Not sure if this is relevent in any way?
- StephenBSep 24, 2019Guru - Experienced User
ArchPrime wrote:
I don't know how I could instigate that collation and archiving process remotely.
You could create the tar files with the cron job, but upload the tar to the NAS using a backup job. Deleing the tars automatically might be tricky, but you could potentially create a script that would keep a couple, and delete the oldest before creating a new one.
ArchPrime wrote:
Though I was able to log on to the web host and view directories via the app, MonstaFTP simply stopped responding once I clicked 'download ' on the file. No error message or progress indicator. Not sure if this is relevent in any way?
Not sure either - it's not a client I've used. Perhaps try installing filezilla or some similar client on a PC, and see if you can download files from the server.
- ArchPrimeSep 24, 2019Guide
Hi Stephen
I will play with that approach - good idea.
Looks like the issue with Monsta FTP may just have been a built in filesize limit of 128 MB.
No problems struck so far just using a standard system backup job via ftp to do it, though I have to backup whole remote folder rather than individual TAR files this way.
Problems thus seem confined to webhost instigated ftp connection to ReadyNAS
- ArchPrimeOct 22, 2019Guide
Thank you again Stephen for your help with this.
A solution in the end turned out to lie with replacing the standalone Xcloner website backup application I was using on the webhost for a different version of Xcloner (where effectivily the standalone application functionalty was repackaged into a plugin, run from a dummy wordpress website set up for the purpose)
Suddenly the large files made it through again.
I can only assume the original standalone version had hit some sort of undisclosed resource or execution time limit policy on the host that the wordpress based version did not strike - perhaps the host has a more tolerant policy towards Wordpress installations than little known 3rd party app installations.
Related Content
NETGEAR Academy
Boost your skills with the Netgear Academy - Get trained, certified and stay ahead with the latest Netgear technology!
Join Us!