× NETGEAR will be terminating ReadyCLOUD service by July 1st, 2023. For more details click here.
Orbi WiFi 7 RBE973
Reply

Random connection loss to Netgear ReadyNAS 314

aaron921
Aspirant

Random connection loss to Netgear ReadyNAS 314

We have been having connection issues.  Everything works fine, then randomly we loose connection.  Our setup is a Netgear ADSL2+ Router model B90-755025-15 that our DSL line connects to.  Then there is a Cisco 8 port workgroup switch that the ReadyNAS 314 is plugged into.  At this office there is a computer with an ethernet connection plugged into the switch.  There is also a laptop that connects wirelessly through the Netgear Router.  We also utilize the Cloud service with people outside of the office.  There are days where we have no problems with the setup.  The next day we will be working and loose connection to the Netgear.  I can still get into the Admin page via a web browser.  I found resetting the ReadyNAS typically fixes the problem.  One day I removed the switch from the setup and only had the computer and ReadyNAS plugged into the Netgear Router.  That worked half a day fine, then stopped working.  I restarted the ReadyNAS, and still nothing.  I went back to having the computer and ReadyNAS back into the switch and did a restart and everything worked.  Monday it randomly stopped working.  I changed ports on the Switch and the computer that was plugged into the switch worked, but the wirelessly connected laptop didn't work.  I can't seem to find anything consistant, other than normally resetting the ReadyNAS fixes the problem.  I can't confirm every occasion, but normally the Cloud doesn't have access to the files either.  During any occurance at the office, neither the wirelessly connected laptop or computer connected via ethernet loose internet connection.  Wondering if anyone has any ideas on what could be wrong.  

Model: RN31400|ReadyNAS 300 Series 4- Bay
Message 1 of 7
StephenB
Guru

Re: Random connection loss to Netgear ReadyNAS 314


@aaron921 wrote:

Wondering if anyone has any ideas on what could be wrong.  


First - if you purchased the RN314 new, and between 1 June 2014 and 31 May 2016 then you are entitled to free lifetime chat support.  Even if you are not entitled to free support, you might want to create a support case via my.netgear.com.  Note that they will not support used equipment.

 

What firmware is the NAS running?

 

If you are using NIC bonding on the ReadyNAS, I suggest breaking the bond, and running with a single ethernet connection.  Some bonding modes can create some connectivity problems.

 

 

Also download the full log zip file - you can request log analysis by pm'ing a netgear mod (perhaps @mdgm-ntgr or @FramerV) and submit the logs using the procedure here: http://kb.netgear.com/21543/How-do-I-send-all-logs-to-ReadyNAS-Community-moderators?cid=wmt_netgear_...

 

One possible cause is a filling OS partition.  Open the log zip file, and examine volume.log.  There is a section that starts with === df -h ===  Please post the lines up to (and including) tmpfs (there should be about 6 file systems altogether).

 

Message 2 of 7
aaron921
Aspirant

Re: Random connection loss to Netgear ReadyNAS 314

Well i checked my purchase date and it was June 17, 2016.  I will work on getting the logs posted.  

Message 3 of 7
aaron921
Aspirant

Re: Random connection loss to Netgear ReadyNAS 314

The firmware is 6.6.1  

 

Not exactly sure what NIC bonding is, so I'm assuming I dont have that.  

 

I'll be sending the logs to a netgear moderater next.  

 

Here is the volume log.  

 

=== df -h ===
Filesystem Size Used Avail Use% Mounted on
udev 10M 4.0K 10M 1% /dev
/dev/md0 4.0G 734M 2.9G 20% /
tmpfs 992M 8.0K 992M 1% /dev/shm
tmpfs 992M 6.1M 986M 1% /run
tmpfs 496M 3.3M 493M 1% /run/lock
tmpfs 992M 0 992M 0% /sys/fs/cgroup
/dev/md127 5.5T 1.7T 3.9T 30% /data
/dev/md127 5.5T 1.7T 3.9T 30% /apps
/dev/md127 5.5T 1.7T 3.9T 30% /home
/dev/sdg1 1.9T 1.6T 226G 88% /media/USB_HDD_1
tmpfs 4.0K 0 4.0K 0% /data/CurrentFiles/snapshot
/dev/md127 5.5T 1.7T 3.9T 30% /data/CurrentFiles/snapshot/c_2016_06_25__00_00_14
/dev/md127 5.5T 1.7T 3.9T 30% /data/CurrentFiles/snapshot/c_2016_07_30__00_00_09
/dev/md127 5.5T 1.7T 3.9T 30% /data/CurrentFiles/snapshot/c_2016_08_27__00_00_12
/dev/md127 5.5T 1.7T 3.9T 30% /data/CurrentFiles/snapshot/c_2016_09_24__00_00_21
/dev/md127 5.5T 1.7T 3.9T 30% /data/CurrentFiles/snapshot/c_2016_10_29__00_00_13
/dev/md127 5.5T 1.7T 3.9T 30% /data/CurrentFiles/snapshot/c_2016_11_26__00_00_17
/dev/md127 5.5T 1.7T 3.9T 30% /data/CurrentFiles/snapshot/c_2016_12_31__00_00_15
/dev/md127 5.5T 1.7T 3.9T 30% /data/CurrentFiles/snapshot/c_2017_01_07__00_00_27
/dev/md127 5.5T 1.7T 3.9T 30% /data/CurrentFiles/snapshot/c_2017_01_14__00_00_23
/dev/md127 5.5T 1.7T 3.9T 30% /data/CurrentFiles/snapshot/c_2017_01_21__00_00_06
/dev/md127 5.5T 1.7T 3.9T 30% /data/CurrentFiles/snapshot/c_2017_01_28__00_00_11
/dev/md127 5.5T 1.7T 3.9T 30% /data/CurrentFiles/snapshot/c_2017_02_04__00_00_15
/dev/md127 5.5T 1.7T 3.9T 30% /data/CurrentFiles/snapshot/c_2017_02_11__00_00_01
/dev/md127 5.5T 1.7T 3.9T 30% /data/CurrentFiles/snapshot/c_2017_02_18__00_00_15
tmpfs 4.0K 0 4.0K 0% /data/Data/snapshot
/dev/md127 5.5T 1.7T 3.9T 30% /data/Data/snapshot/c_2016_06_25__00_00_14
/dev/md127 5.5T 1.7T 3.9T 30% /data/Data/snapshot/c_2016_07_30__00_00_09
/dev/md127 5.5T 1.7T 3.9T 30% /data/Data/snapshot/c_2016_08_27__00_00_12
/dev/md127 5.5T 1.7T 3.9T 30% /data/Data/snapshot/c_2016_09_24__00_00_21
/dev/md127 5.5T 1.7T 3.9T 30% /data/Data/snapshot/c_2016_10_29__00_00_13
/dev/md127 5.5T 1.7T 3.9T 30% /data/Data/snapshot/c_2016_11_26__00_00_17
/dev/md127 5.5T 1.7T 3.9T 30% /data/Data/snapshot/c_2016_12_31__00_00_15
/dev/md127 5.5T 1.7T 3.9T 30% /data/Data/snapshot/c_2017_01_07__00_00_27
/dev/md127 5.5T 1.7T 3.9T 30% /data/Data/snapshot/c_2017_01_14__00_00_23
/dev/md127 5.5T 1.7T 3.9T 30% /data/Data/snapshot/c_2017_01_21__00_00_06
/dev/md127 5.5T 1.7T 3.9T 30% /data/Data/snapshot/c_2017_01_28__00_00_11
/dev/md127 5.5T 1.7T 3.9T 30% /data/Data/snapshot/c_2017_02_04__00_00_15
/dev/md127 5.5T 1.7T 3.9T 30% /data/Data/snapshot/c_2017_02_11__00_00_01
/dev/md127 5.5T 1.7T 3.9T 30% /data/Data/snapshot/c_2017_02_18__00_00_15
tmpfs 4.0K 0 4.0K 0% /data/Payroll/snapshot
/dev/md127 5.5T 1.7T 3.9T 30% /data/Payroll/snapshot/c_2016_06_25__00_00_14
/dev/md127 5.5T 1.7T 3.9T 30% /data/Payroll/snapshot/c_2016_07_30__00_00_09
/dev/md127 5.5T 1.7T 3.9T 30% /data/Payroll/snapshot/c_2016_08_27__00_00_12
/dev/md127 5.5T 1.7T 3.9T 30% /data/Payroll/snapshot/c_2016_09_24__00_00_21
/dev/md127 5.5T 1.7T 3.9T 30% /data/Payroll/snapshot/c_2016_10_29__00_00_13
/dev/md127 5.5T 1.7T 3.9T 30% /data/Payroll/snapshot/c_2016_11_26__00_00_17
/dev/md127 5.5T 1.7T 3.9T 30% /data/Payroll/snapshot/c_2016_12_31__00_00_15
/dev/md127 5.5T 1.7T 3.9T 30% /data/Payroll/snapshot/c_2017_01_07__00_00_27
/dev/md127 5.5T 1.7T 3.9T 30% /data/Payroll/snapshot/c_2017_01_14__00_00_23
/dev/md127 5.5T 1.7T 3.9T 30% /data/Payroll/snapshot/c_2017_01_21__00_00_06
/dev/md127 5.5T 1.7T 3.9T 30% /data/Payroll/snapshot/c_2017_01_28__00_00_11
/dev/md127 5.5T 1.7T 3.9T 30% /data/Payroll/snapshot/c_2017_02_04__00_00_15
/dev/md127 5.5T 1.7T 3.9T 30% /data/Payroll/snapshot/c_2017_02_11__00_00_01
/dev/md127 5.5T 1.7T 3.9T 30% /data/Payroll/snapshot/c_2017_02_18__00_00_15
=== df -i ===
Filesystem Inodes IUsed IFree IUse% Mounted on
udev 253310 466 252844 1% /dev
/dev/md0 0 0 0 - /
tmpfs 253928 3 253925 1% /dev/shm
tmpfs 253928 648 253280 1% /run
tmpfs 253928 38 253890 1% /run/lock
tmpfs 253928 9 253919 1% /sys/fs/cgroup
/dev/md127 0 0 0 - /data
/dev/md127 0 0 0 - /apps
/dev/md127 0 0 0 - /home
/dev/sdg1 0 0 0 - /media/USB_HDD_1
tmpfs 253928 15 253913 1% /data/CurrentFiles/snapshot
/dev/md127 0 0 0 - /data/CurrentFiles/snapshot/c_2016_06_25__00_00_14
/dev/md127 0 0 0 - /data/CurrentFiles/snapshot/c_2016_07_30__00_00_09
/dev/md127 0 0 0 - /data/CurrentFiles/snapshot/c_2016_08_27__00_00_12
/dev/md127 0 0 0 - /data/CurrentFiles/snapshot/c_2016_09_24__00_00_21
/dev/md127 0 0 0 - /data/CurrentFiles/snapshot/c_2016_10_29__00_00_13
/dev/md127 0 0 0 - /data/CurrentFiles/snapshot/c_2016_11_26__00_00_17
/dev/md127 0 0 0 - /data/CurrentFiles/snapshot/c_2016_12_31__00_00_15
/dev/md127 0 0 0 - /data/CurrentFiles/snapshot/c_2017_01_07__00_00_27
/dev/md127 0 0 0 - /data/CurrentFiles/snapshot/c_2017_01_14__00_00_23
/dev/md127 0 0 0 - /data/CurrentFiles/snapshot/c_2017_01_21__00_00_06
/dev/md127 0 0 0 - /data/CurrentFiles/snapshot/c_2017_01_28__00_00_11
/dev/md127 0 0 0 - /data/CurrentFiles/snapshot/c_2017_02_04__00_00_15
/dev/md127 0 0 0 - /data/CurrentFiles/snapshot/c_2017_02_11__00_00_01
/dev/md127 0 0 0 - /data/CurrentFiles/snapshot/c_2017_02_18__00_00_15
tmpfs 253928 15 253913 1% /data/Data/snapshot
/dev/md127 0 0 0 - /data/Data/snapshot/c_2016_06_25__00_00_14
/dev/md127 0 0 0 - /data/Data/snapshot/c_2016_07_30__00_00_09
/dev/md127 0 0 0 - /data/Data/snapshot/c_2016_08_27__00_00_12
/dev/md127 0 0 0 - /data/Data/snapshot/c_2016_09_24__00_00_21
/dev/md127 0 0 0 - /data/Data/snapshot/c_2016_10_29__00_00_13
/dev/md127 0 0 0 - /data/Data/snapshot/c_2016_11_26__00_00_17
/dev/md127 0 0 0 - /data/Data/snapshot/c_2016_12_31__00_00_15
/dev/md127 0 0 0 - /data/Data/snapshot/c_2017_01_07__00_00_27
/dev/md127 0 0 0 - /data/Data/snapshot/c_2017_01_14__00_00_23
/dev/md127 0 0 0 - /data/Data/snapshot/c_2017_01_21__00_00_06
/dev/md127 0 0 0 - /data/Data/snapshot/c_2017_01_28__00_00_11
/dev/md127 0 0 0 - /data/Data/snapshot/c_2017_02_04__00_00_15
/dev/md127 0 0 0 - /data/Data/snapshot/c_2017_02_11__00_00_01
/dev/md127 0 0 0 - /data/Data/snapshot/c_2017_02_18__00_00_15
tmpfs 253928 15 253913 1% /data/Payroll/snapshot
/dev/md127 0 0 0 - /data/Payroll/snapshot/c_2016_06_25__00_00_14
/dev/md127 0 0 0 - /data/Payroll/snapshot/c_2016_07_30__00_00_09
/dev/md127 0 0 0 - /data/Payroll/snapshot/c_2016_08_27__00_00_12
/dev/md127 0 0 0 - /data/Payroll/snapshot/c_2016_09_24__00_00_21
/dev/md127 0 0 0 - /data/Payroll/snapshot/c_2016_10_29__00_00_13
/dev/md127 0 0 0 - /data/Payroll/snapshot/c_2016_11_26__00_00_17
/dev/md127 0 0 0 - /data/Payroll/snapshot/c_2016_12_31__00_00_15
/dev/md127 0 0 0 - /data/Payroll/snapshot/c_2017_01_07__00_00_27
/dev/md127 0 0 0 - /data/Payroll/snapshot/c_2017_01_14__00_00_23
/dev/md127 0 0 0 - /data/Payroll/snapshot/c_2017_01_21__00_00_06
/dev/md127 0 0 0 - /data/Payroll/snapshot/c_2017_01_28__00_00_11
/dev/md127 0 0 0 - /data/Payroll/snapshot/c_2017_02_04__00_00_15
/dev/md127 0 0 0 - /data/Payroll/snapshot/c_2017_02_11__00_00_01
/dev/md127 0 0 0 - /data/Payroll/snapshot/c_2017_02_18__00_00_15

Message 4 of 7
StephenB
Guru

Re: Random connection loss to Netgear ReadyNAS 314


@aaron921 wrote:

The firmware is 6.6.1  

 

Not exactly sure what NIC bonding is, so I'm assuming I dont have that.  

 

 


"NIC bonding" is described here: https://docs.oracle.com/cd/E27300_01/E27309/html/vmusg-network-bonding.html  If you have both ethernet ports connected on the RN314, then disconnect one of them.

 

 Also (since you are running 6.6.1) I recommend disabling the AntiVirus service if that is enabled.  The introduction of the new AV service wasn't very smooth (to put it mildly).  I'm not sure if the bugs there are related to your particular problem, but it's best to disable that service (or upgrade to the 6.7.0 beta release).


@aaron921 wrote:

 

Here is the volume log.  

 

=== df -h ===
Filesystem Size Used Avail Use% Mounted on
udev 10M 4.0K 10M 1% /dev
/dev/md0 4.0G 734M 2.9G 20% /
tmpfs 992M 8.0K 992M 1% /dev/shm
tmpfs 992M 6.1M 986M 1% /run
tmpfs 496M 3.3M 493M 1% /run/lock
tmpfs 992M 0 992M 0% /sys/fs/cgroup


Thanks.  The OS partitons aren't full, so that possibility is ruled out.

Message 5 of 7
mdgm-ntgr
NETGEAR Employee Retired

Re: Random connection loss to Netgear ReadyNAS 314

I see you only have a 100Mbit connection? Why not use a gigabit router/switch?

Does disabling the anti-virus service make any difference?

Message 6 of 7
hrenz01
Aspirant

Betreff: Random connection loss to Netgear ReadyNAS 314

Same problem happens to me. Have connected the NAS to 2 different networks and now i have an intermittend connection loss.

Newest firmware installed 6.6.1 . No glue why this happens. a few years ago there was the same problem, i found out, but this should be solved now. Can Netgear please check, what´s the reason and provide a solution. This is not nice and very unsatisfying.

 

Message 7 of 7
Top Contributors
Discussion stats
  • 6 replies
  • 3541 views
  • 0 kudos
  • 4 in conversation
Announcements