- Subscribe to RSS Feed
- Mark Topic as New
- Mark Topic as Read
- Float this Topic for Current User
- Bookmark
- Subscribe
- Printer Friendly Page
Re: System volume root's usage is 90%.
- Mark as New
- Bookmark
- Subscribe
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Dear All
After an extensive search on here i am at a loss. I know this is an error that come up very often it seems. No solutions have worked for me yet
Put the latest firmware in and error is still being reported.
Deleted the tmp file in /var/lib/clamav, still same error.
Files in that folder
root@CEB-NAS:/var/lib/clamav# ls -lsh
total 113M
113M -rw-r--r-- 1 root root 113M Mar 6 2020 main.cvd
4.0K -rw------- 1 root root 52 Mar 6 2020 mirrors.dat
I am at a loss to be honest and running the CLI is rather flustrating as i am fubling in the dark and do not wish to mess it up.
Ran this cmd and got this
root@CEB-NAS:~# cd /var/lib/clamav
root@CEB-NAS:/var/lib/clamav# ls -lsh
total 113M
113M -rw-r--r-- 1 root root 113M Mar 6 2020 main.cvd
4.0K -rw------- 1 root root 52 Mar 6 2020 mirrors.dat
root@CEB-NAS:/var/lib/clamav# du -hsx /* | sort -rh | head -10
du: cannot access '/proc/31581/task/31581/fd/3': No such file or directory
du: cannot access '/proc/31581/task/31581/fdinfo/3': No such file or directory
du: cannot access '/proc/31581/fd/3': No such file or directory
du: cannot access '/proc/31581/fdinfo/3': No such file or directory
2.0G /apps
1.1G /var
387M /usr
31M /frontview
28M /lib
11M /sbin
11M /etc
6.2M /opt
6.0M /bin
540K /run
Help please
Regards Patrick
Solved! Go to Solution.
Accepted Solutions
- Mark as New
- Bookmark
- Subscribe
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Dear All
All wlerts have stopped after the reboot, so its sorted.
Thanks to everybody who gave me a reply
closing this off as solved
Regards Patrick
All Replies
- Mark as New
- Bookmark
- Subscribe
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Re: System volume root's usage is 90%.
A full 4GB root volume can have a huge variety of different causes. clamav is not the problem here.
Post the output of these commands:
# df -h # df -i # mount --bind / /mnt # du -csh /mnt/*
- Mark as New
- Bookmark
- Subscribe
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Re: System volume root's usage is 90%.
Also, let us know what firmware you are running.
FWIW, ClamAV hasn't been supported on the RN104 for quite some time now.
- Mark as New
- Bookmark
- Subscribe
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Re: System volume root's usage is 90%.
Actually, it doesn't really come up often in the grand scheme of things. The forum is far more full of problems than "everything is working swell" posts, so it may be common among the problems, but it's overall a rare occurrence. You have to be careful poking around the OS partiton. Of course, deleting the wrong thing can make things go amuck. But there are also a lot of links that actually point to something outside the partition that may look like they take up space but do not.
- Mark as New
- Bookmark
- Subscribe
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Re: System volume root's usage is 90%.
Hi And thanks for the reply.
I might have sorted it, found a 2gig log file in the apps folder something which was reporting a php error from 2015 to date! I now cleaned it up. But let me see what you people have to say, till now it all working.
Ok first command
root@CEB-NAS:~# df -h
Filesystem Size Used Avail Use% Mounted on
udev 10M 4.0K 10M 1% /dev
/dev/md0 4.0G 3.5G 218M 95% /
tmpfs 249M 0 249M 0% /dev/shm
tmpfs 249M 540K 248M 1% /run
tmpfs 125M 4.6M 120M 4% /run/lock
tmpfs 249M 0 249M 0% /sys/fs/cgroup
/dev/md126 1.9T 1.3T 573G 70% /NAS-DATA
/dev/md127 1.9T 23G 1.8T 2% /BACKUP_DISK
tmpfs 4.0K 0 4.0K 0% /NAS-DATA/CEB-DATA/snapshot
/dev/md126 1.9T 1.3T 573G 70% /NAS-DATA/CEB-DATA/snapshot/c_2021_03_20__ 00_00_57
/dev/md126 1.9T 1.3T 573G 70% /NAS-DATA/CEB-DATA/snapshot/c_2021_03_27__ 00_00_33
/dev/md126 1.9T 1.3T 573G 70% /NAS-DATA/CEB-DATA/snapshot/c_2021_04_03__ 00_00_34
/dev/md126 1.9T 1.3T 573G 70% /NAS-DATA/CEB-DATA/snapshot/c_2021_04_10__ 00_00_24
/dev/md126 1.9T 1.3T 573G 70% /NAS-DATA/CEB-DATA/snapshot/c_2021_04_17__ 00_01_02
/dev/md126 1.9T 1.3T 573G 70% /NAS-DATA/CEB-DATA/snapshot/c_2021_04_24__ 00_00_55
/dev/md126 1.9T 1.3T 573G 70% /NAS-DATA/CEB-DATA/snapshot/c_2021_05_01__ 00_00_36
/dev/md126 1.9T 1.3T 573G 70% /NAS-DATA/CEB-DATA/snapshot/c_2021_05_08__ 00_00_28
root@CEB-NAS:~#
2nd command
root@CEB-NAS:~# df -i
Filesystem Inodes IUsed IFree IUse% Mounted on
udev 63127 448 62679 1% /dev
/dev/md0 65536 17767 47769 28% /
tmpfs 63582 1 63581 1% /dev/shm
tmpfs 63582 609 62973 1% /run
tmpfs 63582 30 63552 1% /run/lock
tmpfs 63582 9 63573 1% /sys/fs/cgroup
/dev/md126 0 0 0 - /NAS-DATA
/dev/md127 0 0 0 - /BACKUP_DISK
tmpfs 63582 9 63573 1% /NAS-DATA/CEB-DATA/snapshot
/dev/md126 0 0 0 - /NAS-DATA/CEB-DATA/snapshot/c_2021_03_20__00_00_57
/dev/md126 0 0 0 - /NAS-DATA/CEB-DATA/snapshot/c_2021_03_27__00_00_33
/dev/md126 0 0 0 - /NAS-DATA/CEB-DATA/snapshot/c_2021_04_03__00_00_34
/dev/md126 0 0 0 - /NAS-DATA/CEB-DATA/snapshot/c_2021_04_10__00_00_24
/dev/md126 0 0 0 - /NAS-DATA/CEB-DATA/snapshot/c_2021_04_17__00_01_02
/dev/md126 0 0 0 - /NAS-DATA/CEB-DATA/snapshot/c_2021_04_24__00_00_55
/dev/md126 0 0 0 - /NAS-DATA/CEB-DATA/snapshot/c_2021_05_01__00_00_36
/dev/md126 0 0 0 - /NAS-DATA/CEB-DATA/snapshot/c_2021_05_08__00_00_28
root@CEB-NAS:~#
3rd & 4th command
root@CEB-NAS:~# mount --bind / /mnt
root@CEB-NAS:~# du -csh /mnt/*
780K /mnt/apps
4.0K /mnt/BACKUP_DISK
6.0M /mnt/bin
4.0K /mnt/boot
2.8M /mnt/data
12K /mnt/dev
11M /mnt/etc
31M /mnt/frontview
4.0K /mnt/home
28M /mnt/lib
16K /mnt/lost+found
4.0K /mnt/media
4.0K /mnt/mnt
4.0K /mnt/NAS-DATA
6.2M /mnt/opt
4.0K /mnt/proc
28K /mnt/root
8.0K /mnt/run
11M /mnt/sbin
4.0K /mnt/selinux
4.0K /mnt/srv
4.0K /mnt/sys
44K /mnt/tmp
387M /mnt/usr
1.1G /mnt/var
1.5G total
root@CEB-NAS:~#
Regards Patrick
- Mark as New
- Bookmark
- Subscribe
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Re: System volume root's usage is 90%.
Hi All
Thanks for the reply, my first reply went lost on the voids of the interweb!
So i seem to have solved it, found a 2 gig log file in apps, logging a php error from 2015 to date! I cleaned that up prayed it will work.
Everything is till working as it should but here are the commands as you wanted.
first
root@CEB-NAS:~# df -h
Filesystem Size Used Avail Use% Mounted on
udev 10M 4.0K 10M 1% /dev
/dev/md0 4.0G 3.5G 218M 95% /
tmpfs 249M 0 249M 0% /dev/shm
tmpfs 249M 540K 248M 1% /run
tmpfs 125M 4.6M 120M 4% /run/lock
tmpfs 249M 0 249M 0% /sys/fs/cgroup
/dev/md126 1.9T 1.3T 573G 70% /NAS-DATA
/dev/md127 1.9T 23G 1.8T 2% /BACKUP_DISK
tmpfs 4.0K 0 4.0K 0% /NAS-DATA/CEB-DATA/snapshot
/dev/md126 1.9T 1.3T 573G 70% /NAS-DATA/CEB-DATA/snapshot/c_2021_03_20__ 00_00_57
/dev/md126 1.9T 1.3T 573G 70% /NAS-DATA/CEB-DATA/snapshot/c_2021_03_27__ 00_00_33
/dev/md126 1.9T 1.3T 573G 70% /NAS-DATA/CEB-DATA/snapshot/c_2021_04_03__ 00_00_34
/dev/md126 1.9T 1.3T 573G 70% /NAS-DATA/CEB-DATA/snapshot/c_2021_04_10__ 00_00_24
/dev/md126 1.9T 1.3T 573G 70% /NAS-DATA/CEB-DATA/snapshot/c_2021_04_17__ 00_01_02
/dev/md126 1.9T 1.3T 573G 70% /NAS-DATA/CEB-DATA/snapshot/c_2021_04_24__ 00_00_55
/dev/md126 1.9T 1.3T 573G 70% /NAS-DATA/CEB-DATA/snapshot/c_2021_05_01__ 00_00_36
/dev/md126 1.9T 1.3T 573G 70% /NAS-DATA/CEB-DATA/snapshot/c_2021_05_08__ 00_00_28
root@CEB-NAS:~#
second
root@CEB-NAS:~# df -i
Filesystem Inodes IUsed IFree IUse% Mounted on
udev 63127 448 62679 1% /dev
/dev/md0 65536 17768 47768 28% /
tmpfs 63582 1 63581 1% /dev/shm
tmpfs 63582 609 62973 1% /run
tmpfs 63582 29 63553 1% /run/lock
tmpfs 63582 9 63573 1% /sys/fs/cgroup
/dev/md126 0 0 0 - /NAS-DATA
/dev/md127 0 0 0 - /BACKUP_DISK
tmpfs 63582 9 63573 1% /NAS-DATA/CEB-DATA/snapshot
/dev/md126 0 0 0 - /NAS-DATA/CEB-DATA/snapshot/c_2021_03_20__00_00_57
/dev/md126 0 0 0 - /NAS-DATA/CEB-DATA/snapshot/c_2021_03_27__00_00_33
/dev/md126 0 0 0 - /NAS-DATA/CEB-DATA/snapshot/c_2021_04_03__00_00_34
/dev/md126 0 0 0 - /NAS-DATA/CEB-DATA/snapshot/c_2021_04_10__00_00_24
/dev/md126 0 0 0 - /NAS-DATA/CEB-DATA/snapshot/c_2021_04_17__00_01_02
/dev/md126 0 0 0 - /NAS-DATA/CEB-DATA/snapshot/c_2021_04_24__00_00_55
/dev/md126 0 0 0 - /NAS-DATA/CEB-DATA/snapshot/c_2021_05_01__00_00_36
/dev/md126 0 0 0 - /NAS-DATA/CEB-DATA/snapshot/c_2021_05_08__00_00_28
root@CEB-NAS:~#
Mount and last cmd
root@CEB-NAS:~# du -csh /mnt/*
780K /mnt/apps
4.0K /mnt/BACKUP_DISK
6.0M /mnt/bin
4.0K /mnt/boot
2.8M /mnt/data
12K /mnt/dev
11M /mnt/etc
31M /mnt/frontview
4.0K /mnt/home
28M /mnt/lib
16K /mnt/lost+found
4.0K /mnt/media
4.0K /mnt/mnt
4.0K /mnt/NAS-DATA
6.2M /mnt/opt
4.0K /mnt/proc
28K /mnt/root
8.0K /mnt/run
11M /mnt/sbin
4.0K /mnt/selinux
4.0K /mnt/srv
4.0K /mnt/sys
44K /mnt/tmp
387M /mnt/usr
1.1G /mnt/var
1.5G total
root@CEB-NAS:~#
Let me know
Regards Patrick
- Mark as New
- Bookmark
- Subscribe
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Re: System volume root's usage is 90%.
3rd time lucky i guess, lost it all again
root@CEB-NAS:~# df -h
Filesystem Size Used Avail Use% Mounted on
udev 10M 4.0K 10M 1% /dev
/dev/md0 4.0G 3.5G 218M 95% /
tmpfs 249M 0 249M 0% /dev/shm
tmpfs 249M 540K 248M 1% /run
tmpfs 125M 4.6M 120M 4% /run/lock
tmpfs 249M 0 249M 0% /sys/fs/cgroup
/dev/md126 1.9T 1.3T 573G 70% /NAS-DATA
/dev/md127 1.9T 23G 1.8T 2% /BACKUP_DISK
tmpfs 4.0K 0 4.0K 0% /NAS-DATA/CEB-DATA/snapshot
/dev/md126 1.9T 1.3T 573G 70% /NAS-DATA/CEB-DATA/snapshot/c_2021_03_20__ 00_00_57
/dev/md126 1.9T 1.3T 573G 70% /NAS-DATA/CEB-DATA/snapshot/c_2021_03_27__ 00_00_33
/dev/md126 1.9T 1.3T 573G 70% /NAS-DATA/CEB-DATA/snapshot/c_2021_04_03__ 00_00_34
/dev/md126 1.9T 1.3T 573G 70% /NAS-DATA/CEB-DATA/snapshot/c_2021_04_10__ 00_00_24
/dev/md126 1.9T 1.3T 573G 70% /NAS-DATA/CEB-DATA/snapshot/c_2021_04_17__ 00_01_02
/dev/md126 1.9T 1.3T 573G 70% /NAS-DATA/CEB-DATA/snapshot/c_2021_04_24__ 00_00_55
/dev/md126 1.9T 1.3T 573G 70% /NAS-DATA/CEB-DATA/snapshot/c_2021_05_01__ 00_00_36
/dev/md126 1.9T 1.3T 573G 70% /NAS-DATA/CEB-DATA/snapshot/c_2021_05_08__ 00_00_28
root@CEB-NAS:~# ^C
root@CEB-NAS:~# df -i
Filesystem Inodes IUsed IFree IUse% Mounted on
udev 63127 448 62679 1% /dev
/dev/md0 65536 17768 47768 28% /
tmpfs 63582 1 63581 1% /dev/shm
tmpfs 63582 609 62973 1% /run
tmpfs 63582 29 63553 1% /run/lock
tmpfs 63582 9 63573 1% /sys/fs/cgroup
/dev/md126 0 0 0 - /NAS-DATA
/dev/md127 0 0 0 - /BACKUP_DISK
tmpfs 63582 9 63573 1% /NAS-DATA/CEB-DATA/snapshot
/dev/md126 0 0 0 - /NAS-DATA/CEB-DATA/snapshot/c_2021_03_20__00_00_57
/dev/md126 0 0 0 - /NAS-DATA/CEB-DATA/snapshot/c_2021_03_27__00_00_33
/dev/md126 0 0 0 - /NAS-DATA/CEB-DATA/snapshot/c_2021_04_03__00_00_34
/dev/md126 0 0 0 - /NAS-DATA/CEB-DATA/snapshot/c_2021_04_10__00_00_24
/dev/md126 0 0 0 - /NAS-DATA/CEB-DATA/snapshot/c_2021_04_17__00_01_02
/dev/md126 0 0 0 - /NAS-DATA/CEB-DATA/snapshot/c_2021_04_24__00_00_55
/dev/md126 0 0 0 - /NAS-DATA/CEB-DATA/snapshot/c_2021_05_01__00_00_36
/dev/md126 0 0 0 - /NAS-DATA/CEB-DATA/snapshot/c_2021_05_08__00_00_28
root@CEB-NAS:~# ^C
root@CEB-NAS:~# du -csh /mnt/*
780K /mnt/apps
4.0K /mnt/BACKUP_DISK
6.0M /mnt/bin
4.0K /mnt/boot
2.8M /mnt/data
12K /mnt/dev
11M /mnt/etc
31M /mnt/frontview
4.0K /mnt/home
28M /mnt/lib
16K /mnt/lost+found
4.0K /mnt/media
4.0K /mnt/mnt
4.0K /mnt/NAS-DATA
6.2M /mnt/opt
4.0K /mnt/proc
28K /mnt/root
8.0K /mnt/run
11M /mnt/sbin
4.0K /mnt/selinux
4.0K /mnt/srv
4.0K /mnt/sys
44K /mnt/tmp
387M /mnt/usr
1.1G /mnt/var
1.5G total
root@CEB-NAS:~# ^C
root@CEB-NAS:~#
- Mark as New
- Bookmark
- Subscribe
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Re: System volume root's usage is 90%.
@CEBLTD wrote:
Thanks for the reply, my first reply went lost on the voids of the interweb!
They got trapped by the automatic spam filter. Periodically the mods check the filter, and manually release the false positives.
@CEBLTD wrote:
So i seem to have solved it, found a 2 gig log file in apps, logging a php error from 2015 to date! I cleaned that up prayed it will work.
Everything is till working as it should
Looking at the details you posted, it appears to be fine now. The apps folder is normally a mount point to the data volume, so if you remount the root to /mnt, the apps folder should be totally empty.
- Mark as New
- Bookmark
- Subscribe
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Re: System volume root's usage is 90%.
Hi Again everybody
I seem to have failed in my solution. Problem is still there. As a last resort i think it would be wise to backup everything and do a factory reset.
System volume root's usage is 88%. This condition should not occur under normal conditions. Contact technical support.
I will have a last rummage around the root file maybe i missed something
any words of wisdom more than welcome
Thanks Patrick
- Mark as New
- Bookmark
- Subscribe
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Re: System volume root's usage is 90%.
# du -csh /mnt/* 1.1G /mnt/var 1.5G total
This suggests the problem is in /var. So you could do e.g.
# du -csh /mnt/var/*
and so forth until you find some big files e.g. some big log files.
If in doubt, ask whether something is something you should empty or remove.
If the anti-virus service is still running you should stop and disable that service. Maybe I was wrong and there could be some huge anti-virus related files still there.
- Mark as New
- Bookmark
- Subscribe
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Re: System volume root's usage is 90%.
Hi Thanks for the reply
unfortunately i cannot spot anything abnormal
removed some *.old log files but rather small in size
I have no idea why i keep getting the message to be honest
Regards Patrick
- Mark as New
- Bookmark
- Subscribe
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Re: System volume root's usage is 90%.
Have you checked if the problem is still present after a reboot?
- Mark as New
- Bookmark
- Subscribe
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Re: System volume root's usage is 90%.
aha no! did not reboot, let me reboot and report back
Patrick
- Mark as New
- Bookmark
- Subscribe
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Dear All
All wlerts have stopped after the reboot, so its sorted.
Thanks to everybody who gave me a reply
closing this off as solved
Regards Patrick