NAS 326 TRANSFER ERROR VIA FTP: TOO MANY OPEN FILES

2

Comments

  • JUKE
    JUKE Posts: 14  Freshman Member
    Hello again. I have connected the NAS directly to the PC via Ethernet bypassing the router and I still get the same problem using FTP: After about 1000 files the error comes back. How to proceed? Thanks for your help!
  • JUKE
    JUKE Posts: 14  Freshman Member
    *I use IPV4
  • JUKE
    JUKE Posts: 14  Freshman Member
    Do you think the problem is in the NAS itself or could be Windows Firewall? I use Windows 10 and the issue happens also if I disable the Firewall. This evening I will try with a MacBook and see it it changes.
  • Mihawk
    Mihawk Posts: 103  Ally Member
    This is really weird as I tested with different client and upload 13,000 photos, then never see the problem. What is your firmware version? Not sure if there is any configuration to cause the problem occurrd...wait for your update for Macbook.
  • JUKE
    JUKE Posts: 14  Freshman Member
    Firmware v5.21. Is there a way to reset the NAS to factory settings without formatting the HD?
  • Fredzoul1
    Fredzoul1 Posts: 97  Ally Member
    Normally if you make the reset (3bip), it only reset the configuration without lost data
  • Mijzelf
    Mijzelf Posts: 2,741  Guru Member
    Community MVP 2500 Comments Sixth Anniversary 250 Answers
    The message comes from the server. So I think it somehow leaks filehandles. That can be IP sockets or the actual files.

    You can enable the ssh server, login over ssh, and when the problem arises execute

    su
    lsof -a -c pure-ftpd -d ^txt -d ^mem -d ^cwd -d ^rtd

    to see the list of open files.

  • cozmin
    cozmin Posts: 1  Freshman Member
    Hi, I have the same issue on  Zyxel NAS542;
    I believe the root cause is the "open files limit"  witch is set to 1024. (login to Zyxel NAS with SSH terminal, and type 'ulimit -n').  You could try to set a higher limit, but it doesn't 'stick' (after logout/reset it reverts back to 1024).

    So, if you want to transfer a lot of SMALL files (like 1300 photos) you could:

    1) setup a batch transfer like 100 files/pause for 10 seconds/next 100 files...etc
    2) set a limit on transfer rate (i'm transfering 18073 small files at 1500kbit/sec= 1.5Mbit.sec)...it will take a while....
    3) Make storage-only big archive with all the small files, and transfer that single archive (!Dangerous! - if the big archive is corrupted, you could loose your files)

    @Zyxel Team:
          - Please make a firmware update so we can increase the kernel limit of "open files" via SSH
         or just set it to a higher number like '20000' (ulimit -Hn 20000   |   ulimit -Sn 20000 )



  • JUKE
    JUKE Posts: 14  Freshman Member
    edited November 2017
    Thanks for all your advices. What is the point to buy a NAS with 6TB storage and than have a limit on uploading 1000 files in a row? The feeling now is to throw everything away and buy a Dropbox subscription.....frustrating :-(
  • Mijzelf
    Mijzelf Posts: 2,741  Guru Member
    Community MVP 2500 Comments Sixth Anniversary 250 Answers
    That limit shouldn't be there, and most people are not affected. If increasing the open file limit solves the problem, than it's a work-around, not a solution of the lower problem.

    It makes no sense to upload more than, let's say, 10 files simultaneously, because the disk(s) can't handle that. If a mechanical disk has to store several streams simultaneously, the throughput decreases dramatically.

    So any decent FTP client won't open more than 10 streams by default.

    Each stream causes 5 open file handles on the server. One for the actual file being written, and one for the TCP/IP socket. (On FTP each stream has it's own TCP/IP connection). Futher each stream has it's own process spawned, with an open stdin, stdout and stderr (all to /dev/null).
    So when using 10 streams, there should be 50 open files for the streams, plus 4 for the 'main process' (listening socket on port 21, and stdin, stdout and stderr)

    In your case I suspect a network problem, which prevents the sockets from being closed. Which means. That could be a driver problem on the client side, maybe a buggy checksum offload.

    Does your Mac expose the same problem?

    @cozmin: If you install the Tweaks package you can use the cron applet to set your limits @reboot.

Consumer Product Help Center