NAS 540 failure

Options
I have a NAS540 w 4 hitachi 4t drives.  It was configured raid 10.  It fails to boot with all 4 drives.  If I pull out drive 4 it does boot but the the interface says drive 1 is bad.  If I reboot it it will sometimes try to rebuild the volume.  But after rebuilding it will work for a while then fail.  Then I will need to pull the drives out and start the process over again. 

1 What is the best way to get this back up and operating consistently?

2 Why will it only boot with drive 4 out?
3 If drive 1 is the failed drive why does it not boot with the other drives in but the failed one removed?

Thanks, P

All Replies

  • PHa
    PHa Posts: 2
    Options
    It seems to do the repair get to about .7% the beep and restart.
  • Mijzelf
    Mijzelf Posts: 2,605  Guru Member
    First Anniversary 10 Comments Friend Collector First Answer
    Options
    If the system does not boot with 4 disks inserted, but does boot with 3, than either that disk is physically defect, drawing too much current, or pushing garbage on the sata bus, or the power supply is failing, and can't power 4 disks anymore.

    As you seem to have 2 failing disks, odds are that it's the power supply. Do you have a possibility to test disk4 in another system? It should spin up and show some partition table. A Windows PC (nor a Mac) will be able to see anything else, as neither support (this kind of) software raid or the used filesystem.
    When the disk seems fine, then it's the power supply. If the disk is indeed wrong, then possibly disk1 is wrong because of the hard fail when disk4 went down.

    It seems to do the repair get to about .7% the beep and restart.
    Does the box reboot itself? The beep indicates that it's not a hard failure, but that the OS is aware of something. But I am not aware of any condition which makes the OS reboot without confirmation. (Except for a planned reboot). How long does it take to get to 7%?

Consumer Product Help Center