NAS542 lost RAID5 after repair
Hi everybody,
I hope, you can help me.
First of all here are my system information:
OS: Windows 10, build 19045.2604
NAS: NAS542, Firmware V5.21(ABAG.7)
HDD NAS: 4 x Western Digital WD20WARX, 2TB each
RAID level: 5
Couple of days ago, my NAS informed me, that my RAID is degraded and HDD #2 has a problem.
The NAS offered a repair option and I accepted it. Short term later, that RAID informed me again, that RAID is degraded, again HDD #2.
After that I stopped further attemps to repair, switched off the NAS and ordered 4 new HDD (Seagate ST2000VM003) to repair the RAID first, making a backup and then replacement of the other WD20EARX.
Therefore I switched the NAS on, removed HDD #2, replaced it by new one (ST2000VM003) and started the browser interface.
Unfortunately there wos no more an offer for repair and the RAID was lost (not present anymore). It looked like a virgin NAS with brandnew HDD not configured yet.
So here is my question: how I can rescue my data from the NAS.
Please also refer to screenshots below.
Thanks a lot for your kind support and help in advance.
Kind regards
Alexander
Accepted Solution
-
Correct?
Nope. The procedure is
- move new disk 1 to slot 1, and add new disk 2 and old disk 3 and 4.
- Check if the 'disk name'/'Device Role' mapping is the same as in this post. That is expected.
- Re-create the degraded array with 'mdadm —create' like you did before.
- Reboot
- Restart repair process
Btw., the process will take some more time I guess. ;-)
It's not running at 80MB/sec, as I expected. Instead it's 'just' 50MB/sec. But the 'current rate' is lower. Maybe that disk has more problems than I expected from your problem description. On your screen shot it has 3 read errors in the first 66GB, while I was hoping for 1 on the whole disk.
BTW, it seems the 'current rate' is used for expected remaining time. At 50MB/sec it should be 11 hours, at 26MB/sec it's 21 hours.
0
All Replies
-
Can you enable the ssh server (somewhere in System>Network) log in over ssh, and post the output of
su mdadm —examine /dev/sd[abcd]3
0 -
Tried to enter mentioned commands by using Putty terminal )have no Linux), after loggin in on NAS but received error messages only (unknown command).
Here is the directory of the NAS:
0 -
received error messages only (unknown command).
That shouldn't be possible. Can you post the exact commands + the response? In PuTTY you can copy the text to the clipboard just by selecting it with the mouse.
0 -
It's possible obviously:
open 192.168.178.21
login as: admin
admin@192.168.178.21's password:
Remote working directory is /home/shares
psftp> su
psftp: unknown command "su"
psftp> mdadm -examine /dev/sd[abcd]3
psftp: unknown command "mdadm"
psftp>So what might be the problem?
0 -
That is not ssh. It looks like an ftp prompt. But I don't know how you could get an ftp prompt using PuTTY. When you have logged in over ssh the prompt should be something like
admin@NAS542:~$
0 -
Now tried with Solar-PuTTY.
here is the result:
~ # su
BusyBox v1.19.4 (2021-04-01 13:06:33 CST) built-in shell (ash)
Enter 'help' for a list of built-in commands.~ # mdadm ?examine /dev/sd[abcd]3
mdadm: An option must be given to set the mode before a second device
(/dev/sda3) is listed
~ #0 -
!@$#% forum software.
mdadm ?examine
Where there is a ? here, there should be ‘minus minus’ without space in between. Not your fault, the forum software exchanges it by something you can't paste in PuTTY.
0 -
Tried again:
login as: admin
admin@192.168.178.21's password:BusyBox v1.19.4 (2021-04-01 13:06:33 CST) built-in shell (ash)
Enter 'help' for a list of built-in commands.~ $ su
Password:BusyBox v1.19.4 (2021-04-01 13:06:33 CST) built-in shell (ash)
Enter 'help' for a list of built-in commands.~ #
~ # mdadm -examine /dev/sd[abcd]3
mdadm: -e does not set the mode, and so cannot be the first option.0 -
You need /two/ minus signs before examine.
0 -
Ahh, now got it.
Sorry for being dumb, but this is my first time working with such tools.Here is the result:
mdadm --examine /dev/sd[abcd]3
/dev/sda3:
Magic : a92b4efc
Version : 1.2
Feature Map : 0x0
Array UUID : da05a358:f28cfe4e:303ab122:2e3ff7a7
Name : NAS542:2 (local to host NAS542)
Creation Time : Sat Oct 23 20:27:27 2021
Raid Level : raid5
Raid Devices : 4Avail Dev Size : 3898767360 (1859.08 GiB 1996.17 GB)
Array Size : 5848150464 (5577.23 GiB 5988.51 GB)
Used Dev Size : 3898766976 (1859.08 GiB 1996.17 GB)
Data Offset : 262144 sectors
Super Offset : 8 sectors
State : clean
Device UUID : 4ea2cfa9:1665747a:5b36e87d:968f949dUpdate Time : Sun Mar 5 14:01:41 2023
Checksum : 58c7871b - correct
Events : 383
Layout : left-symmetric
Chunk Size : 64KDevice Role : Active device 0
Array State : AAAA ('A' == active, '.' == missing)
/dev/sdb3:
Magic : a92b4efc
Version : 1.2
Feature Map : 0x0
Array UUID : da05a358:f28cfe4e:303ab122:2e3ff7a7
Name : NAS542:2 (local to host NAS542)
Creation Time : Sat Oct 23 20:27:27 2021
Raid Level : raid5
Raid Devices : 4Avail Dev Size : 3898767360 (1859.08 GiB 1996.17 GB)
Array Size : 5848151040 (5577.23 GiB 5988.51 GB)
Data Offset : 262144 sectors
Super Offset : 8 sectors
State : clean
Device UUID : 7dcc7141:61fd12e6:9a83c1cb:7e530328Update Time : Sun Mar 5 14:21:15 2023
Checksum : 345260e0 - correct
Events : 397
Layout : left-symmetric
Chunk Size : 64KDevice Role : spare
Array State : ..AA ('A' == active, '.' == missing)
/dev/sdc3:
Magic : a92b4efc
Version : 1.2
Feature Map : 0x0
Array UUID : da05a358:f28cfe4e:303ab122:2e3ff7a7
Name : NAS542:2 (local to host NAS542)
Creation Time : Sat Oct 23 20:27:27 2021
Raid Level : raid5
Raid Devices : 4Avail Dev Size : 3898767360 (1859.08 GiB 1996.17 GB)
Array Size : 5848151040 (5577.23 GiB 5988.51 GB)
Data Offset : 262144 sectors
Super Offset : 8 sectors
State : clean
Device UUID : 841ed1c2:f2e3a0ee:78d9bc43:369e2b5bUpdate Time : Sun Mar 5 14:21:15 2023
Checksum : 696334a4 - correct
Events : 397
Layout : left-symmetric
Chunk Size : 64KDevice Role : Active device 2
Array State : ..AA ('A' == active, '.' == missing)
/dev/sdd3:
Magic : a92b4efc
Version : 1.2
Feature Map : 0x0
Array UUID : da05a358:f28cfe4e:303ab122:2e3ff7a7
Name : NAS542:2 (local to host NAS542)
Creation Time : Sat Oct 23 20:27:27 2021
Raid Level : raid5
Raid Devices : 4Avail Dev Size : 3898767360 (1859.08 GiB 1996.17 GB)
Array Size : 5848151040 (5577.23 GiB 5988.51 GB)
Data Offset : 262144 sectors
Super Offset : 8 sectors
State : clean
Device UUID : ae491a40:555a8374:d53ba8cd:3f6d4d79Update Time : Sun Mar 5 14:21:15 2023
Checksum : 149c0798 - correct
Events : 397
Layout : left-symmetric
Chunk Size : 64KDevice Role : Active device 3
Array State : ..AA ('A' == active, '.' == missing)Hope these information are helpful and you can help me.
0
Categories
- All Categories
- 415 Beta Program
- 2.4K Nebula
- 151 Nebula Ideas
- 98 Nebula Status and Incidents
- 5.7K Security
- 277 USG FLEX H Series
- 277 Security Ideas
- 1.4K Switch
- 74 Switch Ideas
- 1.1K Wireless
- 42 Wireless Ideas
- 6.4K Consumer Product
- 250 Service & License
- 395 News and Release
- 85 Security Advisories
- 29 Education Center
- 10 [Campaign] Zyxel Network Detective
- 3.6K FAQ
- 34 Documents
- 34 Nebula Monthly Express
- 85 About Community
- 75 Security Highlight