RAID Crashing - NULL performance

Hi everyone,

I encounter a strange behaviour with the embedded intel RAID on one of my PC.

Specs:
Mobo : http://www.jetwaycomputer.com/NF9J.html (latest BIOS installed)
Intel ROM version : 12.6.0.1867
Intel RST version : 13.6.0.1002

I get the RAID perfectly working with some disk but totally failing with others.

Working HDD :
WD RE4 1TB
WD RE4 2TB
WD Red 1TB

Failing HDD:
WD Red 4TB

Being tested:
WD Red 2TB

Description of the problem :
I build a new raid pack (RAID 1 or RAID5), reboot few times and RAID is corrupted. Some drives are “Error Occured” other are back to "Non-RAID Disk"
This happen even if I do not install anything on the RAID pack.

It’s not always the same drive in error.
The cables have been changed.

So either the system is incompatible with the WD Red 4TB (due to firmware or whatever) which seems odd or there is a setting not wrong (maybe in the HDD firmware)
In fact, I don’t know.

Does this ring a bell to someone ?

Any advice would be highly appreciated.

@Elm :
Welcome at Win-RAID Forum!

What have you done with the “others” previously? Did you check the health of them by using WD’s Data Lifeguard Diagnostics?
Have they already been used? Maybe even as members of a RAID array? If yes, did you always make sure, that the formerly created RAID array has been deleted, before you created a new one?

Regards
Dieter (alias Fernando)

Hi Dieter,

thanks for the reply.

The “other” drives (the WD Red 4TB) are brand new, they have been received last week.
They have been tested with WD’s Data Lifeguard Diagnostics and passed the test (even though we didn’t do through the full surface test)
Each time we re-test a RAID array with those drives we reset their RAID status.

From what I can see, most of the time, at reboot, one of the drive’s activitiy led remains switched on and this is the failing drive. It’s not always the same one though.
Here is a screenshot of what happens : https://communities.intel.com/servlet/JiveServlet/showImage/2-421500-291199/DSC_0002_5.JPG
I even changed the CPU to a low power one in order to be sure that the PSU in not the problem.

With your expertise do you think I can rule out an opROM issue or is it worth it trying to upgrade it ?

If I were you, I would change the Intel RAID ROM and the Intel RAID driver version.
According to my performance test results (look >here<) the Intel RST driver/OROM combination v13.6 is not really recommended.
For your Intel 8-Series RAID system I would prefer the Intel RAID driver/OROM combination v13.2.4.1000/13.2.2.2224.

Thanks Fernando,

I’ve seen many thread on your forum about BIOS modding but most of the language here is not quite clear and I do not want to make a mistake.
Would you be kind enough to tell me what thread fits to my situation ?
I have a AMI Aptio UEFI firmware.

Thanks again.

@Elm :
The modification of AMI UEFI BIOSes is very easy and safe by using the tool named “UEFI BIOS Updater” (UBU), which has been developed by our Forum member SoniX.
Please look into the start post of >this< thread.

So I got UBU, analysed the latest version of the firware and updated the opROM

From :

OROM IRST RAID for SATA - 12.6.0.1867
EFI IRST RAID for SATA - 12.5.0.1815

To :

OROM IRST RAID for SATA - 13.2.2.2224
EFI IRST RAID for SATA - 13.2.0.2134

Is that correct ?

So that I understand, what is the role of the opROM vs the IRST windows driver in RAID management ?

Yes, the update of both Intel RAID BIOS modules (Option ROM for "Legacy" mode, "RaidDriver" for UEFI mode) obviously has been done correctly!

The Intel RAID BIOS modules (Option ROM resp. EFI "RaidDriver") have to be loaded while booting into the OS. Otherwise the existing RAID array will not be detected by the OS. Furthermore the related Intel RAID BIOS module is responsable for the creation and the health of the Intel RAID array.
The Intel RAID driver, which is part of the Windows OS, manages the function and performance of the on-board Intel SATA Controller, if it has been set to "RAID" within the BIOS.

Thanks, I definitely have a better understanding now.
Last questions before I update my firmware :

- Do you have the feeling that my problem could really be firmware related ?
- What component handle the RAID functionnality itself, I mean : writing on both drives when configured in RAID 1. Is that the BIOS or the windows driver, or something else ?

I am not sure about it, but anyway it may be a good idea to optmize the RAID system by using a better matching Intel RAID BIOS module/driver combo.

It is mainly the Intel RAID driver in combination with the specific Intel RAID Controller, which is managing the RAID functionality.

Thanks Fernando, I’ll put here the results after the UEFI update (if I dare doing it)
For the windows driver, you definitely recommand 13.2.4.1000 over the latest one (14.8.0.1042) ?

Yes, but the last Intel RAID driver, which is compatible with your 8-Series chipset, is v14.8.12.1059 and not v14.8.0.1042.

I was looking at https://downloadcenter.intel.com/product/55005/Intel-Rapid-Storage-Technology-Intel-RST- where I saw 14.8.0.1042 and 13.2.4.1000 but no trace of 14.8.12.1059.
Then I searched for it in the forum and found it there Intel RST/RSTe Drivers (newest: v14.8.12.1059 WHQL/v4.5.4.1030 WHQL)

How is it that you have more recent drivers that the ones referenced on the official site ?
Moreover, the guys on the Intel forum were asking me if you think it could be a LTER issue even though I already looked with smartctl which reported a read 7s/ write 7 LTER (same as the working RE4 drives).

Because Intel has always been very restrictive and slow regarding the presentation of their newest RST/RSTe drivers within their Download Center.
Furthermore Intel has never offered on their websites drivers for everyone, but just for the users of mainboards, which have been manufactured by Intel itself. Since Intel stopped the production of mainboards a while ago, they obviously don’t see a reason to publish new Intel RST drivers within the Download Center.

So I have finally tested with WD Red 2TB and the problem is the same than the 4TB.
I’ve updated the BIOS and currently testing.

Fernando,

I’ve applied the new firmware, tested and the behaviour is not better than previously, it even seems worse.
What I have seen is :

- sometimes when quitting the opROM, one of the HDD led remains switched on and the subsequent Windows installation fail
- sometimes one of the HDD led remains switched on during Windows Installation which is then very very slow and at first reboot the RAID is corrupted
- sometime the windows installation seems going well but the reboot show a corrupted RAID

It is almost as if the controller was not able to work correctly with the HDD.

It is still not the same HDD failing.

The problem occur only with WD Red 2TB and 4TB

Any idea of what I can test ?

If I were you, I would contact the WD Support.

That’s what I did.
Hopefully I’ll get an anwser.
This is very strange indeed that those 2 models have problems and not the others.

Fernando,

I solved my problem by bypassing the SATA backplanes on which my drives were connected.
With the HDD directly connected to the SATA connector on the Mobo, everything is fine.
I can make all RAID configuraitons with all types of drives.

So now I need to understand what’s wrong with the backplane and if I can find replacement for them.

Thanks for the support.

Should you now a good provider for SATA backplane, I’m interrested :slight_smile:

@Elm :
Thanks for your feedback. It is fine, that you now know the origin of your problem.

I am sorry, but I don’t have any experience with this kind of hardware.