Raid 0 not behaving steadily

Hi
My laptop came with a Raid 0 on 2x128GB ssds. In order to dual boot it with Linux, I used dmraid first then realised that mdadm was the new driver to use.
When I set up my raid it, the second device tends to drop off on a frequent basis. Since that a boot drive, it renders the system unusable.
When in the bios I can see that /dev/sdb is not seen by the Intel SATA controller. And clearly the raid volume is indicated as failed.
Until this morning, booting in a live linux and assembling the array through mdadm allowed to reboot into the array. So the array was
sort of persistent for the Intel controller other one reboot. This is not the case anymore.

I have several questions:
- could it be an issue of metadata slightly corrupted ? If this is the case, I have tried to recreate the array with the controller
(whenever possible) but mostly with mdadm and the metadata option set to imsm.
- could it be a problem with the intel driver ? I use 13.10 but reading quickly through the forum it looks as
though the newest is not necessarily the best.
- could it be a hardware issue ? In this case, are they tools to check the sanity of the drives ?

Thanks in advance for the guidance.

Best

Athos