[NVMe Performance] Which is the best NVMe Driver and MB Connection (M.2 vs. PCIe)

Somewhere I have read, that a PCIe connection may give an NVMe SSD a better performance than a direct insertion of the SSD into an on-board M.2 port.
Reason: The data transfer lanes of the on-board M.2 ports are usually connected to the chipset’s Southbridge, whereas the lanes of the PCIe x16 slots are directly connected to the CPU.

Since my ASRock mainboard has even 3 M.2 ports and a free PCIe 3.0 x16 slot, I have done in August 2019 some benchmark comparison tests with 2 different NVMe SSDs as system drive, which were connected to different ports (M.2_1 resp. PCIe2) of my PC’s mainboard.

This was the configuration of my test system:

  • Mainboard: ASRock Falal1ty Z170 Prof. Gaming i7
  • System Drive and OS:
    1. heavily used 250 GB Samsung 970 EVO SSD, running a freshly installed Win10 x64 Pro v20H1 Build 18950
    2. rather new 500 GB Samsung 970 EVO Plus SSD, running a freshly installed Win10 x64 Pro v19H1 Build 18362.267
  • Tested NVMe drivers:
    1. generic MS Win10 in-box STORNVME.SYS dated 07/27/2019 resp. 03/19/2019 (both wrongly shown as being dated 06/21/2006)
    2. specific Samsung NVMe driver v3.1.0.1901 WHQL dated 01/17/2019
    3. generic mod+signed Open Fabrics Alliance (OFA) NVMe driver v1.5.0.0 dated 04/07/2017
  • NVMe SSD connection:
    • either directly to the first on-board M.2 port named M.2_1
    • or via DELOCK M.2>PCIe adapter to the on-board PCIe x16 slot named PCIe2

With both test systems I got absolutely surprising benchmark results, especially after having switched the connection of the NVMe SSD (M.2 direct resp. PCIe via adapter).


A. Test results with a 250 GB Samsung 970 EVO SSD:
(all left pics: M.2 connection, all right pics: PCIe x16 connection)
    
    
  1. Usage of the MS Win10 in-box NVMe driver:

    Z170-250GB970EVO-NVMe-MSinbox-M.2.png

    Z170-250GB970EVO-NVMe-MSinbox-PCIe.png

  2. 
    
  3. Usage of the Samsung NVMe driver v3.1.0.1901:

    Z170-250GB970EVO-NVMe-Samsung3101901-M.2.png

    Z170-250GB970EVO-NVMe-Samsung3101901-PCIe.png

  4. 
    
  5. Usage of the mod+signed OFA NVMe driver v1.5.0.0:

    Z170-250GB970EVO-NVMe-OFA1500-M.2.png

    Z170-250GB970EVO-NVMe-OFA1500-PCIe.png




B. Test results with a 500 GB Samsung 970 EVO Plus SSD:
(all left pics: M.2 connection, all right pics: PCIe x16 connection)
    
    
  1. Usage of the MS Win10 in-box NVMe driver:

    Z170-500GB970EVO-NVMe-STORNVME.png

    Z170-500GB970EVOPlus-NVMe-STORNVME-PCIe2.png

  2. 
    
  3. Usage of the Samsung NVMe driver v3.1.0.1901:

    Z170-500GB970EVO-NVMe-Samsung3101901.png

    Z170-500GB970EVOPlus-NVMe-Samsung3101901-PCIe2.png

  4. 
    
  5. Usage of the mod+signed OFA NVMe driver v1.5.0.0:

    Z170-500GB970EVO-NVMe-OFA1500_Pic2_2.png

    Z170-500GB970EVOPlus-NVMe-OFA1500-PCIe2.png




Test results:
  1. With my test system the sort of the NVMe SSD connection to the mainboard (M.2 or PCIe) has a huge impact on the performance of an NVMe SSD.
    Irrespective of the used NVMe driver the measured performance differences (especially the WRITE speeds) between the M.2 and PCIe x16 connection of the NVMe SSD were tremendous and not expected at all by me.
  2. In addition to the choice of the best possible connection (M.2 or PCIe) the choice of the best performing NVMe driver is very important as well.
    Contrary to all my performance comparison tests done with different AHCI drivers the in-use NVMe driver has a huge impact on the performance of the NMe SSD.
  3. The clear performance winner among the tested NVMe drivers was again the OFA driver v1.5.0.0 - due to its extremely good 128K READ speeds (twice the numbers of the others!).
    Only regarding the 4K WRITE scores the Samsung driver v3.1.0.1901 was better performant.


Conclusions:
  1. The performance of an NVMe SSD depends on its sort of data transfer connection to the CPU (direct or via chipset). A PCIe x16 slot connection of an NVMe SSD via adapter may be much better performant than a direct connection of the SSD via on-board M.2 port.
  2. Users of an NVMe SSD, who want to get the best possible performance, should test the OFA NVMe driver v1.5.0.0
    (to avoid boot and/or shutdown problems I recommend to disable the "Fast Boot" option within the BIOS and to delete the hiberfil.sys file via the Windows command "Powercfg /hibernate off").
1 Like

Thank you for this comparative test !
PCIE lanes in CPU are better than that in SouthBridge esp 4K speed.
Most of on-board M.2 port,their PICE lanes belong to the SouthBridge in Z170 mobos.
But I think there are two kinds of PCI Express Slots.One belongs to CPU,and the other SouthBridge.
On ASRock Fatal1ty Z170 Prof. Gaming i7,PCIE2/PCIE4/PCIE6 slots’s PCIE lanes belong to CPU,PCIE3 slot’s PCIE lanes belongs to SouthBridge.

@Fernando If conditions permit,would you like to text the performance between the PCIE2 and PCIE3 slot via PCIe connection?

I disagree with the initial question. This is not about NVMe via M.2 vs PCIe with an adapter, this is regarding where the PCIe Lanes themselves comes from. If you have a Motherboard with a M.2 Slot that uses lanes from the Chipset, the SSD will have higher latency than if you connect it directly to the Processor PCIe Lanes, as there is one extra hop (Processor -> PCIe NVMe SSD vs Processor -> Chipset -> PCIe NVMe SSD), plus you have to consider the effects of the DMI (Processor -> Chipset) multiplexing bottleneck since almost everything go though there. This affects all Intel consumer platforms because the Processor has only 16 available PCIe lanes which are typically used as a 16x slot, or bifurcated as 8x/8x or 8x/4x/4x, depending on Chipset/Motherboard. The M.2 Slot pretty much always comes from the Chipset. In AMD AM4 you instead have 20 PCIe Lanes, where you typically have a M.2 Slot that is directly wired to the Processor itself.

Also, in 2018 AnandTech hosted an AMA with the Intel Optane team. I asked this:

Response:


To do a proper comparison, you would need to find a platform that provides both a PCIe Slot 4x or higher and a M.2 Slot with PCIe coming both from either the Processor, Chipset, or both.

@gloobox :

Are you sure about that statement (look into the below inserted benchmark results)?

Yes, I have tested it after having inserted the NVMe SSD into the PCIe3 slot.
Result: I got even better results than with the connection to the PCIe2 slot.

Here are the results I got with the OFA NVMe driver v1.5.0.0 (left pic: connection via PCIe2 slot, right pic: connection via PCIe3 slot):

Update of the start post
Changelog:

  • added:
    • new benchmark test results while running a 500 GB Samsung EVO Plus SSD with the currently latest final Win10 x64 Pro Build 18362.267

Enjoy it!
Dieter (alias Fernando)

@zir_blazer :
Thanks for your interesting contribution.
As you can see within the start post, my results do not match at all your quoted statement given by the AnandTech Forum "Intel_Optane_Team". My measured performance differences between a M.2 and PCIe connected NVMe SSD were huge, although only 1 single NVMe SSD was in-use.

Zitat von zir_blazer im Beitrag #3

To do a proper comparison, you would need to find a platform that provides both a PCIe Slot 4x or higher and a M.2 Slot with PCIe coming both from either the Processor, Chipset, or both.



I own such mainboard, which provides both sorts of NVMe SSD connections, and I would rather like to compare them additionally by using an Intel NVMe RAID0 array. The creation of such RAID0 array is no problem, when the SSDs are directly connected to the on-board M.2 ports M.2_1 and M.2_2, but unfortunately such Intel RAID0 array is not detected anymore by the BIOS and its Intel RAID Utility named "Intel Rapid Storage Technology" after having connected both RAID0 array members to the PCIe slots PCIe2 and PCIe3. This lets me think, that the creation of an Intel RAID array requires a conjunction of the PCIe data transfer lanes to the chipset’s Southbridge.
Is this possible? Do you know how to manage an Intel NVMe RAID0 PCIe configuration? Do you think, that it is possible at all with my currently used ASRock Z170 Fatal1ty Prof. Gamingi7 mainboard? Do I have to re-configure the M.2 connected array and to break it, before I can connect the NVMe SSDs to PCIe slots and try to create a new RAID0 array?
Thanks for your support.

I have checked your Motherboard Manual. While it doesn’t have a nice Block Diagram (Like this) to have exact accuracy about what goes where, I can infer this:

PCI2/PCIE4/PCIE5 Slots comes from the Processor 16 PCIe Lanes, bifurcated as 8x/4x/4x, respectively.
PCIE3 Slot comes from the Southbridge.
The 3 M.2 Slots should also come from the Southbridge. All 3 shares lanes with SATA and SATA Express Ports.


The proper comparison would be a PCIe-to-M.2 adapter in PCIE3 vs same M.2 NVMe SSD in any of the three M.2 Slots, like the first guy said. However, keep in mind that all 3 M.2 Slots of your Motherboard shares lanes with both SATA and SATA Express ports, which means that there should be some lane switches somewhere that may add latency that PCIE3 doesn’t have, thus it makes sense that they perform worse than it. A fair comparison would require a pure M.2 Slot that is directly wired with no intermediates (At some point I recall having suggested somewhere for a pure hardwired 8x/8x Motherboard instead of a 16x with switches to bifurcate to 8x/8x for this very reason. Should have a slighty lower bill of materials, and slighty higher performance, assuming that there is no scenario where you miss 16x bandwidth).
Now, the big question is why you get better results when the SSD is plugged into the Chipset PCIE3 rather than the Processor PCIE2. To be honest, I have absolutely no idea, logic dictates than it should be the other way around as the Chipset is an extra hop, and requires multiplexing all Chipset pheriperals data though DMI. Could it be possible that there is some extra caching going on?


About the RAID 0 thing, I have no idea. Chances are that it is a Chipset-only feature thus it only works with NVMe SSDs connected to it, ignoring those that you plug directly to the Processor, but after some quick googling I didn’t found documentation that explicitly says that.
Keep in mind that if you use two SSDs in RAID 0 via the Chipset, some operations will have the performance capped by the DMI bottleneck. Check these benchmarks, a single Samsung 950 Pro 512 GiB does 2.6 GB p/s reads where a RAID 0 with two caps out a 3.4 GB p/s. It should scale properly if you use both of them directly wired to the Processor, assuming you use something else to RAID them. For other operations that doesn’t max out DMI, you do get near twice the performance.

@zir_blazer :
Thank you very much for your additional investigations regarding the PCIe lane connections of my mainboard’s M.2 ports and PCIe slots.

Here is my statement:
Despite the content of the mainboard’s manual the below attached results of my additional benchmark tests indicate, that the PCIe3 slot of my ASRock Z170 board is directly connected to the CPU (as the PCIe2 slot).
All tests were done with a 250 GB Samsung 970 EVO while running the mod+signed OFA driver v1.5.0.0.

A. SSD directly connected to the M.2_1 port:

Z170-250GB970EVO-NVMe-OFA1500-M2_1.png



B. SSD connected to the PCIe2 slot (via adapter):

Z170-250GB970EVO-NVMe-OFA1500-PCIe2_Pic3.png



C. SSD connected to the PCIe3 slot (via adapter):

Z170-250GB970EVO-NVMe-OFA1500-PCIe3_Pic3.png



If I should ever be able to create with my PC an Intel RAID0 array consisting of 2 PCIe connected NVMe SSDs, I will publish the M.2/PCIe connection comparison results within this thread.

@Fernando which adapter do you use/recommend?

@guigo21 :
All M.2>PCIe cards, which support PCIe 3.0 x4, will work fine. I have chosen the DELOCK card and was very satisfied with its quality.

@Fernando , really confused, a larger score is better right?

Here is my current score:
https://imgur.com/a/9wws4K5

But my userbenchmark score:
https://www.userbenchmark.com/UserRun/19658733

states:
"Performing way below expectations (11th percentile)"

I’m on Windows 7 x64 booting using Legacy mode & haven’t performed a reinstall or any driver hacking, just having used the samsung clone tool. (I will most likely reinstall as I’ve been noticing some small freezes around certain apps), but I still would’ve thought my benchmarks would be lower than yours or others that have really put in a lot of work here. As a note, my NVME is a 970 PRO & you listed an 970 EVO, and I am connecting directly to my m.2 slot rather than PCIE, but it’s really confusing why UserBenchmark makes that comment “Performing below expectations”.

Yes, but only if you compare results of the same benchmark tool.

I thought I pasted an imgur link of the same benchmark; Anvil’s Storage Utilities

@kai :
Each system is different and even the same NVMe SSD model may give you different results. When I started this thread, it was not my intention to get the best possible scores. I just wanted to find out, whether there is a performance difference between an M.2 and a PCIe connection of the NVMe SSD. Furthermore the Samsung 970 Pro is much more expensive and should be better than the Samsung 970 EVO or EVO Plus SSD.
By the way - it is hard to believe, that you booted the OS Win7 x64 in Legacy mode. On which disk drive is the MBR and where is the OS?

Recently I have replaced my formerly used Intel Z170 chipset PC sytem (with a Skylake CPU) by an AMD X570 chipset one (with a Ryzen 5 3600 CPU). To benefit from the on-board M.2 ports with full PCIe 4.0 support, I purchased a 1 TB Sabrent Rocket 4.0 NVMe SSD and did a clean installed Win10 x64 Pro Build 19041.21 onto it.
After having done some tests with my new AMD X570 system, I made some discoveries, which may be interesting for other users as well.

These were the NVMe drivers, whose performance I have tested with a PCIe 4.0 supporting system:

  1. Generic Win10 x64 in-box MS NVMe driver dated 12/07/2019: (wrongly shown as being dated 06/21/2006)

    Due to the support of the PCIe 4.0 data transfer standard by the on-board M.2 ports and the inserted NVMe SSD, the AMD X570 chipset system gave me much better benchmark results than those I got with my previously used Intel system and its PCIe 3.0 ports.

    Here are the related pictures:
    (left Pic: PCIe 4.0 supporting AMD X570 chipset system + 1 TB Sabrent Rocket 4.0 SSD, right Pic: PCIe 3.0 supporting Intel Z170 chipset + 500 GB Samsung 970 EVO SSD)

    Anvil-X570-1TBSabrent4.0-NVMe-STORNVME-M2-20H1.png

    Z170-500GB970EVO-NVMe-STORNVME-PCIe-20H1.png


  2. 
    
  3. 64bit mod+signed Win10 OFA NVMe driver v1.5.0.0 dated 04/07/2017:

    Contrary to my previously used Intel Z170 chipset system I was not able to get the mod+signed OFA driver v1.5.0.0 properly installed onto the AMD X570 system. Although I had disabled the "Fast Boot" and the "Hibernate" options, the system became unbootable after the installation and this problem was not repairable by the OS.
  4. 
    
  5. 64bit Win10 Samsung NVMe driver v3.2.0.1910 WHQL dated 09/19/2019:
    (Notes: The installation had to be forced. Before starting with the installation, I set a "Restore Point".)

    As you can see below, this driver gave my X570 chipset PCIe 4.0 system the best overall benchmark results.

    Anvil-X570-1TBSabrent4.0-NVMe-Samsung3201910-M2-20H1.png

  6. 
    
  7. 64bit Win7-10 Intel NVMe driver v4.4.0.1003 WHQL dated 05/16/2019:
    (Notes: The installation had to be forced. Before starting with the installation, I set a "Restore Point".)

    Here are the results I got:

    Anvil-X570-1TBSabrent4.0-NVMe-Intel4401003-M2-20H1.png

  8. 
    
  9. 64bit Win8-10 Intel RST NVMe driver v17.8.0.1065 WHQL dated 12/09/2019:
    (Notes: The installation had to be forced. Before starting with the installation, I set a "Restore Point".)

    Here are the results I got:

    Anvil-X570-1TBSabrent4.0-NVMe-Intel17801065-M2-20H1.png



Evaluation of the results:
  • A PCIe 4.0 supporting system brings not only a quicker access to the applications, but a much better data transfer performance as well.
  • The Samsung NVMe drivers v3.2.0.1910 and the Intel NVMe drivers v4.4.0.1003 seem to be better performant than the Win10 in-box MS NVMe driver and obviously are running fine even with NVMe SSDs, which have not been manufactured by Samsung resp. Intel.

Hi,

Been a while since following this thread but recently found data on another metric that is just as important as nvme speed in regards to Notebook Battery life performance. What I found from my research was that not only does what driver you use affect performance in regards to speed of the drive but also affect battery life. I will refer you to this site for review: https://www.tweaktown.com/articles/8856/…udy/index3.html. They found a significant boost to battery life when they used the Microsoft driver vs Samsung on the Samsung 960 EVO. In my case I own a Samsung PM961 that I installed in a Dell XPS 13 notebook and I can tell you that I count every minute of battery life and that the NVME has a huge impact on the life of the battery compared to when I had a SATA SSD in there. I wrap with subject findings that at least on the last Samsung driver ver 3.3.0.2003 my notebook feels like its lasting longer than before though I also recently upgraded ME and Intel thermal framework among other drivers so I am curious about switching to the microsoft driver as my performance here is limited to 2X pci-e bandwidth anyhow. Furthermore I am even more curious about the Open Fabrics driver in this respect.

Thanks

Just tried out the Open Fabric driver (after making a System Restore Point and importing the certificate) on Windows 10 PRO 2004 (19041.264) Preview. Unfortunately it ended quickly with a "Automatic Restore.
Did not delete/disable hibernate (have switched off Fast Boot already because of issues with AMD drivers), so wil try that as well.

EddieZ:
If you want the best possible performance of your NVMe SSD (which one do you use?), I recommend to install the mod+signed generic Samsung v3.3.0.2003.

For NVMe drives I use a WD Black SN750 500GB on M2_1 (3.0x4) and a WD Blue SN500 on M2_2 (3.0x2). Boot disk is WD Black SN750.
Do the Samsung drivers also run on non-Samsung?

Yes, if you use the mod+signed generic ones.