The avalanche of benchmark results does not make it easy to understand the real value of the SSD.
Even more confusing is that, in that case, the Performance Pro seems to be below the competition most of the time, or on a par, or only slightly better.
There are few benchmark tests where it seems to shine, it is not immediately visible that (or if) they count more than the others.
In addition to the detailed results, a summary of the few tests that best represent each benchmark suite would be great, instead of having every single test, which probably results in unnecessary redundancy. Something like:
Benchmark test - Max Note - SSD#1 - SSD#2 - SSD#3
BMK1 TST1 - 2000 - 40 - 45 - 36
BMK1 TST2 - 1000 - 61 - 80 - 48
BMK2 TST1 - 2000 - 25 - 24 - 28
...
BMK4 TST1 - 1000 - 12 - 10 - 8
TOTAL -10000 - 225 - 270 - 190
Or even better, replace the names of the benchmarks and tests in the summary by readable names such as "Average sequential write in Steady State" or whatever is important for SSDs, made up from one test or a compilation of several tests from one or several benchmarks.
The summary table could even adjust the max factors to view the results for different types of SSD usages, such as choosing from a drop-down list of "Main Windows 7 OS and apps", "SQL 2012 database server", "Web Apache x.xx Linux y.yy server", "Media server", with the latter giving for instance more importance to the results for sequential reads.
Using large max numbers such as 1000 or 2000 allow for future software and hardware improvements. For instance, on a 1000 scale, a rate of 500Mb/s could be rated 20/1000.
Any technological advance makes it easy to adjust such ratings automatically. For instance, if an old SSD has a "20/1000" rating on Windows 7, and Windows 8 is known to improve the score by a factor of 1.5 on new SSDs, the table could display the adjusted score as "30/1000 (*)", with the asterisk pointing to a "Adjusted but not retested" text.
The same kind of comparative table for technical specs and prices would be great for the Intro too, with number of chips, capacity, die, RAM, dimensions, etc.
Actually, what I find most missing when you want to choose a SSD, a CPU, a video card, etc. is that there is no such tables comparing all SSDs or CPUs that are currently on sale, with a way to sort them based on the highest performance or cheapest price dynamically from the same table.
I just can't memorize every single generation or differences between brands and models, and the benchmarks here or on other sites will compare just a few at a precise time. Two months later, the benchmark is made obsolete by new models, so you'd have to read through several benchmarks and evaluations to buy a bloody drive. There has to be a simpler, easier and less nerdy way to make such purchase decisions.
We need some kind of hardware database that we can consult or use to build automatically specific configurations such as "Smallest silent computer", "Game rig", etc. from the set of best matching parts.
I found a lot of grammar mistakes and typos or doubled words from the Intro to the Conclusion, even now, please have the text proof read manually, not just by a spell-checker, this makes it look unprofessional.
Even more confusing is that, in that case, the Performance Pro seems to be below the competition most of the time, or on a par, or only slightly better.
There are few benchmark tests where it seems to shine, it is not immediately visible that (or if) they count more than the others.
In addition to the detailed results, a summary of the few tests that best represent each benchmark suite would be great, instead of having every single test, which probably results in unnecessary redundancy. Something like:
Benchmark test - Max Note - SSD#1 - SSD#2 - SSD#3
BMK1 TST1 - 2000 - 40 - 45 - 36
BMK1 TST2 - 1000 - 61 - 80 - 48
BMK2 TST1 - 2000 - 25 - 24 - 28
...
BMK4 TST1 - 1000 - 12 - 10 - 8
TOTAL -10000 - 225 - 270 - 190
Or even better, replace the names of the benchmarks and tests in the summary by readable names such as "Average sequential write in Steady State" or whatever is important for SSDs, made up from one test or a compilation of several tests from one or several benchmarks.
The summary table could even adjust the max factors to view the results for different types of SSD usages, such as choosing from a drop-down list of "Main Windows 7 OS and apps", "SQL 2012 database server", "Web Apache x.xx Linux y.yy server", "Media server", with the latter giving for instance more importance to the results for sequential reads.
Using large max numbers such as 1000 or 2000 allow for future software and hardware improvements. For instance, on a 1000 scale, a rate of 500Mb/s could be rated 20/1000.
Any technological advance makes it easy to adjust such ratings automatically. For instance, if an old SSD has a "20/1000" rating on Windows 7, and Windows 8 is known to improve the score by a factor of 1.5 on new SSDs, the table could display the adjusted score as "30/1000 (*)", with the asterisk pointing to a "Adjusted but not retested" text.
The same kind of comparative table for technical specs and prices would be great for the Intro too, with number of chips, capacity, die, RAM, dimensions, etc.
Actually, what I find most missing when you want to choose a SSD, a CPU, a video card, etc. is that there is no such tables comparing all SSDs or CPUs that are currently on sale, with a way to sort them based on the highest performance or cheapest price dynamically from the same table.
I just can't memorize every single generation or differences between brands and models, and the benchmarks here or on other sites will compare just a few at a precise time. Two months later, the benchmark is made obsolete by new models, so you'd have to read through several benchmarks and evaluations to buy a bloody drive. There has to be a simpler, easier and less nerdy way to make such purchase decisions.
We need some kind of hardware database that we can consult or use to build automatically specific configurations such as "Smallest silent computer", "Game rig", etc. from the set of best matching parts.
I found a lot of grammar mistakes and typos or doubled words from the Intro to the Conclusion, even now, please have the text proof read manually, not just by a spell-checker, this makes it look unprofessional.