They generally test the components of a computer or in this case a smartphone to see their highest performance in an absolute best case scenario, usually without much context about how those components are be used.ĪnTuTu and Geekbench, the most commonly used misunderstood or inappropriate mobile benchmarksįrom my experience, along with many others experts that I have talked to within the industry on this topic, the general agreement is that AnTuTu and Geekbench are the two mobile benchmarks that are the most used and misunderstood. These benchmarks are known as synthetic benchmarks. But the reality is that these benchmarks don’t even remotely test what a normal user would be doing on their smartphone. Part of the reason for this has been because press and device manufacturers have been publishing their scores in these inaccurate or inappropriate benchmarks and give credibility to these scores. It takes a lot longer to run a benchmark that reflects real-world usage. The reason that people use inaccurate benchmarks is because these benchmarks make it really easy to simply download, press a button and get a number telling you how fast or slow your smartphone is, in theory. Why do some use inaccurate or inappropriate mobile benchmarks? Qualcomm, I believe, will move back to a different approach with their future Kryo core. So how is that 64-bit Android thing working out? Apple and Intel have not taken this approach of wantonly adding meaningless CPU cores and I applaud them for taking the high road. Some have done this to get the "64-bitness", too. Thus we have the 8-core myth, 8 cores so that you look better on inappropriate benchmarks. So if you are an SoC manufacturer like Huawei, MediaTek, Qualcomm, or Samsung Electronics, you take a "can't beat them, join them" approach and add more processor cores to your SoC. Admittedly, the DailyMail example is the worst I have seen, but I see this kind of stuff every time I read reviews about a new smartphone or tablet. We all know in the industry that a 99 pound Tesco tablet doesn't outperform a 300 pound Apple iPad mini 3 on real benchmarks or the experience. In this example, the DailyMail used GeekBench to justify the article and headline. Consider first an example of something I read this morning in the DailyMail:Īpple rival in speed test: Consumer study shows price and brand is not guarantee to finding best performing device" Don't even think of invoking the "Apple rule" as they have dominated the mobile SoC benchmarks for most of five years. You may be asking, "why should I even care"? First of all, if you look at the history of microprocessor or SoC pricing, you will find a direct correlation between perceived performance and pricing. As a result, these benchmarks have been given the label across the industry as "inaccurate or inappropriate benchmarks" that don’t accurately represent a user’s experience. Many of them simply load up all of the cores to their maximum frequency, which phones never operate at other than benchmarks. These benchmarks are simply run to see the fastest theoretical performance that the system could be tested at, without regard to battery life, operating systems, applications or real world use cases. Some of these benchmarks and the people who use them in reviews have been responsible for proliferating the 8-core myth, too. Most importantly, it impacts consumers and I'll give examples why. It also impacts handset makers like Apple, HTC, Lenovo-Motorola, LG, Sony and Samsung Electronics and the decisions they make. This impacts companies like chipmakers or chip designers Apple, ARM Holdings, Huawei, Intel, MediaTek, NVIDIA, Qualcomm and Samsung Electronics. Over the course of the past few years, there has been a proliferation of inappropriate or misunderstood benchmarks in the mobile world, and those benchmarks serve to do nothing other than help users generate a single number, a benchmark score, that is supposed to quantify the performance and by proxy, the experience of that device. There are hybrids as well, like manufacturer-led consortiums, too. Benchmarks cycle between manufacturer, consortium, benchmark company and industry standard- led formations. Benchmarks have been on a cyclical nature for years and the cycle is fairly predictable. I have been in and around the benchmarking and benchmarketing scene for 25 years in the PC, server, and now smartphone and tablet markets.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |