Intel CPU Innovation.. or Lack Thereof?



CPU innovation is dead how many of you feel this way like every time you upgrade your computer you're saving 32 cents on your power bill this month and not getting better performance in games creative applications or even just general usage well the thing about feelings is that they're difficult to quantify so we gathered up every top Intel CPU from the last 10 years and put together an epic comparison for you performance power consumption thermals and price is innovation dead is it more than a feeling death detector gee fuel is the sugar free alternative energy beverage that helps you maintain focus and endurance over long days and gaming sessions use our code at the link below let's start with our testing methodology we used the same video card on every platform even though the GTX 1080 wasn't even a twinkle in Jensen's eye when the Pentium Extreme Edition 9:55 was the king of the hill we did this because we wanted the main variable to be CPU performance rather than allowing this to turn into a comparison of the best overall systems through the years with that said DDR memory has gone all the way from 667 megahertz dual channel to over 2000 megahertz quad channel so our compromise there was to use a high end RAM and motherboard combination that would have been typical at the time we went ten years back and for every year picked out the first extreme edition Intel CPU then for each one we performed the following tests temperature under load overall system power consumption in our skybox stare test and we put it through a suite of modern CPU and gaming benchmarks though since this project already had me tunneling into the dark recesses of my past I was unable to resist benchmarking each CPU on half-life 2 as well using such old hardware presented a few challenges not least of which was we had to use Windows 7 on all of our builds to ensure driver and game support CPU mark was the first test and the results were very boring since they were just numbers on a spreadsheet but then I had John turned it into a graph ah much better while the multi-threaded results that use all of the CPU scores show incremental performance improvements the hidden story here is the single threaded results where we see Intel's current 10 core flagship the 6950 X getting outperformed by chips going all the way back to 2011 in why cruncher a benchmark that calculates 50 million digits of pi we tested in multi-threaded mode which gave us similar results to CPU mark incremental improvements Cinebench proved to be interesting as I initially expected the results would mimic CPU mark but I was pleasantly surprised the 69 50 X sat at the top as king in both multi-threading and single core tests with the oldest chip a Pentium on the bottom lucky dog on to our first real-world benchmark Adobe Media encoder again shows the older x79 chips with their higher single core clock speeds beating the 69 50x in the more commonly used GPU accelerated rendering method a video where we went more into this and why it happens can be found here each of the game benchmarks seems to tell the same old story of incremental performance bumps rise of the Tomb Raider our modern game engine representative managed to scale through all generations only slowing down when going from 8 to 10 cores Crysis 3 representing an older triple-a engine improved very little past the i7 965 as it is not a very multi-threaded title and then there was half-life 2 whose numbers don't show us much other than boring old incremental bumps but here are the results for a nostalgia purposes so yes for the most part each chip performed better than its predecessor but the margin of improvement from chip to chip shows a steady downward trend though I'm sure for some of you this didn't come as much of a surprise Intel has publicly stated their Rd is less focused on huge increases in processing power for consumer pcs because they want to direct their attention to mobile data centers Internet of Things devices and the cloud so then CPU power draw and heat output which are very important to those markets that's way down on newer products right only sort of at idle power saving features have improved this dramatically but when working hard on the high end at least Intel has settled into a thermal and power budget they're comfortable with and they seem to be just adding more cores accordingly so efficiency is up that is performance per watt but your power draw while gaming will likely remain mostly unchanged let's look at pricing trends now for an entire decade a thousand US dollars give or take would get you the pinnacle of Intel consumer engineering not so anymore they're asking a whopping seventeen hundred dollars for their flagship enthusiast product so conclusion time then decreasing performance games and increasing prices the rational human and me might point at the collapse of Moore's law caused by the cost and difficulty of continuing to shrink silicon transistors and Intel's design goals that have shifted to address growing markets rather than shrinking ones to explain this but the conspiracy theorist and me noticed three things one after 2011 we stopped seeing large single core performance bumps from extreme edition chips and after 2013 we stopped seeing tangible single core improvements in consumer chips at all – the last time AMD had a CPU in competition with Intel for the high-end market was in 2008 and three perhaps most incriminating ly the Intel logo looks suspiciously like the eye of a reptile turned on its side it's time for another razor giveaway and today it's all about the Kraken pro and 7.1 V two headsets from Razer both models include 50 millimeter drivers lightweight frames a headband designed for better weight distribution and less clamping force larger interchangeable ear cushions that are softer and have better sound isolation fully retractable unidirectional microphones and the 7.1 features 7.1 virtual surround sound over its USB connection it's got active noise-cancelling and Razer chroma lighting as well we're giving away five of the pros as well as one 7.1 as part of a six gaming bundle giveaway from Razer enter through the link in the video description so thanks for watching guys if this video sucked do you know what to do but if it was awesome get subscribed hit that like button or even consider checking out our affiliate code links to where to buy CPUs at Amazon in the video description you can also buy a cool shirt like this one from our merch store or join our forum which is freaking awesome all that is linked down below now that you're done doing all that stuff you're probably wondering what to watch next so click that little button in the top right corner to check out our latest through the ages or not latest this is the latest the previous one where we looked at CPU water blocks

38 Comments

  1. Dave G said:

    I disagree with you Linus, and here is why…

    I have been using and programming computers since the 1970's, I went to university for my Bachelors in Electronics Engineering, I am a multi-published author in many electronics magazines, books, and journals, and I have been a hardware developer and software developer for more than 40 years now.  Yes, I'm old.

    In the early days of processors, we saw leaps and bounds from one generation to the next, where there might be such things as significant increases in core clock speed eg 500MHz to 1GHz, wider registers and busses eg 8-bit to 16-bit, and improvements in instructions per clock and cache etc.

    But there are practical limits to core clocks, we can't just keep doubling it.  There are practical limits to bus width.  And practical limits to IPC, cache, etc.  So the "leaps and bounds" that we saw in earlier generations simply will not occur any more.
    Are cars and planes still doubling in speed and performance every year?

    However, the overall system performance has still increased significantly if we compare for example a "total system available then" for an i7-2600K versus a "total system available now" for an i9-9900K.  Over the past few years we moved up to faster DDR4, faster USB3.1, faster SSDs and NVMe PCI drives, faster GPUs, etc.

    However we are also seeing a lot of the software starting to lag significantly behind the hardware.
    Who needs an 8-Core 16-Thread processor if they are simply surfing Facebook and YouTube and playing online card games.  Even the majority of current AAA game titles don't run on engines that support more than 2 to 4 threads.

    I know that some people will talk about some of the push that AMD has brought about recently with their Threadripper CPUs, but I have to disagree about that.  If we ignore price for a moment, Intel has had CPUs with high-core counts in their Xeon series for many years.  And as I stated above, what good is a 16-Core or 32-Core Threadripper or Xeon for 98% of the population, other than bragging rights for those who like "specs" and toys like rainbow color water cooling and RGB LEDs.

    I have had computers based on Intel processors all the way back to the 8088.
    I still have computers here in my office that have everything from the i7-2600K 4-Core HT to the I7-6950X 10-Core HT, and while yes, limiting our look strictly to the raw performance numbers of the CPU alone, the i7-2600K is still a great processor, but the overall system comparison is night and day between my i7-2600 8GB-DDR3-1333 HD6850 WD-Black-500GB Windows-7 vs the i7-6950X 64GB-DDR4-2666 GTX1080 WD-Blue-SSD Windows-10.  The newer system overall is significantly faster in all ways.  Even the benchmarks for the processor alone show a good enough difference to be noticeable PassMark i7-2600 = 8186, i7-6950X = 19945.

    The biggest thing that we need these days in performance boost in my opinion, is better use of threading, so that the high-core-count Intel i9/Xeon and AMD Threadripper processors can actually get utilized to their maximum.
    As I mentioned earlier, the majority of game engines are two to four threads.  Most software that people run doesn't require high-core count systems.  Even software like Adobe Premiere Pro doesn't utilize the full hardware.  My video editor system is an i7-6800K 6-Core-12-Thread 32GB-DDR4-2666 dual-AMD-W7100-8GB WD-Blue-SSD, and even on 2.5k RAW BMD footage with LUT and color-correction and 8 to 10 GPU effects, I still never get over 50% CPU and about 15% GPU usage.

    I also develop software (Demenzun Media Inc.) and my current 3D software can utilize up to 1024 Cores and SIMD for significant performance, easily max'ing any CPU at 100%.  Unfortunately, the vast majority of software doesn't.  So most people's CPU's will typically be sitting idle for most of their time…

    And for the gamers, since the vast majority of games are not CPU-intensive, upgrading your GPU is almost always the best way to improve performance.  Unfortunately those RTX-2080s are pretty pricey.  🙂

    Feel free to trash me in the comments, I know that I am more of a "Productivity Software" user than a Gamer, which is what most of the people who visit this channel are.

    June 29, 2019
    Reply
  2. ZImpresive said:

    2:15 Who's that Girl?

    June 29, 2019
    Reply
  3. skapegoatfilms said:

    fx 8350 still relevant 😀 haha

    June 29, 2019
    Reply
  4. Waleed Bushnaq said:

    You need to redo the test with AMD CPUs

    June 29, 2019
    Reply
  5. David Piçarra said:

    Intel sucks

    June 29, 2019
    Reply
  6. Pingman said:

    This would be good to repeat with say the top regular i5 and i7 (and maybe consumer grade i9 now?)

    June 29, 2019
    Reply
  7. Bobby Lockwood said:

    Well, for what it's worth reusing old components still has its merits. Also my 2011/2012 pair of x5690s did your y-crunch in 5.072 which for a pair is a third of the price of a single 4960x. (Not faster in all areas though). Cinebench is above 5960x at 1529. It should be noted Xeon X Series are ok for gaming but not especially good. For massive parallelism the old Xeon still has a bit of a kick.

    June 29, 2019
    Reply
  8. maZec31 said:

    Linus Intel Tips

    June 29, 2019
    Reply
  9. Alex Ackerman said:

    Oddly enough I was looking into selling my X79 Deluxe and the ebay prices for a used one are higher than I paid for it new… Sadly this was not the case for my 4960X.

    June 29, 2019
    Reply
  10. Mickeal Campbell said:

    855 Fps …….. i know those cpu's are powerful but that frames count just looks unbelievable :O

    June 29, 2019
    Reply
  11. ẤŋȡꝛøMểȡấ-Iấŋ said:

    geez i knew ill be good on i7 4th gen chip, seeing this test result ill be wasting money on getting new system just to get some generations newer chip if performance is that tiny bit up only, now i can concentrate on upgrading gtx750ti

    June 29, 2019
    Reply
  12. I like turtles! said:

    Fuck Intel.

    June 29, 2019
    Reply
  13. KV Student with gud english said:

    Time Machine?

    June 29, 2019
    Reply
  14. William Daniels said:

    More than a feeeeeeeeeelllling

    June 29, 2019
    Reply
  15. Declan O'Cuidighthigh said:

    The reptilian technocratic technocracy led by Hillary Clinton and George Soros are putting chemicals in the silicone that’s turning the processors gay, you can overcome this by downloading these vitamin supplement pills that’ll unlock secret hidden cores in your processor

    June 29, 2019
    Reply
  16. Seaboo Productions said:

    Could you do an updated version of this video and AMD version of this video

    June 29, 2019
    Reply
  17. that one guy said:

    and to think i was running an AMD A10 7800 untill a little while ago…. which refused to boost. 4500 score in cpumark.. now im running a xeon E5 1650 v1

    June 29, 2019
    Reply
  18. realdomdom said:

    https://www.youtube.com/watch?v=osSMJRyxG0k

    June 29, 2019
    Reply
  19. Farbod Mohammadzadeh said:

    Make this in 2018

    June 29, 2019
    Reply
  20. BlueScope819 said:

    Here is a thought. Why not have another socket for the graphics because you could provide better cooling a lot easier and power draw is easier?

    June 29, 2019
    Reply
  21. Emanuel Abraham said:

    Intel pentium with gtx 1030 will get u in gaming all them are extra how long u have a 2 core . 8gb. A gpu will game

    June 29, 2019
    Reply
  22. Jared Herman said:

    My i7-4770 is still going strong even after 5 whole years!

    June 29, 2019
    Reply
  23. Thomas Samoht said:

    Don't shrink cpu size: there's plenty of space on the mobos, and inside those cases!!! Just add more!!! GO BIG!

    June 29, 2019
    Reply
  24. ForbiddenFateGaming said:

    I still use a core I-3 with no problems I use my pc primarily for playing games and the games I play aren't very cpu intensive so so far I havent had a reason to upgrade from core I-3

    June 29, 2019
    Reply
  25. TK99 said:

    You guys should have used a Pentium 4 as the baseline for last century processors. Though I think that would require a retired firefighter present and on site ambulance for that testing.

    June 29, 2019
    Reply
  26. Gunther Ultrabolt Novacrunch said:

    https://www.youtube.com/watch?v=SSR6ZzjDZ94

    June 29, 2019
    Reply
  27. wogfun said:

    ive had a 6900 for about a year and have never felt like I had anything "special"… 32gb, 1070, 1tb ssd & 8 6gb+ drives on a super deluxe ROG mb. Premiere runs just meh

    June 29, 2019
    Reply
  28. BIOSHOCKFOXX said:

    Why compare something a small portion can afford? then at least make a video with the same idea just compare the top mainstream, such as K editions starting from gen.2. i7 2600k (or lower) (excluding 2700k because it's the same with +0.1Ghz, and it came a bit later), and ending with currently what is i7 8700k

    June 29, 2019
    Reply
  29. MrGoatflakes said:

    The problem is that game developers and worse game engine developers just aren't using much more than one thread. We could have so much better performance, but not on one thread one core…

    June 29, 2019
    Reply
  30. danwat1234 said:

    2018 Coffee Lake, still no single core performance improvements besides a bit more turboboost. Will Whiskey Lake or Ice Lake give more? Doubt it. maybe 6% Sky to Ice at most

    June 29, 2019
    Reply
  31. Ondrej Krajnak said:

    i got a p6t deluxe v2
    hyperx furry 1600mhz 16gb
    ssd 850samsung
    i7 980x
    gtx1050 gigabyte 2gb
    am i still good whit gaming??
    whit these specs

    June 29, 2019
    Reply
  32. tntom said:

    So What you are saying is keep my i7 4770K and asus z87-pro motherboard… and upgrade my GPU and RAM, maybe adding another ssd.

    June 29, 2019
    Reply
  33. warmfreeze said:

    The fact of the matter is…the base architecture hasn't really changed since the Pentium M…they have just been shrinking the process and tacking more shit onto the die in hopes of making it faster..

    June 29, 2019
    Reply
  34. dX said:

    Then rizen happens

    June 29, 2019
    Reply
  35. Martin said:

    delidding

    June 29, 2019
    Reply
  36. Andy T said:

    This is two years old, Weve even gotten even more stagnant with generation 7 and 8. A lack of competition and the fact that most consumers don't even use the head room they have.

    I think they're not doing research, and I think they are holding back some new innovations. While they are the only game in town, it doesnt make sense to release the best youve got each year, because then you are obligated to improve it for the following generation. So they are doling them out measurably, with about 7% performance gain a year.

    AMD released a new chip and now we finally get more cores. Theyve have more than 4 cores in the Xeon line for more than 10 years, but with AMD going the multicore threadripper route, theyve had to finally adjust. Also they are struggling as a fabricator. They are not having very high yields on their 14nm process, Samsung and another though are already doing it. Although Intel argues that they measure their 14nm in a slightly tricky way. This massively parallel architecture has to eventually make it into the CPU, I am thinking that AI research will progress, that depends heavily on GPU for their thousands of parallel cores to run the support vector machine,

    Eventually I am hoping it finds its way into consumer hardware. Id also like to see virtual workplaces via VR setups. Not just playing games, but jumping on the web and taking care of some of business in a virtual environment. I imagine virtually picking up a document and moving it to another document that has a different spacial location. Or flipping through things on Amazon using the vision. So the chips of the future will resemble AMDs APUs with a couple heavy lifting cores and thousands of small parallel cores. And maybe a middle one two, So that a single die package will have thousands of cores and each operation will be sent to the correct lane for processing.

    Physically, I think we've reached the limits of photolithography. Although I thought that a while ago, and figured wed be limited by the wavelength of light., IDK how they do it now , but at some point they will reach that boundary. The next option would be maybe an electron gun? I should know this- but does an accelerated electron have a wavelength. I dont think it does, just the quanta that they release. But Well probably see 3d chips. And Id love for heat to stop being such a huge problem. I don't know if I totally understand where its all coming from

    Anyway, this year intel is finally making their laptop cores a little more competative so now I am desparately waiting for them to come on the market so I can replace my 10 yr old laptop (which works ok because intel hasnt improved all that much);

    June 29, 2019
    Reply
  37. Sean Metivier said:

    At least they added cores to Extreme edition. i7 (and prior equal) languished forever.

    June 29, 2019
    Reply
  38. León Coretz said:

    I want to see all these retested at the same clockrate, and threadcount. That would control for simple brute force, and show us their real efficiencies.

    June 29, 2019
    Reply

Leave a Reply

Your email address will not be published. Required fields are marked *