i7-10700K to the i7-11700K
Single thread floating point: +19.0%
Multi-thread floating point: +19.5%
Single thread integer: +13.0%
Multi-thread integer: +7.3%
Jesus. Just overclocking the RAM for the 10700k would give a similar performance increase. Along with 291W, this 10nm->14nm backport is not working out well for Intel.PS My work desktop died on Wednesday afternoon during the SpaceX SN10 flight. I rushed to my local computer store to replace it. Out of the entire store, only 2 motherboards models with Intel's LGA1200 are in stock. All the other motherboards on the shelf were for AMD.
I picked the i5 10400f as it's under deep discount. I asked around and everyone agrees that Intel is competing only on price point right now.
Why didnāt you just go with AMD?
OP is implying the i5 was Pareto-optimal in terms of price and performance.
Exactly. In 2020 the 10400f was about 40% more expensive; now Intel is forced to lower the price of their entire lineup to compete against AMD. These days, at most performance tiers, Intel+mobo combo are slightly cheaper than AMD's equivalent.
Of course, Intel's CPUs have less PCIE lanes, no ECC support, lessor upgrade path, generates more heat, etc.
Finally my old desktop was Intel and I am working on a lot of kernel/VM stuff. I don't want to switch architecture halfway. I did that before and it was painful.
just a guess, but if he went to the store looking for an LGA1200 socket motherboard, he wasn't planning on purchasing a new CPU as well
In his last paragraph, he said he bought a new processor. Or at least, he suggested a purchase by referring to its discount using present tense.
> Why didnāt you just go with AMD?
It's out of stock everywhere..
I think Ryzen 5000 series' stock situation is getting better, both 5600X and 5800X has been consistently in stock in Belgium and Netherlands for about 2-3 weeks.
Is it a backport? I read somewhere that it's a continuation of Skylake evolution and not a new architecture.
Rocket Lake was designed for 10nm, but Intel couldn't get 10nm to work, so they backport Rocket Lake to 14nm.
https://semiaccurate.com/2020/10/29/intels-palpable-desperat...
Just filter out Charlie Demerjian's grudge against Intel.
Companies donāt backport because a core is designed for a process. Itās efficiency and architecture is based on the transistors that it will use from the design start, caches are sized based on the process too, and a lot of the architectural gains are due to the added transistor counts. If you backport a CPU you lose the efficiency of these new transistors so the energy use goes up and the clocks likely go down as well. Cores take up 2x the size, or at least a lot more than the older core did on the older process, so costs go up too. You can either cut out bits of the core and lose performance or eat it on area and therefor cost. Backporting a design not made for portability isnāt a lose/lose proposition, it is a lose/lose/lose/lose proposition. But Intel is desperate soā¦
Grudge or not, it's not like he's been wrong the last few years. Turns out all the negativity is well justified...
According to the article,
"The new generation Rocket Lake is the combination of two different backported technologies. Intel took the Sunny Cove core from 10nm Ice Lake, and re-built it on 14nm, calling it now Cypress Cove. Intel also took the Xe graphics from 10nm Tiger Lake and re-built those on 14nm, but these are still called Xe graphics."
So, yes, it appears to be a backport.
If you read the first page, it'll tell you?
> My work desktop died on Wednesday afternoon during the SpaceX SN10 flight.
What happened? Iāve never seen a computer die during use.
Didnāt you see the video? It blew up
Spend enough time in front of a specific computer (or little time in front of a lot of different computers) and you'll eventually come across the same thing. In my (around) 20 years in the industry I've had (mostly laptops) die for various of reasons, most often the motherboard and screen giving out, but also the CPU and harddrives. Happened a few time with desktop computers that been 24/7 under stress as well.
>Just overclocking the RAM [..]
>My work desktop died [..]
Are these related, perhaps? I found the juxtaposition quite humorous.
You have to give whomever came up with 14nm process. Despite being super old it's still somewhat competitive. It's crazy Intel can't fix their process. They make more cash than amd, nvida etc. They make more cash than tsmc.
Intel's 14nm process is, in terms of dimensions, closer to TSMC's 10nm, and their 10nm is only slightly larger than TSMC's 7nm:
https://fuse.wikichip.org/wp-content/uploads/2018/02/iedm-20...
That being said they really need to step up their game or else they'll be bleeding market share for years to come.
I give the people working on 14nm a ton of credit for being able to squeeze out as much performance as they have. As long as big datacenter customers worry about AMD's supply chain that will probably keep them from switching en masse, though the big TSMC plant coming on line might go a long way towards alleviating those concerns.
I did not know this. This is interesting, because it gives Intel even less of an excuse for their insane power consumption and poor performance. In recent discussions, Ryzen's performance was commonly credited to the process, along the lines of "if you ported Intels core to 7nm it'd wreck everything else". It seems like more of a fair comparison now.
I would not be surprised if there was a lot of sunk cost fallacy on Intel's 10nm. They couldn't make it work effectively and it was so far behind that they bled out most of their process leadership. But rather than pivot and reset they've just kept hammering away at it. Of course the decision would not be trivial to make; the lead time for a new process node is significant.
It's really a pipeline problem. You start the core design 2 or 3 years out, with the design rules you expect to be able to fab on. It's relatively easy to fab a 14nm design at 10nm if the 10nm fab shows up early, but it's hard and messy to take a 10nm design and fab it at 14nm if the 10nm isn't working so well.
But when you're "close" to getting your 10nm fab working, it's hard to start a new design for 14nm. If their 7nm had shown up ready to go, that would have saved them, but it looks troubled too. Maybe they'll be able to work out the kinks faster on 7nm.
Maybe they'll design something out of character for Intel that works for the realities of their 10nm process; from what I can tell, their 10nm process is mobile only because it doesn't clock well, they could turn that into a less negative by doing a design they normally couldn't do because of the need for high clockspeeds. It may look copycat of M1, but something that's a lot wider might be viable with a 3Ghz clock target, but isn't viable for 5Ghz that Intel needs to hit on desktop chips.
I read somewhere that the crunch to get 14nm fixed burned a lot of goodwill among their top engineers, and when they saw delays looming for 10nm as well they looked for greener pastures rather than go through the 14nm crunch again. But yeah, Intelās leadership should have been making better fallback plans for 10nm and beyond from the moment they knew Broadwell would have to be delayed, all the way back in 2013.
Knowing very little about the subject...
Is it plausible that they wouldnāt be able to get 10nm to work, but would be able to make 7nm work? Or put differently, would abandoning 10nm be tantamount to abandoning the effort to develop a smaller process?
7 nm and 10 nm are quite different (7 nm relies more heavily on EUV IIRC), so the 10 nm delay doesn't have to affect 7 nm. However, 7nm is also facing significant delays.
I am a very causal enthusiast but from my understanding there are many ways to skin a cat when it comes to process nodes hence with the company A 10nm vs company B 10nm are not equivalent.
Intel somewhere along the line chose the wrong set of solutions to achieve their 10nm process.
IMO these mistakes will not necessarily mean that some other node will be inherently disadvantaged other than the fact that they may now find themselves scrambling for tech lead. They will still have learned a lot from 10nm and hopefully should be able to feed that into future density improvements.
The power consumption graphs are really something. Can't believe it almost cracks 300w under avx512 workloads.
can the situation be thought of as "Intel has problems with porting all their 14nm tweaks to 10nm" and therefore is it reasonable to expect Intel on 10nm nodes to pick up all the similar node advantages over again?
Intel is on my shitlist now for their new "Coalition for Content Provenance and Authenticity" with Microsoft and Adobe. I don't care how fast their CPUs are, they are dead to me.
Wait, this just looks like cryptographically signing pictures. What's so scary about that? You can do it today from your terminal if you feel like it.
I guess I agree that the 'trusted computing' stuff it seems like they're trying to do is a little scary, but the tech isn't really there yet, at least not on the desktop (look at the fiasco that is Intel SGX) and it's happening with or without whatever this CAI thing is.
I guess a world where your iPhone's camera sends signed frames to the processor's secure enclave which processes them and signs them with a key signed by Apple is... a little different from today? They do basically this for Face ID today.
Would you like to explain why for those of us out of the loop?
https://contentauthenticity.org/our-members
Basically a large group of companies are forming a group to create an agreed upon fingerprinting system for 'content' (read: everything) to destroy privacy in the name of 'fighting disinformation'.
Isn't such a thing necessary as we see US presidents falling for (deep)fake media?
Is there a privacy-respecting solution that you prefer?
Maybe I'm missing it, but Intel doesn't seem to be listed among the members?
https://contentauthenticity.org/our-members
I agree that the project sounds like garbage.
What's wrong being able to sign a document?
Nothing.
The problem comes when it's built into the hardware so you are forced to sign all documents.
Oh this is dreadful. I moved to AMD this year. Do you know good alternative to After Effects on PC? If I could get that going I could stop my Adobe sub.
Unfortunately, the only viable alternative is Final Cut...which I think is vastly superior but is a much bigger investment (buying an Apple computer instead of just switching software).
That program is an amazing piece of software. It would be nice if there were an open source version...
RIP Intel...
They need their new process node and whatever new architecture they have in the pipe to land -soon-.
Looks like my 9900k might be the last Intel processor I buy for a long time unless things drastically change.
Either I had really bad timing or bad luck, but somehow I am ending up with Intel even though I really wanted to switch.
Around 2012 I bought a 4-core Core i5 and after some basic upgrades (max RAM to 32GB and a good SSD) it had more-than-acceptable performance for my work until mid-2018. Even when I dabbled with Machine Learning I borrowed a 1070 from my office and was plenty to run my tasks.
Around end of 2018 I started working more in different offices, so I needed a more powerful laptop. No strong offering from AMD yet, so I ended up with a System76 core i7, 6-core 32GB RAM.
2020 came and with it I got back into working at home. So I got excited about upgrading my workstation. I bought all the components to get a Ryzen 5, a 3700X I think. What a shitshow: had to go through three different pairs of RAM to find one that was actually compatible with the motherboard. Then I had to find a way to upgrade the BIOS, and I could only do it by getting an Athlon CPU. BIOS upgraded, then the original 3700 died. I ended up turning the whole system into a very expensive media center just with the Athlon and continued working with the laptop.
A couple of weeks ago my laptop's battery puffed up, so I decided to go look again into a nice desktop. An 12-core Ryzen 5900X is coming out at 900⬠and a basic motherboard is 300ā¬. 10-core Intel i9-10900 is 300⬠and a motherboard for it is 180ā¬. Guess which one I ended up taking and that I managed to get running without any issue whatsoever?
Here is to hope that in 6-8 years from now I finally manage to make the switch to AMD.
(Edit: why the downvotes, HN? It's an honest account of my experience.)
> a basic motherboard is 300ā¬
A 'basic' motherboard two weeks ago was the same as it is today, a low-end B550, and can be purchased for under 100ā¬.
The 10900 trades blows with the 3900x, particularly for multi-threaded, workstation-type workloads where the AMD part pulls ahead with its two extra cores. And the 3900x can be had around the same price as a 10900.
So your "honest account" comes off as a little disingenuous, which may explain the downvotes.
(Yes, the high-end Zen 3 parts are pricey and have very limited availability right now, but that is afflicting the entire industry - see also consoles, graphics cards and many other things. With their own fabs, intel are uniquely placed to supply right now, but their CPUs are more competing with the last gen of AMD's if you're talking about a workstation)
I paired up a 5950x with a $200 TUF 570 board - a middle of the road offering. One thing to consider is the BIOS version of the motherboard. The Asus one had a sticker with the build version, and what they were stocking in Microcenter was just new enough to support zen3 OOTB. A better board would allow for BIOS flashing without the CPU, and a few folks have struggled with that first flashing. Check the version first.
The threadripper boards are starting at $400 for the current generations. I regret spending what I did on the first generation boards, as they dropped support for them on the third generation. I'm waiting to see a zen3 series threadripper before I'll replace my old 1950x. The even older 3930k i7 was able to keep up on single threaded processes... which surprised me. Funny it really took 10 years before I got that wow factor again. The number of cores that I'd consider 'normal' has certainly gone up. Think the 3930k was around $600 in 2011, the 1950x around $800 in 2017, and the 5950x at $800, so that price point has not changed much. The first two were picked up at launch date, the last was a paper launch that took till January to find.
I honestly just looked at both AMD's and Intel's latest generation and compared from there.
Still, point taken. If I didn't get burned by the whole ordeal with the 3700x I think I would've taken a second look at the AMD 3000.
> why the downvotes, HN?
Your honest account seems more like a series of errors you've invited on yourself. Your motherboard prices are insane. And this litany of spontaneously breaking components make you sound like you're just yeeting the stuff around. Maybe that's all unfair, but it's how it comes across.
Now you're comparing the prices of the i9-10900 to the 5900X, when the 5800X's performance more than matches it. The 5800X is about 10% cheaper than the 10900 here.
If you don't understand the platform and chipsets, use a picker service or even buy a bundle. There are lots of component sellers flogging matched components in tested configurations. You won't get top-spec motherboards but there's no reason to spunk ā¬300 on a desktop motherboard.
> If you don't understand the platform and chipsets, use a picker service
Like I actually did, I have been doing since I can't even remember to build previous workstations, NAS, the dappnode under my desk, and my home server that runs the services I self-host?
- Ryzen upgrade: https://de.pcpartpicker.com/list/Dxx2nL
- Intel upgrade: https://de.pcpartpicker.com/list/rP8jCz
Looking around now, it seems that MSI X570 is available at a lower price, but it wasn't when I first checked.So, yeah, bad timing. But this patronizing is unwarranted.
> The 5800X is about 10% cheaper than the 10900 here.
In Germany it's about 15% more expensive, and my calculation was based on price-per-core than absolute performance, not to mention that the 10900 has a 65W TDP vs 105W for the Ryzens.
AMD is beating Intel on performance, but with recent price cuts, Intel is offering VERY good value right now and also much better availability. If you don't need the absolute best, Intel is the way to go easily.
That is the most succinct and business-minded summary in the entire thread.
Relatedly, I am surprised by the commenters who believe that the trends in Intel pricing were somehow involuntary. The outperformance of AMD CPUs was clear even prior to the pandemic, and I recall talk at that time about Intel using price pressure to resist AMD's expansion of market share and gross margins. Now that silicon supply is getting tight, market share is all the more relevant because whatever demand AMD can't meet gets caught by Intel. It's not all about benchmarks (although IMO Intel is hanging in there with Rocket Lake). You actually have to get a customer to buy your product in order to generate revenue.
Thank you for that. This downvoting is making me feel like I kicked some sacred cow.
What an odd experience. I haven't heard of anyone having similar experiences with Ryzen, including myself. Maybe you got some unusually finicky motherboard?
It's a mini-ITX Gigabyte B450, the Aorus line. From what I found online, there were plenty of cases of incompatible RAM that would simply not boot.
The "odd" thing is that this Intel system I put together ended up being the very same brand. I am regretting it now a bit because I realized that the board can only do 4k@30Hz through the HDMI port and the i9 can do 4k@60, but given that my laptop had a similar limitation and I got used to run with my monitor at 1440p, I think I will hold on to it until I feel like it is time to get a dedicated GPU.
TBF, I still haven't seen enough to convince me to upgrade my desktop from 4690K (which got an RTX2080 in the meantime). Standing plan is to wait for RTX4090 or AMD's equivalent and change the whole box then.
I still have a 4770k. I have a newer gpu, and it seems to be working ok for me. I'm planning on upgrading in 2022 when DDR5 is available.
>> land soon
Not soon but yesterday
The next big desktop release will be whatever comes with the new AM5 socket so they have some time to get their act together. I donāt have high hopes that they can actually compete and now Apple also leapfrogged them with non-desktop chips
Beaten by the 5800X in almost every test, unless AVX512 was used.
Given the 7nm shortage at TSMC, Intel might still beat it in price / availability.
For 5600X/5800X, availability is fine now. Theyāve been in stock at amazon for a few days continuously. 5900X and 5950X on the other hand.. good luck.
I find it really amazing that one of the things "saving" Intel at the moment is that TSMC is maxxed out on 7nm capacity ATM.
I really wanted to build a Zen 3 / RTX 30-something to replace my Ivy Bridge / GTX 1060. After months of chasing the RTX, I finally ordered a prebuilt yesterday with a 10700K / RTX 3080. Now to see if it really is "in stock and ready to ship."
The only thing I feel like I'll be missing in the 10700K compared to the 11700K is AVX512. It's been derided for throttling, but it does look kind of amazing for certain workloads.
One thing I find curious about the Zen 2 and Zen 3 product lines is that the processor I really want is OEM only: first the 3900, now the 5900. I like the idea of twelve cores at an easier-to-cool 65W. I bought an OEM Athlon back in the day, but I don't see a lot of OEM processors for sale these days. Maybe just more fallout from the chip shortage.
You can just get a higher TDP processor and set limits so it's a low-TDP one. It will perform better. The most efficient CPUs go to the highest end parts; not the other way around.
I see nothing in this review that would make me choose the Intel part over a Ryzen 5800X.
Price is the biggest factor. You can get 10 cores with Intel for cheaper than 8 with AMD, plus with Intel you get integrated graphics as well which is really important right now considering the GPU shortage. Also, Intel allows virtual GPUs for virtualization in QEMU which AMD won't do.
āChoose Intel because itās cheaper than AMDā is one of those arguments that make you realise how quickly the world changes.
The whole situation is a kind of surreal 180-degree turn of what things looked like 5+ years ago and a testament to how far AMD has come.
"Yeah go with Intel if you don't need top performance because the CPUs offer more[0], but slower cores, albeit with much higher power draw and a decent-enough integrated GPU at a cheaper price point."
Are you sure we're not talking about early 2010s era AMD APUs?
[0]: There are as many caveats here as there were 10 years ago for AMD.
It does make sense though given that theyāre still cashing in on their mature process when AMD is in the TSMC 7nm bidding war
If I wanted more than 8 cores, the 12-core ryzen 5900X is $549, can you really get an intel 10-core cpu for the same price, or within 50 bucks plus or minus, that performs better?
I'm seeing the i9-10900K for $467, but in terms of dollars/performance per core I think it's still behind the ryzen which costs $82 more.
I wouldn't look at the 10900k trying to "save a buck" on the high end. Instead the 10850k which is, for most purposes, just a 100 MHz slower 10900k can be found on sale for 350$. Higher availability, and prices that meet or are below recommended MSRP.
You can't really get a 5900X for $550, they're sold out everywhere. (I'm trying to buy one myself.)
The 8-core 5800X is available at MSRP, though (at least, I've seen Newegg stock it).
The biggest issue you're going to run into, is that the 5900x may be $549 retail or whatever, but where are you going to buy one? I recently opted for a pre-built 4600G PC, and the 4700G is a great price point too... unfortunately AMD decided to really open these up to OEMs.
Can you get the 5900X at all?
AMD needs to integrate basic gpu in its higher end CPUs. With the current gpu prices where even old low end models are selling for double it will make sense for anyone needing good desktop experience without need for a fast gpu otherwise AMD is missing a market segment.
There is no shortage on dedicated low end GPUs.
I'm just cracking a smile and almost laughing at the thought that 25 years ago I wanted an Intel Pentium but had to settle for a cheaper AMD 586 clone that ran hotter and performed worse. My, my, my how the tables have turned on Intel...
... or the 5600X (probably equal perf but cheaper / lower power) or the 5900X (about the same price if you factor in power supply + cooling, still lower power).
...Which is weird, because there are other reviews that say the opposite based on a small handful of benchmarks. I'm not sure what conclusions to draw.
https://www.techradar.com/news/intel-core-i7-11700k-may-outp...
I think at this point we're just waiting for Windows and Linux to be fully supported on the M1 or for ARM to come out with a system on a chip that's documented enough for Windows and Linux to make a port.
Windows on ARM is already working on M1 Mac through parallels. The big issue now is licensing and I assume some niceties in terms of official drivers/fine-tuning support.
According to recent videos, Windows in a VM on M1 Mac today has same or better performance than Windows on ARM running natively on Surface Pro X.
https://www.macrumors.com/2020/12/22/m1-mac-windows-parallel...
https://twitter.com/imbushuo/status/1332484912549687297?ref_...
Licensing isn't an issue: a standard Windows 10 Pro license is processor-independent and includes a single-seat virtualization license. If Apple allowed it, you could even use the same license for virtualization and BootCamp like you can on amd64 macs.
I'm not sure ARM images are available, except for big OEMs. Also IIRC the boot is not restricted to Apple software (or at least can be unlocked), so I'm not sure that Apple is actually disallowing anything. Getting native drivers is another issue...
Considering the benchmarks of Qualcomm's Kryo 495, it's not surprising. Apple is way ahead of them in every regard.
> for ARM to come out with a system on a chip that's documented enough for Windows and Linux to make a port.
glances at the pile of ARM boards on his desk
don't care about windows but... linux ? ... you know that 99% of android devices are linux and run arm, right ? and that debian has been available for arm, for, what, 2.5 decades ? what do you think raspberry pi's run ?
There already are other SoCs for ARM based PCs. Qualcomm has the 8cx "Gen 2" chip, but it performs really badly compared to M1 (https://www.cpu-monkey.com/en/compare_cpu-qualcomm_snapdrago...).
Considering how little fab capacity there is for new designs right now (and Apple has probably booked it all out for months/years to come), I can't see this situation changing much. AMD is the closest but they also have insane supply shortages right now.
You can buy a couple different Qualcomm-based Windows ARM laptops today, like the Surface Pro X. There's even a ~5 year old Asus (or maybe Acer) laptop using a Snapdragon processor that you can find for dirt cheap on ebay.
ARM is not the gatekeeper or reason why windows on ARM is lagging--the ARM ISA is fully documented and available to license holders (of which Microsoft is certainly one). Microsoft is the reason windows on ARM is a bit of a hot mess. They let the windows ARM platform languish for years and are scrambling like mad to catch up now (both in the Azure server space, and in the consumer space).
I'm just flabbergasted that Intel has Thunderbolt support integrated into it's mobile chips, but on desktop it's business as usual, left to motherboard vendors to implement Thunderbolt themselves with the normal slew of add-on chips.
Super frustrating that desktop users can't expect modern connectivity. It should be on the chip, like mobile. Notably, one can take any old TB4 cable, connect two modern Intel laptops, and get a 40Gbps networking connection between them. Alas, only a couple rare desktops will be able to participate & do this, because for whatever reason this on-chip capability that Intel has was omitted on desktop. I was so excited to be entering an era of enhanced connectivity!! Not yet, I guess.
Intel still did not react properly to their new situation of being objectively worse than AMD. They lowered the prices a bit, that's all. Still trying to segment the market with locked processors, even introducing more limits on ram overclocking, and still refusing to honor warranty when enabling XMP. That's not a company that wants to compete, so I wouldn't expect big moves with regards to connectivity.
Locked processors shouldn't be a thing anymore, ECC should be supported, motherboard chipset segmentation stopped, and it would be nice if the new processors wouldn't be set to 300W energy usage. With the right price Intel's processors would have a chance then even though they are slower. But like this - who would want to buy this? You are right to be flabbergasted, they should do things like integrated thunderbolt support to better the situation.
Nowadays they change their sockets at a much higher rate compared to AMD. Since you have to replace the motherboard to upgrade to a newer generation, the door is open to move from Intel to AMD.
Well USB-C/Thunderbolt rollout and HDMI 2 to laptops has been as glacial as ever.
How is it that mobile phones get technology support two years before laptops and desktops?
Where is big money to be made, in laptops and desktops, or in overpriced phones?
You're probably right, but the Android market has it's marketplace pressures just like PC, and while I don't know the margins, I imagine Apple is making good margins on Macs.
It seems there IS a market for high end androids, but not for PCs, which really is an indictment of mainly Windows but also desktop Linux.
Is the 20Gbps on Rocket Lakeās integrated USB 3.2 Gen 2x2 āclose enoughā to TB4ās 40Gbps?
It's not USB4 so it can't connect to other computers. And it's already legacy, with limitations (no sharing mtiple streams on the channel, so no mixed DisplayPort). It's not Thunderbolt so it can't connect to thunderbolt pcie devices.
It's about more than the throughput. It's about usability. TB4 and USB4 both have some great usability leaps over USB3.
I think Intel sees slots as mutually exclusive with Thunderbolt.
Get a daily email with the the top stories from Hacker News. No spam, unsubscribe at any time.