Yes I know, I had said as much if your read what I wrote.
As for delays, that is a story across the whole world with component shortages and modified work spaces due to the pandemic. Its hard to fault them in that regard.
I have electronic components not slated for delivery to my company until April 2023 right now.
I am sure that it is felt across the board with all companies. Even massive automakers reverting to key start instead of push button just so they could keep manufacturing.
That is why I will buy when possible. To break the ground and pave the way for successive generations.
Intel would hardly be the first to launch a product with an incomplete driver set or software issues.
I’ve been Frontline for far too long on crap launches from both brands spanning decades.
One hopes with the world returning to normal they can work out what ails them and get it into a more refined launch state.
OEMs and workstations first, then retail after, hopefully.
I don’t have a personal loyalty to Intel for these cards, my loyalty lies in diversity and competitive market. Right now with only two companies competing in the space, the consumer loses out. With 3, it changes everything.
i wouldnt call it shortage… i would call it massive over demand. and intel had capacity reserved at tsmc for arc gpus. so there was no problem on hardware end. that was entirely software/drivers. and considering cost per chip freight really wasnt the problem for intel.
legacy nodes were impacted cause they are really cheap and manufacturers cancelled their order cause covid, then massively over ordered the chips and manufacturing lasts for months. and expanding legacy nodes really doesnt make sense cause it is hard to recoup investment with how cheap those chips are. it is better to invest in leading edge where companies like apple, nvidia and amd will cover the cost of expansion with their orders and later they will become legacy nodes that will just print money.
there is no shortage of chips, there is shortage of cheap (legacy) chips.
They only said rasterization performance bump of 4090, not 4080!
Rasterization is what this game, and most game use to render 3D world.
All 2~4x performance bump of 4080 is pure from DLSS and RTX.
Wow. If you dont compared between specs, then what you want for? TFLOPS? 3DMARK scores? GPU has one of the most linear scaled performance of its specs.
Let alone doubled memory bandwidth.
Wow. Do you know memory bus width is directly bond with the chip size? Aka the chip price? Why not let lower end GPUs paired with slower but more lanes of DRAMs instead of increasing GD6X’s speed? Do you know IO also cost die size?
And 4090 is already using state of art fastest GDDR6X. Lower end GPUs obviously not going to get any more faster GD6X chips.
Also, do you know the GPU’s memory bus width is the flag of what this die’s tiers?
In the past. Only mid tier SKUs get 192 bit bus.
Still. Think with your mind.
How come 4090=2x3090 But 4080=1/2x4090 but still they said it’s 2x3080ti. Which 3080ti is never so much slower than 3090ti.
If you cant understand GPU’s specs, here’s a very good website for you:
You are being to narrow focused when I use the term shortage and concentrating on the base silicon.
That was far from what I said when I said components.
Everything from Caps to VRAM to MOSFETS and beyond was effected by the pandemic.
Graphics cards are the sum of their parts, and many parts to make them were not in sufficient supply.
well i focused on them cause most media focus on chip shortage. but company like intel usually doesnt have that problem cause they have priority on components from their suppliers. so at least for them hardware wasnt a problem. just look at their MBO supply. there wasnt a single problem with them in terms of supply in last 2 years. and gpus are more or less small mbo with graphical processor and memory.
problem is with smaller companies in tech business (and yeah i find irony in calling gm and ford smaller companies) that dont have priority. intel can pay out more per caps and mosfet cause of end price of their component, while car manufacturer wants to pay smallest amount for chip/mbo.
They were no less affected by it however. They have buying power when it comes to Silicone, no doubt about that. But Intel has not done board level work in quite some time, so in reality they were smaller fish in the sea compared to the likes of Cisco ect. When it comes to those components.
Cisco was one of the few companies that could supply during all of this. And even if Intel had put pressure on supply chain for these items , there are contracts that need to be fulfilled first before allocations and production can be shifted. That takes time.
So yes, they were very much hampered by the global situation. I literally watched it all unfold from my job.
Futher compounding all of this was Ukraine and the supply of Neon to the electronic component industry.
Its been a cluster #^#$ since this all started.
I’m not downplaying other problems they had but I’m bringing it into focus that the challenges they faced were many and they are plowing ahead to get something on release to start eating into market share.
ffs pls look at these 2 slideshows from GTC. if those are true than 4090 also has all of its performance from DLSS and RTX.
no. while specs may point you in direction of their performance, it doesnt scale linearly. if this was pure compute i would say you are right, but it isnt. this is in game performance. there are numerous factors that decide number of fps. from game optimization to driver optimization to cpu processing etc.
ffs it is directly 64 bits per 4GB. they all have same memory throughput. just they have less chips, so smaller throughput. what do you expect them to do? special memory and gpu lanes for model with less memory? 96 bits per 4GB for 12GB model?
pls tell me where it says that 4080 is 1/2 of 4090? it only says half cuda core, not half the performance. anyways it is pointless to discuss this before we get independent tests. they said 4090 is 2-4 x 3090ti, and 4080 is 2-4 x 3080ti. so i guess same criteria apply whether it is rasterization or dlss/rtx
intel is pretty big in MBO business… sure they outsource manufacturing to their partners, but that doesnt mean they arent a big player. they have industry connections for it not to be a problem. also they dont need to manufacture most of the cards themselves cause they have AIB partners which should have constant supply of those components.
i am not saying that they could produce unlimited number of cards, but for allocation they got from tsmc and with their AIB partners they could have had hundreds of thousands of arc gpu already out since their initial release should have been on Q1 2022. but there are only select number of cards out. so not a shortage problem. there could be hardware design problems, software problems, but it is not shortage problem.
Well unfortunately we are going to have to agree to disagree then.
Both the team at my website who have actually been to these factories and my own personal involvement in the industry are very conflicted with your statements.
I won’t try and continue arguing pointlessly on something that I am well versed in with you.
if there were 471 million intel CPUs sold in last year, then that would mean about same number of MBO-s for intel cpu-s were sold (not including dual socket MBO). and you want to tell me they dont have enough component level parts to build 1 million arc gpus? or 100 thousand?
edit: well just correction, that was overall x86 number sold, but still intel held majority of the market, so point stands.
And that’s just what is happening in public. A report from the German-language Igor’s Lab claims that Intel’s board partners (the ones who would be putting the Arc GPU dies on boards, packaging them, and shipping them out) and the OEMs who would be putting Arc GPUs into their prebuilt computers are getting frustrated with the delays and lack of communication.
A long, conspiratorial video from YouTuber Moore’s Law is Dead goes even farther, suggesting (using a combination of “internal sources” and speculation) that people in Intel’s graphics division are “lying” to consumers and others in the company about the state of the GPUs, that the first-generation Alchemist architecture has fundamental performance-limiting flaws, and that Intel is having internal discussions about discontinuing Arc GPUs after the second-generation “Battlemage” architecture.
Seems you know a lot.
Just give me an example of ANY GPU that is 1/2 of entire spec of another same architecture GPU. But performance is just 20% difference.
Yeah GPU in all real game may not be so much different from the raw power. But especially at same architecture. The real GPU gaming benchmark is very linear to its spec.
Also. GPU is never “GRAM per memory bandwidth”.
Especially for games that dont need so much GPU memory. The higher memory bandwidth the better.
Explain why Nvidia release 3070ti, AMD release 6x50XT. Just for higher memory bandwidth.
3090 has 3x CUDA cores as 3060 (10496 vs 3584), but it cant even get 2x performance of 3060. also 3060 has lower bandwith (360GB/s vs 936GB/s). also memory width is 192 vs 384.
yes. that is why it is pointless to argue on performance based just on specs. it may be better or worse depending on the architecture. and considering how nvidia is going overboard with its power consumption i assume they will get worse performance per cuda core amongst different tiers of same generation products.
cause i dont trust anything nvidia says, i am waiting for independent testing. also scum move placing this years 4070 as 4080 12GB just to cover price hike.