RTX4090 is really nice. 2x rasterization performance than RTX3090ti. And only get a price bump of 100USD.
But.
RTX4080 is a disaster. And it makes all model below RTX4080 a nightmare.
First of all. For the first time. Nvidia get a completely different GPU die for the SAME GPU model. (except those cut down version but cut to same specs. Like GTX1060 6GB with GA104/GA106.)
RTX4080 16GB is using AD103 but RTX4080 12GB using AD104. RTX4080 12GB is really supposed to be RTX4070. But +200 more USD than RTX3080.
And, for the first time. xx80 get such a large performance gap from the flagship SKUs. RTX4080 12GB specs is even less than 1/2 of RTX4090
But RTX4080’s price is not less than a half of RTX4090.
For the first time, RTX4090 , as a flagship SKU/GPU, is the KING of Performance per Dollar. For those who are poor and can’t afford 1500USD for a GPU. NV says: F U.
Also, the RTX4080 12GB has near same performance of the RTX3090ti, also just 15% faster than RTX3080 12GB.
So from 699USD(assumed) to 899USD. You only get a 15% performance bump.
And dont forget RTX4080 is already get a memory bus of only 192bit. What about mainstream SKUs? 4060? 4050? The majority model that supposed to be widely sell?
Memory bus at 128bit for 4070? 96bit for 4060? 64bit for 4050??? Or do Nvidia will ever produce mainstream GPUs?
And for game developers. If players see no meaningful performance per dollar uplift at newer generations, why players would upgrade their GPUs at all?
The GTX1060 has been here for 6 years now. And most players are still using it.
You have been using GTX1070 for performance benchmark at 2019:
But players now still using even slower GPUs than what you tested 3 years earlier.
Let alone Nvidia only takes about 3 minute of entire GTC to talk about PC GPU products. They really no longer care PC market now.
Nvidia has been milking consumers since RTX began.
They’ve just gotten more bold because it’s been proven that people will buy anything they offer regardless.
Nothing new here. It’s all been going to hell for years. They’ve been watching everything burn while collecting record profits from crypto miners. The same’s been happening to pretty much the entirety of the western IT sphere.
It’s not all bad, though. Modern games are almost exclusively trash, making these technologies worthless to everyone except for the aforementioned crypto miners.
To make it worse people have short memories there.
The next generation of Ryzen CPUs are being hailed for being cheaper than the current gen, which everyone seems to have forgotten had extremely inflated prices to begin with.
Then there’s Nvidia raising wattages every generation. The GTX 1080 was 180w.
The 3060 is 170w.
Well, if you have no impulse control, you’ll be paying NVIDIA every 2-3 years. That’s why the stats are really meaningless until the broader game industry gets on board with the fancy technology. I know they usually have a flagship game or three at launch, but a LOT of people are finally becoming wary of AAA releases, so I doubt that practice can continue in the same manner for much longer. I still don’t know how many games actually use RTX, but at launch it was only 1 or 2, IIRC.
I made my 1080Ti last about 3.5 years. Since prices have started to return to sanity, I’m “upgrading” to the 3080 Ti. There’s still a $500 USD difference between the 90 and the 80…damned if I know what I’m going to play that would need the 90, but such is the graphic card marketplace.
With Bitcoin mining now on the radar of power utilities, I suspect the major market for GPUs will remain the render farms that the movie-based CGI outfits use, with bitcoin mining moving to wherever the utility costs are the lowest.
Just so noone starts making premature declarations about PC gaming. If I had 100 Enlisted Gold for every time I’ve heard that “the PC is dead!” I could buy everyone on the forum a premium squad.
The other variable is that AMD is far from out of the graphics game, if NVIDIA cedes the PC GPU market to AMD, there’s a good chance they may never get it back.
Sad that EVGA has ended its partnership with NIVIDA. No more GPUs. I will probably buy a 3070 or 3080. I’m not gonna use my life savings to buy a GPU. On top of that, have you seen the prices for Zen4? Nice system is gonna set you back 2,000$ +
idk why you say it is disaster… they say you get 2x performance of 3080 ti. and 3080 ti doesnt have that much difference in performance with 3090 ti.
not really. xx90 before just didnt make sense compared to xx80 products. you pay double for 10% performance. now you pay almost double for 20%? idk what exact performance difference will be without independent testing.
specs dont tell anything. they may matter in computing, but in games they translate differently. double of CUDA cores may only translate to 10-20% increase in game fps. for general consumer they just hope to produce effect on them like you have now. that anything below 4090 is not worth buying and that you must buy it.
well you can thank shitty gpu market for last 2 years. when they saw there were people willing to pay 3000$ for flagship products so they just increased prices. also TSMC has increased price of manufacturing and there is shitty inflation cause of numerous reasons. also 4080 is 200$ and 500$ more expensive than 3080. they are making big segmentation, so i guess there should be some performance gap unlike past 3080.
you know that bitrate depends on the memory? and not on gpu? so depending on the memory size you get higher or lower bitrate.
did you even watch GTC? they said 4080 has double the performance of 3080ti. and 3080ti has slight difference with 3090 and 3090ti.
people dont understand that they dont need shiniest new toys every 2 years to keep their PC playing games. specially f2p games that are optimized for wide range of toasters.
unless you are doing professional work where time=money, mid range PC can get you 5-10 years of usability before you need to upgrade.
I’m personally going to take a chance on Intel for my next GPU. One less to line nVidias pockets , personally.
They have done wrong to their AiBs for years, I’ve personally written many press releases about their practices and marketing.
They did make great GPUs on and off over the years but lately things have gotten extreme.
When a company that has 50% market share in NA gets ticked off and axes 70%+ of its Company Revenue stream because of it. Thats a big deal.
I dont care what this card does. Its not worth supporting them anymore.
I could rant for hours, I’ve been at a few releases and behind the scenes events , and there is so much more to say.
few of the brave ones… personally i wouldnt touch it with 10 foot pole. they dont have optimized dx9 and dx11 drivers and only dx12 games are optimized. that means that older games will perform like shit and wont utilize half the potential card has. also not to mention number of bugs cause it is first generation of cards.
unless you work in settings where you need AV1 encoder i wouldnt recommend intel. also their top card is only comparable to 3060 i think in best case scenarios. currently best price/performance is still on AMD.
You have to be a bit brave. I know its going to be a Gen 1 card that will have issues.
I dont care.
I’d rather support it for Diversity and work with their driver development team than have things as they are right now.
AMD might be a good price / performance ratio but their market share is smaller and they have their own issues (as many people report on my forums).
For me I want to have that 3rd contender that we used to have so many years ago before it was gobbled up.
I want to take them out at the knees a little and give Intel a foothold.
With 3 in the market, it might help equalize things. Especially if Intel prices it correctly.
edit further to all of that. Aiming for the 3060 in terms of performance is perfect. That is the market sweet spot. Halo cards are cool, but midline performance cards are the volume movers. That is where you need to take the fight. AMD knew this for years, and its probably why the Ex AMD head is doing it for Intel.
well 3060 would be good goal if they came a year ago. but probably they will only come out in limited quantities through next half a year when 4060 will be available (so 4090 in october, 4080 in november and 4070 will probably be around january with 4060 anywhere between january and may).
so intel is lagging generation behind mainstream nvidia/amd products, will have driver problems (specially with older titles). but it has beastly AV1 encoder which would be good for professional use case. if we were still in gpu shortage, every card would sell even with all of those problems. but now they will be more used in data centers and office workstations.
yes it would be great if there was a third contender, but i dont see intel getting up with nvidia and amd for years to come. also amd was known to have shitty drivers and they had decades of experience, so idk how intel would fix them in short period of time. drivers need years of optimization for loads of older games which nvidia and amd have and intel as new player in market doesnt.
Intel interestingly enough is actually the largest supplier of GPUs due to the fact that the majority of their volume chips have iGPU on die. So their driver teams are very mature. Now they just need to scale that up into an industry of discrete cards. Its new territory for them, but they have deep roots in the industry.
As well, yes this card is a generation behind, but that was always going to be the case. Look what happened with nVidia in the DX10 days. They had mighty struggles getting their 4XX series cards into production and market. They were well behind AMD. But given time and development and refinement they overcame that and now are ahead by all intents and purposes.
It can happen faster for Intel because they have a vast array of resources to put into this project now that they have something that physically exists and works as intended. Plus the company has the motivation behind them to do so, unlike the failed Larrabee project.
Probably as I type this, the next generation of Intel cards is already being taped out and put into some form of Risk production.
As for the DX thing. That was a calculated decision.
Devote resources on the past or focus resources on the future? I personally feel from a corporate perspective they made the right call in supporting the latest DX variation first. They can worry about backporting support at a later date as required once these have been in the wild for a while.
And for professional use, that is a market dominated by Quadro, yet they can take Nvidia door clean of the hinges if they get these cards in the right hands and win market back.
If EVGA had any sense at all, they’d be calling Intel to ink a partnership like PNY has with Nvidia (workstation cards) and hitting them where it hurts.
Time will tell on this one, but I see a promising future if they stay the course and don’t pull a 14nm+++++++++++++++++ mistake again.
well drivers for igpu and discrete cards are vastly different. considering that the cards should have launched at Q1 2022, constant delays and leaked reports show that their drivers are immature and they had significant delays to fix them. a month ago gamers nexus made video about 43 driver bugs and their cards arent actually widely spread. i would imagine hundreds if not thousands of bugs being discovered as they interact with different configurations of hardware and software once they become widespread (and that will be hard considering competition).
intel gpus would need to be significantly cheaper then their more mature counterparts for someone to be even considering them, otherwise they will pay full price for more or less beta testing for intel.